hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
045d833a3d41ca735b37c661f969bbfde50f661d | 16,489 | py | Python | examples/temp_bubbles/plot_sweep_results.py | JBHilton/hh-npi-modelling | 8a51e6260bdc700bfa7995ff4b0885393da4d360 | [
"Apache-2.0"
] | 2 | 2022-01-17T09:28:46.000Z | 2022-01-17T09:36:29.000Z | examples/temp_bubbles/plot_sweep_results.py | JBHilton/hh-npi-modelling | 8a51e6260bdc700bfa7995ff4b0885393da4d360 | [
"Apache-2.0"
] | null | null | null | examples/temp_bubbles/plot_sweep_results.py | JBHilton/hh-npi-modelling | 8a51e6260bdc700bfa7995ff4b0885393da4d360 | [
"Apache-2.0"
] | null | null | null | '''This plots the bubble results
'''
from os import mkdir
from os.path import isdir
from math import ceil, floor
from pickle import dump, load
from numpy import arange, array, atleast_2d, hstack, where, zeros
from matplotlib.pyplot import axes, close, colorbar, imshow, subplots
from mpl_toolkits.axes_grid1 import make_axes_locatable
if isdir('plots/temp_bubbles') is False:
mkdir('plots/temp_bubbles')
with open('outputs/temp_bubbles/baseline_results.pkl', 'rb') as f:
(baseline_peak_data,
baseline_end_data,
baseline_ar_data,
baseline_hh_prop_data) = load(f)
with open('outputs/temp_bubbles/results.pkl', 'rb') as f:
(peak_data0,
end_data0,
ar_data0,
hh_prop_data0,
peak_data1,
end_data1,
ar_data1,
hh_prop_data1,
peak_data2,
end_data2,
ar_data2,
hh_prop_data2,
peak_data3,
end_data3,
ar_data3,
hh_prop_data3,
peak_data4,
end_data4,
ar_data4,
hh_prop_data4,
unmerged_exponents,
merged_exponents) = load(f)
baseline_peak_array = 0 * end_data0
for i in range(merged_exponents.shape[1]):
baseline_peak_array[:,i] = baseline_peak_data
baseline_end_array = 0 * end_data0
for i in range(merged_exponents.shape[1]):
baseline_end_array[:,i] = baseline_end_data
fig, ((ax_0, ax_1),
(ax_2, ax_3),
(ax_4, ax_5)) = subplots(3, 2, sharex=True, sharey=True, figsize=(6,8))
vmin_bl = 0.5*floor(2*100*baseline_peak_array.min())
vmax_bl = 0.5*ceil(2*100*baseline_peak_array.max())
vmin_0 = 0.5*floor(2*100*peak_data0.min())
vmax_0 = 0.5*ceil(2*100*peak_data0.max())
vmin_2 = 0.5*floor(2*100*peak_data2.min())
vmax_2 = 0.5*ceil(2*100*peak_data2.max())
vmin_1 = 0.5*floor(2*100*peak_data1.min())
vmax_1 = 0.5*ceil(2*100*peak_data1.max())
vmin_4 = 0.5*floor(2*100*peak_data4.min())
vmax_4 = 0.5*ceil(2*100*peak_data4.max())
vmin_3 = 0.5*floor(2*100*peak_data3.min())
vmax_3 = 0.5*ceil(2*100*peak_data3.max())
vmin = min(vmin_bl, vmin_0, vmin_1, vmin_2, vmin_3, vmin_4)
vmax = max(vmax_bl, vmax_0, vmax_1, vmax_2, vmax_3, vmax_4)
vtick = arange(vmin, vmax+0.5, 0.5)
axim = ax_0.imshow(100 * baseline_peak_array,
origin='lower',
vmin=vmin,
vmax=vmax,
extent=(0, 1, 0, 1),
aspect=1)
ax_0.set_ylabel('Single household\n density exponent', fontsize=12)
ax_0.text(-0.5, 1, 'a)',
fontsize=12,
verticalalignment='top',
fontfamily='serif',
bbox=dict(facecolor='1', edgecolor='none', pad=3.0))
ax_0.spines['top'].set_visible(False)
ax_0.spines['right'].set_visible(False)
axim = ax_1.imshow(100 * peak_data0,
origin='lower',
vmin=vmin,
vmax=vmax,
extent=(0, 1, 0, 1),
aspect=1)
ax_1.text(-0.2, 1, 'b)',
fontsize=12,
verticalalignment='top',
fontfamily='serif',
bbox=dict(facecolor='1', edgecolor='none', pad=3.0))
ax_1.spines['top'].set_visible(False)
ax_1.spines['right'].set_visible(False)
axim = ax_2.imshow(100 * peak_data2,
origin='lower',
vmin=vmin,
vmax=vmax,
extent=(0, 1, 0, 1),
aspect=1)
ax_2.set_ylabel('Single household\n density exponent', fontsize=12)
ax_2.text(-0.5, 1, 'c)',
fontsize=12,
verticalalignment='top',
fontfamily='serif',
bbox=dict(facecolor='1', edgecolor='none', pad=3.0))
ax_2.spines['top'].set_visible(False)
ax_2.spines['right'].set_visible(False)
axim = ax_3.imshow(100 * peak_data1,
origin='lower',
vmin=vmin,
vmax=vmax,
extent=(0, 1, 0, 1),
aspect=1)
ax_3.text(-0.2, 1, 'd)',
fontsize=12,
verticalalignment='top',
fontfamily='serif',
bbox=dict(facecolor='1', edgecolor='none', pad=3.0))
ax_3.spines['top'].set_visible(False)
ax_3.spines['right'].set_visible(False)
axim = ax_4.imshow(100 * peak_data4,
origin='lower',
vmin=vmin,
vmax=vmax,
extent=(0, 1, 0, 1),
aspect=1)
ax_4.set_ylabel('Single household\n density exponent', fontsize=12)
ax_4.set_xlabel('Bubbled density\n exponent', fontsize=12)
ax_4.text(-0.5, 1, 'e)',
fontsize=12,
verticalalignment='top',
fontfamily='serif',
bbox=dict(facecolor='1', edgecolor='none', pad=3.0))
ax_4.spines['top'].set_visible(False)
ax_4.spines['right'].set_visible(False)
axim = ax_5.imshow(100 * peak_data3,
origin='lower',
vmin=vmin,
vmax=vmax,
extent=(0, 1, 0, 1),
aspect=1)
ax_5.set_xlabel('Bubbled density\n exponent', fontsize=12)
ax_5.text(-0.2, 1, 'f)',
fontsize=12,
verticalalignment='top',
fontfamily='serif',
bbox=dict(facecolor='1', edgecolor='none', pad=3.0))
ax_5.spines['top'].set_visible(False)
ax_5.spines['right'].set_visible(False)
cax = fig.add_axes([0.95, 0.15, 0.05, 0.7])
cbar = fig.colorbar(axim,
cax=cax,
orientation='vertical',
ticks=vtick)
cbar.set_label("Peak %\n prevalence", fontsize=12)
cbar.outline.set_visible(False)
axim = ax_0.contour(100 * baseline_peak_array,
colors='w',
levels=vtick,
vmin=vmin,
vmax=vmax,
extent=(0, 1, 0, 1))
ax_0.clabel(axim, fontsize=9, inline=1, fmt='%1.1f')
ax_0.set_ylabel('Single household\n density exponent', fontsize=12)
ax_0.text(-0.5, 1, 'a)',
fontsize=12,
verticalalignment='top',
fontfamily='serif',
bbox=dict(facecolor='1', edgecolor='none', pad=3.0))
ax_0.spines['top'].set_visible(False)
ax_0.spines['right'].set_visible(False)
axim = ax_1.contour(100 * peak_data0,
colors='w',
levels=vtick,
vmin=vmin,
vmax=vmax,
extent=(0, 1, 0, 1))
ax_1.clabel(axim, fontsize=9, inline=1, fmt='%1.1f')
ax_1.text(-0.2, 1, 'b)',
fontsize=12,
verticalalignment='top',
fontfamily='serif',
bbox=dict(facecolor='1', edgecolor='none', pad=3.0))
ax_1.spines['top'].set_visible(False)
ax_1.spines['right'].set_visible(False)
axim = ax_2.contour(100 * peak_data2,
colors='w',
levels=vtick,
vmin=vmin,
vmax=vmax,
extent=(0, 1, 0, 1))
ax_2.clabel(axim, fontsize=9, inline=1, fmt='%1.1f')
ax_2.set_ylabel('Single household\n density exponent', fontsize=12)
ax_2.text(-0.5, 1, 'c)',
fontsize=12,
verticalalignment='top',
fontfamily='serif',
bbox=dict(facecolor='1', edgecolor='none', pad=3.0))
ax_2.spines['top'].set_visible(False)
ax_2.spines['right'].set_visible(False)
axim = ax_3.contour(100 * peak_data1,
colors='w',
levels=vtick,
vmin=vmin,
vmax=vmax,
extent=(0, 1, 0, 1))
ax_3.clabel(axim, fontsize=9, inline=1, fmt='%1.1f')
ax_3.text(-0.2, 1, 'd)',
fontsize=12,
verticalalignment='top',
fontfamily='serif',
bbox=dict(facecolor='1', edgecolor='none', pad=3.0))
ax_3.spines['top'].set_visible(False)
ax_3.spines['right'].set_visible(False)
axim = ax_4.contour(100 * peak_data4,
colors='w',
levels=vtick,
vmin=vmin,
vmax=vmax,
extent=(0, 1, 0, 1))
ax_4.clabel(axim, fontsize=9, inline=1, fmt='%1.1f')
ax_4.set_ylabel('Single household\n density exponent', fontsize=12)
ax_4.set_xlabel('Bubbled density\n exponent', fontsize=12)
ax_4.text(-0.5, 1, 'e)',
fontsize=12,
verticalalignment='top',
fontfamily='serif',
bbox=dict(facecolor='1', edgecolor='none', pad=3.0))
ax_4.spines['top'].set_visible(False)
ax_4.spines['right'].set_visible(False)
axim = ax_5.contour(100 * peak_data3,
colors='w',
levels=vtick,
vmin=vmin,
vmax=vmax,
extent=(0, 1, 0, 1))
ax_5.clabel(axim, fontsize=9, inline=1, fmt='%1.1f')
ax_5.set_xlabel('Bubbled density\n exponent', fontsize=12)
ax_5.text(-0.2, 1, 'f)',
fontsize=12,
verticalalignment='top',
fontfamily='serif',
bbox=dict(facecolor='1', edgecolor='none', pad=3.0))
ax_5.spines['top'].set_visible(False)
ax_5.spines['right'].set_visible(False)
fig.savefig('plots/temp_bubbles/peak_prev_grid.png',bbox_inches='tight', dpi=300)
close()
fig, ((ax_0, ax_1),
(ax_2, ax_3),
(ax_4, ax_5)) = subplots(3, 2, sharex=True, sharey=True, figsize=(6,8))
vmin_bl = floor(100*baseline_end_array.min())
vmax_bl = ceil(100*baseline_end_array.max())
vmin_0 = floor(100*end_data0.min())
vmax_0 = ceil(100*end_data0.max())
vmin_2 = floor(100*end_data2.min())
vmax_2 = ceil(100*end_data2.max())
vmin_1 = floor(100*end_data1.min())
vmax_1 = ceil(100*end_data1.max())
vmin_4 = floor(100*end_data4.min())
vmax_4 = ceil(100*end_data4.max())
vmin_3 = floor(100*end_data3.min())
vmax_3 = ceil(100*end_data3.max())
vmin = min(vmin_bl, vmin_0, vmin_1, vmin_2, vmin_3, vmin_4)
vmax = max(vmax_bl, vmax_0, vmax_1, vmax_2, vmax_3, vmax_4)
vtick = arange(vmin, vmax+1.0, 1.0)
axim = ax_0.imshow(100 * baseline_end_array,
origin='lower',
vmin=vmin,
vmax=vmax,
extent=(0, 1, 0, 1),
aspect=1)
ax_0.set_ylabel('Single household\n density exponent', fontsize=12)
ax_0.text(-0.5, 1, 'a)',
fontsize=12,
verticalalignment='top',
fontfamily='serif',
bbox=dict(facecolor='1', edgecolor='none', pad=3.0))
ax_0.spines['top'].set_visible(False)
ax_0.spines['right'].set_visible(False)
axim = ax_1.imshow(100 * end_data0,
origin='lower',
vmin=vmin,
vmax=vmax,
extent=(0, 1, 0, 1),
aspect=1)
ax_1.text(-0.2, 1, 'b)',
fontsize=12,
verticalalignment='top',
fontfamily='serif',
bbox=dict(facecolor='1', edgecolor='none', pad=3.0))
ax_1.spines['top'].set_visible(False)
ax_1.spines['right'].set_visible(False)
axim = ax_2.imshow(100 * end_data2,
origin='lower',
vmin=vmin,
vmax=vmax,
extent=(0, 1, 0, 1),
aspect=1)
ax_2.set_ylabel('Single household\n density exponent', fontsize=12)
ax_2.text(-0.5, 1, 'c)',
fontsize=12,
verticalalignment='top',
fontfamily='serif',
bbox=dict(facecolor='1', edgecolor='none', pad=3.0))
ax_2.spines['top'].set_visible(False)
ax_2.spines['right'].set_visible(False)
axim = ax_3.imshow(100 * end_data1,
origin='lower',
vmin=vmin,
vmax=vmax,
extent=(0, 1, 0, 1),
aspect=1)
ax_3.text(-0.2, 1, 'd)',
fontsize=12,
verticalalignment='top',
fontfamily='serif',
bbox=dict(facecolor='1', edgecolor='none', pad=3.0))
ax_3.spines['top'].set_visible(False)
ax_3.spines['right'].set_visible(False)
axim = ax_4.imshow(100 * end_data4,
origin='lower',
vmin=vmin,
vmax=vmax,
extent=(0, 1, 0, 1),
aspect=1)
ax_4.set_ylabel('Single household\n density exponent', fontsize=12)
ax_4.set_xlabel('Bubbled density\n exponent', fontsize=12)
ax_4.text(-0.5, 1, 'e)',
fontsize=12,
verticalalignment='top',
fontfamily='serif',
bbox=dict(facecolor='1', edgecolor='none', pad=3.0))
ax_4.spines['top'].set_visible(False)
ax_4.spines['right'].set_visible(False)
axim = ax_5.imshow(100 * end_data3,
origin='lower',
vmin=vmin,
vmax=vmax,
extent=(0, 1, 0, 1),
aspect=1)
ax_5.set_xlabel('Bubbled density\n exponent', fontsize=12)
ax_5.text(-0.2, 1, 'f)',
fontsize=12,
verticalalignment='top',
fontfamily='serif',
bbox=dict(facecolor='1', edgecolor='none', pad=3.0))
ax_5.spines['top'].set_visible(False)
ax_5.spines['right'].set_visible(False)
cax = fig.add_axes([0.95, 0.15, 0.05, 0.7])
cbar = fig.colorbar(axim,
cax=cax,
orientation='vertical',
ticks=vtick)
cbar.set_label("Cumulative %\n prevalence", fontsize=12)
cbar.outline.set_visible(False)
axim = ax_0.contour(100 * baseline_end_array,
colors='w',
levels=vtick,
vmin=vmin,
vmax=vmax,
extent=(0, 1, 0, 1))
ax_0.clabel(axim, fontsize=9, inline=1, fmt='%1.1f')
ax_0.set_ylabel('Single household\n density exponent', fontsize=12)
ax_0.text(-0.5, 1, 'a)',
fontsize=12,
verticalalignment='top',
fontfamily='serif',
bbox=dict(facecolor='1', edgecolor='none', pad=3.0))
ax_0.spines['top'].set_visible(False)
ax_0.spines['right'].set_visible(False)
axim = ax_1.contour(100 * end_data0,
colors='w',
levels=vtick,
vmin=vmin,
vmax=vmax,
extent=(0, 1, 0, 1))
ax_1.clabel(axim, fontsize=9, inline=1, fmt='%1.1f')
ax_1.text(-0.2, 1, 'b)',
fontsize=12,
verticalalignment='top',
fontfamily='serif',
bbox=dict(facecolor='1', edgecolor='none', pad=3.0))
ax_1.spines['top'].set_visible(False)
ax_1.spines['right'].set_visible(False)
axim = ax_2.contour(100 * end_data2,
colors='w',
levels=vtick,
vmin=vmin,
vmax=vmax,
extent=(0, 1, 0, 1))
ax_2.clabel(axim, fontsize=9, inline=1, fmt='%1.1f')
ax_2.set_ylabel('Single household\n density exponent', fontsize=12)
ax_2.text(-0.5, 1, 'c)',
fontsize=12,
verticalalignment='top',
fontfamily='serif',
bbox=dict(facecolor='1', edgecolor='none', pad=3.0))
ax_2.spines['top'].set_visible(False)
ax_2.spines['right'].set_visible(False)
axim = ax_3.contour(100 * end_data1,
colors='w',
levels=vtick,
vmin=vmin,
vmax=vmax,
extent=(0, 1, 0, 1))
ax_3.clabel(axim, fontsize=9, inline=1, fmt='%1.1f')
ax_3.text(-0.2, 1, 'd)',
fontsize=12,
verticalalignment='top',
fontfamily='serif',
bbox=dict(facecolor='1', edgecolor='none', pad=3.0))
ax_3.spines['top'].set_visible(False)
ax_3.spines['right'].set_visible(False)
axim = ax_4.contour(100 * end_data4,
colors='w',
levels=vtick,
vmin=vmin,
vmax=vmax,
extent=(0, 1, 0, 1))
ax_4.clabel(axim, fontsize=9, inline=1, fmt='%1.1f')
ax_4.set_ylabel('Single household\n density exponent', fontsize=12)
ax_4.set_xlabel('Bubbled density\n exponent', fontsize=12)
ax_4.text(-0.5, 1, 'e)',
fontsize=12,
verticalalignment='top',
fontfamily='serif',
bbox=dict(facecolor='1', edgecolor='none', pad=3.0))
ax_4.spines['top'].set_visible(False)
ax_4.spines['right'].set_visible(False)
axim = ax_5.contour(100 * end_data3,
colors='w',
levels=vtick,
vmin=vmin,
vmax=vmax,
extent=(0, 1, 0, 1))
ax_5.clabel(axim, fontsize=9, inline=1, fmt='%1.1f')
ax_5.set_xlabel('Bubbled density\n exponent', fontsize=12)
ax_5.text(-0.2, 1, 'f)',
fontsize=12,
verticalalignment='top',
fontfamily='serif',
bbox=dict(facecolor='1', edgecolor='none', pad=3.0))
ax_5.spines['top'].set_visible(False)
ax_5.spines['right'].set_visible(False)
fig.savefig('plots/temp_bubbles/cum_prev_grid.png',bbox_inches='tight', dpi=300)
close()
| 34.209544 | 81 | 0.563891 | 2,310 | 16,489 | 3.857143 | 0.067965 | 0.056117 | 0.084175 | 0.043098 | 0.870258 | 0.848822 | 0.82716 | 0.82716 | 0.82716 | 0.818855 | 0 | 0.072795 | 0.280187 | 16,489 | 481 | 82 | 34.280665 | 0.6779 | 0.001759 | 0 | 0.784581 | 0 | 0 | 0.095302 | 0.008874 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.015873 | 0 | 0.015873 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
f0abab9d4dce346c3bf53097534446ca9522245b | 344 | py | Python | tests/test_smoke.py | sandorex/extract-browser-data.py | f2b2a60b9af2f7331e09167abf0b8c6cd6e534d0 | [
"Apache-2.0"
] | null | null | null | tests/test_smoke.py | sandorex/extract-browser-data.py | f2b2a60b9af2f7331e09167abf0b8c6cd6e534d0 | [
"Apache-2.0"
] | 7 | 2020-04-05T19:39:24.000Z | 2020-04-15T18:55:04.000Z | tests/test_smoke.py | sandorex/extract-browser-data.py | f2b2a60b9af2f7331e09167abf0b8c6cd6e534d0 | [
"Apache-2.0"
] | null | null | null | import pytest
def test_ff_smoke():
pass
@pytest.mark.reading
def test_ff_reading_smoke():
pass
@pytest.mark.writing
def test_ff_writing_smoke():
pass
def test_ch_smoke():
pass
@pytest.mark.reading
def test_ch_reading_smoke():
pass
@pytest.mark.writing
def test_ch_writing_smoke():
pass
def test_smoke():
pass
| 10.117647 | 28 | 0.723837 | 52 | 344 | 4.461538 | 0.211538 | 0.211207 | 0.258621 | 0.327586 | 0.827586 | 0.62931 | 0.62931 | 0.344828 | 0 | 0 | 0 | 0 | 0.180233 | 344 | 33 | 29 | 10.424242 | 0.822695 | 0 | 0 | 0.578947 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.368421 | true | 0.368421 | 0.052632 | 0 | 0.421053 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 7 |
9bce6a2b4e6803c11e5214febb8b6f6ef8ea4050 | 762 | py | Python | Exercices/20200425 Turtle Polygone.py | FredC94/MOOC-Python3 | bb8d15fabc5c0cb240198f968aed4be0d3e0900b | [
"MIT"
] | null | null | null | Exercices/20200425 Turtle Polygone.py | FredC94/MOOC-Python3 | bb8d15fabc5c0cb240198f968aed4be0d3e0900b | [
"MIT"
] | null | null | null | Exercices/20200425 Turtle Polygone.py | FredC94/MOOC-Python3 | bb8d15fabc5c0cb240198f968aed4be0d3e0900b | [
"MIT"
] | null | null | null | import turtle
turtle.up()
turtle.shape('turtle')
turtle.goto(0,0)
turtle.color('black')
turtle.down()
turtle.begin_fill()
turtle.forward(100)
turtle.right(120)
turtle.forward(100)
turtle.right(60)
turtle.forward(100)
turtle.right(120)
turtle.forward(100)
turtle.end_fill()
turtle.up()
turtle.shape('turtle')
turtle.goto(100,0)
turtle.color('blue')
turtle.down()
turtle.begin_fill()
turtle.left(60)
turtle.forward(100)
turtle.left(60)
turtle.forward(100)
turtle.left(120)
turtle.forward(100)
turtle.end_fill()
turtle.up()
turtle.shape('turtle')
turtle.color('red')
turtle.down()
turtle.begin_fill()
turtle.left(180)
turtle.forward(100)
turtle.left(120)
turtle.forward(100)
turtle.left(60)
turtle.forward(100)
turtle.end_fill()
turtle.hideturtle()
turtle.done() | 16.933333 | 22 | 0.761155 | 119 | 762 | 4.823529 | 0.193277 | 0.226481 | 0.278746 | 0.383275 | 0.860627 | 0.848432 | 0.804878 | 0.609756 | 0.58885 | 0.58885 | 0 | 0.082287 | 0.059055 | 762 | 45 | 23 | 16.933333 | 0.718271 | 0 | 0 | 0.761905 | 0 | 0 | 0.039318 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.02381 | 0 | 0.02381 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
ac9e473f3a6bd342f89ff560c12530fd595cd59b | 20,398 | py | Python | PetriDish.py | AkandaAshraf/VirtualSoc | b2e78b9816d9a30e321e3e8ccd2b4cc78eddbff3 | [
"MIT"
] | 16 | 2019-05-23T02:36:06.000Z | 2022-02-04T16:29:37.000Z | PetriDish.py | AkandaAshraf/VirtualSoc | b2e78b9816d9a30e321e3e8ccd2b4cc78eddbff3 | [
"MIT"
] | null | null | null | PetriDish.py | AkandaAshraf/VirtualSoc | b2e78b9816d9a30e321e3e8ccd2b4cc78eddbff3 | [
"MIT"
] | 3 | 2019-05-23T02:36:12.000Z | 2019-06-16T18:21:10.000Z | from Node import *
from DNA import *
import Networks
import numpy as np
#All methods here are to generate Nodes
def createSimpleNodes( numberOfNodes, nodeType, DNA, Graph):
'''
This method generate simplest type of Nodes. No features are added.
:param numberOfNodes: The total number of nodes to be generated. Integer
:param nodeType: Class type of the Node, Node . Class name. Not to be passed as a string but the name only
:param DNA: The DNA object
:param Graph: Graph type object. All generated nodes must be contained within a graph to enable socialisation.
:return: list of Node type objects
'''
return [ nodeType(DNA,Graph=Graph) for i in range(numberOfNodes)]
def createSocialNodesThreeFeatures( numberOfNodes, nodeType, DNA,commonLabel, Graph):
'''
This method generates NodeSocial type of nodes with three basic auto-generated features.
:param numberOfNodes: numberOfNodes: The total number of nodes to be generated. Integer
:param nodeType: The Node type must be NodeSocial or more advanced child class of Node but not Node. Class name. Not to be passed as a string but the name only
:param DNA: The DNA object.
:param commonLabel: Label of the generated objects. EVery Node objects refers to one signle DNA object. And that DNA object defines label of the nodes. In this method label is not auto generated and must be
passed in this param.
:param Graph: Graph type object. All generated nodes must be contained within a graph to enable socialisation.
:return: list of NodeSocial or more advanced type of objects
'''
age = np.random.randint(18, high=65, size=numberOfNodes)
gender = np.random.randint(0, high=2, size=numberOfNodes)
location = np.random.randint(1, high=100, size=numberOfNodes)
# if len(DNA)!=numberOfNodes:
# label = DNA
# else:
# location = np.random.randint(1, high=100, size=numberOfNodes)
return [ nodeType(age=age[i], gender=gender[i], location=location[i], label=commonLabel, DNA=DNA, Graph=Graph) for i in range(numberOfNodes)]
def createSocialNodesNFeatures(numberOfNodes, nodeType, DNA, commonLabel, Graph, additionalFeatureLen, addTraidtionalFeatures, npDistFunc):
'''
:param numberOfNodes: The total number of nodes to be generated. Integer
:param nodeType: The Node type must be NodeSocial or more advanced child class of Node but not Node. Class name. Not to be passed as a string but the name only
:param DNA: DNA object
:param commonLabel: Label of the generated objects. EVery Node objects refers to one signle DNA object. And that DNA object defines label of the nodes. In this method label is not auto generated and must be
passed in this param.
:param Graph: Graph type object. All generated nodes must be contained within a graph to enable socialisation.
:param additionalFeatureLen: Number of additional features other than age, gender, and location. Integer
:param addTraidtionalFeatures: Whether or not to add than age, gender, and location features by default. True/False
:param npDistFunc: Distribution of additional features. Usually a numpy method as a string. Do not define the size of the generated numbers. Include package name i.e. numpy/np. If it's not a numpy
method but if it's an user written method then make sure it contains an integer param named size which is used to get the number of random numbers.
:return: list of NodeSocial or more advanced type of objects
'''
featuresAgeGenderLocation = []
features = []
if addTraidtionalFeatures:
featuresAgeGenderLocation.append(np.random.randint(18, high=80, size=numberOfNodes))
featuresAgeGenderLocation.append(np.random.randint(0, high=2, size=numberOfNodes))
featuresAgeGenderLocation.append(np.random.randint(1, high=100, size=numberOfNodes))
if npDistFunc is not None:
if not isinstance(npDistFunc, list):
for i in range(0, additionalFeatureLen):
# if npDistFunc is None:
# features.append(np.random.randint(0, high=100, size=numberOfNodes))
# elif npDistFunc is not None:
l = len(npDistFunc)
#npDistFunc=npDistFunc[:l-2] + 'size=' +numberOfNodes+ npDistFunc[l-2:]
features.append(eval(npDistFunc[:l-1] + ',size=' +str(numberOfNodes)+ npDistFunc[l-1:]))
elif len(npDistFunc)==additionalFeatureLen:
for ndf in npDistFunc:
l = len(ndf)
features.append(eval(ndf[:l - 1] + ',size=' + str(numberOfNodes) + ndf[l - 1:]))
# features.append(ndf)
else:
randIntIndexnpDistFunc= np.random.randint(low=0, high=len(npDistFunc), size=additionalFeatureLen)
for index in randIntIndexnpDistFunc:
npDistFuncTemp = npDistFunc[index]
l = len(npDistFuncTemp)
features.append(eval(npDistFuncTemp[:l - 1] + ',size=' + str(numberOfNodes) + npDistFuncTemp[l - 1:]))
featuresNP=np.array(features)
# if len(DNA)!=numberOfNodes:
# label = DNA
# else:
# location = np.random.randint(1, high=100, size=numberOfNodes)
if addTraidtionalFeatures and additionalFeatureLen >0:
return [ nodeType(age=featuresAgeGenderLocation[0][i], gender=featuresAgeGenderLocation[1][i], location=featuresAgeGenderLocation[2][i], label=commonLabel, DNA=DNA, Graph=Graph, additionalFeatures= featuresNP[:,i]) for i in range(numberOfNodes)]
elif addTraidtionalFeatures and additionalFeatureLen ==0:
return [nodeType(age=featuresAgeGenderLocation[0][i], gender=featuresAgeGenderLocation[1][i],
location=featuresAgeGenderLocation[2][i], label=commonLabel, DNA=DNA, Graph=Graph) for i in range(numberOfNodes)]
elif not addTraidtionalFeatures and additionalFeatureLen >0:
return [nodeType( label=commonLabel, DNA=DNA, Graph=Graph, additionalFeatures=featuresNP[:, i]) for i in range(numberOfNodes)]
def createSocialNodesNFeaturesSameDist(numberOfNodes, nodeType, dna, Graph, additionalFeatureLen,
addTraidtionalFeatures, npDistFunc,labelSplit,DnaObjType):
'''
:param numberOfNodes: The total number of nodes to be generated. Integer
:param nodeType: he Node type must be NodeSocial or more advanced child class of Node but not Node. Class name. Not to be passed as a string but the name only
:param dna: 'auto' , string. This is different from other methods of generating nodes. This is because here labels and DNA's are generated implicitly within the method.
:param Graph: Graph type object. All generated nodes must be contained within a graph to enable socialisation.
:param additionalFeatureLen: Number of additional features other than age, gender, and location. Integer
:param addTraidtionalFeatures: Whether or not to add than age, gender, and location features by default. True/False
:param npDistFunc: Distribution of additional features. Usually a numpy method as a string. Do not define the size of the generated numbers. Include package name i.e. numpy/np. If it's not a numpy
method but if it's an user written method then make sure it contains an integer param named size which is used to get the number of random numbers.
:param labelSplit: the split of lablel passed as a vector or single integer number. For example, if it's [10 , 20, 30] then 30 nodes will be generated and first 10 nodes will have DNA object 1,
2nd snd DNA object and 3rd will have DNA object 3. In this function DNA objects are auto generated and assigned. These DNA also correspond to the label of nodes. So, the first 10 Nodes will have
label 0, 2nd 10 label 1 , and finally 3rd 10 nodes will have label 2. If you want each node to have a distinct DNA and label then pass a vector such as [1,2,3,.....,28,29]
:param DnaObjType: DNA class name, DNA or any other advanced DNA class. Single name passed not as a string but name of the class
:return: list of NodeSocial or more advanced type of objects
'''
featuresAgeGenderLocation = []
features = []
featureLen = 0
if addTraidtionalFeatures:
featuresAgeGenderLocation.append(np.random.randint(18, high=80, size=numberOfNodes))
featuresAgeGenderLocation.append(np.random.randint(0, high=2, size=numberOfNodes))
featuresAgeGenderLocation.append(np.random.randint(1, high=100, size=numberOfNodes))
featureLen=len(featuresAgeGenderLocation)+featureLen
if npDistFunc is not None:
if not isinstance(npDistFunc, list):
for i in range(0, additionalFeatureLen):
# if npDistFunc is None:
# features.append(np.random.randint(0, high=100, size=numberOfNodes))
# elif npDistFunc is not None:
l = len(npDistFunc)
# npDistFunc=npDistFunc[:l-2] + 'size=' +numberOfNodes+ npDistFunc[l-2:]
features.append(eval(npDistFunc[:l - 1] + ',size=' + str(numberOfNodes) + npDistFunc[l - 1:]))
elif len(npDistFunc) == additionalFeatureLen:
for ndf in npDistFunc:
l = len(ndf)
features.append(eval(ndf[:l - 1] + ',size=' + str(numberOfNodes) + ndf[l - 1:]))
# features.append(ndf)
else:
randIntIndexnpDistFunc = np.random.randint(low=0, high=len(npDistFunc), size=additionalFeatureLen)
for index in randIntIndexnpDistFunc:
npDistFuncTemp = npDistFunc[index]
l = len(npDistFuncTemp)
features.append(
eval(npDistFuncTemp[:l - 1] + ',size=' + str(numberOfNodes) + npDistFuncTemp[l - 1:]))
featuresNP = np.array(features)
featureLen=len(features)+featureLen
# if len(DNA)!=numberOfNodes:
# label = DNA
# else:
# location = np.random.randint(1, high=100, size=numberOfNodes)
N = []
tempN = []
DNAlist = []
# DNASpreadIndex = np.empty(labelSplit[-1])
# startIndexTemp = 0
# # np.full(i, labelSplit[i])
# for i in range(0, len(labelSplit)):
# DNASpreadIndex[startIndexTemp:labelSplit[i]]= i
# startIndexTemp = labelSplit[i]
# np.random.shuffle(DNASpreadIndex)
# np.random.shuffle(DNASpreadIndex)
# np.random.shuffle(DNASpreadIndex)
# np.random.shuffle(DNASpreadIndex)
# np.random.shuffle(DNASpreadIndex)
for i in range(0, len(labelSplit)):
DNAlist.append(DnaObjType(dna, len=featureLen,useGPU=Graph._useGPU,createInGPUMem=Graph.createInGPUMem))
if i ==0:
startingIndex = 0
endingIndex = labelSplit[i] - 1
else:
endingIndex = labelSplit[i] - 1
if addTraidtionalFeatures and additionalFeatureLen > 0:
tempN = [nodeType(age=featuresAgeGenderLocation[0][j], gender=featuresAgeGenderLocation[1][j],
location=featuresAgeGenderLocation[2][j], label=i, DNA=DNAlist[i], Graph=Graph,
additionalFeatures=featuresNP[:, j]) for j in range(startingIndex,endingIndex+1)]
elif addTraidtionalFeatures and additionalFeatureLen == 0:
tempN = [nodeType(age=featuresAgeGenderLocation[0][j], gender=featuresAgeGenderLocation[1][j],
location=featuresAgeGenderLocation[2][j], label=i, DNA=DNAlist[j], Graph=Graph) for
j in range(startingIndex,endingIndex+1)]
elif not addTraidtionalFeatures and additionalFeatureLen > 0:
tempN = [nodeType(label=i, DNA=DNAlist[i], Graph=Graph, additionalFeatures=featuresNP[:, j]) for j in
range(startingIndex, endingIndex + 1)]
startingIndex = labelSplit[i]
N = [*N, *tempN]
DNAlist[-1]._assignedNodes(tempN)
Graph.DNA = DNAlist
return N
def createSocialNodesNFeaturesSameDistWithDNAShuffled(numberOfNodes, nodeType, dna, Graph, additionalFeatureLen,
addTraidtionalFeatures, npDistFunc,labelSplit,DnaObjType):
'''
:param numberOfNodes: The total number of nodes to be generated. Integer
:param nodeType: he Node type must be NodeSocial or more advanced child class of Node but not Node. Class name. Not to be passed as a string but the name only
:param dna: 'auto' , string. This is different from other methods of generating nodes. This is because here labels and DNA's are generated implicitly within the method.
:param Graph: Graph type object. All generated nodes must be contained within a graph to enable socialisation.
:param additionalFeatureLen: Number of additional features other than age, gender, and location. Integer
:param addTraidtionalFeatures: Whether or not to add than age, gender, and location features by default. True/False
:param npDistFunc: Distribution of additional features. Usually a numpy method as a string. Do not define the size of the generated numbers. Include package name i.e. numpy/np. If it's not a numpy
method but if it's an user written method then make sure it contains an integer param named size which is used to get the number of random numbers.
:param labelSplit: the split of lablel passed as a vector or single integer number. For example, if it's [10 , 20, 30] then 30 nodes will be generated and first 10 nodes will have DNA object 1,
2nd snd DNA object and 3rd will have DNA object 3. In this function DNA objects are auto generated and assigned. These DNA also correspond to the label of nodes. So, the first 10 Nodes will have
label 0, 2nd 10 label 1 , and finally 3rd 10 nodes will have label 2. If you want each node to have a distinct DNA and label then pass a vector such as [1,2,3,.....,28,29]
:param DnaObjType: DNA class name, DNA or any other advanced DNA class. Single name passed not as a string but name of the class
:return: list of NodeSocial or more advanced type of objects
'''
featuresAgeGenderLocation = []
features = []
featureLen = 0
if addTraidtionalFeatures:
featuresAgeGenderLocation.append(np.random.randint(18, high=80, size=numberOfNodes))
featuresAgeGenderLocation.append(np.random.randint(0, high=2, size=numberOfNodes))
featuresAgeGenderLocation.append(np.random.randint(1, high=100, size=numberOfNodes))
featureLen=len(featuresAgeGenderLocation)+featureLen
if npDistFunc is not None:
if not isinstance(npDistFunc, list):
for i in range(0, additionalFeatureLen):
# if npDistFunc is None:
# features.append(np.random.randint(0, high=100, size=numberOfNodes))
# elif npDistFunc is not None:
l = len(npDistFunc)
# npDistFunc=npDistFunc[:l-2] + 'size=' +numberOfNodes+ npDistFunc[l-2:]
features.append(eval(npDistFunc[:l - 1] + ',size=' + str(numberOfNodes) + npDistFunc[l - 1:]))
elif len(npDistFunc) == additionalFeatureLen:
for ndf in npDistFunc:
l = len(ndf)
features.append(eval(ndf[:l - 1] + ',size=' + str(numberOfNodes) + ndf[l - 1:]))
# features.append(ndf)
else:
randIntIndexnpDistFunc = np.random.randint(low=0, high=len(npDistFunc), size=additionalFeatureLen)
for index in randIntIndexnpDistFunc:
npDistFuncTemp = npDistFunc[index]
l = len(npDistFuncTemp)
features.append(
eval(npDistFuncTemp[:l - 1] + ',size=' + str(numberOfNodes) + npDistFuncTemp[l - 1:]))
featuresNP = np.array(features)
featureLen=len(features)+featureLen
# if len(DNA)!=numberOfNodes:
# label = DNA
# else:
# location = np.random.randint(1, high=100, size=numberOfNodes)
N = []
tempN = []
DNAlist = []
DNASpreadIndex = np.empty(labelSplit[-1],dtype=int)
startIndexTemp = 0
# np.full(i, labelSplit[i])
for i in range(0, len(labelSplit)):
DNASpreadIndex[startIndexTemp:labelSplit[i]]= i
startIndexTemp = labelSplit[i]
DNAlist.append(DnaObjType(dna, len=featureLen,useGPU=Graph._useGPU,createInGPUMem=Graph.createInGPUMem))
np.random.shuffle(DNASpreadIndex)
np.random.shuffle(DNASpreadIndex)
np.random.shuffle(DNASpreadIndex)
np.random.shuffle(DNASpreadIndex)
np.random.shuffle(DNASpreadIndex)
for i in range(0, len(labelSplit)):
print('generating nodes with: '+str(featureLen)+ 'features each')
if i ==0:
startingIndex = 0
endingIndex = labelSplit[i] - 1
else:
endingIndex = labelSplit[i] - 1
if addTraidtionalFeatures and additionalFeatureLen > 0:
for j in range(startingIndex, endingIndex + 1):
tempN.append(nodeType(age=featuresAgeGenderLocation[0][j], gender=featuresAgeGenderLocation[1][j],
location=featuresAgeGenderLocation[2][j], label=DNASpreadIndex[j], DNA=DNAlist[DNASpreadIndex[j]], Graph=Graph,
additionalFeatures=featuresNP[:, j]))
DNAlist[DNASpreadIndex[j]]._assignedNode(tempN[-1])
elif addTraidtionalFeatures and additionalFeatureLen == 0:
for j in range(startingIndex,endingIndex+1):
tempN.append(nodeType(age=featuresAgeGenderLocation[0][j], gender=featuresAgeGenderLocation[1][j],
location=featuresAgeGenderLocation[2][j], label=DNASpreadIndex[j], DNA=DNAlist[DNASpreadIndex[j]], Graph=Graph))
DNAlist[DNASpreadIndex[j]]._assignedNode(tempN[-1])
elif not addTraidtionalFeatures and additionalFeatureLen > 0:
for j in range(startingIndex, endingIndex + 1):
tempN.append(nodeType(label=DNASpreadIndex[j], DNA=DNAlist[DNASpreadIndex[j]], Graph=Graph, additionalFeatures=featuresNP[:, j]))
DNAlist[DNASpreadIndex[j]]._assignedNode(tempN[-1])
startingIndex = labelSplit[i]
N = [*N, *tempN]
tempN = []
Graph.DNA = DNAlist
return N
# test = PetriDish.createSimpleNodes(PetriDish,numberOfNodes=5,nodeType=Node,DNA='auto',Graph=Networks.RandomSocialGraph())
# a = 2+3
| 62.189024 | 259 | 0.611923 | 2,255 | 20,398 | 5.532594 | 0.092239 | 0.020519 | 0.026451 | 0.01058 | 0.936037 | 0.927942 | 0.91664 | 0.903655 | 0.897082 | 0.893636 | 0 | 0.015584 | 0.307922 | 20,398 | 327 | 260 | 62.379205 | 0.868173 | 0.380626 | 0 | 0.786585 | 1 | 0 | 0.007538 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.030488 | false | 0 | 0.02439 | 0 | 0.097561 | 0.006098 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
acb6cf47edb8df9e09fc481946db0f98b7414604 | 122 | py | Python | WH_Utils/__init__.py | WealtHawk-prod/WH_Utils | 713b464a4a0971c8d5bc9bebc2e68f129ec65a4c | [
"Apache-2.0"
] | null | null | null | WH_Utils/__init__.py | WealtHawk-prod/WH_Utils | 713b464a4a0971c8d5bc9bebc2e68f129ec65a4c | [
"Apache-2.0"
] | null | null | null | WH_Utils/__init__.py | WealtHawk-prod/WH_Utils | 713b464a4a0971c8d5bc9bebc2e68f129ec65a4c | [
"Apache-2.0"
] | null | null | null | from WH_Utils.External import *
from WH_Utils.Objects import *
from WH_Utils.Utils.global_utils import *
"this is a test"
| 24.4 | 41 | 0.795082 | 21 | 122 | 4.428571 | 0.52381 | 0.193548 | 0.354839 | 0.365591 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.131148 | 122 | 4 | 42 | 30.5 | 0.877358 | 0 | 0 | 0 | 0 | 0 | 0.114754 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.75 | 0 | 0.75 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
acb75e3a3748fb96d65e520aa1ccd3f825dc34b2 | 106 | py | Python | 2_uzd.py | JanisRatnieksGitHub/eksamens_JR | 5d7b1beb94ba8938c3279bb7a15f506c12bf6e3c | [
"MIT"
] | null | null | null | 2_uzd.py | JanisRatnieksGitHub/eksamens_JR | 5d7b1beb94ba8938c3279bb7a15f506c12bf6e3c | [
"MIT"
] | null | null | null | 2_uzd.py | JanisRatnieksGitHub/eksamens_JR | 5d7b1beb94ba8938c3279bb7a15f506c12bf6e3c | [
"MIT"
] | null | null | null | import sys
print("Python version")
print (sys.version)
# print("Version info.")
# print (sys.version_info) | 21.2 | 26 | 0.735849 | 15 | 106 | 5.133333 | 0.4 | 0.311688 | 0.38961 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.103774 | 106 | 5 | 26 | 21.2 | 0.810526 | 0.443396 | 0 | 0 | 0 | 0 | 0.245614 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.333333 | 0 | 0.333333 | 0.666667 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 7 |
c5bf5074b459f0f4b292bf569f9ec8318ec0ca9d | 5,137 | py | Python | uri/1682.py | italo-batista/problems-solving | f83ad34f0abebd52925c4020635556f20743ba06 | [
"MIT"
] | null | null | null | uri/1682.py | italo-batista/problems-solving | f83ad34f0abebd52925c4020635556f20743ba06 | [
"MIT"
] | null | null | null | uri/1682.py | italo-batista/problems-solving | f83ad34f0abebd52925c4020635556f20743ba06 | [
"MIT"
] | null | null | null |
sequence = "NONPNOPNPONOPNONPNOPNPONPNONPONOPNONPNOPNPONOPNONPONOPNPONPNONPONOPNONPNOPNPONOPONPNONPONOPNONPNOPNPONPNONPONOPNONPNOPONOPNONPONOPNPONPNONPONOPNONPNOPNPONOPNONPONOPNPONPNONPONOPNONPNOPONOPNONPONOPNPONPNONPONOPNONPOPNOPONOPNONPNOPNPONOPNONPNOPONOPNONPONOPNPONPNONPONOPNONPNOPNPONOPNONPONOPNPONPNONPONOPNONPNOPONOPNONPONOPNPONPNONPONOPNONPOPNOPONOPNONPONOPNPONPNONPONOPNONPNOPNPONOPNONPONOPNPONPNONPONOPNONPNOPONOPNONPONOPNPONPNONPONOPNPOPNONPNOPNPONOPNONPNOPNPOPNONPNOPONOPNONPNOPNPONOPNONPNOPONOPNONPONOPNPONPNONPONOPNONPNOPNPONOPNONPONOPNPONPNONPONOPNONPNOPONOPNONPONOPNPONPNONPONOPNONPOPNOPONOPNONPNOPNPONOPNONPNOPONOPNONPONOPNPONPNONPONOPNONPNOPNPONOPNONPONOPNPONPNONPONOPNONPNOPONOPNONPONOPNPONPNONPONOPNPOPNONPNOPNPONOPNONPNOPNPOPNONPNOPONOPNONPNOPNPONOPNONPNOPONOPNONPONOPNPONPNONPONOPNONPNOPNPONOPNONPONOPNPONPNONPONOPNONPNOPONOPNONPONOPNPONPNONPONOPNONPOPNOPONOPNONPNOPNPONOPONPNONPONOPNONPNOPNPONOPNONPONOPNPONPNONPONOPNONPNOPNPONOPNONPONOPONPNONPONOPNONPNOPNPONOPNONPOPNOPONOPNONPNOPNPONOPNONPNOPONOPNONPONOPNPONPNONPONOPNONPNOPNPONOPNONPONOPNPONPNONPONOPNONPNOPONOPNONPONOPNPONPNONPONOPNONPOPNOPONOPNONPNOPNPONOPNONPNOPONOPNONPONOPNPONPNONPONOPNONPNOPNPONOPNONPONOPNPONPNONPONOPNONPNOPONOPNONPONOPNPOPNONPNOPNPONOPNONPNOPNPOPNONPNOPONOPNONPNOPNPONOPNONPNOPONOPNONPONOPNPONPNONPONOPNONPNOPNPONOPNONPONOPNPONPNONPONOPNONPNOPONOPNONPONOPNPONPNONPONOPNONPOPNOPONOPNONPNOPNPONOPNONPNOPONOPNONPONOPNPONPNONPONOPNONPNOPNPONOPNONPONOPNPONPNONPONOPNONPNOPONOPNONPONOPNPONPNONPONOPNPOPNONPNOPNPONOPNONPNOPNPOPNONPNOPONOPNONPNOPNPONOPNONPNOPONOPNONPONOPNPONPNONPONOPNONPNOPNPONOPNONPONOPNPONPNONPONOPNONPNOPONOPNONPONOPNPONPNONPONOPNONPOPNOPONOPNONPNOPNPONOPNONPNOPONOPNONPONOPNPONPNONPONOPNONPNOPNPONOPNONPONOPNPONPNONPONOPNONPNOPONOPNONPONOPNPOPNONPNOPNPONOPNONPNOPNPOPNONPNOPONOPNONPNOPNPONOPNONPNOPONOPNONPONOPNPONPNONPONOPNONPNOPNPONOPNONPONOPNPONPNONPONOPNONPNOPONOPNONPONOPNPONPNONPONOPNONPOPNOPONOPNONPNOPNPONOPNONPNOPONOPNONPONOPNPONPNONPONOPNONPNOPNPONOPNONPONOPNPONPNONPONOPNONPNOPONPNONPONOPNONPOPNOPONOPNONPNOPNPONOPNONPNOPONOPNONPONOPNPONPNONPONOPNONPNOPNPONOPNONPONOPNPONPNONPONOPNONPOPNOPONOPNONPNOPNPONOPNONPNOPONOPNONPONOPNPONPNONPONOPNONPNOPNPONOPNONPONOPONPNONPONOPNONPNOPNPONOPNONPOPNOPONOPNONPNOPNPONOPNONPNOPONOPNONPONOPNPONPNONPONOPNONPNOPNPONOPNONPONOPNPONPNONPONOPNONPNOPONOPNONPONOPNPONPNONPONOPNONPOPNOPONOPNONPNOPNPONOPNONPNOPONOPNONPONOPNPONPNONPONOPNONPNOPNPONOPNONPONOPNPONPNONPONOPNONPNOPONOPNONPONOPNPOPNONPNOPNPONOPNONPNOPNPOPNONPNOPONOPNONPNOPNPONOPNONPNOPONOPNONPONOPNPONPNONPONOPNONPNOPNPONOPNONPONOPNPONPNONPONOPNONPNOPONOPNONPONOPNPONPNONPONOPNONPOPNOPONOPNONPNOPNPONOPNONPNOPONOPNONPONOPNPONPNONPONOPNONPNOPNPONOPNONPONOPNPONPNONPONOPNONPNOPONOPNONPONOPNPONPNONPONOPNPOPNONPNOPNPONOPNONPNOPNPOPNONPNOPONOPNONPNOPNPONOPNONPNOPONOPNONPONOPNPONPNONPONOPNONPNOPNPONOPNONPONOPNPONPNONPONOPNONPNOPONOPNONPONOPNPONPNONPONOPNONPOPNOPONOPNONPNOPNPONOPNONPNOPONOPNONPONOPNPONPNONPONOPNONPNOPNPONOPNONPONOPNPONPNONPONOPNONPNOPONOPNONPONOPNPOPNONPNOPNPONOPNONPNOPNPOPNONPNOPONOPNONPNOPNPONOPNONPNOPONOPNONPONOPNPONPNONPONOPNONPNOPNPONOPNONPONOPNPONPNONPONOPNONPNOPONOPNONPONOPNPONPNONPONOPNONPOPNOPONOPNONPNOPNPONOPNONPNOPONOPNONPONOPNPONPNONPONOPNONPNOPNPONOPNONPONOPNPONPNONPONOPNONPNOPONPNONPONOPNONPOPNOPONOPNONPNOPNPONOPNONPNOPONOPNONPONOPNPONPNONPONOPNONPNOPNPONOPNONPONOPNPONPNONPONOPNONPOPNOPONOPNONPNOPNPONOPNONPONOPNPONPNONPONOPNONPNOPNPONOPNONPONOPNPONPNOPNPONOPNONPNOPNPONOPONPNONPONOPNONPNOPNPONOPNONPONOPNPONPNONPONOPNONPNOPNPONOPNONPONOPONPNONPONOPNONPNOPNPONOPNONPOPNOPONOPNONPNOPNPONOPNONPNOPONOPNONPONOPNPONPNONPONOPNONPNOPNPONOPNONPONOPNPONPNONPONOPNONPNOPONOPNONPONOPNPONPNONPONOPNONPOPNOPONOPNONPNOPNPONOPNONPNOPONOPNONPONOPNPONPNONPONOPNONPNOPNPONOPNONPONOPNPONPNONPONOPNONPNOPONOPNONPONOPNPOPNONPNOPNPONOPNONPNOPNPOPNONPNOPONOPNONPNOPNPONOPNONPNOPONOPNONPONOPNPONPNONPONOPNONPNOPNPONOPNONPONOPNPONPNONPONOPNONPNOPONOPNONPONOPNPONPNONPONOPNONPOPNOPONOPNONPNOPNPONOPNONPNOPONOPNONPONOPNPONPNONPONOPNONPNOPNPONOPNONPONOPNPONPNONPONOPNONPNOPONOPNONPONOPNPONPNONPONOPNPOPNONPNOPNPONOPNONPNOPNPOPNONPNOPONOPNONPNOPNPONOPNONPNOPONOPNONPONOPNPONPNONPONOPNONPNOPNPONOPNONPONOPNPONPNONPONOPNONPNOPONOPNONPONOPNPONPNONPONOPNONPOPNOPONOPNONPNOPNPONOPNONPNOPONOPNONPONOPNPONPNONPONOPNONPNOPNPONOPNONPONOPNPONPNONPONOPNONPNOPONOPNONPONOPNPOPNONPNOPNPONOPNONPNOPNPOPNONPNOPONOPNONPNOPNPONOPNONPNOPONOPNONPONOPNPONPNONPONOPNONPNOPNPONOPNONPONOPNPONPNONPONOPNONPNOPONOPNONPONOPNPONPNONPONOPNONPOPNOPONOPNONPNOPNPONOPNONPNOPONOPNONPONOPNPONPNONPONOPNONPNOPNPONOPNONPONOPNPONPNONPONOPNONPNOPONPNONPONOPNONPOPNOPONOPNONPNOPNPONOPNONPNOPONOPNONPONOPNPONPNONPONOPNONPNOPNPONOPNONPONOPNPONPNONPONOPNONPOPNOPONOPNONPNOPNPONOPNONPNOPONOPNONPONOPNPONPNONPONOPNONPNOPNPONOPNONPONOPONPNONPONOPNONPNOPNPONOPNONPOPNOPONOPNONPNOPNPONOPNONPNOPONOPNONPONOPNPONPNONPONOPNONPNOPNPONOPNONPONOPNPONPNONPONOPNONPNOPONOPNONPONOPNPONPNONPONOPNONPOPNOPONOPNONPNOPNPONOPNONPNOPONOPNONPONOPNPONPNONPONOPNONPNOPNPONOPNONPONOPNPONPNONPONOPNONPNOPONOPNONPONOPNPOPNONPNOPNPONOPNONPNOPNPOPNONPNOP"
n = input()
while n != 0:
try:
genome = sequence[:n]
print genome
n = input()
except Exception:
break
| 321.0625 | 5,013 | 0.988125 | 18 | 5,137 | 282 | 0.666667 | 0.002364 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.000196 | 0.008371 | 5,137 | 15 | 5,014 | 342.466667 | 0.99627 | 0 | 0 | 0.222222 | 0 | 0 | 0.97371 | 0.97371 | 0 | 1 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0.111111 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
c5fcd305519f610e8aad783a00e501992e4a4440 | 21,409 | py | Python | tests/extensions/aria_extension_tosca/simple_v1_0/templates/common/test_template_parameters.py | tnadeau/incubator-ariatosca | de32028783969bc980144afa3c91061c7236459c | [
"Apache-2.0"
] | null | null | null | tests/extensions/aria_extension_tosca/simple_v1_0/templates/common/test_template_parameters.py | tnadeau/incubator-ariatosca | de32028783969bc980144afa3c91061c7236459c | [
"Apache-2.0"
] | null | null | null | tests/extensions/aria_extension_tosca/simple_v1_0/templates/common/test_template_parameters.py | tnadeau/incubator-ariatosca | de32028783969bc980144afa3c91061c7236459c | [
"Apache-2.0"
] | 1 | 2020-06-16T15:13:06.000Z | 2020-06-16T15:13:06.000Z | # -*- coding: utf-8 -*-
# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements. See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""
Unified testing for properties, attributes, and inputs.
Additional tests for properties are in test_template_properties.py.
"""
import pytest
from ... import data
from ......mechanisms.utils import matrix
# Assigning to parameters defined at a type
MAIN_MACROS = """
{% macro additions() %}
{%- endmacro %}
{% macro type_parameters() %}
{{ parameter_section }}: {{ caller()|indent(6) }}
{%- endmacro %}
{% macro parameters() %}
{{ parameter_section }}: {{ caller()|indent(8) }}
{%- endmacro %}
"""
# Assigning to parameters defined at a capability type
CAPABILITY_MACROS = """
{% macro additions() %}
{%- endmacro %}
{% macro type_parameters() %}
capabilities:
my_capability:
type: MyType
capability_types:
MyType:
{{ parameter_section }}: {{ caller()|indent(6) }}
{%- endmacro %}
{% macro parameters() %}
capabilities:
my_capability:
{{ parameter_section }}: {{ caller()|indent(12) }}
{%- endmacro %}
"""
# Assigning to parameters defined at an artifact type
ARTIFACT_MACROS = """
{% macro additions() %}
{%- endmacro %}
{% macro type_parameters() %} {}
artifact_types:
MyType:
{{ parameter_section }}: {{ caller()|indent(6) }}
{%- endmacro %}
{% macro parameters() %}
artifacts:
my_artifact:
type: MyType
file: a file
{{ parameter_section }}: {{ caller()|indent(12) }}
{%- endmacro %}
"""
# Assigning to inputs defined at an interface type
INTERFACE_MACROS = """
{% macro additions() %}
{%- endmacro %}
{% macro type_parameters() %}
interfaces:
MyInterface:
type: MyType
interface_types:
MyType:
{{ parameter_section }}: {{ caller()|indent(6) }}
{%- endmacro %}
{% macro parameters() %}
interfaces:
MyInterface:
{{ parameter_section }}: {{ caller()|indent(12) }}
{%- endmacro %}
"""
# Assigning to inputs defined at an operation of an interface type
OPERATION_MACROS = """
{% macro additions() %}
{%- endmacro %}
{% macro type_parameters() %}
interfaces:
MyInterface:
type: MyType
interface_types:
MyType:
my_operation:
{{ parameter_section }}: {{ caller()|indent(8) }}
{%- endmacro %}
{% macro parameters() %}
interfaces:
MyInterface:
my_operation:
{{ parameter_section }}: {{ caller()|indent(14) }}
{%- endmacro %}
"""
# Assigning to inputs defined (added/overridden) at an interface of the template's type
LOCAL_INTERFACE_MACROS = """
{% macro additions() %}
interface_types:
MyType: {}
{%- endmacro %}
{% macro type_parameters() %}
interfaces:
MyInterface:
type: MyType
{{ parameter_section }}: {{ caller()|indent(10) }}
{%- endmacro %}
{% macro parameters() %}
interfaces:
MyInterface:
{{ parameter_section }}: {{ caller()|indent(12) }}
{%- endmacro %}
"""
# Assigning to inputs defined (added/overridden) at an operation of an interface of the template's
# type
LOCAL_OPERATION_MACROS = """
{% macro additions() %}
interface_types:
MyType: {}
{%- endmacro %}
{% macro type_parameters() %}
interfaces:
MyInterface:
type: MyType
my_operation:
{{ parameter_section }}: {{ caller()|indent(12) }}
{%- endmacro %}
{% macro parameters() %}
interfaces:
MyInterface:
my_operation:
{{ parameter_section }}: {{ caller()|indent(14) }}
{%- endmacro %}
"""
# At a relationship of a node template, assigning to parameters defined at a relationship type
RELATIONSHIP_TYPE_MACROS = """
{% macro additions() %}
capability_types:
MyType: {}
interface_types:
MyType: {}
{%- endmacro %}
{% macro type_parameters() %}
requirements:
- my_requirement:
capability: MyType
relationship:
type: MyType
relationship_types:
MyType:
{{ parameter_section }}: {{ caller()|indent(6) }}
{%- endmacro %}
{% macro parameters() %}
requirements:
- my_requirement:
relationship:
{{ parameter_section }}: {{ caller()|indent(16) }}
{%- endmacro %}
"""
# At a relationship of a node template, assigning to inputs defined at an interface type
RELATIONSHIP_INTERFACE_MACROS = """
{% macro additions() %}
capability_types:
MyType: {}
relationship_types:
MyType: {}
{%- endmacro %}
{% macro type_parameters() %}
requirements:
- my_requirement:
capability: MyType
relationship:
type: MyType
interfaces:
MyInterface:
type: MyType
interface_types:
MyType:
{{ parameter_section }}: {{ caller()|indent(8) }}
{%- endmacro %}
{% macro parameters() %}
requirements:
- my_requirement:
relationship:
interfaces:
MyInterface:
{{ parameter_section }}: {{ caller()|indent(20) }}
{%- endmacro %}
"""
# At a relationship of a node template, assigning to inputs defined at an operation of an interface
# type
RELATIONSHIP_OPERATION_MACROS = """
{% macro additions() %}
capability_types:
MyType: {}
relationship_types:
MyType: {}
{%- endmacro %}
{% macro type_parameters() %}
requirements:
- my_requirement:
capability: MyType
relationship:
type: MyType
interfaces:
MyInterface:
type: MyType
interface_types:
MyType:
my_operation:
{{ parameter_section }}: {{ caller()|indent(10) }}
{%- endmacro %}
{% macro parameters() %}
requirements:
- my_requirement:
relationship:
interfaces:
MyInterface:
my_operation:
{{ parameter_section }}: {{ caller()|indent(22) }}
{%- endmacro %}
"""
# At a relationship of a node template, assigning to inputs defined (added/overridden) at an
# interface of a relationship type
RELATIONSHIP_TYPE_INTERFACE_MACROS = """
{% macro additions() %}
capability_types:
MyType: {}
interface_types:
MyType: {}
{%- endmacro %}
{% macro type_parameters() %}
requirements:
- my_requirement:
capability: MyType
relationship:
type: MyType
relationship_types:
MyType:
interfaces:
MyInterface:
type: MyType
{{ parameter_section }}: {{ caller()|indent(10) }}
{%- endmacro %}
{% macro parameters() %}
requirements:
- my_requirement:
relationship:
interfaces:
MyInterface:
{{ parameter_section }}: {{ caller()|indent(20) }}
{%- endmacro %}
"""
# At a relationship of a node template, assigning to inputs defined (added/overridden) at an
# operation of an interface of a relationship type
RELATIONSHIP_TYPE_OPERATION_MACROS = """
{% macro additions() %}
capability_types:
MyType: {}
interface_types:
MyType: {}
{%- endmacro %}
{% macro type_parameters() %}
requirements:
- my_requirement:
capability: MyType
relationship:
type: MyType
relationship_types:
MyType:
interfaces:
MyInterface:
type: MyType
my_operation:
{{ parameter_section }}: {{ caller()|indent(12) }}
{%- endmacro %}
{% macro parameters() %}
requirements:
- my_requirement:
relationship:
interfaces:
MyInterface:
my_operation:
{{ parameter_section }}: {{ caller()|indent(22) }}
{%- endmacro %}
"""
# At a relationship of a node template, assigning to inputs defined (added/overridden) at an
# interface of the relationship of the node type
RELATIONSHIP_LOCAL_INTERFACE_MACROS = """
{% macro additions() %}
capability_types:
MyType: {}
relationship_types:
MyType: {}
interface_types:
MyType: {}
{%- endmacro %}
{% macro type_parameters() %}
requirements:
- my_requirement:
capability: MyType
relationship:
type: MyType
interfaces:
MyInterface:
type: MyType
{{ parameter_section }}: {{ caller()|indent(18) }}
{%- endmacro %}
{% macro parameters() %}
requirements:
- my_requirement:
relationship:
interfaces:
MyInterface:
{{ parameter_section }}: {{ caller()|indent(20) }}
{%- endmacro %}
"""
# At a relationship of a node template, assigning to inputs defined (added/overridden) at an
# operation of an interface of the relationship of the node type
RELATIONSHIP_LOCAL_OPERATION_MACROS = """
{% macro additions() %}
capability_types:
MyType: {}
relationship_types:
MyType: {}
interface_types:
MyType: {}
{%- endmacro %}
{% macro type_parameters() %}
requirements:
- my_requirement:
capability: MyType
relationship:
type: MyType
interfaces:
MyInterface:
type: MyType
my_operation:
{{ parameter_section }}: {{ caller()|indent(20) }}
{%- endmacro %}
{% macro parameters() %}
requirements:
- my_requirement:
relationship:
interfaces:
MyInterface:
my_operation:
{{ parameter_section }}: {{ caller()|indent(22) }}
{%- endmacro %}
"""
MACROS = {
'main': MAIN_MACROS,
'capability': CAPABILITY_MACROS,
'artifact': ARTIFACT_MACROS,
'interface': INTERFACE_MACROS,
'operation': OPERATION_MACROS,
'local-interface': LOCAL_INTERFACE_MACROS,
'local-operation': LOCAL_OPERATION_MACROS,
'relationship-type': RELATIONSHIP_TYPE_MACROS,
'relationship-interface': RELATIONSHIP_INTERFACE_MACROS,
'relationship-operation': RELATIONSHIP_OPERATION_MACROS,
'relationship-type-interface': RELATIONSHIP_TYPE_INTERFACE_MACROS,
'relationship-type-operation': RELATIONSHIP_TYPE_OPERATION_MACROS,
'relationship-local-interface': RELATIONSHIP_LOCAL_INTERFACE_MACROS,
'relationship-local-operation': RELATIONSHIP_LOCAL_OPERATION_MACROS
}
PERMUTATIONS = (
('main', 'node', 'properties'),
('main', 'node', 'attributes'),
('main', 'group', 'properties'),
('main', 'relationship', 'properties'),
('main', 'relationship', 'attributes'),
('main', 'policy', 'properties'),
('capability', 'node', 'properties'),
('capability', 'node', 'attributes'),
('artifact', 'node', 'properties'),
('interface', 'node', 'inputs'),
('interface', 'group', 'inputs'),
('interface', 'relationship', 'inputs'),
('operation', 'node', 'inputs'),
('operation', 'group', 'inputs'),
('operation', 'relationship', 'inputs'),
('local-interface', 'node', 'inputs'),
('local-interface', 'group', 'inputs'),
('local-interface', 'relationship', 'inputs'),
('local-operation', 'node', 'inputs'),
('local-operation', 'group', 'inputs'),
('local-operation', 'relationship', 'inputs'),
('relationship-type', 'node', 'properties'),
('relationship-interface', 'node', 'inputs'),
#('relationship-operation', 'node', 'inputs'), # fix
('relationship-type-interface', 'node', 'inputs'),
('relationship-type-operation', 'node', 'inputs'), # fix
('relationship-local-interface', 'node', 'inputs'),
#('relationship-operation', 'node', 'inputs'), # fix
)
# Parameters section
@pytest.mark.parametrize('macros,name,parameter_section,value', matrix(
PERMUTATIONS,
data.NOT_A_DICT,
counts=(3, 1)
))
def test_template_parameters_section_syntax_type(parser, macros, name, parameter_section, value):
parser.parse_literal(MACROS[macros] + """
tosca_definitions_version: tosca_simple_yaml_1_0
{{- additions() }}
{{ name }}_types:
MyType:
{%- call type_parameters() -%}
{}
{% endcall %}
topology_template:
{{ section }}:
my_template:
type: MyType
{%- call parameters() -%}
{{ value }}
{% endcall %}
""", dict(name=name, section=data.TEMPLATE_NAME_SECTIONS[name],
parameter_section=parameter_section, value=value)).assert_failure()
@pytest.mark.parametrize('macros,name,parameter_section', PERMUTATIONS)
def test_template_parameters_section_syntax_empty(parser, macros, name, parameter_section):
parser.parse_literal(MACROS[macros] + """
tosca_definitions_version: tosca_simple_yaml_1_0
{{- additions() }}
{{ name }}_types:
MyType:
{%- call type_parameters() -%}
{}
{% endcall %}
topology_template:
{{ section }}:
my_template:
type: MyType
{%- call parameters() -%}
{}
{% endcall %}
""", dict(name=name, section=data.TEMPLATE_NAME_SECTIONS[name],
parameter_section=parameter_section)).assert_success()
# Parameter
@pytest.mark.parametrize('macros,name,parameter_section', (('capability', 'node', 'attributes'),))
def test_template_parameter_missing(parser, macros, name, parameter_section):
parser.parse_literal(MACROS[macros] + """
tosca_definitions_version: tosca_simple_yaml_1_0
{{- additions() }}
{{ name }}_types:
MyType:
{%- call type_parameters() -%}
{}
{% endcall %}
topology_template:
{{ section }}:
my_template:
type: MyType
{%- call parameters() %}
my_parameter: a value
{% endcall %}
""", dict(name=name, section=data.TEMPLATE_NAME_SECTIONS[name],
parameter_section=parameter_section)).assert_failure()
# Entry schema
@pytest.mark.parametrize('macros,name,parameter_section,values', matrix(
PERMUTATIONS,
data.ENTRY_SCHEMA_VALUES,
counts=(3, 1)
))
def test_template_parameter_map(parser, macros, name, parameter_section, values):
parser.parse_literal(MACROS[macros] + """
tosca_definitions_version: tosca_simple_yaml_1_0
{{- additions() }}
data_types:
MyType:
properties:
my_field:
type: string
default: default value
{{ name }}_types:
MyType:
{%- call type_parameters() %}
my_parameter:
type: map
entry_schema: {{ values[0] }}
{% endcall %}
topology_template:
{{ section }}:
my_template:
type: MyType
{%- call parameters() %}
my_parameter:
key1: {{ values[1] }}
key2: {{ values[2] }}
{% endcall %}
""", dict(name=name, section=data.TEMPLATE_NAME_SECTIONS[name], parameter_section=parameter_section,
values=values), import_profile=True).assert_success()
@pytest.mark.parametrize('macros,name,parameter_section,values', matrix(
PERMUTATIONS,
data.ENTRY_SCHEMA_VALUES_BAD,
counts=(3, 1)
))
def test_template_parameter_map_bad(parser, macros, name, parameter_section, values):
parser.parse_literal(MACROS[macros] + """
tosca_definitions_version: tosca_simple_yaml_1_0
{{- additions() }}
data_types:
MyType:
properties:
my_field:
type: string
default: default value
{{ name }}_types:
MyType:
{%- call type_parameters() %}
my_parameter:
type: map
entry_schema: {{ values[0] }}
{% endcall %}
topology_template:
{{ section }}:
my_template:
type: MyType
{%- call parameters() %}
my_parameter:
key1: {{ values[1] }}
key2: {{ values[2] }}
{% endcall %}
""", dict(name=name, section=data.TEMPLATE_NAME_SECTIONS[name], parameter_section=parameter_section,
values=values), import_profile=True).assert_failure()
@pytest.mark.parametrize('macros,name,parameter_section', PERMUTATIONS)
def test_template_parameter_map_required_field(parser, macros, name, parameter_section):
parser.parse_literal(MACROS[macros] + """
tosca_definitions_version: tosca_simple_yaml_1_0
{{- additions() }}
data_types:
MyType:
properties:
my_field:
type: string
{{ name }}_types:
MyType:
{%- call type_parameters() %}
my_parameter:
type: map
entry_schema: MyType
{% endcall %}
topology_template:
{{ section }}:
my_template:
type: MyType
{%- call parameters() %}
my_parameter:
key: {my_field: a value}
{% endcall %}
""", dict(name=name, section=data.TEMPLATE_NAME_SECTIONS[name],
parameter_section=parameter_section), import_profile=True).assert_success()
@pytest.mark.parametrize('macros,name,parameter_section', PERMUTATIONS)
def test_template_parameter_map_required_field_bad(parser, macros, name, parameter_section):
parser.parse_literal(MACROS[macros] + """
tosca_definitions_version: tosca_simple_yaml_1_0
{{- additions() }}
data_types:
MyType:
properties:
my_field:
type: string
{{ name }}_types:
MyType:
{%- call type_parameters() %}
my_parameter:
type: map
entry_schema: MyType
{% endcall %}
topology_template:
{{ section }}:
my_template:
type: MyType
{%- call parameters() %}
my_parameter:
key: {}
{% endcall %}
""", dict(name=name, section=data.TEMPLATE_NAME_SECTIONS[name],
parameter_section=parameter_section), import_profile=True).assert_failure()
@pytest.mark.parametrize('macros,name,parameter_section,values', matrix(
PERMUTATIONS,
data.ENTRY_SCHEMA_VALUES,
counts=(3, 1)
))
def test_template_parameter_list(parser, macros, name, parameter_section, values):
parser.parse_literal(MACROS[macros] + """
tosca_definitions_version: tosca_simple_yaml_1_0
{{- additions() }}
data_types:
MyType:
properties:
my_field:
type: string
default: default value
{{ name }}_types:
MyType:
{%- call type_parameters() %}
my_parameter:
type: list
entry_schema: {{ values[0] }}
{% endcall %}
topology_template:
{{ section }}:
my_template:
type: MyType
{%- call parameters() %}
my_parameter:
- {{ values[1] }}
- {{ values[2] }}
{% endcall %}
""", dict(name=name, section=data.TEMPLATE_NAME_SECTIONS[name], parameter_section=parameter_section,
values=values), import_profile=True).assert_success()
@pytest.mark.parametrize('macros,name,parameter_section,values', matrix(
PERMUTATIONS,
data.ENTRY_SCHEMA_VALUES_BAD,
counts=(3, 1)
))
def test_template_parameter_list_bad(parser, macros, name, parameter_section, values):
parser.parse_literal(MACROS[macros] + """
tosca_definitions_version: tosca_simple_yaml_1_0
{{- additions() }}
data_types:
MyType:
properties:
my_field:
type: string
default: default value
{{ name }}_types:
MyType:
{%- call type_parameters() %}
my_parameter:
type: list
entry_schema: {{ values[0] }}
{% endcall %}
topology_template:
{{ section }}:
my_template:
type: MyType
{%- call parameters() %}
my_parameter:
- {{ values[1] }}
- {{ values[2] }}
{% endcall %}
""", dict(name=name, section=data.TEMPLATE_NAME_SECTIONS[name], parameter_section=parameter_section,
values=values), import_profile=True).assert_failure()
@pytest.mark.parametrize('macros,name,parameter_section', PERMUTATIONS)
def test_template_parameter_list_required_field(parser, macros, name, parameter_section):
parser.parse_literal(MACROS[macros] + """
tosca_definitions_version: tosca_simple_yaml_1_0
{{- additions() }}
data_types:
MyType:
properties:
my_field:
type: string
{{ name }}_types:
MyType:
{%- call type_parameters() %}
my_parameter:
type: list
entry_schema: MyType
{% endcall %}
topology_template:
{{ section }}:
my_template:
type: MyType
{%- call parameters() %}
my_parameter:
- {my_field: a value}
{% endcall %}
""", dict(name=name, section=data.TEMPLATE_NAME_SECTIONS[name],
parameter_section=parameter_section), import_profile=True).assert_success()
@pytest.mark.parametrize('macros,name,parameter_section', PERMUTATIONS)
def test_template_parameter_list_required_field_bad(parser, macros, name, parameter_section):
parser.parse_literal(MACROS[macros] + """
tosca_definitions_version: tosca_simple_yaml_1_0
{{- additions() }}
data_types:
MyType:
properties:
my_field:
type: string
{{ name }}_types:
MyType:
{%- call type_parameters() %}
my_parameter:
type: list
entry_schema: MyType
{% endcall %}
topology_template:
{{ section }}:
my_template:
type: MyType
{%- call parameters() %}
my_parameter:
- {}
{% endcall %}
""", dict(name=name, section=data.TEMPLATE_NAME_SECTIONS[name],
parameter_section=parameter_section), import_profile=True).assert_failure()
# Unicode
@pytest.mark.parametrize('macros,name,parameter_section', PERMUTATIONS)
def test_template_parameter_unicode(parser, macros, name, parameter_section):
parser.parse_literal(MACROS[macros] + """
tosca_definitions_version: tosca_simple_yaml_1_0
{{- additions() }}
{{ name }}_types:
類型:
{%- call type_parameters() %}
參數:
type: string
{% endcall %}
topology_template:
{{ section }}:
模板:
type: 類型
{%- call parameters() %}
參數: 值
{% endcall %}
""", dict(name=name, section=data.TEMPLATE_NAME_SECTIONS[name],
parameter_section=parameter_section)).assert_success()
| 27.377238 | 100 | 0.652249 | 2,187 | 21,409 | 6.164609 | 0.082762 | 0.090194 | 0.053405 | 0.058152 | 0.825916 | 0.811526 | 0.796024 | 0.774514 | 0.748479 | 0.746403 | 0 | 0.006062 | 0.206409 | 21,409 | 781 | 101 | 27.412292 | 0.787463 | 0.109393 | 0 | 0.860262 | 0 | 0 | 0.692627 | 0.063315 | 0 | 0 | 0 | 0 | 0.017467 | 1 | 0.017467 | false | 0 | 0.016012 | 0 | 0.033479 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
68015a53337ab0a88928e5e57f2aa49f61d2c793 | 8,757 | py | Python | gators/feature_generation/tests/test_is_null.py | Aditya-Kapadiya/gators | d7c9967e3a8e304a601b6a92ad834d03d3e36338 | [
"Apache-2.0"
] | 4 | 2021-10-29T18:20:52.000Z | 2022-03-31T22:53:03.000Z | gators/feature_generation/tests/test_is_null.py | Aditya-Kapadiya/gators | d7c9967e3a8e304a601b6a92ad834d03d3e36338 | [
"Apache-2.0"
] | 1 | 2022-01-19T12:16:19.000Z | 2022-01-19T12:16:19.000Z | gators/feature_generation/tests/test_is_null.py | Aditya-Kapadiya/gators | d7c9967e3a8e304a601b6a92ad834d03d3e36338 | [
"Apache-2.0"
] | 5 | 2021-11-17T20:16:54.000Z | 2022-02-21T18:21:02.000Z | # License: Apache-2.0
import databricks.koalas as ks
import numpy as np
import pandas as pd
import pytest
from pandas.testing import assert_frame_equal
from gators.feature_generation.is_null import IsNull
ks.set_option("compute.default_index_type", "distributed-sequence")
@pytest.fixture
def data_num():
X = pd.DataFrame(np.arange(9).reshape(3, 3), columns=list("ABC"))
X.iloc[0, :] = np.nan
X_expected = pd.DataFrame(
[
[np.nan, np.nan, np.nan, 1.0, 1.0, 1.0],
[3.0, 4.0, 5.0, 0.0, 0.0, 0.0],
[6.0, 7.0, 8.0, 0.0, 0.0, 0.0],
],
columns=["A", "B", "C", "A__is_null", "B__is_null", "C__is_null"],
)
obj = IsNull(columns=list("ABC")).fit(X)
return obj, X, X_expected
@pytest.fixture
def data_float32_num():
X = pd.DataFrame(np.arange(9).reshape(3, 3), columns=list("ABC")).astype(np.float32)
X.iloc[0, :] = np.nan
X_expected = pd.DataFrame(
[
[np.nan, np.nan, np.nan, 1.0, 1.0, 1.0],
[3.0, 4.0, 5.0, 0.0, 0.0, 0.0],
[6.0, 7.0, 8.0, 0.0, 0.0, 0.0],
],
columns=["A", "B", "C", "A__is_null", "B__is_null", "C__is_null"],
).astype(np.float32)
obj = IsNull(columns=list("ABC"), dtype=np.float32).fit(X)
return obj, X, X_expected
@pytest.fixture
def data_names():
X = pd.DataFrame(np.arange(9).reshape(3, 3), columns=list("ABC"))
X.iloc[0, :] = np.nan
X_expected = pd.DataFrame(
[
[np.nan, np.nan, np.nan, 1.0, 1.0, 1.0],
[3.0, 4.0, 5.0, 0.0, 0.0, 0.0],
[6.0, 7.0, 8.0, 0.0, 0.0, 0.0],
],
columns=["A", "B", "C", "AIsNull", "BIsNull", "CIsNull"],
)
obj = IsNull(
columns=list("ABC"), column_names=["AIsNull", "BIsNull", "CIsNull"]
).fit(X)
return obj, X, X_expected
@pytest.fixture
def data_obj():
X = pd.DataFrame(
{
"A": [None, "a", "b"],
"B": [None, "c", "d"],
"C": [None, "e", "f"],
"D": [0, 1, np.nan],
}
)
X_expected = pd.DataFrame(
{
"A": [None, "a", "b"],
"B": [None, "c", "d"],
"C": [None, "e", "f"],
"D": [0, 1, np.nan],
"A__is_null": [1.0, 0.0, 0.0],
"B__is_null": [1.0, 0.0, 0.0],
"C__is_null": [1.0, 0.0, 0.0],
"D__is_null": [0.0, 0.0, 1.0],
}
)
obj = IsNull(columns=list("ABCD")).fit(X)
return obj, X, X_expected
@pytest.fixture
def data_num_ks():
X = ks.DataFrame(np.arange(9).reshape(3, 3), columns=list("ABC"))
X.iloc[0, :] = np.nan
X_expected = pd.DataFrame(
[
[np.nan, np.nan, np.nan, 1.0, 1.0, 1.0],
[3.0, 4.0, 5.0, 0.0, 0.0, 0.0],
[6.0, 7.0, 8.0, 0.0, 0.0, 0.0],
],
columns=["A", "B", "C", "A__is_null", "B__is_null", "C__is_null"],
)
obj = IsNull(columns=list("ABC")).fit(X)
return obj, X, X_expected
@pytest.fixture
def data_float32_num_ks():
X = ks.DataFrame(np.arange(9).reshape(3, 3), columns=list("ABC"))
X.iloc[0, :] = np.nan
X = X.astype(np.float32)
X_expected = pd.DataFrame(
[
[np.nan, np.nan, np.nan, 1.0, 1.0, 1.0],
[3.0, 4.0, 5.0, 0.0, 0.0, 0.0],
[6.0, 7.0, 8.0, 0.0, 0.0, 0.0],
],
columns=["A", "B", "C", "A__is_null", "B__is_null", "C__is_null"],
).astype(np.float32)
obj = IsNull(columns=list("ABC"), dtype=np.float32).fit(X)
return obj, X, X_expected
@pytest.fixture
def data_names_ks():
X = ks.DataFrame(np.arange(9).reshape(3, 3), columns=list("ABC"))
X.iloc[0, :] = np.nan
X_expected = pd.DataFrame(
[
[np.nan, np.nan, np.nan, 1.0, 1.0, 1.0],
[3.0, 4.0, 5.0, 0.0, 0.0, 0.0],
[6.0, 7.0, 8.0, 0.0, 0.0, 0.0],
],
columns=["A", "B", "C", "AIsNull", "BIsNull", "CIsNull"],
)
obj = IsNull(
columns=list("ABC"), column_names=["AIsNull", "BIsNull", "CIsNull"]
).fit(X)
return obj, X, X_expected
@pytest.fixture
def data_obj_ks():
X = ks.DataFrame(
{
"A": [None, "a", "b"],
"B": [None, "c", "d"],
"C": [None, "e", "f"],
"D": [0, 1, np.nan],
}
)
X_expected = pd.DataFrame(
{
"A": [None, "a", "b"],
"B": [None, "c", "d"],
"C": [None, "e", "f"],
"D": [0, 1, np.nan],
"A__is_null": [1.0, 0.0, 0.0],
"B__is_null": [1.0, 0.0, 0.0],
"C__is_null": [1.0, 0.0, 0.0],
"D__is_null": [0.0, 0.0, 1.0],
}
)
obj = IsNull(columns=list("ABCD")).fit(X)
return obj, X, X_expected
def test_pd(data_num):
obj, X, X_expected = data_num
X_new = obj.transform(X)
assert_frame_equal(X_new, X_expected)
@pytest.mark.koalas
def test_ks(data_num_ks):
obj, X, X_expected = data_num_ks
X_new = obj.transform(X)
assert_frame_equal(X_new.to_pandas(), X_expected)
def test_pd_np(data_num):
obj, X, X_expected = data_num
X_numpy_new = obj.transform_numpy(X.to_numpy())
X_new = pd.DataFrame(X_numpy_new)
X_expected = pd.DataFrame(X_expected.values)
assert_frame_equal(X_new, X_expected)
@pytest.mark.koalas
def test_ks_np(data_num_ks):
obj, X, X_expected = data_num_ks
X_numpy_new = obj.transform_numpy(X.to_numpy())
X_new = pd.DataFrame(X_numpy_new)
X_expected = pd.DataFrame(X_expected.values)
assert_frame_equal(X_new, X_expected)
def test_float32_pd(data_float32_num):
obj, X, X_expected = data_float32_num
X_new = obj.transform(X)
assert_frame_equal(X_new, X_expected)
@pytest.mark.koalas
def test_float32_ks(data_float32_num_ks):
obj, X, X_expected = data_float32_num_ks
X_new = obj.transform(X)
assert_frame_equal(X_new.to_pandas(), X_expected)
def test_float32_pd_np(data_float32_num):
obj, X, X_expected = data_float32_num
X_numpy_new = obj.transform_numpy(X.to_numpy())
X_new = pd.DataFrame(X_numpy_new)
X_expected = pd.DataFrame(X_expected.values)
assert_frame_equal(X_new, X_expected)
@pytest.mark.koalas
def test_float32_ks_np(data_float32_num_ks):
obj, X, X_expected = data_float32_num_ks
X_numpy_new = obj.transform_numpy(X.to_numpy())
X_new = pd.DataFrame(X_numpy_new)
X_expected = pd.DataFrame(X_expected.values)
assert_frame_equal(X_new, X_expected)
def test_names_pd(data_names):
obj, X, X_expected = data_names
X_new = obj.transform(X)
assert_frame_equal(X_new, X_expected)
@pytest.mark.koalas
def test_names_ks(data_names_ks):
obj, X, X_expected = data_names_ks
X_new = obj.transform(X)
assert_frame_equal(X_new.to_pandas(), X_expected)
def test_names_pd_np(data_names):
obj, X, X_expected = data_names
X_numpy_new = obj.transform_numpy(X.to_numpy())
X_new = pd.DataFrame(X_numpy_new)
X_expected = pd.DataFrame(X_expected.values)
assert_frame_equal(X_new, X_expected)
@pytest.mark.koalas
def test_names_ks_np(data_names_ks):
obj, X, X_expected = data_names_ks
X_numpy_new = obj.transform_numpy(X.to_numpy())
X_new = pd.DataFrame(X_numpy_new)
X_expected = pd.DataFrame(X_expected.values)
assert_frame_equal(X_new, X_expected)
def test_obj(data_obj):
obj, X, X_expected = data_obj
X_new = obj.transform(X)
assert_frame_equal(
X_new.iloc[:, 4:].astype(float), X_expected.iloc[:, 4:].astype(float)
)
@pytest.mark.koalas
def test_obj_ks(data_obj_ks):
obj, X, X_expected = data_obj_ks
X_new = obj.transform(X).to_pandas()
assert_frame_equal(
X_new.iloc[:, 4:].astype(float), X_expected.iloc[:, 4:].astype(float)
)
def test_obj_np(data_obj):
obj, X, X_expected = data_obj
X_numpy_new = obj.transform_numpy(X.to_numpy())
X_new = pd.DataFrame(X_numpy_new)
X_expected = pd.DataFrame(X_expected.values)
assert_frame_equal(
X_new.iloc[:, 4:].astype(float), X_expected.iloc[:, 4:].astype(float)
)
@pytest.mark.koalas
def test_obj_ks_np(data_obj_ks):
obj, X, X_expected = data_obj_ks
X_numpy_new = obj.transform_numpy(X.to_numpy())
X_new = pd.DataFrame(X_numpy_new)
X_expected = pd.DataFrame(X_expected.values)
assert_frame_equal(
X_new.iloc[:, 4:].astype(float), X_expected.iloc[:, 4:].astype(float)
)
def test_init():
with pytest.raises(TypeError):
_ = IsNull(columns=0)
with pytest.raises(ValueError):
_ = IsNull(columns=[], column_names=["AIsNull"])
with pytest.raises(TypeError):
_ = IsNull(columns=list("ABC"), column_names=0)
with pytest.raises(ValueError):
_ = IsNull(columns=list("ABC"), column_names=["a", "b"])
| 28.617647 | 88 | 0.583876 | 1,437 | 8,757 | 3.306889 | 0.059847 | 0.042929 | 0.051768 | 0.052189 | 0.934343 | 0.92782 | 0.89415 | 0.877315 | 0.877315 | 0.842803 | 0 | 0.046903 | 0.240379 | 8,757 | 305 | 89 | 28.711475 | 0.667468 | 0.00217 | 0 | 0.730924 | 0 | 0 | 0.051168 | 0.002976 | 0 | 0 | 0 | 0 | 0.068273 | 1 | 0.100402 | false | 0 | 0.024096 | 0 | 0.156627 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
a858227b16690594c88b3d74c684793242a8c9c7 | 47 | py | Python | calc/Seconds.py | beyondjjw/SendStocks | 103872853b2393ef88611e661f334d74bdd61778 | [
"MIT"
] | null | null | null | calc/Seconds.py | beyondjjw/SendStocks | 103872853b2393ef88611e661f334d74bdd61778 | [
"MIT"
] | null | null | null | calc/Seconds.py | beyondjjw/SendStocks | 103872853b2393ef88611e661f334d74bdd61778 | [
"MIT"
] | 3 | 2019-03-19T14:31:59.000Z | 2019-11-26T09:45:15.000Z |
import sys
def stop(s=1):
return int(s) | 6.714286 | 17 | 0.595745 | 9 | 47 | 3.111111 | 0.888889 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.029412 | 0.276596 | 47 | 7 | 17 | 6.714286 | 0.794118 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 7 |
a895e019d3dae753576b15bb28ad09a33bb06265 | 4,460 | py | Python | tests/integration/docusaurus/connecting_to_your_data/how_to_introspect_and_partition_your_data/sql_database/yaml_example_gradual.py | arunnthevapalan/great_expectations | 97f1481bcd1c3f4d8878c6f383f4e6f008b20cd1 | [
"Apache-2.0"
] | 1 | 2022-03-16T22:09:49.000Z | 2022-03-16T22:09:49.000Z | tests/integration/docusaurus/connecting_to_your_data/how_to_introspect_and_partition_your_data/sql_database/yaml_example_gradual.py | draev/great_expectations | 317e15ee7e50f6e0d537b62154177440f33b795d | [
"Apache-2.0"
] | null | null | null | tests/integration/docusaurus/connecting_to_your_data/how_to_introspect_and_partition_your_data/sql_database/yaml_example_gradual.py | draev/great_expectations | 317e15ee7e50f6e0d537b62154177440f33b795d | [
"Apache-2.0"
] | null | null | null | from ruamel import yaml
# <snippet>
import great_expectations as ge
# </snippet>
# <snippet>
context = ge.get_context()
# </snippet>
# <snippet>
datasource_yaml = f"""
name: taxi_datasource
class_name: SimpleSqlalchemyDatasource
connection_string: <CONNECTION_STRING>
introspection: # Each key in the "introspection" section is the name of an InferredAssetSqlDataConnector (key name "introspection" in "SimpleSqlalchemyDatasource" configuration is reserved).
whole_table: {{}} # Any alphanumeric key name is acceptable.
"""
# </snippet>
# Please note this override is only to provide good UX for docs and tests.
# In normal usage you'd set your path directly in the yaml above.
data_dir_path = "data"
CONNECTION_STRING = f"sqlite:///{data_dir_path}/yellow_tripdata.db"
datasource_yaml = datasource_yaml.replace("<CONNECTION_STRING>", CONNECTION_STRING)
# <snippet>
context.test_yaml_config(datasource_yaml)
# </snippet>
# <snippet>
datasource_yaml = f""" # buggy datasource_yaml configuration
name: mis_configured_datasource
class_name: SimpleSqlalchemyDatasource
connection_string: <CONNECTION_STRING>
introspecting: # illegal top-level key name
whole_table: {{}}
"""
# </snippet>
datasource_yaml = datasource_yaml.replace("<CONNECTION_STRING>", CONNECTION_STRING)
# <snippet>
context.test_yaml_config(datasource_yaml)
# </snippet>
# <snippet>
datasource_yaml = f"""
name: taxi_datasource
class_name: SimpleSqlalchemyDatasource
connection_string: <CONNECTION_STRING>
introspection: # Each key in the "introspection" section is the name of an InferredAssetSqlDataConnector (key name "introspection" in "SimpleSqlalchemyDatasource" configuration is reserved).
whole_table:
include_schema_name: true
introspection_directives:
include_views: true
skip_inapplicable_tables: true # skip and continue upon encountering introspection errors
excluded_tables: # a list of tables to ignore when inferring data asset_names
- main.yellow_tripdata_sample_2019_03 # format: schema_name.table_name
"""
# </snippet>
datasource_yaml = datasource_yaml.replace("<CONNECTION_STRING>", CONNECTION_STRING)
context.test_yaml_config(datasource_yaml)
# <snippet>
context.add_datasource(**yaml.load(datasource_yaml))
# </snippet>
# <snippet>
available_data_asset_names = context.datasources[
"taxi_datasource"
].get_available_data_asset_names(data_connector_names="whole_table")["whole_table"]
# </snippet>
assert len(available_data_asset_names) == 2
# <snippet>
datasource_yaml = f"""
name: taxi_datasource
class_name: SimpleSqlalchemyDatasource
connection_string: <CONNECTION_STRING>
tables: # Each key in the "tables" section is a table_name (key name "tables" in "SimpleSqlalchemyDatasource" configuration is reserved).
yellow_tripdata_sample_2019_01: # Must match table name exactly.
partitioners: # Each key in the "partitioners" sub-section the name of a ConfiguredAssetSqlDataConnector (key name "partitioners" in "SimpleSqlalchemyDatasource" configuration is reserved).
whole_table: {{}}
"""
# </snippet>
datasource_yaml = datasource_yaml.replace("<CONNECTION_STRING>", CONNECTION_STRING)
# <snippet>
context.test_yaml_config(datasource_yaml)
# </snippet>
# <snippet>
datasource_yaml = f"""
name: taxi_datasource
class_name: SimpleSqlalchemyDatasource
connection_string: <CONNECTION_STRING>
tables: # Each key in the "tables" section is a table_name (key name "tables" in "SimpleSqlalchemyDatasource" configuration is reserved).
yellow_tripdata_sample_2019_01: # Must match table name exactly.
partitioners: # Each key in the "partitioners" sub-section the name of a ConfiguredAssetSqlDataConnector (key name "partitioners" in "SimpleSqlalchemyDatasource" configuration is reserved).
whole_table:
include_schema_name: true
schema_name: main
data_asset_name_prefix: taxi__
data_asset_name_suffix: __asset
"""
# </snippet>
datasource_yaml = datasource_yaml.replace("<CONNECTION_STRING>", CONNECTION_STRING)
# <snippet>
context.test_yaml_config(datasource_yaml)
# </snippet>
# <snippet>
context.add_datasource(**yaml.load(datasource_yaml))
# </snippet>
available_data_asset_names = context.datasources[
"taxi_datasource"
].get_available_data_asset_names(data_connector_names="whole_table")["whole_table"]
assert len(available_data_asset_names) == 1
| 33.283582 | 198 | 0.766816 | 520 | 4,460 | 6.296154 | 0.225 | 0.106903 | 0.079414 | 0.09774 | 0.790165 | 0.790165 | 0.770617 | 0.759927 | 0.736408 | 0.706475 | 0 | 0.005222 | 0.141256 | 4,460 | 133 | 199 | 33.533835 | 0.849608 | 0.096413 | 0 | 0.743243 | 0 | 0.081081 | 0.68067 | 0.175544 | 0 | 0 | 0 | 0 | 0.027027 | 1 | 0 | false | 0 | 0.027027 | 0 | 0.027027 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
a8a9dbec94ea7ec7c82716b1de865b9f20bd76de | 27,140 | py | Python | pycode/memilio-epidata/memilio/epidata_test/test_epidata_cleandata.py | DLR-SC/memilio | e83243df95e21b5841e3bc077a667fe5446da866 | [
"Apache-2.0"
] | 9 | 2021-08-24T11:01:41.000Z | 2022-03-16T12:45:39.000Z | pycode/memilio-epidata/memilio/epidata_test/test_epidata_cleandata.py | DLR-SC/memilio | e83243df95e21b5841e3bc077a667fe5446da866 | [
"Apache-2.0"
] | 197 | 2021-08-16T15:38:44.000Z | 2022-03-31T14:37:24.000Z | pycode/memilio-epidata/memilio/epidata_test/test_epidata_cleandata.py | DLR-SC/memilio | e83243df95e21b5841e3bc077a667fe5446da866 | [
"Apache-2.0"
] | 3 | 2021-09-28T08:29:02.000Z | 2022-03-16T12:45:42.000Z | #############################################################################
# Copyright (C) 2020-2021 German Aerospace Center (DLR-SC)
#
# Authors:
#
# Contact: Martin J. Kuehn <Martin.Kuehn@DLR.de>
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#############################################################################
import unittest
from pyfakefs import fake_filesystem_unittest
import os
import sys
from unittest.mock import patch
from memilio.epidata import cleanData as cd
from memilio.epidata import defaultDict as dd
class Test_cleanData(fake_filesystem_unittest.TestCase):
path = '/home/x'
def setUp(self):
self.setUpPyfakefs()
def set_dirs_and_files(self, what):
dir_dic_all = { 'Germany' : ["a_rki", "a_jh", "FullRKI", "PopulData", "FullDataB", "FullDataL"],
'Spain': ["b_jh"],
'France': ["c_jh"],
'Italy': ["d_jh"],
'US' : ["e_jh"],
'SouthKorea' : ["f_jh"],
'China' : ["g_jh"]}
dir_dic_rki = {'Germany': ["a_rki", "FullRKI"]}
dir_dic_popul = {'Germany': ["PopulData", "FullDataB", "FullDataL"]}
dir_dic_jh = {'Germany': ["a_jh"],
'Spain': ["b_jh"],
'France': ["c_jh"],
'Italy': ["d_jh"],
'US': ["e_jh"],
'SouthKorea': ["f_jh"],
'China': ["g_jh"]}
ending_all = [".json", ".h5"]
ending_json = [".json"]
dir_choose = {"all": dir_dic_all,
"rki": dir_dic_rki,
"jh": dir_dic_jh,
"popul": dir_dic_popul
}
ending_choose= {"all": ending_all,
"rki": ending_json,
"jh": ending_json,
"popul": ending_json
}
dir_dic = dir_choose[what]
ending = ending_choose[what]
file_list = ["all_jh", "FullJohnHopkins"]
# make folders
for key in dir_dic:
dir_path = os.path.join(self.path, key)
os.makedirs(dir_path)
# make files
for file in dir_dic[key]:
for e in ending:
with open(os.path.join(dir_path, file + e), 'w') as f:
f.write('foo')
if what == "all" or what == "jh":
for file in file_list:
for e in ending:
with open(os.path.join(self.path, file + e), 'w') as f:
f.write('foo')
def test_set_dirs_and_files(self):
# test if writte fct works as expected
self.set_dirs_and_files("all")
dir_list = ['Germany', 'Spain', 'France', 'Italy', 'US', 'SouthKorea', 'China']
# Test wanted folder and file structure
self.assertEqual(len(os.listdir(self.path)), 11)
self.assertEqual(os.listdir(self.path),
dir_list + ['all_jh.json', 'all_jh.h5', 'FullJohnHopkins.json', 'FullJohnHopkins.h5'])
for dir in dir_list:
dir_path = os.path.join(self.path, dir)
if dir == "Germany":
self.assertEqual(len(os.listdir(dir_path)), 12)
self.assertEqual(os.listdir(dir_path),
["a_rki.json", "a_rki.h5", "a_jh.json", "a_jh.h5", "FullRKI.json", "FullRKI.h5",
"PopulData.json", "PopulData.h5", "FullDataB.json", "FullDataB.h5",
"FullDataL.json", "FullDataL.h5"])
elif dir == "Spain":
self.assertEqual(len(os.listdir(dir_path)), 2)
self.assertEqual(os.listdir(dir_path), ["b_jh.json", "b_jh.h5"])
else:
self.assertEqual(len(os.listdir(dir_path)), 2)
if dir == "France":
self.assertEqual(os.listdir(dir_path), ["c_jh.json", "c_jh.h5"])
# generate folder and files
def test_clean_data_all_should_delete_all(self):
self.set_dirs_and_files("all")
cd.clean_data(True, False, False, False, False, self.path)
# Should delete everything
self.assertEqual(len(os.listdir(self.path)), 0)
def test_clean_data_all_should_not_delete_all(self):
self.set_dirs_and_files("all")
# add different files and folder
os.makedirs(os.path.join(self.path, "ImportantDir"))
with open(os.path.join(self.path, "wichtig.py"), 'w') as f:
f.write('foo')
dir_path = os.path.join(self.path, "China")
with open(os.path.join(dir_path, "secret.txt"), 'w') as f:
f.write('foo')
cd.clean_data(True, False, False, False, False, self.path)
# Should delete everything
self.assertEqual(len(os.listdir(self.path)), 3)
self.assertEqual(os.listdir(self.path), ["China", "ImportantDir", "wichtig.py"])
self.assertEqual(len(os.listdir(dir_path)), 1)
self.assertEqual(os.listdir(dir_path), ["secret.txt"])
def test_clean_data_rki(self):
self.set_dirs_and_files("all")
cd.clean_data(False, True, False, False, False, self.path)
dir_list = ['Germany', 'Spain', 'France', 'Italy', 'US', 'SouthKorea', 'China']
# Test wanted folder and file structure
self.assertEqual(len(os.listdir(self.path)), 11)
self.assertEqual(os.listdir(self.path),
dir_list + ['all_jh.json', 'all_jh.h5', 'FullJohnHopkins.json', 'FullJohnHopkins.h5'])
for dir in dir_list:
dir_path = os.path.join(self.path, dir)
if dir == "Germany":
self.assertEqual(len(os.listdir(dir_path)), 10)
self.assertEqual(os.listdir(dir_path),
["a_rki.h5", "a_jh.json", "a_jh.h5", "FullRKI.h5",
"PopulData.json", "PopulData.h5", "FullDataB.json", "FullDataB.h5",
"FullDataL.json", "FullDataL.h5"])
elif dir == "Spain":
self.assertEqual(len(os.listdir(dir_path)), 2)
self.assertEqual(os.listdir(dir_path), ["b_jh.json", "b_jh.h5"])
else:
self.assertEqual(len(os.listdir(dir_path)), 2)
if dir == "France":
self.assertEqual(os.listdir(dir_path), ["c_jh.json", "c_jh.h5"])
def test_clean_data_rki_h5(self):
self.set_dirs_and_files("all")
cd.clean_data(False, True, False, False, True, self.path)
dir_list = ['Germany', 'Spain', 'France', 'Italy', 'US', 'SouthKorea', 'China']
# Test wanted folder and file structure
self.assertEqual(len(os.listdir(self.path)), 11)
self.assertEqual(os.listdir(self.path),
dir_list + ['all_jh.json', 'all_jh.h5', 'FullJohnHopkins.json', 'FullJohnHopkins.h5'])
for dir in dir_list:
dir_path = os.path.join(self.path, dir)
if dir == "Germany":
self.assertEqual(len(os.listdir(dir_path)), 10)
self.assertEqual(os.listdir(dir_path),
["a_rki.json", "a_jh.json", "a_jh.h5", "FullRKI.json",
"PopulData.json", "PopulData.h5", "FullDataB.json", "FullDataB.h5",
"FullDataL.json", "FullDataL.h5"])
elif dir == "Spain":
self.assertEqual(len(os.listdir(dir_path)), 2)
self.assertEqual(os.listdir(dir_path), ["b_jh.json", "b_jh.h5"])
else:
self.assertEqual(len(os.listdir(dir_path)), 2)
if dir == "France":
self.assertEqual(os.listdir(dir_path), ["c_jh.json", "c_jh.h5"])
def test_clean_data_rki_del_dir(self):
self.set_dirs_and_files("all")
dir_path = os.path.join(self.path, "Germany")
files = os.listdir(dir_path)
# delete all files except which will be deleted
for item in files:
if item == "a_rki.json" or item == "FullRKI.json":
continue
else:
os.remove(os.path.join(dir_path, item))
self.assertEqual(len(os.listdir(dir_path)), 2)
self.assertEqual(os.listdir(dir_path), ["a_rki.json", "FullRKI.json"])
cd.clean_data(False, True, False, False, False, self.path)
dir_list = ['Spain', 'France', 'Italy', 'US', 'SouthKorea', 'China']
# Test wanted folder and file structure
# Germany should be deleted, because no files where left after deletion
self.assertEqual(len(os.listdir(self.path)), 10)
self.assertEqual(os.listdir(self.path),
dir_list + ['all_jh.json', 'all_jh.h5', 'FullJohnHopkins.json', 'FullJohnHopkins.h5'])
for dir in dir_list:
dir_path = os.path.join(self.path, dir)
if dir == "Spain":
self.assertEqual(len(os.listdir(dir_path)), 2)
self.assertEqual(os.listdir(dir_path), ["b_jh.json", "b_jh.h5"])
else:
self.assertEqual(len(os.listdir(dir_path)), 2)
if dir == "France":
self.assertEqual(os.listdir(dir_path), ["c_jh.json", "c_jh.h5"])
def test_clean_data_population(self):
# test if writte fct works as expected
self.set_dirs_and_files("all")
cd.clean_data(False, False, False, True, False, self.path)
dir_list = ['Germany', 'Spain', 'France', 'Italy', 'US', 'SouthKorea', 'China']
# Test wanted folder and file structure
self.assertEqual(len(os.listdir(self.path)), 11)
self.assertEqual(os.listdir(self.path),
dir_list + ['all_jh.json', 'all_jh.h5', 'FullJohnHopkins.json', 'FullJohnHopkins.h5'])
for dir in dir_list:
dir_path = os.path.join(self.path, dir)
if dir == "Germany":
self.assertEqual(len(os.listdir(dir_path)), 9)
self.assertEqual(os.listdir(dir_path),
["a_rki.json", "a_rki.h5", "a_jh.json", "a_jh.h5", "FullRKI.json", "FullRKI.h5",
"PopulData.h5", "FullDataB.h5", "FullDataL.h5"])
elif dir == "Spain":
self.assertEqual(len(os.listdir(dir_path)), 2)
self.assertEqual(os.listdir(dir_path), ["b_jh.json", "b_jh.h5"])
else:
self.assertEqual(len(os.listdir(dir_path)), 2)
if dir == "France":
self.assertEqual(os.listdir(dir_path), ["c_jh.json", "c_jh.h5"])
def test_clean_data_population_hdf5(self):
# test if writte fct works as expected
self.set_dirs_and_files("all")
cd.clean_data(False, False, False, True, True, self.path)
dir_list = ['Germany', 'Spain', 'France', 'Italy', 'US', 'SouthKorea', 'China']
# Test wanted folder and file structure
self.assertEqual(len(os.listdir(self.path)), 11)
self.assertEqual(os.listdir(self.path),
dir_list + ['all_jh.json', 'all_jh.h5', 'FullJohnHopkins.json', 'FullJohnHopkins.h5'])
for dir in dir_list:
dir_path = os.path.join(self.path, dir)
if dir == "Germany":
self.assertEqual(len(os.listdir(dir_path)), 9)
self.assertEqual(os.listdir(dir_path),
["a_rki.json", "a_rki.h5", "a_jh.json", "a_jh.h5", "FullRKI.json", "FullRKI.h5",
"PopulData.json", "FullDataB.json", "FullDataL.json"])
elif dir == "Spain":
self.assertEqual(len(os.listdir(dir_path)), 2)
self.assertEqual(os.listdir(dir_path), ["b_jh.json", "b_jh.h5"])
else:
self.assertEqual(len(os.listdir(dir_path)), 2)
if dir == "France":
self.assertEqual(os.listdir(dir_path), ["c_jh.json", "c_jh.h5"])
def test_clean_data_population_del_dir(self):
# test if writte fct works as expected
self.set_dirs_and_files("all")
dir_path = os.path.join(self.path, "Germany")
files = os.listdir(dir_path)
# delete all files except which will be deleted
for item in files:
if item == "PopulData.json" or item == "FullDataB.json" or item == "FullDataL.json":
continue
else:
os.remove(os.path.join(dir_path, item))
self.assertEqual(len(os.listdir(dir_path)), 3)
cd.clean_data(False, False, False, True, False, self.path)
dir_list = ['Spain', 'France', 'Italy', 'US', 'SouthKorea', 'China']
# Test wanted folder and file structure
self.assertEqual(len(os.listdir(self.path)), 10)
self.assertEqual(os.listdir(self.path),
dir_list + ['all_jh.json', 'all_jh.h5', 'FullJohnHopkins.json', 'FullJohnHopkins.h5'])
for dir in dir_list:
dir_path = os.path.join(self.path, dir)
if dir == "Spain":
self.assertEqual(len(os.listdir(dir_path)), 2)
self.assertEqual(os.listdir(dir_path), ["b_jh.json", "b_jh.h5"])
else:
self.assertEqual(len(os.listdir(dir_path)), 2)
if dir == "France":
self.assertEqual(os.listdir(dir_path), ["c_jh.json", "c_jh.h5"])
def test_all_false(self):
cd.clean_data(False, False, False, False, False, self.path)
# test if writte fct works as expected
self.set_dirs_and_files("all")
dir_list = ['Germany', 'Spain', 'France', 'Italy', 'US', 'SouthKorea', 'China']
# Test wanted folder and file structure
self.assertEqual(len(os.listdir(self.path)), 11)
self.assertEqual(os.listdir(self.path),
dir_list + ['all_jh.json', 'all_jh.h5', 'FullJohnHopkins.json', 'FullJohnHopkins.h5'])
for dir in dir_list:
dir_path = os.path.join(self.path, dir)
if dir == "Germany":
self.assertEqual(len(os.listdir(dir_path)), 12)
self.assertEqual(os.listdir(dir_path),
["a_rki.json", "a_rki.h5", "a_jh.json", "a_jh.h5", "FullRKI.json", "FullRKI.h5",
"PopulData.json", "PopulData.h5", "FullDataB.json", "FullDataB.h5",
"FullDataL.json", "FullDataL.h5"])
elif dir == "Spain":
self.assertEqual(len(os.listdir(dir_path)), 2)
self.assertEqual(os.listdir(dir_path), ["b_jh.json", "b_jh.h5"])
else:
self.assertEqual(len(os.listdir(dir_path)), 2)
if dir == "France":
self.assertEqual(os.listdir(dir_path), ["c_jh.json", "c_jh.h5"])
def test_wrong_path(self):
cd.clean_data(True, False, False, False, False, "/home/y")
# TODO add some test: but what? - nothing is happening in this case
def test_clean_data_jh(self):
# test if writte fct works as expected
self.set_dirs_and_files("all")
cd.clean_data(False, False, True, False, False, self.path)
dir_list = ['Germany', 'Spain', 'France', 'Italy', 'US', 'SouthKorea', 'China']
# Test wanted folder and file structure
self.assertEqual(len(os.listdir(self.path)), 9)
self.assertEqual(os.listdir(self.path),
dir_list + ['all_jh.h5', 'FullJohnHopkins.h5'])
for dir in dir_list:
dir_path = os.path.join(self.path, dir)
if dir == "Germany":
self.assertEqual(len(os.listdir(dir_path)), 11)
self.assertEqual(os.listdir(dir_path),
["a_rki.json", "a_rki.h5", "a_jh.h5", "FullRKI.json", "FullRKI.h5",
"PopulData.json", "PopulData.h5", "FullDataB.json", "FullDataB.h5",
"FullDataL.json", "FullDataL.h5"])
elif dir == "Spain":
self.assertEqual(len(os.listdir(dir_path)), 1)
self.assertEqual(os.listdir(dir_path), ["b_jh.h5"])
else:
self.assertEqual(len(os.listdir(dir_path)), 1)
if dir == "France":
self.assertEqual(os.listdir(dir_path), ["c_jh.h5"])
def test_clean_data_jh_hdf5(self):
# test if writte fct works as expected
self.set_dirs_and_files("all")
cd.clean_data(False, False, True, False, True, self.path)
dir_list = ['Germany', 'Spain', 'France', 'Italy', 'US', 'SouthKorea', 'China']
# Test wanted folder and file structure
self.assertEqual(len(os.listdir(self.path)), 9)
self.assertEqual(os.listdir(self.path),
dir_list + ['all_jh.json', 'FullJohnHopkins.json'])
for dir in dir_list:
dir_path = os.path.join(self.path, dir)
if dir == "Germany":
self.assertEqual(len(os.listdir(dir_path)), 11)
self.assertEqual(os.listdir(dir_path),
["a_rki.json", "a_rki.h5", "a_jh.json", "FullRKI.json", "FullRKI.h5",
"PopulData.json", "PopulData.h5", "FullDataB.json", "FullDataB.h5",
"FullDataL.json", "FullDataL.h5"])
elif dir == "Spain":
self.assertEqual(len(os.listdir(dir_path)), 1)
self.assertEqual(os.listdir(dir_path), ["b_jh.json"])
else:
self.assertEqual(len(os.listdir(dir_path)), 1)
if dir == "France":
self.assertEqual(os.listdir(dir_path), ["c_jh.json"])
def test_clean_data_jh_both_endings(self):
# test if writte fct works as expected
self.set_dirs_and_files("all")
cd.clean_data(False, False, True, False, False, self.path)
cd.clean_data(False, False, True, False, True, self.path)
dir_list = ['Germany']
# Test wanted folder and file structure
self.assertEqual(len(os.listdir(self.path)), 1)
self.assertEqual(os.listdir(self.path),
dir_list)
for dir in dir_list:
dir_path = os.path.join(self.path, dir)
if dir == "Germany":
self.assertEqual(len(os.listdir(dir_path)), 10)
self.assertEqual(os.listdir(dir_path),
["a_rki.json", "a_rki.h5","FullRKI.json", "FullRKI.h5",
"PopulData.json", "PopulData.h5", "FullDataB.json", "FullDataB.h5",
"FullDataL.json", "FullDataL.h5"])
def test_clean_data_rki_johns_hopkins(self):
# test if writte fct works as expected
self.set_dirs_and_files("all")
cd.clean_data(False, True, True, False, False, self.path)
dir_list = ['Germany', 'Spain', 'France', 'Italy', 'US', 'SouthKorea', 'China']
# Test wanted folder and file structure
self.assertEqual(len(os.listdir(self.path)), 9)
self.assertEqual(os.listdir(self.path),
dir_list + ['all_jh.h5', 'FullJohnHopkins.h5'])
for dir in dir_list:
dir_path = os.path.join(self.path, dir)
if dir == "Germany":
self.assertEqual(len(os.listdir(dir_path)), 9)
self.assertEqual(os.listdir(dir_path),
["a_rki.h5", "a_jh.h5", "FullRKI.h5",
"PopulData.json", "PopulData.h5", "FullDataB.json", "FullDataB.h5",
"FullDataL.json", "FullDataL.h5"])
elif dir == "Spain":
self.assertEqual(len(os.listdir(dir_path)), 1)
self.assertEqual(os.listdir(dir_path), ["b_jh.h5"])
else:
self.assertEqual(len(os.listdir(dir_path)), 1)
if dir == "France":
self.assertEqual(os.listdir(dir_path), ["c_jh.h5"])
def test_clean_data_rki_john_hokins_spain_population(self):
# test if writte fct works as expected
self.set_dirs_and_files("all")
cd.clean_data(False, True, True, True, False, self.path)
dir_list = ['Germany', 'Spain', 'France', 'Italy', 'US', 'SouthKorea', 'China']
# Test wanted folder and file structure
self.assertEqual(len(os.listdir(self.path)), 9)
self.assertEqual(os.listdir(self.path),
dir_list + ['all_jh.h5', 'FullJohnHopkins.h5'])
for dir in dir_list:
dir_path = os.path.join(self.path, dir)
if dir == "Germany":
self.assertEqual(len(os.listdir(dir_path)), 6)
self.assertEqual(os.listdir(dir_path),
["a_rki.h5", "a_jh.h5", "FullRKI.h5",
"PopulData.h5", "FullDataB.h5", "FullDataL.h5"])
elif dir == "Spain":
self.assertEqual(len(os.listdir(dir_path)), 1)
self.assertEqual(os.listdir(dir_path), ["b_jh.h5"])
else:
self.assertEqual(len(os.listdir(dir_path)), 1)
if dir == "France":
self.assertEqual(os.listdir(dir_path), ["c_jh.h5"])
def test_file_not_found_rki(self):
self.set_dirs_and_files("rki")
# add different files and folder
os.makedirs(os.path.join(self.path, "ImportantDir"))
with open(os.path.join(self.path, "wichtig.py"), 'w') as f:
f.write('foo')
cd.clean_data(False, True, False, False, False, self.path)
self.assertEqual(len(os.listdir(self.path)), 2)
self.assertEqual(os.listdir(self.path), ["ImportantDir", "wichtig.py"])
def test_file_not_found_population(self):
self.set_dirs_and_files("popul")
# add different files and folder
os.makedirs(os.path.join(self.path, "ImportantDir"))
with open(os.path.join(self.path, "wichtig.py"), 'w') as f:
f.write('foo')
cd.clean_data(False, False, False, True, False, self.path)
self.assertEqual(len(os.listdir(self.path)), 2)
self.assertEqual(os.listdir(self.path), ["ImportantDir", "wichtig.py"])
def test_file_not_found_jh(self):
self.set_dirs_and_files("jh")
# add different files and folder
os.makedirs(os.path.join(self.path, "ImportantDir"))
with open(os.path.join(self.path, "wichtig.py"), 'w') as f:
f.write('foo')
cd.clean_data(False, False, True, False, False, self.path)
self.assertEqual(len(os.listdir(self.path)), 2)
self.assertEqual(os.listdir(self.path), ["ImportantDir", "wichtig.py"])
def test_no_files(self):
# The following should run without any problem.
# Every error should be cached and passed
# no data in folder
cd.clean_data(False, False, True, False, False, self.path)
# population
cd.clean_data(False, False, False, True, False, self.path)
# rki
cd.clean_data(False, True, False, False, False, self.path)
def test_cli_default(self):
out_path_default = dd.defaultDict['out_folder']
test_args = ["prog"]
with patch.object(sys, 'argv', test_args):
[all_data, rki, jh, popul, hdf5, out_path] = cd.cli()
print([all_data, rki, jh, popul, hdf5, out_path])
self.assertEqual(all_data, False)
self.assertEqual(rki, False)
self.assertEqual(jh, False)
self.assertEqual(popul, False)
self.assertEqual(hdf5, False)
self.assertEqual(out_path, out_path_default)
def test_cli_folder(self):
folder = "some_folder"
test_args = ["prog", '--out_path', folder]
with patch.object(sys, 'argv', test_args):
[all_data, rki, jh, popul, hdf5, out_path] = cd.cli()
self.assertEqual(all_data, False)
self.assertEqual(rki, False)
self.assertEqual(jh, False)
self.assertEqual(popul, False)
self.assertEqual(hdf5, False)
self.assertEqual(out_path, folder)
def test_cli_all(self):
out_path_default = dd.defaultDict['out_folder']
test_args = ["prog", '--all']
with patch.object(sys, 'argv', test_args):
[all_data, rki, jh, popul, hdf5, out_path] = cd.cli()
self.assertEqual(all_data, True)
self.assertEqual(rki, False)
self.assertEqual(jh, False)
self.assertEqual(popul, False)
self.assertEqual(hdf5, False)
self.assertEqual(out_path, out_path_default)
def test_cli_rki(self):
out_path_default = dd.defaultDict['out_folder']
test_args = ["prog", '--rki']
with patch.object(sys, 'argv', test_args):
[all_data, rki, jh, popul, hdf5, out_path] = cd.cli()
self.assertEqual(all_data, False)
self.assertEqual(rki, True)
self.assertEqual(jh, False)
self.assertEqual(popul, False)
self.assertEqual(hdf5, False)
self.assertEqual(out_path, out_path_default)
def test_cli_jh(self):
out_path_default = dd.defaultDict['out_folder']
test_args = ["prog", '-j', '--hdf5']
with patch.object(sys, 'argv', test_args):
[all_data, rki, jh, popul, hdf5, out_path] = cd.cli()
self.assertEqual(all_data, False)
self.assertEqual(rki, False)
self.assertEqual(jh, True)
self.assertEqual(popul, False)
self.assertEqual(hdf5, True)
self.assertEqual(out_path, out_path_default)
def test_cli_popul(self):
out_path_default = dd.defaultDict['out_folder']
test_args = ['prog', '--population', '-h5']
with patch.object(sys, 'argv', test_args):
[all_data, rki, jh, popul, hdf5, out_path] = cd.cli()
self.assertEqual(all_data, False)
self.assertEqual(rki, False)
self.assertEqual(jh, False)
self.assertEqual(popul, True)
self.assertEqual(hdf5, True)
self.assertEqual(out_path, out_path_default)
if __name__ == '__main__':
unittest.main()
| 36.23498 | 113 | 0.55 | 3,314 | 27,140 | 4.34309 | 0.067592 | 0.152157 | 0.064198 | 0.085597 | 0.86549 | 0.858542 | 0.845203 | 0.835476 | 0.826652 | 0.814632 | 0 | 0.010902 | 0.307148 | 27,140 | 748 | 114 | 36.283422 | 0.75452 | 0.076271 | 0 | 0.752212 | 0 | 0 | 0.14684 | 0 | 0 | 0 | 0 | 0.001337 | 0.323009 | 1 | 0.061947 | false | 0 | 0.033186 | 0 | 0.099558 | 0.002212 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
a8be00485d84bd75298acaedb039eee193812015 | 7,583 | py | Python | backend/tests/baserow/contrib/database/formula/test_rename_field_references.py | ashishdhngr/baserow | b098678d2165eb7c42930ee24dc6753a3cb520c3 | [
"MIT"
] | 1 | 2022-01-24T15:12:02.000Z | 2022-01-24T15:12:02.000Z | backend/tests/baserow/contrib/database/formula/test_rename_field_references.py | rasata/baserow | c6e1d7842c53f801e1c96b49f1377da2a06afaa9 | [
"MIT"
] | null | null | null | backend/tests/baserow/contrib/database/formula/test_rename_field_references.py | rasata/baserow | c6e1d7842c53f801e1c96b49f1377da2a06afaa9 | [
"MIT"
] | null | null | null | import pytest
from baserow.contrib.database.formula import FormulaHandler, BaserowFormulaSyntaxError
def test_replace_single_quoted_field_ref():
new_formula = FormulaHandler.rename_field_references_in_formula_string(
"field('test')",
{"test": "new " "test"},
)
assert new_formula == "field('new test')"
def test_replace_double_quoted_field_ref():
new_formula = FormulaHandler.rename_field_references_in_formula_string(
'field("test")', {"test": "new test"}
)
assert new_formula == 'field("new test")'
def test_replace_field_reference_keeping_whitespace():
new_formula = FormulaHandler.rename_field_references_in_formula_string(
" \n\tfield('test') \n\t", {"test": "new test"}
)
assert new_formula == " \n\tfield('new test') \n\t"
def test_replace_field_reference_keeping_whitespace_and_comments():
new_formula = FormulaHandler.rename_field_references_in_formula_string(
"//my line comment \n\tfield('test') /*my block comment*/\n\t",
{"test": "new " "test"},
)
assert (
new_formula == "//my line comment \n\tfield('new test') /*my block "
"comment*/\n\t"
)
def test_replace_field_reference_preserving_case():
new_formula = FormulaHandler.rename_field_references_in_formula_string(
"//my line comment \n\tADD(fIeLd('test'),1) /*my block comment*/\n\t",
{"test": "new " "test"},
)
assert (
new_formula == "//my line comment \n\tADD(fIeLd('new test'),1) /*my block "
"comment*/\n\t"
)
def test_replace_binary_op_keeping_whitespace_and_comments():
new_formula = FormulaHandler.rename_field_references_in_formula_string(
"//my line comment \n\t1+1 /*my block comment*/\n\t",
{"test": "new " "test"},
)
assert new_formula == "//my line comment \n\t1+1 /*my block " "comment*/\n\t"
def test_replace_function_call_keeping_whitespace_and_comments():
new_formula = FormulaHandler.rename_field_references_in_formula_string(
"//my line comment \n\tadd( 1\t \t+\t \t1,\nfield('test')\t) /*my block "
"comment*/\n\t",
{"test": "new test"},
)
assert (
new_formula == "//my line comment \n\tadd( 1\t \t+\t \t1,\nfield('new "
"test')\t) /*my block comment*/\n\t"
)
def test_replace_double_quote_field_ref_containing_single_quotes():
new_formula = FormulaHandler.rename_field_references_in_formula_string(
'field("test with \'")', {"test with '": "new test with ' \\' and \" and \\\""}
)
assert new_formula == 'field("new test with \' \\\' and \\" and \\\\"")'
def test_replace_double_quote_field_ref_containing_double_quotes():
new_formula = FormulaHandler.rename_field_references_in_formula_string(
"field('test with \\'')", {"test with '": "new test with ' \\' and \" and \\\""}
)
assert new_formula == "field('new test with \\' \\\\' and \" and \\\"')"
def test_can_replace_multiple_different_field_references():
new_formula = FormulaHandler.rename_field_references_in_formula_string(
'concat(field("test"), field("test"), field(\'other\'))',
{"test": "new test", "other": "new other"},
)
assert (
new_formula == 'concat(field("new test"), field("new test"), '
"field('new other'))"
)
def test_leaves_unknown_field_references_along():
new_formula = FormulaHandler.rename_field_references_in_formula_string(
"field('test')",
{},
)
assert new_formula == "field('test')"
def test_raises_with_field_names_for_invalid_syntax():
_assert_raises("field('test'")
_assert_raises("field(''''test'")
_assert_raises("field(test")
_assert_raises("field(1)")
_assert_raises("field)")
def _assert_raises(formula):
with pytest.raises(BaserowFormulaSyntaxError):
FormulaHandler.rename_field_references_in_formula_string(
formula,
{
"test": "new test",
},
)
def test_replaces_unknown_field_by_id_with_field():
new_formula = FormulaHandler.rename_field_references_in_formula_string(
"field_by_id(1)",
{},
)
assert new_formula == "field('unknown field 1')"
def test_replaces_unknown_field_by_id_with_field_multiple():
new_formula = FormulaHandler.rename_field_references_in_formula_string(
"field_by_id(1)+concat(field('a'), field_by_id(2))",
{},
)
assert (
new_formula == "field('unknown field 1')+concat(field('a'), field('unknown "
"field 2'))"
)
def test_replaces_known_field_by_id():
new_formula = FormulaHandler.rename_field_references_in_formula_string(
"field_by_id(1)+concat(field('a'), field_by_id(2))",
{},
field_ids_to_replace_with_name_refs={1: "test", 2: "other_test"},
)
assert new_formula == "field('test')+concat(field('a'), field('other_test'))"
def test_replaces_functions_preserving_case():
new_formula = FormulaHandler.rename_field_references_in_formula_string(
"field_by_id(1)+CONCAT(field('a'), field_by_id(2))",
{},
field_ids_to_replace_with_name_refs={1: "test", 2: "other_test"},
)
assert new_formula == "field('test')+CONCAT(field('a'), field('other_test'))"
def test_replaces_known_field_by_id_single_quotes():
new_formula = FormulaHandler.rename_field_references_in_formula_string(
"field_by_id(1)",
{},
field_ids_to_replace_with_name_refs={1: "test with ' '", 2: "other_test"},
)
assert new_formula == "field('test with \\' \\'')"
def test_replaces_known_field_by_id_double_quotes():
new_formula = FormulaHandler.rename_field_references_in_formula_string(
"field_by_id(1)",
{},
field_ids_to_replace_with_name_refs={1: 'test with " "', 2: "other_test"},
)
assert new_formula == "field('test with \" \"')"
def test_replaces_field_with_field_by_id():
new_formula = FormulaHandler.rename_field_references_in_formula_string(
"field('a')",
{},
field_names_to_replace_with_id_refs={"a": 1},
)
assert new_formula == "field_by_id(1)"
def test_doesnt_replace_unknown_field():
new_formula = FormulaHandler.rename_field_references_in_formula_string(
"field('b')+concat(field('a'), field('c'))",
{},
field_names_to_replace_with_id_refs={"a": 1},
)
assert new_formula == "field('b')+concat(field_by_id(1), field('c'))"
def test_replaces_field_with_single_quotes_with_id():
new_formula = FormulaHandler.rename_field_references_in_formula_string(
"field('test with \\' \\'')",
{},
field_names_to_replace_with_id_refs={"test with ' '": 1, "other_test": 2},
)
assert new_formula == "field_by_id(1)"
def test_replaces_field_with_double_quotes_with_id():
new_formula = FormulaHandler.rename_field_references_in_formula_string(
"field('test with \" \"')",
{},
field_names_to_replace_with_id_refs={'test with " "': 1, "other_test": 2},
)
assert new_formula == "field_by_id(1)"
def test_replaces_lookup():
new_formula = FormulaHandler.rename_field_references_in_formula_string(
"lookup('a', 'b')", {"a": "c"}
)
assert new_formula == "lookup('c', 'b')"
def test_replaces_lookup_when_via_changes():
new_formula = FormulaHandler.rename_field_references_in_formula_string(
"lookup('a', 'b')+field('b')", {"b": "c"}, via_field="a"
)
assert new_formula == "lookup('a', 'c')+field('b')"
| 32.405983 | 88 | 0.659897 | 961 | 7,583 | 4.78564 | 0.083247 | 0.100022 | 0.130463 | 0.182648 | 0.851489 | 0.825832 | 0.816482 | 0.760165 | 0.72668 | 0.671885 | 0 | 0.006695 | 0.192404 | 7,583 | 233 | 89 | 32.545064 | 0.744285 | 0 | 0 | 0.343023 | 0 | 0.017442 | 0.261242 | 0.041408 | 0 | 0 | 0 | 0 | 0.168605 | 1 | 0.145349 | false | 0 | 0.011628 | 0 | 0.156977 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
a8c3c92687b5ae610d29c762aef39e52855153d0 | 24,075 | py | Python | dataprep/clean/clean_ml.py | devinllu/dataprep | d56861e5bed3c608cace74983f797dc729072d0a | [
"MIT"
] | 1,229 | 2019-12-21T02:58:59.000Z | 2022-03-30T08:12:33.000Z | dataprep/clean/clean_ml.py | devinllu/dataprep | d56861e5bed3c608cace74983f797dc729072d0a | [
"MIT"
] | 680 | 2019-12-19T06:09:23.000Z | 2022-03-31T04:15:25.000Z | dataprep/clean/clean_ml.py | devinllu/dataprep | d56861e5bed3c608cace74983f797dc729072d0a | [
"MIT"
] | 170 | 2020-01-08T03:27:26.000Z | 2022-03-20T20:42:55.000Z | """
Implement clean_ml function
"""
# pylint: disable=too-many-arguments, too-many-locals, too-many-branches
from typing import Union, Dict, List, Tuple, Optional, Any
import dask.dataframe as dd
import pandas as pd
from .pipeline import Pipeline
from .utils import to_dask, NULL_VALUES
def clean_ml(
training_df: Union[pd.DataFrame, dd.DataFrame],
test_df: Union[pd.DataFrame, dd.DataFrame],
target: str = "target",
cat_imputation: str = "constant",
cat_null_value: Optional[List[Any]] = None,
fill_val: str = "missing_value",
num_imputation: str = "mean",
num_null_value: Optional[List[Any]] = None,
cat_encoding: str = "one_hot",
variance_threshold: bool = False,
variance: float = 0.0,
num_scaling: str = "standardize",
include_operators: Optional[List[str]] = None,
exclude_operators: Optional[List[str]] = None,
customized_cat_pipeline: Optional[List[Dict[str, Any]]] = None,
customized_num_pipeline: Optional[List[Dict[str, Any]]] = None,
) -> Tuple[pd.DataFrame, pd.DataFrame]:
"""
This function transforms an arbitrary tabular dataset
into a format that's suitable for a typical ML application.
Parameters
----------
training_df
Training dataframe. Pandas or Dask DataFrame.
test_df
Test dataframe. Pandas or Dask DataFrame.
target
Name of target column. String.
cat_imputation
The mode of imputation for categorical columns.
If it equals to "constant",
then all missing values are filled with `fill_val`.
If it equals to "most_frequent",
then all missing values are filled with most frequent value.
If it equals to "drop",
then all categorical columns with missing values will be dropped.
cat_null_value
Specified categorical null values which should be recognized.
fill_val
When cat_imputation = "constant",
then all missing values are filled with `fill_val`.
num_imputation
The mode of imputation for numerical columns.
If it equals to "mean",
then all missing values are filled with mean value.
If it equals to "median",
then all missing values are filled with median value.
If it equals to "most_frequent",
then all missing values are filled with most frequent value.
If it equals to "drop",
then all numerical columns with missing values will be dropped.
num_null_value
Specified numerical null values which should be recognized.
cat_encoding
The mode of encoding categorical columns.
If it equals to "one_hot", do one-hot encoding.
If it equals to "no_encoding", nothing will be done.
variance_threshold
If it is True,
then dropping numerical columns with variance less than `variance`.
variance
Variance value when variance_threshold = True.
num_scaling
The mode of scaling for numerical columns.
If it equals to "standardize", do standardize for all numerical columns.
If it equals to "minmax", do minmax scaling for all numerical columns.
If it equals to "maxabs", do maxabs scaling for all numerical columns.
If it equals to "no_scaling", nothing will be done.
include_operators
Components included for `clean_ml`, like "one_hot", "standardize", etc.
exclude_operators
Components excluded for `clean_ml`, like "one_hot", "standardize", etc.
"""
if cat_null_value is None:
cat_null_value = list(NULL_VALUES)
if num_null_value is None:
num_null_value = list(NULL_VALUES)
training_df = to_dask(training_df)
test_df = to_dask(test_df)
col_names = []
for label, _ in training_df.items(): # doctest: +SKIP
col_names.append(label)
for col_name in col_names:
if col_name == target:
continue
if not customized_cat_pipeline is None and customized_num_pipeline is None:
temp_training_df, temp_test_df = format_data_with_customized_cat(
training_df[col_name].compute(),
test_df[col_name].compute(),
num_imputation,
num_null_value,
variance_threshold,
variance,
num_scaling,
include_operators,
exclude_operators,
customized_cat_pipeline,
)
elif customized_cat_pipeline is None and not customized_num_pipeline is None:
temp_training_df, temp_test_df = format_data_with_customized_num(
training_df[col_name].compute(),
test_df[col_name].compute(),
cat_imputation,
cat_null_value,
fill_val,
cat_encoding,
include_operators,
exclude_operators,
customized_num_pipeline,
)
elif customized_cat_pipeline is None and customized_num_pipeline is None:
temp_training_df, temp_test_df = format_data_with_default(
training_df[col_name].compute(),
test_df[col_name].compute(),
cat_imputation,
cat_null_value,
fill_val,
num_imputation,
num_null_value,
cat_encoding,
variance_threshold,
variance,
num_scaling,
include_operators,
exclude_operators,
)
elif not customized_cat_pipeline is None and not customized_num_pipeline is None:
temp_training_df, temp_test_df = format_data_with_customized_cat_and_num(
training_df[col_name].compute(),
test_df[col_name].compute(),
include_operators,
exclude_operators,
customized_cat_pipeline,
customized_num_pipeline,
)
if temp_training_df.values.size > 0:
training_df[col_name] = temp_training_df
test_df[col_name] = temp_test_df
else:
training_df = training_df.drop(columns=[col_name])
test_df = test_df.drop(columns=[col_name])
return training_df.compute(), test_df.compute()
def format_data_with_customized_cat(
training_row: dd.Series,
test_row: dd.Series,
num_imputation: str = "mean",
num_null_value: Optional[List[Any]] = None,
variance_threshold: bool = False,
variance: float = 0.0,
num_scaling: str = "standardize",
include_operators: Optional[List[str]] = None,
exclude_operators: Optional[List[str]] = None,
customized_cat_pipeline: Optional[List[Dict[str, Any]]] = None,
) -> Tuple[dd.Series, dd.Series]:
"""
This function transforms an arbitrary tabular dataset
into a format that's suitable for a typical ML application.
Customized categorical pipeline and related parameters should be provided by users
Parameters
----------
training_row
One column of training dataset. Dask Series.
test_row
One column of test dataset. Dask Series.
num_imputation
The mode of imputation for numerical columns.
If it equals to "mean",
then all missing values are filled with mean value.
If it equals to "median",
then all missing values are filled with median value.
If it equals to "most_frequent",
then all missing values are filled with most frequent value.
If it equals to "drop",
then all numerical columns with missing values will be dropped.
num_null_value
Specified numerical null values which should be recognized.
variance_threshold
If it is True, then dropping numerical columns with variance less than `variance`.
variance
Variance value when variance_threshold = True.
num_scaling
The mode of scaling for numerical columns.
If it equals to "standardize", do standardize for all numerical columns.
If it equals to "minmax", do minmax scaling for all numerical columns.
If it equals to "maxabs", do maxabs scaling for all numerical columns.
If it equals to "no_scaling", nothing will be done.
include_operators
Components included for `clean_ml`, like "one_hot", "standardize", etc.
exclude_operators
Components excluded for `clean_ml`, like "one_hot", "standardize", etc.
customized_cat_pipeline
User-specified pipeline managing categorical columns.
"""
cat_pipe_info: Dict[str, Any] = {}
cat_pipeline = []
if not customized_cat_pipeline is None:
for item in customized_cat_pipeline:
(component_key,) = item
cat_pipeline.append(component_key)
cat_pipe_info["cat_pipeline"] = cat_pipeline
for item in customized_cat_pipeline:
(component_key,) = item
if (
not exclude_operators is None
and item[component_key]["operator"] in exclude_operators
) or (
not include_operators is None
and item[component_key]["operator"] not in include_operators
):
cat_pipe_info[component_key] = None
continue
for key in item[component_key]:
if key == "operator":
cat_pipe_info[component_key] = item[component_key][key]
else:
cat_pipe_info[key] = item[component_key][key]
num_pipe_info: Dict[str, Any] = {}
if variance_threshold:
num_pipe_info["num_pipeline"] = [
"num_imputation",
"variance_threshold",
"num_scaling",
]
num_pipe_info["variance_threshold"] = variance_threshold
num_pipe_info["variance"] = variance
else:
num_pipe_info["num_pipeline"] = ["num_imputation", "num_scaling"]
if (not exclude_operators is None and num_imputation in exclude_operators) or (
not include_operators is None and num_imputation not in include_operators
):
num_pipe_info["num_imputation"] = None
num_pipe_info["num_null_value"] = None
else:
num_pipe_info["num_imputation"] = num_imputation
num_pipe_info["num_null_value"] = num_null_value
if (not exclude_operators is None and num_scaling in exclude_operators) or (
not include_operators is None and num_scaling not in include_operators
):
num_pipe_info["num_scaling"] = None
else:
num_pipe_info["num_scaling"] = num_scaling
if num_scaling == "no_scaling":
num_pipe_info["num_scaling"] = None
else:
num_pipe_info["num_scaling"] = num_scaling
clean_pipeline = Pipeline(cat_pipe_info, num_pipe_info)
training_result, test_result = clean_pipeline.fit_transform(training_row, test_row)
return training_result, test_result
def format_data_with_customized_num(
training_row: dd.Series,
test_row: dd.Series,
cat_imputation: str = "constant",
cat_null_value: Optional[List[Any]] = None,
fill_val: str = "missing_value",
cat_encoding: str = "one_hot",
include_operators: Optional[List[str]] = None,
exclude_operators: Optional[List[str]] = None,
customized_num_pipeline: Optional[List[Dict[str, Any]]] = None,
) -> Tuple[dd.Series, dd.Series]:
"""
This function transforms an arbitrary tabular dataset
into a format that's suitable for a typical ML application.
Customized numerical pipeline and related parameters should be provided by users
Parameters
----------
training_row
One column of training dataset. Dask Series.
test_row
One column of test dataset. Dask Series.
cat_imputation
The mode of imputation for categorical columns.
If it equals to "constant",
then all missing values are filled with `fill_val`.
If it equals to "most_frequent",
then all missing values are filled with most frequent value.
If it equals to "drop",
then all categorical columns with missing values will be dropped.
cat_null_value
Specified categorical null values which should be recognized.
fill_val
When cat_imputation = "constant", then all missing values are filled with `fill_val`.
cat_encoding
The mode of encoding categorical columns.
If it equals to "one_hot", do one-hot encoding.
If it equals to "no_encoding", nothing will be done.
include_operators
Components included for `clean_ml`, like "one_hot", "standardize", etc.
exclude_operators
Components excluded for `clean_ml`, like "one_hot", "standardize", etc.
customized_num_pipeline
User-specified pipeline managing numerical columns.
"""
cat_pipe_info: Dict[str, Any] = {}
cat_pipe_info["cat_pipeline"] = ["cat_imputation", "cat_encoding"]
# cat_pipe_info['cat_pipeline'] = ['cat_imputation']
if (not exclude_operators is None and cat_imputation in exclude_operators) or (
not include_operators is None and cat_imputation not in include_operators
):
cat_pipe_info["cat_imputation"] = None
cat_pipe_info["cat_null_value"] = None
cat_pipe_info["fill_val"] = None
else:
cat_pipe_info["cat_imputation"] = cat_imputation
cat_pipe_info["cat_null_value"] = cat_null_value
cat_pipe_info["fill_val"] = fill_val
if (not exclude_operators is None and cat_encoding in exclude_operators) or (
not include_operators is None and cat_encoding not in include_operators
):
cat_pipe_info["cat_encoding"] = None
else:
cat_pipe_info["cat_encoding"] = cat_encoding
if cat_encoding == "no_encoding":
cat_pipe_info["cat_encoding"] = None
else:
cat_pipe_info["cat_encoding"] = cat_encoding
num_pipe_info: Dict[str, Any] = {}
num_pipeline = []
if not customized_num_pipeline is None:
for item in customized_num_pipeline:
(component_key,) = item
num_pipeline.append(component_key)
num_pipe_info["num_pipeline"] = num_pipeline
for item in customized_num_pipeline:
(component_key,) = item
if (
not exclude_operators is None
and item[component_key]["operator"] in exclude_operators
) or (
not include_operators is None
and item[component_key]["operator"] not in include_operators
):
num_pipe_info[component_key] = None
continue
for key in item[component_key]:
if key == "operator":
num_pipe_info[component_key] = item[component_key][key]
else:
num_pipe_info[key] = item[component_key][key]
clean_pipeline = Pipeline(cat_pipe_info, num_pipe_info)
training_result, test_result = clean_pipeline.fit_transform(training_row, test_row)
return training_result, test_result
def format_data_with_default(
training_row: dd.Series,
test_row: dd.Series,
cat_imputation: str = "constant",
cat_null_value: Optional[List[Any]] = None,
fill_val: str = "missing_value",
num_imputation: str = "mean",
num_null_value: Optional[List[Any]] = None,
cat_encoding: str = "one_hot",
variance_threshold: bool = True,
variance: float = 0.0,
num_scaling: str = "standardize",
include_operators: Optional[List[str]] = None,
exclude_operators: Optional[List[str]] = None,
) -> Tuple[dd.Series, dd.Series]:
"""
This function transforms an arbitrary tabular dataset
into a format that's suitable for a typical ML application.
No customized pipeline should be provided. Use default pipeline.
Parameters
----------
training_row
One column of training dataset. Dask Series.
test_row
One column of test dataset. Dask Series.
cat_imputation
The mode of imputation for categorical columns.
If it equals to "constant",
then all missing values are filled with `fill_val`.
If it equals to "most_frequent",
then all missing values are filled with most frequent value.
If it equals to "drop",
then all categorical columns with missing values will be dropped.
cat_null_value
Specified categorical null values which should be recognized.
fill_val
When cat_imputation = "constant", then all missing values are filled with `fill_val`.
num_imputation
The mode of imputation for numerical columns.
If it equals to "mean",
then all missing values are filled with mean value.
If it equals to "median",
then all missing values are filled with median value.
If it equals to "most_frequent",
then all missing values are filled with most frequent value.
If it equals to "drop",
then all numerical columns with missing values will be dropped.
num_null_value
Specified numerical null values which should be recognized.
cat_encoding
The mode of encoding categorical columns.
If it equals to "one_hot", do one-hot encoding.
If it equals to "no_encoding", nothing will be done.
variance_threshold
If it is True, then dropping numerical columns with variance less than `variance`.
variance
Variance value when variance_threshold = True.
num_scaling
The mode of scaling for numerical columns.
If it equals to "standardize", do standardize for all numerical columns.
If it equals to "minmax", do minmax scaling for all numerical columns.
If it equals to "maxabs", do maxabs scaling for all numerical columns.
If it equals to "no_scaling", nothing will be done.
include_operators
Components included for `clean_ml`, like "one_hot", "standardize", etc.
exclude_operators
Components excluded for `clean_ml`, like "one_hot", "standardize", etc.
"""
cat_pipe_info: Dict[str, Any] = {}
cat_pipe_info["cat_pipeline"] = ["cat_imputation", "cat_encoding"]
# cat_pipe_info['cat_pipeline'] = ['cat_imputation']
if (not exclude_operators is None and cat_imputation in exclude_operators) or (
not include_operators is None and cat_imputation not in include_operators
):
cat_pipe_info["cat_imputation"] = None
cat_pipe_info["cat_null_value"] = None
cat_pipe_info["fill_val"] = None
else:
cat_pipe_info["cat_imputation"] = cat_imputation
cat_pipe_info["cat_null_value"] = cat_null_value
cat_pipe_info["fill_val"] = fill_val
if (not exclude_operators is None and cat_encoding in exclude_operators) or (
not include_operators is None and cat_encoding not in include_operators
):
cat_pipe_info["cat_encoding"] = None
else:
cat_pipe_info["cat_encoding"] = cat_encoding
if cat_encoding == "no_encoding":
cat_pipe_info["cat_encoding"] = None
else:
cat_pipe_info["cat_encoding"] = cat_encoding
num_pipe_info: Dict[str, Any] = {}
if variance_threshold:
num_pipe_info["num_pipeline"] = [
"num_imputation",
"variance_threshold",
"num_scaling",
]
num_pipe_info["variance_threshold"] = variance_threshold
num_pipe_info["variance"] = variance
else:
num_pipe_info["num_pipeline"] = ["num_imputation", "num_scaling"]
# num_pipe_info['num_pipeline'] = ['num_imputation', 'num_scaling']
if (not exclude_operators is None and num_imputation in exclude_operators) or (
not include_operators is None and num_imputation not in include_operators
):
num_pipe_info["num_imputation"] = None
num_pipe_info["num_null_value"] = None
else:
num_pipe_info["num_imputation"] = num_imputation
num_pipe_info["num_null_value"] = num_null_value
if (not exclude_operators is None and num_scaling in exclude_operators) or (
not include_operators is None and num_scaling not in include_operators
):
num_pipe_info["num_scaling"] = None
else:
num_pipe_info["num_scaling"] = num_scaling
if num_scaling == "no_scaling":
num_pipe_info["num_scaling"] = None
else:
num_pipe_info["num_scaling"] = num_scaling
clean_pipeline = Pipeline(cat_pipe_info, num_pipe_info)
training_result, test_result = clean_pipeline.fit_transform(training_row, test_row)
return training_result, test_result
def format_data_with_customized_cat_and_num(
training_row: dd.Series,
test_row: dd.Series,
include_operators: Optional[List[str]] = None,
exclude_operators: Optional[List[str]] = None,
customized_cat_pipeline: Optional[List[Dict[str, Any]]] = None,
customized_num_pipeline: Optional[List[Dict[str, Any]]] = None,
) -> Tuple[dd.Series, dd.Series]:
"""
This function transforms an arbitrary tabular dataset
into a format that's suitable for a typical ML application.
Both customized pipeline managing categorical columns and numerical columns should be provided.
Parameters
----------
training_row
One column of training dataset. Dask Series.
test_row
One column of test dataset. Dask Series.
include_operators
Components included for `clean_ml`, like "one_hot", "standardize", etc.
exclude_operators
Components excluded for `clean_ml`, like "one_hot", "standardize", etc.
customized_cat_pipeline
User-specified pipeline managing categorical columns.
customized_num_pipeline
User-specified pipeline managing numerical columns.
"""
cat_pipe_info: Dict[str, Any] = {}
cat_pipeline = []
if not customized_cat_pipeline is None:
for item in customized_cat_pipeline:
(component_key,) = item
cat_pipeline.append(component_key)
cat_pipe_info["cat_pipeline"] = cat_pipeline
for item in customized_cat_pipeline:
(component_key,) = item
if (
not exclude_operators is None
and item[component_key]["operator"] in exclude_operators
) or (
not include_operators is None
and item[component_key]["operator"] not in include_operators
):
cat_pipe_info[component_key] = None
continue
for key in item[component_key]:
if key == "operator":
cat_pipe_info[component_key] = item[component_key][key]
else:
cat_pipe_info[key] = item[component_key][key]
num_pipe_info: Dict[str, Any] = {}
num_pipeline = []
if not customized_num_pipeline is None:
for item in customized_num_pipeline:
(component_key,) = item
num_pipeline.append(component_key)
num_pipe_info["num_pipeline"] = num_pipeline
for item in customized_num_pipeline:
(component_key,) = item
if (
not exclude_operators is None
and item[component_key]["operator"] in exclude_operators
) or (
not include_operators is None
and item[component_key]["operator"] not in include_operators
):
num_pipe_info[component_key] = None
continue
for key in item[component_key]:
if key == "operator":
num_pipe_info[component_key] = item[component_key][key]
else:
num_pipe_info[key] = item[component_key][key]
clean_pipeline = Pipeline(cat_pipe_info, num_pipe_info)
training_result, test_result = clean_pipeline.fit_transform(training_row, test_row)
return training_result, test_result
| 41.365979 | 99 | 0.654039 | 3,029 | 24,075 | 4.939254 | 0.053483 | 0.043313 | 0.030145 | 0.031281 | 0.939509 | 0.921329 | 0.915447 | 0.910501 | 0.906022 | 0.894392 | 0 | 0.0004 | 0.273562 | 24,075 | 581 | 100 | 41.437177 | 0.855052 | 0.351111 | 0 | 0.866279 | 0 | 0 | 0.072525 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.014535 | false | 0 | 0.014535 | 0 | 0.043605 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
763c438db1ecf26e705114dd7f7484791bc07662 | 40 | py | Python | recordExample.py | bhkj9999/warmane | e18bf7aa436908cb05ac0b7ec361ee23bec773b1 | [
"MIT"
] | null | null | null | recordExample.py | bhkj9999/warmane | e18bf7aa436908cb05ac0b7ec361ee23bec773b1 | [
"MIT"
] | null | null | null | recordExample.py | bhkj9999/warmane | e18bf7aa436908cb05ac0b7ec361ee23bec773b1 | [
"MIT"
] | null | null | null | md5 = "2ef00178e187c6e9f5cb1490386128d6" | 40 | 40 | 0.875 | 2 | 40 | 17.5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.631579 | 0.05 | 40 | 1 | 40 | 40 | 0.289474 | 0 | 0 | 0 | 0 | 0 | 0.780488 | 0.780488 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
767f92029a2c373fb55b86b16de5e76cb9329a9b | 2,039 | py | Python | App/migrations/0003_alter_song_dance_type_alter_song_holiday_and_more.py | dlanghorne0428/StudioMusicPlayer | 54dabab896b96d90b68d6435edfd52fe6a866bc2 | [
"MIT"
] | null | null | null | App/migrations/0003_alter_song_dance_type_alter_song_holiday_and_more.py | dlanghorne0428/StudioMusicPlayer | 54dabab896b96d90b68d6435edfd52fe6a866bc2 | [
"MIT"
] | 44 | 2022-01-21T01:33:59.000Z | 2022-03-26T23:35:25.000Z | App/migrations/0003_alter_song_dance_type_alter_song_holiday_and_more.py | dlanghorne0428/StudioMusicPlayer | 54dabab896b96d90b68d6435edfd52fe6a866bc2 | [
"MIT"
] | null | null | null | # Generated by Django 4.0 on 2022-02-14 03:24
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('App', '0002_playlist_owner'),
]
operations = [
migrations.AlterField(
model_name='song',
name='dance_type',
field=models.CharField(choices=[('Bac', 'Bachata'), ('Bol', 'Bolero'), ('Cha', 'Cha-Cha'), ('C2S', 'Country Two Step'), ('ECS', 'East Coast Swing'), ('Fox', 'Foxtrot'), ('Hus', 'Hustle'), ('Jiv', 'Jive'), ('Mam', 'Mambo/Salsa'), ('Mer', 'Merengue'), ('NC2', 'Nite Club 2-Step'), ('PD', 'Paso Doble'), ('Pea', 'Peabody'), ('Q', 'Quickstep'), ('Rum', 'Rumba'), ('Sam', 'Samba'), ('Tan', 'Tango'), ('VW', 'Viennese Waltz'), ('Wal', 'Waltz'), ('WCS', 'West Coast Swing')], default='Cha', max_length=10),
),
migrations.AlterField(
model_name='song',
name='holiday',
field=models.CharField(blank=True, choices=[('Jul4', '4th of July'), ('Hall', 'Halloween'), ('Xmas', 'Christmas'), ('NYE', "New Year's Eve")], default='', max_length=5),
),
migrations.AlterField(
model_name='songfileinput',
name='dance_type',
field=models.CharField(choices=[('Bac', 'Bachata'), ('Bol', 'Bolero'), ('Cha', 'Cha-Cha'), ('C2S', 'Country Two Step'), ('ECS', 'East Coast Swing'), ('Fox', 'Foxtrot'), ('Hus', 'Hustle'), ('Jiv', 'Jive'), ('Mam', 'Mambo/Salsa'), ('Mer', 'Merengue'), ('NC2', 'Nite Club 2-Step'), ('PD', 'Paso Doble'), ('Pea', 'Peabody'), ('Q', 'Quickstep'), ('Rum', 'Rumba'), ('Sam', 'Samba'), ('Tan', 'Tango'), ('VW', 'Viennese Waltz'), ('Wal', 'Waltz'), ('WCS', 'West Coast Swing')], default='Cha', max_length=10),
),
migrations.AlterField(
model_name='songfileinput',
name='holiday',
field=models.CharField(blank=True, choices=[('Jul4', '4th of July'), ('Hall', 'Halloween'), ('Xmas', 'Christmas'), ('NYE', "New Year's Eve")], default='', max_length=5),
),
]
| 59.970588 | 511 | 0.542423 | 227 | 2,039 | 4.819383 | 0.449339 | 0.073126 | 0.091408 | 0.106033 | 0.870201 | 0.870201 | 0.786106 | 0.786106 | 0.786106 | 0.786106 | 0 | 0.020897 | 0.20206 | 2,039 | 33 | 512 | 61.787879 | 0.651506 | 0.021089 | 0 | 0.740741 | 1 | 0 | 0.343029 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.037037 | 0 | 0.148148 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
768929c02f0a6e19fc4817ee36305d4a2cdc1465 | 49 | py | Python | medgpc/evaluation/__init__.py | bee-hive/MedGP | 596a24ca519900507cce42cb4e2061319cef801e | [
"BSD-3-Clause"
] | 25 | 2018-03-18T18:09:03.000Z | 2022-02-24T07:47:33.000Z | medgpc/evaluation/__init__.py | bee-hive/MedGP | 596a24ca519900507cce42cb4e2061319cef801e | [
"BSD-3-Clause"
] | 3 | 2021-04-12T16:11:00.000Z | 2021-04-12T16:26:17.000Z | medgpc/evaluation/__init__.py | bee-hive/MedGP | 596a24ca519900507cce42cb4e2061319cef801e | [
"BSD-3-Clause"
] | 4 | 2019-04-27T23:18:26.000Z | 2021-12-03T20:19:09.000Z | from . import evals
from . import run_medgpc_eval | 24.5 | 29 | 0.816327 | 8 | 49 | 4.75 | 0.75 | 0.526316 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.142857 | 49 | 2 | 29 | 24.5 | 0.904762 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
76a63457086d063c00e2d943a130393c23f2defa | 127,659 | py | Python | napalm_yang/models/openconfig/network_instances/network_instance/protocols/protocol/ospfv2/areas/area/virtual_links/virtual_link/state/__init__.py | ckishimo/napalm-yang | 8f2bd907bd3afcde3c2f8e985192de74748baf6c | [
"Apache-2.0"
] | 64 | 2016-10-20T15:47:18.000Z | 2021-11-11T11:57:32.000Z | napalm_yang/models/openconfig/network_instances/network_instance/protocols/protocol/ospfv2/areas/area/virtual_links/virtual_link/state/__init__.py | ckishimo/napalm-yang | 8f2bd907bd3afcde3c2f8e985192de74748baf6c | [
"Apache-2.0"
] | 126 | 2016-10-05T10:36:14.000Z | 2019-05-15T08:43:23.000Z | napalm_yang/models/openconfig/network_instances/network_instance/protocols/protocol/ospfv2/areas/area/virtual_links/virtual_link/state/__init__.py | ckishimo/napalm-yang | 8f2bd907bd3afcde3c2f8e985192de74748baf6c | [
"Apache-2.0"
] | 63 | 2016-11-07T15:23:08.000Z | 2021-09-22T14:41:16.000Z | # -*- coding: utf-8 -*-
from operator import attrgetter
from pyangbind.lib.yangtypes import RestrictedPrecisionDecimalType
from pyangbind.lib.yangtypes import RestrictedClassType
from pyangbind.lib.yangtypes import TypedListType
from pyangbind.lib.yangtypes import YANGBool
from pyangbind.lib.yangtypes import YANGListType
from pyangbind.lib.yangtypes import YANGDynClass
from pyangbind.lib.yangtypes import ReferenceType
from pyangbind.lib.base import PybindBase
from collections import OrderedDict
from decimal import Decimal
from bitarray import bitarray
import six
# PY3 support of some PY2 keywords (needs improved)
if six.PY3:
import builtins as __builtin__
long = int
elif six.PY2:
import __builtin__
class state(PybindBase):
"""
This class was auto-generated by the PythonClass plugin for PYANG
from YANG module openconfig-network-instance - based on the path /network-instances/network-instance/protocols/protocol/ospfv2/areas/area/virtual-links/virtual-link/state. Each member element of
the container is represented as a class variable - with a specific
YANG type.
YANG Description: State parameters relating to the OSPF virtual link
"""
__slots__ = (
"_path_helper",
"_extmethods",
"__remote_router_id",
"__priority",
"__dead_time",
"__designated_router",
"__backup_designated_router",
"__optional_capabilities",
"__last_established_time",
"__adjacency_state",
"__state_changes",
"__retranmission_queue_length",
)
_yang_name = "state"
_pybind_generated_by = "container"
def __init__(self, *args, **kwargs):
self._path_helper = False
self._extmethods = False
self.__remote_router_id = YANGDynClass(
base=RestrictedClassType(
base_type=RestrictedClassType(
base_type=six.text_type,
restriction_dict={
"pattern": "(([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\\.){3}([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])(%[\\p{N}\\p{L}]+)?"
},
),
restriction_dict={"pattern": "[0-9\\.]*"},
),
is_leaf=True,
yang_name="remote-router-id",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="inet:ipv4-address-no-zone",
is_config=False,
)
self.__priority = YANGDynClass(
base=RestrictedClassType(
base_type=int, restriction_dict={"range": ["0..255"]}, int_size=8
),
is_leaf=True,
yang_name="priority",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="uint8",
is_config=False,
)
self.__dead_time = YANGDynClass(
base=RestrictedClassType(
base_type=long,
restriction_dict={"range": ["0..18446744073709551615"]},
int_size=64,
),
is_leaf=True,
yang_name="dead-time",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="oc-types:timeticks64",
is_config=False,
)
self.__designated_router = YANGDynClass(
base=RestrictedClassType(
base_type=six.text_type,
restriction_dict={
"pattern": "(([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\\.){3}([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])"
},
),
is_leaf=True,
yang_name="designated-router",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="yang:dotted-quad",
is_config=False,
)
self.__backup_designated_router = YANGDynClass(
base=RestrictedClassType(
base_type=six.text_type,
restriction_dict={
"pattern": "(([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\\.){3}([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])"
},
),
is_leaf=True,
yang_name="backup-designated-router",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="yang:dotted-quad",
is_config=False,
)
self.__optional_capabilities = YANGDynClass(
base=RestrictedClassType(
base_type=six.text_type,
restriction_dict={"pattern": "([0-9a-fA-F]{2}(:[0-9a-fA-F]{2})*)?"},
),
is_leaf=True,
yang_name="optional-capabilities",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="yang:hex-string",
is_config=False,
)
self.__last_established_time = YANGDynClass(
base=RestrictedClassType(
base_type=long,
restriction_dict={"range": ["0..18446744073709551615"]},
int_size=64,
),
is_leaf=True,
yang_name="last-established-time",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="oc-types:timeticks64",
is_config=False,
)
self.__adjacency_state = YANGDynClass(
base=RestrictedClassType(
base_type=six.text_type,
restriction_type="dict_key",
restriction_arg={
"DOWN": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospf-types:DOWN": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospft:DOWN": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"ATTEMPT": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospf-types:ATTEMPT": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospft:ATTEMPT": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"INIT": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospf-types:INIT": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospft:INIT": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"TWO_WAY": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospf-types:TWO_WAY": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospft:TWO_WAY": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"EXSTART": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospf-types:EXSTART": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospft:EXSTART": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"EXCHANGE": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospf-types:EXCHANGE": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospft:EXCHANGE": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"LOADING": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospf-types:LOADING": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospft:LOADING": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"FULL": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospf-types:FULL": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospft:FULL": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
},
),
is_leaf=True,
yang_name="adjacency-state",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="identityref",
is_config=False,
)
self.__state_changes = YANGDynClass(
base=RestrictedClassType(
base_type=long,
restriction_dict={"range": ["0..4294967295"]},
int_size=32,
),
is_leaf=True,
yang_name="state-changes",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="uint32",
is_config=False,
)
self.__retranmission_queue_length = YANGDynClass(
base=RestrictedClassType(
base_type=long,
restriction_dict={"range": ["0..4294967295"]},
int_size=32,
),
is_leaf=True,
yang_name="retranmission-queue-length",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="uint32",
is_config=False,
)
load = kwargs.pop("load", None)
if args:
if len(args) > 1:
raise TypeError("cannot create a YANG container with >1 argument")
all_attr = True
for e in self._pyangbind_elements:
if not hasattr(args[0], e):
all_attr = False
break
if not all_attr:
raise ValueError("Supplied object did not have the correct attributes")
for e in self._pyangbind_elements:
nobj = getattr(args[0], e)
if nobj._changed() is False:
continue
setmethod = getattr(self, "_set_%s" % e)
if load is None:
setmethod(getattr(args[0], e))
else:
setmethod(getattr(args[0], e), load=load)
def _path(self):
if hasattr(self, "_parent"):
return self._parent._path() + [self._yang_name]
else:
return [
"network-instances",
"network-instance",
"protocols",
"protocol",
"ospfv2",
"areas",
"area",
"virtual-links",
"virtual-link",
"state",
]
def _get_remote_router_id(self):
"""
Getter method for remote_router_id, mapped from YANG variable /network_instances/network_instance/protocols/protocol/ospfv2/areas/area/virtual_links/virtual_link/state/remote_router_id (inet:ipv4-address-no-zone)
YANG Description: The router ID of the device which terminates the remote end
of the virtual link
"""
return self.__remote_router_id
def _set_remote_router_id(self, v, load=False):
"""
Setter method for remote_router_id, mapped from YANG variable /network_instances/network_instance/protocols/protocol/ospfv2/areas/area/virtual_links/virtual_link/state/remote_router_id (inet:ipv4-address-no-zone)
If this variable is read-only (config: false) in the
source YANG file, then _set_remote_router_id is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_remote_router_id() directly.
YANG Description: The router ID of the device which terminates the remote end
of the virtual link
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(
v,
base=RestrictedClassType(
base_type=RestrictedClassType(
base_type=six.text_type,
restriction_dict={
"pattern": "(([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\\.){3}([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])(%[\\p{N}\\p{L}]+)?"
},
),
restriction_dict={"pattern": "[0-9\\.]*"},
),
is_leaf=True,
yang_name="remote-router-id",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="inet:ipv4-address-no-zone",
is_config=False,
)
except (TypeError, ValueError):
raise ValueError(
{
"error-string": """remote_router_id must be of a type compatible with inet:ipv4-address-no-zone""",
"defined-type": "inet:ipv4-address-no-zone",
"generated-type": """YANGDynClass(base=RestrictedClassType(base_type=RestrictedClassType(base_type=six.text_type, restriction_dict={'pattern': '(([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\\.){3}([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])(%[\\p{N}\\p{L}]+)?'}), restriction_dict={'pattern': '[0-9\\.]*'}), is_leaf=True, yang_name="remote-router-id", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='http://openconfig.net/yang/network-instance', defining_module='openconfig-network-instance', yang_type='inet:ipv4-address-no-zone', is_config=False)""",
}
)
self.__remote_router_id = t
if hasattr(self, "_set"):
self._set()
def _unset_remote_router_id(self):
self.__remote_router_id = YANGDynClass(
base=RestrictedClassType(
base_type=RestrictedClassType(
base_type=six.text_type,
restriction_dict={
"pattern": "(([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\\.){3}([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])(%[\\p{N}\\p{L}]+)?"
},
),
restriction_dict={"pattern": "[0-9\\.]*"},
),
is_leaf=True,
yang_name="remote-router-id",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="inet:ipv4-address-no-zone",
is_config=False,
)
def _get_priority(self):
"""
Getter method for priority, mapped from YANG variable /network_instances/network_instance/protocols/protocol/ospfv2/areas/area/virtual_links/virtual_link/state/priority (uint8)
YANG Description: The remote system's priority to become the designated
router
"""
return self.__priority
def _set_priority(self, v, load=False):
"""
Setter method for priority, mapped from YANG variable /network_instances/network_instance/protocols/protocol/ospfv2/areas/area/virtual_links/virtual_link/state/priority (uint8)
If this variable is read-only (config: false) in the
source YANG file, then _set_priority is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_priority() directly.
YANG Description: The remote system's priority to become the designated
router
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(
v,
base=RestrictedClassType(
base_type=int, restriction_dict={"range": ["0..255"]}, int_size=8
),
is_leaf=True,
yang_name="priority",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="uint8",
is_config=False,
)
except (TypeError, ValueError):
raise ValueError(
{
"error-string": """priority must be of a type compatible with uint8""",
"defined-type": "uint8",
"generated-type": """YANGDynClass(base=RestrictedClassType(base_type=int, restriction_dict={'range': ['0..255']}, int_size=8), is_leaf=True, yang_name="priority", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='http://openconfig.net/yang/network-instance', defining_module='openconfig-network-instance', yang_type='uint8', is_config=False)""",
}
)
self.__priority = t
if hasattr(self, "_set"):
self._set()
def _unset_priority(self):
self.__priority = YANGDynClass(
base=RestrictedClassType(
base_type=int, restriction_dict={"range": ["0..255"]}, int_size=8
),
is_leaf=True,
yang_name="priority",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="uint8",
is_config=False,
)
def _get_dead_time(self):
"""
Getter method for dead_time, mapped from YANG variable /network_instances/network_instance/protocols/protocol/ospfv2/areas/area/virtual_links/virtual_link/state/dead_time (oc-types:timeticks64)
YANG Description: The time at which this neighbor's adjacency will be
considered dead. This value is expressed as a number of
seconds since the Unix Epoch
"""
return self.__dead_time
def _set_dead_time(self, v, load=False):
"""
Setter method for dead_time, mapped from YANG variable /network_instances/network_instance/protocols/protocol/ospfv2/areas/area/virtual_links/virtual_link/state/dead_time (oc-types:timeticks64)
If this variable is read-only (config: false) in the
source YANG file, then _set_dead_time is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_dead_time() directly.
YANG Description: The time at which this neighbor's adjacency will be
considered dead. This value is expressed as a number of
seconds since the Unix Epoch
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(
v,
base=RestrictedClassType(
base_type=long,
restriction_dict={"range": ["0..18446744073709551615"]},
int_size=64,
),
is_leaf=True,
yang_name="dead-time",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="oc-types:timeticks64",
is_config=False,
)
except (TypeError, ValueError):
raise ValueError(
{
"error-string": """dead_time must be of a type compatible with oc-types:timeticks64""",
"defined-type": "oc-types:timeticks64",
"generated-type": """YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..18446744073709551615']}, int_size=64), is_leaf=True, yang_name="dead-time", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='http://openconfig.net/yang/network-instance', defining_module='openconfig-network-instance', yang_type='oc-types:timeticks64', is_config=False)""",
}
)
self.__dead_time = t
if hasattr(self, "_set"):
self._set()
def _unset_dead_time(self):
self.__dead_time = YANGDynClass(
base=RestrictedClassType(
base_type=long,
restriction_dict={"range": ["0..18446744073709551615"]},
int_size=64,
),
is_leaf=True,
yang_name="dead-time",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="oc-types:timeticks64",
is_config=False,
)
def _get_designated_router(self):
"""
Getter method for designated_router, mapped from YANG variable /network_instances/network_instance/protocols/protocol/ospfv2/areas/area/virtual_links/virtual_link/state/designated_router (yang:dotted-quad)
YANG Description: The designated router for the adjacency. This device
advertises the Network LSA for broadcast and NBMA networks.
"""
return self.__designated_router
def _set_designated_router(self, v, load=False):
"""
Setter method for designated_router, mapped from YANG variable /network_instances/network_instance/protocols/protocol/ospfv2/areas/area/virtual_links/virtual_link/state/designated_router (yang:dotted-quad)
If this variable is read-only (config: false) in the
source YANG file, then _set_designated_router is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_designated_router() directly.
YANG Description: The designated router for the adjacency. This device
advertises the Network LSA for broadcast and NBMA networks.
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(
v,
base=RestrictedClassType(
base_type=six.text_type,
restriction_dict={
"pattern": "(([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\\.){3}([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])"
},
),
is_leaf=True,
yang_name="designated-router",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="yang:dotted-quad",
is_config=False,
)
except (TypeError, ValueError):
raise ValueError(
{
"error-string": """designated_router must be of a type compatible with yang:dotted-quad""",
"defined-type": "yang:dotted-quad",
"generated-type": """YANGDynClass(base=RestrictedClassType(base_type=six.text_type, restriction_dict={'pattern': '(([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\\.){3}([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])'}), is_leaf=True, yang_name="designated-router", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='http://openconfig.net/yang/network-instance', defining_module='openconfig-network-instance', yang_type='yang:dotted-quad', is_config=False)""",
}
)
self.__designated_router = t
if hasattr(self, "_set"):
self._set()
def _unset_designated_router(self):
self.__designated_router = YANGDynClass(
base=RestrictedClassType(
base_type=six.text_type,
restriction_dict={
"pattern": "(([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\\.){3}([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])"
},
),
is_leaf=True,
yang_name="designated-router",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="yang:dotted-quad",
is_config=False,
)
def _get_backup_designated_router(self):
"""
Getter method for backup_designated_router, mapped from YANG variable /network_instances/network_instance/protocols/protocol/ospfv2/areas/area/virtual_links/virtual_link/state/backup_designated_router (yang:dotted-quad)
YANG Description: The backup designated router for the adjacency.
"""
return self.__backup_designated_router
def _set_backup_designated_router(self, v, load=False):
"""
Setter method for backup_designated_router, mapped from YANG variable /network_instances/network_instance/protocols/protocol/ospfv2/areas/area/virtual_links/virtual_link/state/backup_designated_router (yang:dotted-quad)
If this variable is read-only (config: false) in the
source YANG file, then _set_backup_designated_router is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_backup_designated_router() directly.
YANG Description: The backup designated router for the adjacency.
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(
v,
base=RestrictedClassType(
base_type=six.text_type,
restriction_dict={
"pattern": "(([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\\.){3}([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])"
},
),
is_leaf=True,
yang_name="backup-designated-router",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="yang:dotted-quad",
is_config=False,
)
except (TypeError, ValueError):
raise ValueError(
{
"error-string": """backup_designated_router must be of a type compatible with yang:dotted-quad""",
"defined-type": "yang:dotted-quad",
"generated-type": """YANGDynClass(base=RestrictedClassType(base_type=six.text_type, restriction_dict={'pattern': '(([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\\.){3}([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])'}), is_leaf=True, yang_name="backup-designated-router", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='http://openconfig.net/yang/network-instance', defining_module='openconfig-network-instance', yang_type='yang:dotted-quad', is_config=False)""",
}
)
self.__backup_designated_router = t
if hasattr(self, "_set"):
self._set()
def _unset_backup_designated_router(self):
self.__backup_designated_router = YANGDynClass(
base=RestrictedClassType(
base_type=six.text_type,
restriction_dict={
"pattern": "(([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\\.){3}([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])"
},
),
is_leaf=True,
yang_name="backup-designated-router",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="yang:dotted-quad",
is_config=False,
)
def _get_optional_capabilities(self):
"""
Getter method for optional_capabilities, mapped from YANG variable /network_instances/network_instance/protocols/protocol/ospfv2/areas/area/virtual_links/virtual_link/state/optional_capabilities (yang:hex-string)
YANG Description: The optional capabilities field received in the Hello
message from the neighbor
"""
return self.__optional_capabilities
def _set_optional_capabilities(self, v, load=False):
"""
Setter method for optional_capabilities, mapped from YANG variable /network_instances/network_instance/protocols/protocol/ospfv2/areas/area/virtual_links/virtual_link/state/optional_capabilities (yang:hex-string)
If this variable is read-only (config: false) in the
source YANG file, then _set_optional_capabilities is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_optional_capabilities() directly.
YANG Description: The optional capabilities field received in the Hello
message from the neighbor
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(
v,
base=RestrictedClassType(
base_type=six.text_type,
restriction_dict={"pattern": "([0-9a-fA-F]{2}(:[0-9a-fA-F]{2})*)?"},
),
is_leaf=True,
yang_name="optional-capabilities",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="yang:hex-string",
is_config=False,
)
except (TypeError, ValueError):
raise ValueError(
{
"error-string": """optional_capabilities must be of a type compatible with yang:hex-string""",
"defined-type": "yang:hex-string",
"generated-type": """YANGDynClass(base=RestrictedClassType(base_type=six.text_type, restriction_dict={'pattern': '([0-9a-fA-F]{2}(:[0-9a-fA-F]{2})*)?'}), is_leaf=True, yang_name="optional-capabilities", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='http://openconfig.net/yang/network-instance', defining_module='openconfig-network-instance', yang_type='yang:hex-string', is_config=False)""",
}
)
self.__optional_capabilities = t
if hasattr(self, "_set"):
self._set()
def _unset_optional_capabilities(self):
self.__optional_capabilities = YANGDynClass(
base=RestrictedClassType(
base_type=six.text_type,
restriction_dict={"pattern": "([0-9a-fA-F]{2}(:[0-9a-fA-F]{2})*)?"},
),
is_leaf=True,
yang_name="optional-capabilities",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="yang:hex-string",
is_config=False,
)
def _get_last_established_time(self):
"""
Getter method for last_established_time, mapped from YANG variable /network_instances/network_instance/protocols/protocol/ospfv2/areas/area/virtual_links/virtual_link/state/last_established_time (oc-types:timeticks64)
YANG Description: The time at which the adjacency was last established with
the neighbor. That is to say the time at which the
adjacency last transitioned into the FULL state.
This value is expressed as the number of seconds, relative to
the Unix Epoch (Jan 1, 1970 00:00:00 UTC).
"""
return self.__last_established_time
def _set_last_established_time(self, v, load=False):
"""
Setter method for last_established_time, mapped from YANG variable /network_instances/network_instance/protocols/protocol/ospfv2/areas/area/virtual_links/virtual_link/state/last_established_time (oc-types:timeticks64)
If this variable is read-only (config: false) in the
source YANG file, then _set_last_established_time is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_last_established_time() directly.
YANG Description: The time at which the adjacency was last established with
the neighbor. That is to say the time at which the
adjacency last transitioned into the FULL state.
This value is expressed as the number of seconds, relative to
the Unix Epoch (Jan 1, 1970 00:00:00 UTC).
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(
v,
base=RestrictedClassType(
base_type=long,
restriction_dict={"range": ["0..18446744073709551615"]},
int_size=64,
),
is_leaf=True,
yang_name="last-established-time",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="oc-types:timeticks64",
is_config=False,
)
except (TypeError, ValueError):
raise ValueError(
{
"error-string": """last_established_time must be of a type compatible with oc-types:timeticks64""",
"defined-type": "oc-types:timeticks64",
"generated-type": """YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..18446744073709551615']}, int_size=64), is_leaf=True, yang_name="last-established-time", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='http://openconfig.net/yang/network-instance', defining_module='openconfig-network-instance', yang_type='oc-types:timeticks64', is_config=False)""",
}
)
self.__last_established_time = t
if hasattr(self, "_set"):
self._set()
def _unset_last_established_time(self):
self.__last_established_time = YANGDynClass(
base=RestrictedClassType(
base_type=long,
restriction_dict={"range": ["0..18446744073709551615"]},
int_size=64,
),
is_leaf=True,
yang_name="last-established-time",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="oc-types:timeticks64",
is_config=False,
)
def _get_adjacency_state(self):
"""
Getter method for adjacency_state, mapped from YANG variable /network_instances/network_instance/protocols/protocol/ospfv2/areas/area/virtual_links/virtual_link/state/adjacency_state (identityref)
YANG Description: The state of the adjacency with the neighbor.
"""
return self.__adjacency_state
def _set_adjacency_state(self, v, load=False):
"""
Setter method for adjacency_state, mapped from YANG variable /network_instances/network_instance/protocols/protocol/ospfv2/areas/area/virtual_links/virtual_link/state/adjacency_state (identityref)
If this variable is read-only (config: false) in the
source YANG file, then _set_adjacency_state is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_adjacency_state() directly.
YANG Description: The state of the adjacency with the neighbor.
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(
v,
base=RestrictedClassType(
base_type=six.text_type,
restriction_type="dict_key",
restriction_arg={
"DOWN": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospf-types:DOWN": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospft:DOWN": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"ATTEMPT": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospf-types:ATTEMPT": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospft:ATTEMPT": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"INIT": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospf-types:INIT": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospft:INIT": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"TWO_WAY": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospf-types:TWO_WAY": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospft:TWO_WAY": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"EXSTART": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospf-types:EXSTART": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospft:EXSTART": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"EXCHANGE": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospf-types:EXCHANGE": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospft:EXCHANGE": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"LOADING": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospf-types:LOADING": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospft:LOADING": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"FULL": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospf-types:FULL": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospft:FULL": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
},
),
is_leaf=True,
yang_name="adjacency-state",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="identityref",
is_config=False,
)
except (TypeError, ValueError):
raise ValueError(
{
"error-string": """adjacency_state must be of a type compatible with identityref""",
"defined-type": "openconfig-network-instance:identityref",
"generated-type": """YANGDynClass(base=RestrictedClassType(base_type=six.text_type, restriction_type="dict_key", restriction_arg={'DOWN': {'@module': 'openconfig-ospf-types', '@namespace': 'http://openconfig.net/yang/ospf-types'}, 'oc-ospf-types:DOWN': {'@module': 'openconfig-ospf-types', '@namespace': 'http://openconfig.net/yang/ospf-types'}, 'oc-ospft:DOWN': {'@module': 'openconfig-ospf-types', '@namespace': 'http://openconfig.net/yang/ospf-types'}, 'ATTEMPT': {'@module': 'openconfig-ospf-types', '@namespace': 'http://openconfig.net/yang/ospf-types'}, 'oc-ospf-types:ATTEMPT': {'@module': 'openconfig-ospf-types', '@namespace': 'http://openconfig.net/yang/ospf-types'}, 'oc-ospft:ATTEMPT': {'@module': 'openconfig-ospf-types', '@namespace': 'http://openconfig.net/yang/ospf-types'}, 'INIT': {'@module': 'openconfig-ospf-types', '@namespace': 'http://openconfig.net/yang/ospf-types'}, 'oc-ospf-types:INIT': {'@module': 'openconfig-ospf-types', '@namespace': 'http://openconfig.net/yang/ospf-types'}, 'oc-ospft:INIT': {'@module': 'openconfig-ospf-types', '@namespace': 'http://openconfig.net/yang/ospf-types'}, 'TWO_WAY': {'@module': 'openconfig-ospf-types', '@namespace': 'http://openconfig.net/yang/ospf-types'}, 'oc-ospf-types:TWO_WAY': {'@module': 'openconfig-ospf-types', '@namespace': 'http://openconfig.net/yang/ospf-types'}, 'oc-ospft:TWO_WAY': {'@module': 'openconfig-ospf-types', '@namespace': 'http://openconfig.net/yang/ospf-types'}, 'EXSTART': {'@module': 'openconfig-ospf-types', '@namespace': 'http://openconfig.net/yang/ospf-types'}, 'oc-ospf-types:EXSTART': {'@module': 'openconfig-ospf-types', '@namespace': 'http://openconfig.net/yang/ospf-types'}, 'oc-ospft:EXSTART': {'@module': 'openconfig-ospf-types', '@namespace': 'http://openconfig.net/yang/ospf-types'}, 'EXCHANGE': {'@module': 'openconfig-ospf-types', '@namespace': 'http://openconfig.net/yang/ospf-types'}, 'oc-ospf-types:EXCHANGE': {'@module': 'openconfig-ospf-types', '@namespace': 'http://openconfig.net/yang/ospf-types'}, 'oc-ospft:EXCHANGE': {'@module': 'openconfig-ospf-types', '@namespace': 'http://openconfig.net/yang/ospf-types'}, 'LOADING': {'@module': 'openconfig-ospf-types', '@namespace': 'http://openconfig.net/yang/ospf-types'}, 'oc-ospf-types:LOADING': {'@module': 'openconfig-ospf-types', '@namespace': 'http://openconfig.net/yang/ospf-types'}, 'oc-ospft:LOADING': {'@module': 'openconfig-ospf-types', '@namespace': 'http://openconfig.net/yang/ospf-types'}, 'FULL': {'@module': 'openconfig-ospf-types', '@namespace': 'http://openconfig.net/yang/ospf-types'}, 'oc-ospf-types:FULL': {'@module': 'openconfig-ospf-types', '@namespace': 'http://openconfig.net/yang/ospf-types'}, 'oc-ospft:FULL': {'@module': 'openconfig-ospf-types', '@namespace': 'http://openconfig.net/yang/ospf-types'}},), is_leaf=True, yang_name="adjacency-state", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='http://openconfig.net/yang/network-instance', defining_module='openconfig-network-instance', yang_type='identityref', is_config=False)""",
}
)
self.__adjacency_state = t
if hasattr(self, "_set"):
self._set()
def _unset_adjacency_state(self):
self.__adjacency_state = YANGDynClass(
base=RestrictedClassType(
base_type=six.text_type,
restriction_type="dict_key",
restriction_arg={
"DOWN": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospf-types:DOWN": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospft:DOWN": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"ATTEMPT": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospf-types:ATTEMPT": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospft:ATTEMPT": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"INIT": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospf-types:INIT": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospft:INIT": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"TWO_WAY": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospf-types:TWO_WAY": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospft:TWO_WAY": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"EXSTART": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospf-types:EXSTART": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospft:EXSTART": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"EXCHANGE": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospf-types:EXCHANGE": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospft:EXCHANGE": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"LOADING": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospf-types:LOADING": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospft:LOADING": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"FULL": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospf-types:FULL": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospft:FULL": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
},
),
is_leaf=True,
yang_name="adjacency-state",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="identityref",
is_config=False,
)
def _get_state_changes(self):
"""
Getter method for state_changes, mapped from YANG variable /network_instances/network_instance/protocols/protocol/ospfv2/areas/area/virtual_links/virtual_link/state/state_changes (uint32)
YANG Description: The number of transitions out of the FULL state that this
neighbor has been through
"""
return self.__state_changes
def _set_state_changes(self, v, load=False):
"""
Setter method for state_changes, mapped from YANG variable /network_instances/network_instance/protocols/protocol/ospfv2/areas/area/virtual_links/virtual_link/state/state_changes (uint32)
If this variable is read-only (config: false) in the
source YANG file, then _set_state_changes is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_state_changes() directly.
YANG Description: The number of transitions out of the FULL state that this
neighbor has been through
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(
v,
base=RestrictedClassType(
base_type=long,
restriction_dict={"range": ["0..4294967295"]},
int_size=32,
),
is_leaf=True,
yang_name="state-changes",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="uint32",
is_config=False,
)
except (TypeError, ValueError):
raise ValueError(
{
"error-string": """state_changes must be of a type compatible with uint32""",
"defined-type": "uint32",
"generated-type": """YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="state-changes", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='http://openconfig.net/yang/network-instance', defining_module='openconfig-network-instance', yang_type='uint32', is_config=False)""",
}
)
self.__state_changes = t
if hasattr(self, "_set"):
self._set()
def _unset_state_changes(self):
self.__state_changes = YANGDynClass(
base=RestrictedClassType(
base_type=long,
restriction_dict={"range": ["0..4294967295"]},
int_size=32,
),
is_leaf=True,
yang_name="state-changes",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="uint32",
is_config=False,
)
def _get_retranmission_queue_length(self):
"""
Getter method for retranmission_queue_length, mapped from YANG variable /network_instances/network_instance/protocols/protocol/ospfv2/areas/area/virtual_links/virtual_link/state/retranmission_queue_length (uint32)
YANG Description: The number of LSAs that are currently in the queue to be
retransmitted to the neighbor
"""
return self.__retranmission_queue_length
def _set_retranmission_queue_length(self, v, load=False):
"""
Setter method for retranmission_queue_length, mapped from YANG variable /network_instances/network_instance/protocols/protocol/ospfv2/areas/area/virtual_links/virtual_link/state/retranmission_queue_length (uint32)
If this variable is read-only (config: false) in the
source YANG file, then _set_retranmission_queue_length is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_retranmission_queue_length() directly.
YANG Description: The number of LSAs that are currently in the queue to be
retransmitted to the neighbor
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(
v,
base=RestrictedClassType(
base_type=long,
restriction_dict={"range": ["0..4294967295"]},
int_size=32,
),
is_leaf=True,
yang_name="retranmission-queue-length",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="uint32",
is_config=False,
)
except (TypeError, ValueError):
raise ValueError(
{
"error-string": """retranmission_queue_length must be of a type compatible with uint32""",
"defined-type": "uint32",
"generated-type": """YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="retranmission-queue-length", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='http://openconfig.net/yang/network-instance', defining_module='openconfig-network-instance', yang_type='uint32', is_config=False)""",
}
)
self.__retranmission_queue_length = t
if hasattr(self, "_set"):
self._set()
def _unset_retranmission_queue_length(self):
self.__retranmission_queue_length = YANGDynClass(
base=RestrictedClassType(
base_type=long,
restriction_dict={"range": ["0..4294967295"]},
int_size=32,
),
is_leaf=True,
yang_name="retranmission-queue-length",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="uint32",
is_config=False,
)
remote_router_id = __builtin__.property(_get_remote_router_id)
priority = __builtin__.property(_get_priority)
dead_time = __builtin__.property(_get_dead_time)
designated_router = __builtin__.property(_get_designated_router)
backup_designated_router = __builtin__.property(_get_backup_designated_router)
optional_capabilities = __builtin__.property(_get_optional_capabilities)
last_established_time = __builtin__.property(_get_last_established_time)
adjacency_state = __builtin__.property(_get_adjacency_state)
state_changes = __builtin__.property(_get_state_changes)
retranmission_queue_length = __builtin__.property(_get_retranmission_queue_length)
_pyangbind_elements = OrderedDict(
[
("remote_router_id", remote_router_id),
("priority", priority),
("dead_time", dead_time),
("designated_router", designated_router),
("backup_designated_router", backup_designated_router),
("optional_capabilities", optional_capabilities),
("last_established_time", last_established_time),
("adjacency_state", adjacency_state),
("state_changes", state_changes),
("retranmission_queue_length", retranmission_queue_length),
]
)
class state(PybindBase):
"""
This class was auto-generated by the PythonClass plugin for PYANG
from YANG module openconfig-network-instance-l2 - based on the path /network-instances/network-instance/protocols/protocol/ospfv2/areas/area/virtual-links/virtual-link/state. Each member element of
the container is represented as a class variable - with a specific
YANG type.
YANG Description: State parameters relating to the OSPF virtual link
"""
__slots__ = (
"_path_helper",
"_extmethods",
"__remote_router_id",
"__priority",
"__dead_time",
"__designated_router",
"__backup_designated_router",
"__optional_capabilities",
"__last_established_time",
"__adjacency_state",
"__state_changes",
"__retranmission_queue_length",
)
_yang_name = "state"
_pybind_generated_by = "container"
def __init__(self, *args, **kwargs):
self._path_helper = False
self._extmethods = False
self.__remote_router_id = YANGDynClass(
base=RestrictedClassType(
base_type=RestrictedClassType(
base_type=six.text_type,
restriction_dict={
"pattern": "(([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\\.){3}([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])(%[\\p{N}\\p{L}]+)?"
},
),
restriction_dict={"pattern": "[0-9\\.]*"},
),
is_leaf=True,
yang_name="remote-router-id",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="inet:ipv4-address-no-zone",
is_config=False,
)
self.__priority = YANGDynClass(
base=RestrictedClassType(
base_type=int, restriction_dict={"range": ["0..255"]}, int_size=8
),
is_leaf=True,
yang_name="priority",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="uint8",
is_config=False,
)
self.__dead_time = YANGDynClass(
base=RestrictedClassType(
base_type=long,
restriction_dict={"range": ["0..18446744073709551615"]},
int_size=64,
),
is_leaf=True,
yang_name="dead-time",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="oc-types:timeticks64",
is_config=False,
)
self.__designated_router = YANGDynClass(
base=RestrictedClassType(
base_type=six.text_type,
restriction_dict={
"pattern": "(([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\\.){3}([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])"
},
),
is_leaf=True,
yang_name="designated-router",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="yang:dotted-quad",
is_config=False,
)
self.__backup_designated_router = YANGDynClass(
base=RestrictedClassType(
base_type=six.text_type,
restriction_dict={
"pattern": "(([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\\.){3}([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])"
},
),
is_leaf=True,
yang_name="backup-designated-router",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="yang:dotted-quad",
is_config=False,
)
self.__optional_capabilities = YANGDynClass(
base=RestrictedClassType(
base_type=six.text_type,
restriction_dict={"pattern": "([0-9a-fA-F]{2}(:[0-9a-fA-F]{2})*)?"},
),
is_leaf=True,
yang_name="optional-capabilities",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="yang:hex-string",
is_config=False,
)
self.__last_established_time = YANGDynClass(
base=RestrictedClassType(
base_type=long,
restriction_dict={"range": ["0..18446744073709551615"]},
int_size=64,
),
is_leaf=True,
yang_name="last-established-time",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="oc-types:timeticks64",
is_config=False,
)
self.__adjacency_state = YANGDynClass(
base=RestrictedClassType(
base_type=six.text_type,
restriction_type="dict_key",
restriction_arg={
"DOWN": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospf-types:DOWN": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospft:DOWN": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"ATTEMPT": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospf-types:ATTEMPT": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospft:ATTEMPT": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"INIT": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospf-types:INIT": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospft:INIT": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"TWO_WAY": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospf-types:TWO_WAY": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospft:TWO_WAY": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"EXSTART": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospf-types:EXSTART": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospft:EXSTART": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"EXCHANGE": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospf-types:EXCHANGE": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospft:EXCHANGE": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"LOADING": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospf-types:LOADING": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospft:LOADING": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"FULL": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospf-types:FULL": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospft:FULL": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
},
),
is_leaf=True,
yang_name="adjacency-state",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="identityref",
is_config=False,
)
self.__state_changes = YANGDynClass(
base=RestrictedClassType(
base_type=long,
restriction_dict={"range": ["0..4294967295"]},
int_size=32,
),
is_leaf=True,
yang_name="state-changes",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="uint32",
is_config=False,
)
self.__retranmission_queue_length = YANGDynClass(
base=RestrictedClassType(
base_type=long,
restriction_dict={"range": ["0..4294967295"]},
int_size=32,
),
is_leaf=True,
yang_name="retranmission-queue-length",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="uint32",
is_config=False,
)
load = kwargs.pop("load", None)
if args:
if len(args) > 1:
raise TypeError("cannot create a YANG container with >1 argument")
all_attr = True
for e in self._pyangbind_elements:
if not hasattr(args[0], e):
all_attr = False
break
if not all_attr:
raise ValueError("Supplied object did not have the correct attributes")
for e in self._pyangbind_elements:
nobj = getattr(args[0], e)
if nobj._changed() is False:
continue
setmethod = getattr(self, "_set_%s" % e)
if load is None:
setmethod(getattr(args[0], e))
else:
setmethod(getattr(args[0], e), load=load)
def _path(self):
if hasattr(self, "_parent"):
return self._parent._path() + [self._yang_name]
else:
return [
"network-instances",
"network-instance",
"protocols",
"protocol",
"ospfv2",
"areas",
"area",
"virtual-links",
"virtual-link",
"state",
]
def _get_remote_router_id(self):
"""
Getter method for remote_router_id, mapped from YANG variable /network_instances/network_instance/protocols/protocol/ospfv2/areas/area/virtual_links/virtual_link/state/remote_router_id (inet:ipv4-address-no-zone)
YANG Description: The router ID of the device which terminates the remote end
of the virtual link
"""
return self.__remote_router_id
def _set_remote_router_id(self, v, load=False):
"""
Setter method for remote_router_id, mapped from YANG variable /network_instances/network_instance/protocols/protocol/ospfv2/areas/area/virtual_links/virtual_link/state/remote_router_id (inet:ipv4-address-no-zone)
If this variable is read-only (config: false) in the
source YANG file, then _set_remote_router_id is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_remote_router_id() directly.
YANG Description: The router ID of the device which terminates the remote end
of the virtual link
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(
v,
base=RestrictedClassType(
base_type=RestrictedClassType(
base_type=six.text_type,
restriction_dict={
"pattern": "(([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\\.){3}([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])(%[\\p{N}\\p{L}]+)?"
},
),
restriction_dict={"pattern": "[0-9\\.]*"},
),
is_leaf=True,
yang_name="remote-router-id",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="inet:ipv4-address-no-zone",
is_config=False,
)
except (TypeError, ValueError):
raise ValueError(
{
"error-string": """remote_router_id must be of a type compatible with inet:ipv4-address-no-zone""",
"defined-type": "inet:ipv4-address-no-zone",
"generated-type": """YANGDynClass(base=RestrictedClassType(base_type=RestrictedClassType(base_type=six.text_type, restriction_dict={'pattern': '(([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\\.){3}([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])(%[\\p{N}\\p{L}]+)?'}), restriction_dict={'pattern': '[0-9\\.]*'}), is_leaf=True, yang_name="remote-router-id", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='http://openconfig.net/yang/network-instance', defining_module='openconfig-network-instance', yang_type='inet:ipv4-address-no-zone', is_config=False)""",
}
)
self.__remote_router_id = t
if hasattr(self, "_set"):
self._set()
def _unset_remote_router_id(self):
self.__remote_router_id = YANGDynClass(
base=RestrictedClassType(
base_type=RestrictedClassType(
base_type=six.text_type,
restriction_dict={
"pattern": "(([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\\.){3}([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])(%[\\p{N}\\p{L}]+)?"
},
),
restriction_dict={"pattern": "[0-9\\.]*"},
),
is_leaf=True,
yang_name="remote-router-id",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="inet:ipv4-address-no-zone",
is_config=False,
)
def _get_priority(self):
"""
Getter method for priority, mapped from YANG variable /network_instances/network_instance/protocols/protocol/ospfv2/areas/area/virtual_links/virtual_link/state/priority (uint8)
YANG Description: The remote system's priority to become the designated
router
"""
return self.__priority
def _set_priority(self, v, load=False):
"""
Setter method for priority, mapped from YANG variable /network_instances/network_instance/protocols/protocol/ospfv2/areas/area/virtual_links/virtual_link/state/priority (uint8)
If this variable is read-only (config: false) in the
source YANG file, then _set_priority is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_priority() directly.
YANG Description: The remote system's priority to become the designated
router
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(
v,
base=RestrictedClassType(
base_type=int, restriction_dict={"range": ["0..255"]}, int_size=8
),
is_leaf=True,
yang_name="priority",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="uint8",
is_config=False,
)
except (TypeError, ValueError):
raise ValueError(
{
"error-string": """priority must be of a type compatible with uint8""",
"defined-type": "uint8",
"generated-type": """YANGDynClass(base=RestrictedClassType(base_type=int, restriction_dict={'range': ['0..255']}, int_size=8), is_leaf=True, yang_name="priority", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='http://openconfig.net/yang/network-instance', defining_module='openconfig-network-instance', yang_type='uint8', is_config=False)""",
}
)
self.__priority = t
if hasattr(self, "_set"):
self._set()
def _unset_priority(self):
self.__priority = YANGDynClass(
base=RestrictedClassType(
base_type=int, restriction_dict={"range": ["0..255"]}, int_size=8
),
is_leaf=True,
yang_name="priority",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="uint8",
is_config=False,
)
def _get_dead_time(self):
"""
Getter method for dead_time, mapped from YANG variable /network_instances/network_instance/protocols/protocol/ospfv2/areas/area/virtual_links/virtual_link/state/dead_time (oc-types:timeticks64)
YANG Description: The time at which this neighbor's adjacency will be
considered dead. This value is expressed as a number of
seconds since the Unix Epoch
"""
return self.__dead_time
def _set_dead_time(self, v, load=False):
"""
Setter method for dead_time, mapped from YANG variable /network_instances/network_instance/protocols/protocol/ospfv2/areas/area/virtual_links/virtual_link/state/dead_time (oc-types:timeticks64)
If this variable is read-only (config: false) in the
source YANG file, then _set_dead_time is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_dead_time() directly.
YANG Description: The time at which this neighbor's adjacency will be
considered dead. This value is expressed as a number of
seconds since the Unix Epoch
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(
v,
base=RestrictedClassType(
base_type=long,
restriction_dict={"range": ["0..18446744073709551615"]},
int_size=64,
),
is_leaf=True,
yang_name="dead-time",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="oc-types:timeticks64",
is_config=False,
)
except (TypeError, ValueError):
raise ValueError(
{
"error-string": """dead_time must be of a type compatible with oc-types:timeticks64""",
"defined-type": "oc-types:timeticks64",
"generated-type": """YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..18446744073709551615']}, int_size=64), is_leaf=True, yang_name="dead-time", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='http://openconfig.net/yang/network-instance', defining_module='openconfig-network-instance', yang_type='oc-types:timeticks64', is_config=False)""",
}
)
self.__dead_time = t
if hasattr(self, "_set"):
self._set()
def _unset_dead_time(self):
self.__dead_time = YANGDynClass(
base=RestrictedClassType(
base_type=long,
restriction_dict={"range": ["0..18446744073709551615"]},
int_size=64,
),
is_leaf=True,
yang_name="dead-time",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="oc-types:timeticks64",
is_config=False,
)
def _get_designated_router(self):
"""
Getter method for designated_router, mapped from YANG variable /network_instances/network_instance/protocols/protocol/ospfv2/areas/area/virtual_links/virtual_link/state/designated_router (yang:dotted-quad)
YANG Description: The designated router for the adjacency. This device
advertises the Network LSA for broadcast and NBMA networks.
"""
return self.__designated_router
def _set_designated_router(self, v, load=False):
"""
Setter method for designated_router, mapped from YANG variable /network_instances/network_instance/protocols/protocol/ospfv2/areas/area/virtual_links/virtual_link/state/designated_router (yang:dotted-quad)
If this variable is read-only (config: false) in the
source YANG file, then _set_designated_router is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_designated_router() directly.
YANG Description: The designated router for the adjacency. This device
advertises the Network LSA for broadcast and NBMA networks.
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(
v,
base=RestrictedClassType(
base_type=six.text_type,
restriction_dict={
"pattern": "(([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\\.){3}([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])"
},
),
is_leaf=True,
yang_name="designated-router",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="yang:dotted-quad",
is_config=False,
)
except (TypeError, ValueError):
raise ValueError(
{
"error-string": """designated_router must be of a type compatible with yang:dotted-quad""",
"defined-type": "yang:dotted-quad",
"generated-type": """YANGDynClass(base=RestrictedClassType(base_type=six.text_type, restriction_dict={'pattern': '(([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\\.){3}([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])'}), is_leaf=True, yang_name="designated-router", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='http://openconfig.net/yang/network-instance', defining_module='openconfig-network-instance', yang_type='yang:dotted-quad', is_config=False)""",
}
)
self.__designated_router = t
if hasattr(self, "_set"):
self._set()
def _unset_designated_router(self):
self.__designated_router = YANGDynClass(
base=RestrictedClassType(
base_type=six.text_type,
restriction_dict={
"pattern": "(([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\\.){3}([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])"
},
),
is_leaf=True,
yang_name="designated-router",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="yang:dotted-quad",
is_config=False,
)
def _get_backup_designated_router(self):
"""
Getter method for backup_designated_router, mapped from YANG variable /network_instances/network_instance/protocols/protocol/ospfv2/areas/area/virtual_links/virtual_link/state/backup_designated_router (yang:dotted-quad)
YANG Description: The backup designated router for the adjacency.
"""
return self.__backup_designated_router
def _set_backup_designated_router(self, v, load=False):
"""
Setter method for backup_designated_router, mapped from YANG variable /network_instances/network_instance/protocols/protocol/ospfv2/areas/area/virtual_links/virtual_link/state/backup_designated_router (yang:dotted-quad)
If this variable is read-only (config: false) in the
source YANG file, then _set_backup_designated_router is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_backup_designated_router() directly.
YANG Description: The backup designated router for the adjacency.
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(
v,
base=RestrictedClassType(
base_type=six.text_type,
restriction_dict={
"pattern": "(([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\\.){3}([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])"
},
),
is_leaf=True,
yang_name="backup-designated-router",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="yang:dotted-quad",
is_config=False,
)
except (TypeError, ValueError):
raise ValueError(
{
"error-string": """backup_designated_router must be of a type compatible with yang:dotted-quad""",
"defined-type": "yang:dotted-quad",
"generated-type": """YANGDynClass(base=RestrictedClassType(base_type=six.text_type, restriction_dict={'pattern': '(([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\\.){3}([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])'}), is_leaf=True, yang_name="backup-designated-router", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='http://openconfig.net/yang/network-instance', defining_module='openconfig-network-instance', yang_type='yang:dotted-quad', is_config=False)""",
}
)
self.__backup_designated_router = t
if hasattr(self, "_set"):
self._set()
def _unset_backup_designated_router(self):
self.__backup_designated_router = YANGDynClass(
base=RestrictedClassType(
base_type=six.text_type,
restriction_dict={
"pattern": "(([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\\.){3}([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])"
},
),
is_leaf=True,
yang_name="backup-designated-router",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="yang:dotted-quad",
is_config=False,
)
def _get_optional_capabilities(self):
"""
Getter method for optional_capabilities, mapped from YANG variable /network_instances/network_instance/protocols/protocol/ospfv2/areas/area/virtual_links/virtual_link/state/optional_capabilities (yang:hex-string)
YANG Description: The optional capabilities field received in the Hello
message from the neighbor
"""
return self.__optional_capabilities
def _set_optional_capabilities(self, v, load=False):
"""
Setter method for optional_capabilities, mapped from YANG variable /network_instances/network_instance/protocols/protocol/ospfv2/areas/area/virtual_links/virtual_link/state/optional_capabilities (yang:hex-string)
If this variable is read-only (config: false) in the
source YANG file, then _set_optional_capabilities is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_optional_capabilities() directly.
YANG Description: The optional capabilities field received in the Hello
message from the neighbor
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(
v,
base=RestrictedClassType(
base_type=six.text_type,
restriction_dict={"pattern": "([0-9a-fA-F]{2}(:[0-9a-fA-F]{2})*)?"},
),
is_leaf=True,
yang_name="optional-capabilities",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="yang:hex-string",
is_config=False,
)
except (TypeError, ValueError):
raise ValueError(
{
"error-string": """optional_capabilities must be of a type compatible with yang:hex-string""",
"defined-type": "yang:hex-string",
"generated-type": """YANGDynClass(base=RestrictedClassType(base_type=six.text_type, restriction_dict={'pattern': '([0-9a-fA-F]{2}(:[0-9a-fA-F]{2})*)?'}), is_leaf=True, yang_name="optional-capabilities", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='http://openconfig.net/yang/network-instance', defining_module='openconfig-network-instance', yang_type='yang:hex-string', is_config=False)""",
}
)
self.__optional_capabilities = t
if hasattr(self, "_set"):
self._set()
def _unset_optional_capabilities(self):
self.__optional_capabilities = YANGDynClass(
base=RestrictedClassType(
base_type=six.text_type,
restriction_dict={"pattern": "([0-9a-fA-F]{2}(:[0-9a-fA-F]{2})*)?"},
),
is_leaf=True,
yang_name="optional-capabilities",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="yang:hex-string",
is_config=False,
)
def _get_last_established_time(self):
"""
Getter method for last_established_time, mapped from YANG variable /network_instances/network_instance/protocols/protocol/ospfv2/areas/area/virtual_links/virtual_link/state/last_established_time (oc-types:timeticks64)
YANG Description: The time at which the adjacency was last established with
the neighbor. That is to say the time at which the
adjacency last transitioned into the FULL state.
This value is expressed as the number of seconds, relative to
the Unix Epoch (Jan 1, 1970 00:00:00 UTC).
"""
return self.__last_established_time
def _set_last_established_time(self, v, load=False):
"""
Setter method for last_established_time, mapped from YANG variable /network_instances/network_instance/protocols/protocol/ospfv2/areas/area/virtual_links/virtual_link/state/last_established_time (oc-types:timeticks64)
If this variable is read-only (config: false) in the
source YANG file, then _set_last_established_time is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_last_established_time() directly.
YANG Description: The time at which the adjacency was last established with
the neighbor. That is to say the time at which the
adjacency last transitioned into the FULL state.
This value is expressed as the number of seconds, relative to
the Unix Epoch (Jan 1, 1970 00:00:00 UTC).
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(
v,
base=RestrictedClassType(
base_type=long,
restriction_dict={"range": ["0..18446744073709551615"]},
int_size=64,
),
is_leaf=True,
yang_name="last-established-time",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="oc-types:timeticks64",
is_config=False,
)
except (TypeError, ValueError):
raise ValueError(
{
"error-string": """last_established_time must be of a type compatible with oc-types:timeticks64""",
"defined-type": "oc-types:timeticks64",
"generated-type": """YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..18446744073709551615']}, int_size=64), is_leaf=True, yang_name="last-established-time", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='http://openconfig.net/yang/network-instance', defining_module='openconfig-network-instance', yang_type='oc-types:timeticks64', is_config=False)""",
}
)
self.__last_established_time = t
if hasattr(self, "_set"):
self._set()
def _unset_last_established_time(self):
self.__last_established_time = YANGDynClass(
base=RestrictedClassType(
base_type=long,
restriction_dict={"range": ["0..18446744073709551615"]},
int_size=64,
),
is_leaf=True,
yang_name="last-established-time",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="oc-types:timeticks64",
is_config=False,
)
def _get_adjacency_state(self):
"""
Getter method for adjacency_state, mapped from YANG variable /network_instances/network_instance/protocols/protocol/ospfv2/areas/area/virtual_links/virtual_link/state/adjacency_state (identityref)
YANG Description: The state of the adjacency with the neighbor.
"""
return self.__adjacency_state
def _set_adjacency_state(self, v, load=False):
"""
Setter method for adjacency_state, mapped from YANG variable /network_instances/network_instance/protocols/protocol/ospfv2/areas/area/virtual_links/virtual_link/state/adjacency_state (identityref)
If this variable is read-only (config: false) in the
source YANG file, then _set_adjacency_state is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_adjacency_state() directly.
YANG Description: The state of the adjacency with the neighbor.
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(
v,
base=RestrictedClassType(
base_type=six.text_type,
restriction_type="dict_key",
restriction_arg={
"DOWN": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospf-types:DOWN": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospft:DOWN": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"ATTEMPT": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospf-types:ATTEMPT": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospft:ATTEMPT": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"INIT": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospf-types:INIT": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospft:INIT": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"TWO_WAY": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospf-types:TWO_WAY": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospft:TWO_WAY": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"EXSTART": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospf-types:EXSTART": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospft:EXSTART": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"EXCHANGE": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospf-types:EXCHANGE": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospft:EXCHANGE": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"LOADING": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospf-types:LOADING": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospft:LOADING": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"FULL": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospf-types:FULL": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospft:FULL": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
},
),
is_leaf=True,
yang_name="adjacency-state",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="identityref",
is_config=False,
)
except (TypeError, ValueError):
raise ValueError(
{
"error-string": """adjacency_state must be of a type compatible with identityref""",
"defined-type": "openconfig-network-instance:identityref",
"generated-type": """YANGDynClass(base=RestrictedClassType(base_type=six.text_type, restriction_type="dict_key", restriction_arg={'DOWN': {'@module': 'openconfig-ospf-types', '@namespace': 'http://openconfig.net/yang/ospf-types'}, 'oc-ospf-types:DOWN': {'@module': 'openconfig-ospf-types', '@namespace': 'http://openconfig.net/yang/ospf-types'}, 'oc-ospft:DOWN': {'@module': 'openconfig-ospf-types', '@namespace': 'http://openconfig.net/yang/ospf-types'}, 'ATTEMPT': {'@module': 'openconfig-ospf-types', '@namespace': 'http://openconfig.net/yang/ospf-types'}, 'oc-ospf-types:ATTEMPT': {'@module': 'openconfig-ospf-types', '@namespace': 'http://openconfig.net/yang/ospf-types'}, 'oc-ospft:ATTEMPT': {'@module': 'openconfig-ospf-types', '@namespace': 'http://openconfig.net/yang/ospf-types'}, 'INIT': {'@module': 'openconfig-ospf-types', '@namespace': 'http://openconfig.net/yang/ospf-types'}, 'oc-ospf-types:INIT': {'@module': 'openconfig-ospf-types', '@namespace': 'http://openconfig.net/yang/ospf-types'}, 'oc-ospft:INIT': {'@module': 'openconfig-ospf-types', '@namespace': 'http://openconfig.net/yang/ospf-types'}, 'TWO_WAY': {'@module': 'openconfig-ospf-types', '@namespace': 'http://openconfig.net/yang/ospf-types'}, 'oc-ospf-types:TWO_WAY': {'@module': 'openconfig-ospf-types', '@namespace': 'http://openconfig.net/yang/ospf-types'}, 'oc-ospft:TWO_WAY': {'@module': 'openconfig-ospf-types', '@namespace': 'http://openconfig.net/yang/ospf-types'}, 'EXSTART': {'@module': 'openconfig-ospf-types', '@namespace': 'http://openconfig.net/yang/ospf-types'}, 'oc-ospf-types:EXSTART': {'@module': 'openconfig-ospf-types', '@namespace': 'http://openconfig.net/yang/ospf-types'}, 'oc-ospft:EXSTART': {'@module': 'openconfig-ospf-types', '@namespace': 'http://openconfig.net/yang/ospf-types'}, 'EXCHANGE': {'@module': 'openconfig-ospf-types', '@namespace': 'http://openconfig.net/yang/ospf-types'}, 'oc-ospf-types:EXCHANGE': {'@module': 'openconfig-ospf-types', '@namespace': 'http://openconfig.net/yang/ospf-types'}, 'oc-ospft:EXCHANGE': {'@module': 'openconfig-ospf-types', '@namespace': 'http://openconfig.net/yang/ospf-types'}, 'LOADING': {'@module': 'openconfig-ospf-types', '@namespace': 'http://openconfig.net/yang/ospf-types'}, 'oc-ospf-types:LOADING': {'@module': 'openconfig-ospf-types', '@namespace': 'http://openconfig.net/yang/ospf-types'}, 'oc-ospft:LOADING': {'@module': 'openconfig-ospf-types', '@namespace': 'http://openconfig.net/yang/ospf-types'}, 'FULL': {'@module': 'openconfig-ospf-types', '@namespace': 'http://openconfig.net/yang/ospf-types'}, 'oc-ospf-types:FULL': {'@module': 'openconfig-ospf-types', '@namespace': 'http://openconfig.net/yang/ospf-types'}, 'oc-ospft:FULL': {'@module': 'openconfig-ospf-types', '@namespace': 'http://openconfig.net/yang/ospf-types'}},), is_leaf=True, yang_name="adjacency-state", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='http://openconfig.net/yang/network-instance', defining_module='openconfig-network-instance', yang_type='identityref', is_config=False)""",
}
)
self.__adjacency_state = t
if hasattr(self, "_set"):
self._set()
def _unset_adjacency_state(self):
self.__adjacency_state = YANGDynClass(
base=RestrictedClassType(
base_type=six.text_type,
restriction_type="dict_key",
restriction_arg={
"DOWN": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospf-types:DOWN": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospft:DOWN": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"ATTEMPT": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospf-types:ATTEMPT": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospft:ATTEMPT": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"INIT": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospf-types:INIT": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospft:INIT": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"TWO_WAY": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospf-types:TWO_WAY": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospft:TWO_WAY": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"EXSTART": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospf-types:EXSTART": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospft:EXSTART": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"EXCHANGE": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospf-types:EXCHANGE": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospft:EXCHANGE": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"LOADING": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospf-types:LOADING": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospft:LOADING": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"FULL": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospf-types:FULL": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
"oc-ospft:FULL": {
"@module": "openconfig-ospf-types",
"@namespace": "http://openconfig.net/yang/ospf-types",
},
},
),
is_leaf=True,
yang_name="adjacency-state",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="identityref",
is_config=False,
)
def _get_state_changes(self):
"""
Getter method for state_changes, mapped from YANG variable /network_instances/network_instance/protocols/protocol/ospfv2/areas/area/virtual_links/virtual_link/state/state_changes (uint32)
YANG Description: The number of transitions out of the FULL state that this
neighbor has been through
"""
return self.__state_changes
def _set_state_changes(self, v, load=False):
"""
Setter method for state_changes, mapped from YANG variable /network_instances/network_instance/protocols/protocol/ospfv2/areas/area/virtual_links/virtual_link/state/state_changes (uint32)
If this variable is read-only (config: false) in the
source YANG file, then _set_state_changes is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_state_changes() directly.
YANG Description: The number of transitions out of the FULL state that this
neighbor has been through
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(
v,
base=RestrictedClassType(
base_type=long,
restriction_dict={"range": ["0..4294967295"]},
int_size=32,
),
is_leaf=True,
yang_name="state-changes",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="uint32",
is_config=False,
)
except (TypeError, ValueError):
raise ValueError(
{
"error-string": """state_changes must be of a type compatible with uint32""",
"defined-type": "uint32",
"generated-type": """YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="state-changes", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='http://openconfig.net/yang/network-instance', defining_module='openconfig-network-instance', yang_type='uint32', is_config=False)""",
}
)
self.__state_changes = t
if hasattr(self, "_set"):
self._set()
def _unset_state_changes(self):
self.__state_changes = YANGDynClass(
base=RestrictedClassType(
base_type=long,
restriction_dict={"range": ["0..4294967295"]},
int_size=32,
),
is_leaf=True,
yang_name="state-changes",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="uint32",
is_config=False,
)
def _get_retranmission_queue_length(self):
"""
Getter method for retranmission_queue_length, mapped from YANG variable /network_instances/network_instance/protocols/protocol/ospfv2/areas/area/virtual_links/virtual_link/state/retranmission_queue_length (uint32)
YANG Description: The number of LSAs that are currently in the queue to be
retransmitted to the neighbor
"""
return self.__retranmission_queue_length
def _set_retranmission_queue_length(self, v, load=False):
"""
Setter method for retranmission_queue_length, mapped from YANG variable /network_instances/network_instance/protocols/protocol/ospfv2/areas/area/virtual_links/virtual_link/state/retranmission_queue_length (uint32)
If this variable is read-only (config: false) in the
source YANG file, then _set_retranmission_queue_length is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_retranmission_queue_length() directly.
YANG Description: The number of LSAs that are currently in the queue to be
retransmitted to the neighbor
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(
v,
base=RestrictedClassType(
base_type=long,
restriction_dict={"range": ["0..4294967295"]},
int_size=32,
),
is_leaf=True,
yang_name="retranmission-queue-length",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="uint32",
is_config=False,
)
except (TypeError, ValueError):
raise ValueError(
{
"error-string": """retranmission_queue_length must be of a type compatible with uint32""",
"defined-type": "uint32",
"generated-type": """YANGDynClass(base=RestrictedClassType(base_type=long, restriction_dict={'range': ['0..4294967295']}, int_size=32), is_leaf=True, yang_name="retranmission-queue-length", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, namespace='http://openconfig.net/yang/network-instance', defining_module='openconfig-network-instance', yang_type='uint32', is_config=False)""",
}
)
self.__retranmission_queue_length = t
if hasattr(self, "_set"):
self._set()
def _unset_retranmission_queue_length(self):
self.__retranmission_queue_length = YANGDynClass(
base=RestrictedClassType(
base_type=long,
restriction_dict={"range": ["0..4294967295"]},
int_size=32,
),
is_leaf=True,
yang_name="retranmission-queue-length",
parent=self,
path_helper=self._path_helper,
extmethods=self._extmethods,
register_paths=True,
namespace="http://openconfig.net/yang/network-instance",
defining_module="openconfig-network-instance",
yang_type="uint32",
is_config=False,
)
remote_router_id = __builtin__.property(_get_remote_router_id)
priority = __builtin__.property(_get_priority)
dead_time = __builtin__.property(_get_dead_time)
designated_router = __builtin__.property(_get_designated_router)
backup_designated_router = __builtin__.property(_get_backup_designated_router)
optional_capabilities = __builtin__.property(_get_optional_capabilities)
last_established_time = __builtin__.property(_get_last_established_time)
adjacency_state = __builtin__.property(_get_adjacency_state)
state_changes = __builtin__.property(_get_state_changes)
retranmission_queue_length = __builtin__.property(_get_retranmission_queue_length)
_pyangbind_elements = OrderedDict(
[
("remote_router_id", remote_router_id),
("priority", priority),
("dead_time", dead_time),
("designated_router", designated_router),
("backup_designated_router", backup_designated_router),
("optional_capabilities", optional_capabilities),
("last_established_time", last_established_time),
("adjacency_state", adjacency_state),
("state_changes", state_changes),
("retranmission_queue_length", retranmission_queue_length),
]
)
| 48.484239 | 3,095 | 0.549041 | 12,990 | 127,659 | 5.208468 | 0.02194 | 0.059594 | 0.092465 | 0.104526 | 0.994457 | 0.99125 | 0.99125 | 0.99125 | 0.99125 | 0.99125 | 0 | 0.022349 | 0.323542 | 127,659 | 2,632 | 3,096 | 48.50266 | 0.761128 | 0.15185 | 0 | 0.854913 | 0 | 0.020202 | 0.370949 | 0.154545 | 0 | 0 | 0 | 0 | 0 | 1 | 0.029385 | false | 0 | 0.006887 | 0 | 0.061065 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
76bf069c848264ba7cfcb721dd3816af56b405ab | 6,588 | py | Python | tests/unit_tests/test_pdb/test_biopython_utils.py | MauriceKarrenbrock/PythonPDBStructures | 694c1ca6d600d97a8b514f6006034f2ff227e969 | [
"BSD-3-Clause"
] | null | null | null | tests/unit_tests/test_pdb/test_biopython_utils.py | MauriceKarrenbrock/PythonPDBStructures | 694c1ca6d600d97a8b514f6006034f2ff227e969 | [
"BSD-3-Clause"
] | null | null | null | tests/unit_tests/test_pdb/test_biopython_utils.py | MauriceKarrenbrock/PythonPDBStructures | 694c1ca6d600d97a8b514f6006034f2ff227e969 | [
"BSD-3-Clause"
] | null | null | null | # -*- coding: utf-8 -*-
# pylint: disable=missing-docstring
# pylint: disable=redefined-outer-name
# pylint: disable=no-self-use
#############################################################
# Copyright (c) 2020-2020 Maurice Karrenbrock #
# #
# This software is open-source and is distributed under the #
# BSD 3-Clause "New" or "Revised" License #
#############################################################
import unittest.mock as mock
import Bio.PDB
import pytest
import PythonPDBStructures.pdb.biopython_utils as bio
class Testparse_pdb():
def test_works(self, mocker):
prot_id = '5aol'
file_name = 'name.pdb'
m_parser = mocker.patch.object(Bio.PDB.PDBParser, 'get_structure')
output = bio.parse_pdb(prot_id, file_name)
m_parser.assert_called_once_with(prot_id, file_name)
assert isinstance(output, mock.MagicMock)
class Testparse_mmcif():
def test_works(self, mocker):
prot_id = '5aol'
file_name = 'name.cif'
m_parser = mocker.patch.object(Bio.PDB.MMCIFParser, 'get_structure')
output = bio.parse_mmcif(prot_id, file_name)
m_parser.assert_called_once_with(prot_id, file_name)
assert isinstance(output, mock.MagicMock)
class Testmmcif2dict():
def test_works(self, mocker):
file_name = 'name.cif'
m_parser = \
mocker.patch('PythonPDBStructures.pdb.biopython_utils.Bio.PDB.MMCIF2Dict.MMCIF2Dict')
output = bio.mmcif2dict(file_name)
m_parser.assert_called_once_with(file_name)
assert isinstance(output, mock.MagicMock)
class Testwrite_pdb():
def test_works_is_instance(self, mocker):
m_isinstance = \
mocker.patch('PythonPDBStructures.pdb.biopython_utils.isinstance', return_value=True)
m_hasattr = \
mocker.patch('PythonPDBStructures.pdb.biopython_utils.hasattr')
m_struct = mocker.patch.object(Bio.PDB.PDBIO, 'set_structure')
m_save = mocker.patch.object(Bio.PDB.PDBIO, 'save')
bio.write_pdb('fake_structure_object', 'file_name.pdb')
m_isinstance.assert_called_once()
m_hasattr.assert_not_called()
m_struct.assert_called_once_with('fake_structure_object')
m_save.assert_called_once_with('file_name.pdb')
def test_works_hasattr(self, mocker):
class FakeAtom(object):
def __init__(self):
self.level = 'A'
fake_atom_1 = FakeAtom()
fake_atom_2 = FakeAtom()
fake_structure = (fake_atom_1, fake_atom_2)
m_isinstance = \
mocker.patch('PythonPDBStructures.pdb.biopython_utils.isinstance', return_value=False)
m_hasattr = \
mocker.patch('PythonPDBStructures.pdb.biopython_utils.hasattr', return_value=True)
m_struct = mocker.patch.object(Bio.PDB.PDBIO, 'set_structure')
m_save = mocker.patch.object(Bio.PDB.PDBIO, 'save')
bio.write_pdb(fake_structure, 'file_name.pdb')
m_isinstance.assert_called_once()
m_hasattr.assert_called_once()
m_struct.assert_called_once_with(fake_structure)
m_save.assert_called_once_with('file_name.pdb')
def test_raises(self, mocker):
class FakeAtom(object):
def __init__(self):
self.level = 'A'
fake_atom_1 = FakeAtom()
fake_atom_2 = FakeAtom()
fake_structure = (fake_atom_1, fake_atom_2)
m_isinstance = \
mocker.patch('PythonPDBStructures.pdb.biopython_utils.isinstance', return_value=False)
m_hasattr = \
mocker.patch('PythonPDBStructures.pdb.biopython_utils.hasattr', return_value=False)
with pytest.raises(TypeError):
bio.write_pdb(fake_structure, 'file_name.pdb')
m_isinstance.assert_called_once()
m_hasattr.assert_called_once()
class Testwrite_mmcif():
def test_works_is_instance(self, mocker):
m_isinstance = \
mocker.patch('PythonPDBStructures.pdb.biopython_utils.isinstance', return_value=True)
m_hasattr = \
mocker.patch('PythonPDBStructures.pdb.biopython_utils.hasattr')
m_struct = mocker.patch.object(Bio.PDB.MMCIFIO, 'set_structure')
m_save = mocker.patch.object(Bio.PDB.MMCIFIO, 'save')
bio.write_mmcif('fake_structure_object', 'file_name.cif')
m_isinstance.assert_called_once()
m_hasattr.assert_not_called()
m_struct.assert_called_once_with('fake_structure_object')
m_save.assert_called_once_with('file_name.cif')
def test_works_hasattr(self, mocker):
class FakeAtom(object):
def __init__(self):
self.level = 'A'
fake_atom_1 = FakeAtom()
fake_atom_2 = FakeAtom()
fake_structure = (fake_atom_1, fake_atom_2)
m_isinstance = \
mocker.patch('PythonPDBStructures.pdb.biopython_utils.isinstance', return_value=False)
m_hasattr = \
mocker.patch('PythonPDBStructures.pdb.biopython_utils.hasattr', return_value=True)
m_struct = mocker.patch.object(Bio.PDB.MMCIFIO, 'set_structure')
m_save = mocker.patch.object(Bio.PDB.MMCIFIO, 'save')
bio.write_mmcif(fake_structure, 'file_name.cif')
m_isinstance.assert_called_once()
m_hasattr.assert_called_once()
m_struct.assert_called_once_with(fake_structure)
m_save.assert_called_once_with('file_name.cif')
def test_raises(self, mocker):
class FakeAtom(object):
def __init__(self):
self.level = 'A'
fake_atom_1 = FakeAtom()
fake_atom_2 = FakeAtom()
fake_structure = (fake_atom_1, fake_atom_2)
m_isinstance = \
mocker.patch('PythonPDBStructures.pdb.biopython_utils.isinstance', return_value=False)
m_hasattr = \
mocker.patch('PythonPDBStructures.pdb.biopython_utils.hasattr', return_value=False)
with pytest.raises(TypeError):
bio.write_mmcif(fake_structure, 'file_name.cif')
m_isinstance.assert_called_once()
m_hasattr.assert_called_once()
class Testwrite_dict2mmcif():
def test_works(self, mocker):
fake_dict = {}
m_dict = mocker.patch.object(Bio.PDB.MMCIFIO, 'set_dict')
m_save = mocker.patch.object(Bio.PDB.MMCIFIO, 'save')
bio.write_dict2mmcif(fake_dict, 'file_name.cif')
m_dict.assert_called_once_with(fake_dict)
m_save.assert_called_once_with('file_name.cif')
| 27.3361 | 94 | 0.652095 | 792 | 6,588 | 5.083333 | 0.126263 | 0.068306 | 0.091406 | 0.125186 | 0.875062 | 0.842275 | 0.833333 | 0.814704 | 0.780676 | 0.771734 | 0 | 0.006697 | 0.229356 | 6,588 | 240 | 95 | 27.45 | 0.786291 | 0.051154 | 0 | 0.804688 | 0 | 0 | 0.169428 | 0.120551 | 0 | 0 | 0 | 0 | 0.21875 | 1 | 0.109375 | false | 0 | 0.03125 | 0 | 0.21875 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
4fa6561a460c33a4dd5702b428ad9befbcfa806e | 3,029 | py | Python | sorting_algos/test_sorts.py | steveflys/data-structures-and-algorithms | 9c89cb24449ca7bc09578408cba3c877fe74e000 | [
"MIT"
] | null | null | null | sorting_algos/test_sorts.py | steveflys/data-structures-and-algorithms | 9c89cb24449ca7bc09578408cba3c877fe74e000 | [
"MIT"
] | 3 | 2018-05-01T18:07:50.000Z | 2018-05-11T16:52:16.000Z | sorting_algos/test_sorts.py | steveflys/data-structures-and-algorithms | 9c89cb24449ca7bc09578408cba3c877fe74e000 | [
"MIT"
] | null | null | null | """Made tersts for the sort functions."""
from .mergesort import merge_sort
from .selection import selection_sort
from .quicksort import quicksort
from .radix_sort import radix_sort
def test_merge_sort_on_integers():
"""Test the merge sort on integers."""
assert merge_sort([54, 26, 93, 17, 77, 31, 44, 55, 20]) == [17, 20, 26, 31, 44, 54, 55, 77, 93]
def test_merge_sort_on_words():
"""Test the merge sort on words."""
assert merge_sort(['cat', 'dog', 'snake', 'gerbil']) == ['cat', 'dog', 'gerbil', 'snake']
def test_merge_sort_with_neg():
"""Test the merge sort on integers with neg integer."""
assert merge_sort([54, 26, 93, 17, 77, 31, 44, -55, 20]) == [-55, 17, 20, 26, 31, 44, 54, 77, 93]
def test_merge_sort_with_floats():
"""Test the merge sort on integers with neg integer."""
assert merge_sort([54, 26.01, 93, 17, 77, 26, 44, -55, 20]) == [-55, 17, 20, 26, 26.01, 44, 54, 77, 93]
def test_selection_sort_on_integers():
"""Test the merge sort on integers."""
assert selection_sort([54, 26, 93, 17, 77, 31, 44, 55, 20]) == [17, 20, 26, 31, 44, 54, 55, 77, 93]
def test_selection_sort_on_words():
"""Test the selection sort on words."""
assert selection_sort(['cat', 'dog', 'snake', 'gerbil']) == ['cat', 'dog', 'gerbil', 'snake']
def test_selection_sort_with_neg():
"""Test the selection sort on integers with neg integer."""
assert selection_sort([54, 26, 93, 17, 77, 31, 44, -55, 20]) == [-55, 17, 20, 26, 31, 44, 54, 77, 93]
def test_selection_sort_with_floats():
"""Test the selection sort on integers with neg integer."""
assert selection_sort([54, 26.01, 93, 17, 77, 26, 44, -55, 20]) == [-55, 17, 20, 26, 26.01, 44, 54, 77, 93]
def test_quicksort_on_integers():
"""Test the merge sort on integers."""
assert quicksort([54, 26, 93, 17, 77, 31, 44, 55, 20]) == [17, 20, 26, 31, 44, 54, 55, 77, 93]
def test_quicksort_on_words():
"""Test the selection sort on words."""
assert quicksort(['cat', 'dog', 'snake', 'gerbil']) == ['cat', 'dog', 'gerbil', 'snake']
def test_quicksort_with_neg():
"""Test the selection sort on integers with neg integer."""
assert quicksort([54, 26, 93, 17, 77, 31, 44, -55, 20]) == [-55, 17, 20, 26, 31, 44, 54, 77, 93]
def test_quicksort_with_floats():
"""Test the selection sort on integers with neg integer."""
assert quicksort([54, 26.01, 93, 17, 77, 26, 44, -55, 20]) == [-55, 17, 20, 26, 26.01, 44, 54, 77, 93]
def test_radix_sort_on_integers():
"""Test the merge sort on integers."""
assert radix_sort([54, 26, 93, 17, 77, 31, 44, 55, 20]) == [17, 20, 26, 31, 44, 54, 55, 77, 93]
def test_radix_sort_on_integers_of_diff_lengths():
"""Test the merge sort on integers of diffeerent lengths."""
assert radix_sort([540, 2, 9113, 17, 707, 31, 4, 55555, 20]) == [2, 4, 17, 20, 31, 540, 707, 9113, 55555]
def test_radix_sort_on_list_of_one():
"""Test the merge sort on integers of diffeerent lengths."""
assert radix_sort([540]) == [540]
| 36.939024 | 111 | 0.632882 | 502 | 3,029 | 3.667331 | 0.09761 | 0.0717 | 0.121673 | 0.05975 | 0.858229 | 0.80717 | 0.792504 | 0.775665 | 0.750679 | 0.68767 | 0 | 0.172498 | 0.188511 | 3,029 | 81 | 112 | 37.395062 | 0.576485 | 0.228128 | 0 | 0 | 0 | 0 | 0.045193 | 0 | 0 | 0 | 0 | 0 | 0.441176 | 1 | 0.441176 | true | 0 | 0.117647 | 0 | 0.558824 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 10 |
96c5c432c0f9fa173329f5f04ff85eb12dec40c8 | 4,185 | py | Python | 2021/15.py | frodetb/aoc | 2dad7c0d7342d97f4a3a2ad202afe302c3473368 | [
"MIT"
] | null | null | null | 2021/15.py | frodetb/aoc | 2dad7c0d7342d97f4a3a2ad202afe302c3473368 | [
"MIT"
] | null | null | null | 2021/15.py | frodetb/aoc | 2dad7c0d7342d97f4a3a2ad202afe302c3473368 | [
"MIT"
] | null | null | null | risk_level = [[int(c) for c in line.strip()] for line in open("input_15", "r").readlines()]
R = len(risk_level)
C = len(risk_level[0])
path_length = [[None for _ in range(C)] for __ in range(R)]
path_length[-1][-1] = 0
def neighbours(r, c):
for dr, dc in [(1, 0), (-1, 0), (0, 1), (0, -1)]:
if 0 <= r + dr < R and 0 <= c + dc < C:
yield r + dr, c + dc
for r, c in ((r, c) for r in reversed(range(R)) for c in reversed(range(C))):
for rn, cn in neighbours(r, c):
if path_length[rn][cn] is not None:
if path_length[r][c] is None or path_length[r][c] > path_length[rn][cn] + risk_level[rn][cn]:
path_length[r][c] = path_length[rn][cn] + risk_level[rn][cn]
print(path_length[0][0])
def neighbours(r, c):
for dr, dc in [(1, 0), (-1, 0), (0, 1), (0, -1)]:
if 0 <= r + dr < 5 * R and 0 <= c + dc < 5 * C:
yield r + dr, c + dc
risk_level = [[(risk_level[r % R][c % C] + (r // R) + (c // C) - 1) % 9 + 1 for c in range(5 * C)] for r in range(5 * R)]
path_length = [[None for _ in range(5 * C)] for __ in range(5 * R)]
path_length[-1][-1] = 0
for r, c in ((r, c) for r in reversed(range(5 * R)) for c in reversed(range(5 * C))):
for rn, cn in neighbours(r, c):
if path_length[rn][cn] is not None:
if path_length[r][c] is None or path_length[r][c] > path_length[rn][cn] + risk_level[rn][cn]:
path_length[r][c] = path_length[rn][cn] + risk_level[rn][cn]
for r, c in ((r, c) for r in reversed(range(5 * R)) for c in reversed(range(5 * C))):
for rn, cn in neighbours(r, c):
if path_length[r][c] > path_length[rn][cn] + risk_level[rn][cn]:
path_length[r][c] = path_length[rn][cn] + risk_level[rn][cn]
for r, c in ((r, c) for r in reversed(range(5 * R)) for c in reversed(range(5 * C))):
for rn, cn in neighbours(r, c):
if path_length[r][c] > path_length[rn][cn] + risk_level[rn][cn]:
path_length[r][c] = path_length[rn][cn] + risk_level[rn][cn]
for r, c in ((r, c) for r in reversed(range(5 * R)) for c in reversed(range(5 * C))):
for rn, cn in neighbours(r, c):
if path_length[r][c] > path_length[rn][cn] + risk_level[rn][cn]:
path_length[r][c] = path_length[rn][cn] + risk_level[rn][cn]
for r, c in ((r, c) for r in reversed(range(5 * R)) for c in reversed(range(5 * C))):
for rn, cn in neighbours(r, c):
if path_length[r][c] > path_length[rn][cn] + risk_level[rn][cn]:
path_length[r][c] = path_length[rn][cn] + risk_level[rn][cn]
for r, c in ((r, c) for r in reversed(range(5 * R)) for c in reversed(range(5 * C))):
for rn, cn in neighbours(r, c):
if path_length[r][c] > path_length[rn][cn] + risk_level[rn][cn]:
path_length[r][c] = path_length[rn][cn] + risk_level[rn][cn]
for r, c in ((r, c) for r in reversed(range(5 * R)) for c in reversed(range(5 * C))):
for rn, cn in neighbours(r, c):
if path_length[r][c] > path_length[rn][cn] + risk_level[rn][cn]:
path_length[r][c] = path_length[rn][cn] + risk_level[rn][cn]
for r, c in ((r, c) for r in reversed(range(5 * R)) for c in reversed(range(5 * C))):
for rn, cn in neighbours(r, c):
if path_length[r][c] > path_length[rn][cn] + risk_level[rn][cn]:
path_length[r][c] = path_length[rn][cn] + risk_level[rn][cn]
for r, c in ((r, c) for r in reversed(range(5 * R)) for c in reversed(range(5 * C))):
for rn, cn in neighbours(r, c):
if path_length[r][c] > path_length[rn][cn] + risk_level[rn][cn]:
path_length[r][c] = path_length[rn][cn] + risk_level[rn][cn]
for r, c in ((r, c) for r in reversed(range(5 * R)) for c in reversed(range(5 * C))):
for rn, cn in neighbours(r, c):
if path_length[r][c] > path_length[rn][cn] + risk_level[rn][cn]:
path_length[r][c] = path_length[rn][cn] + risk_level[rn][cn]
for r, c in ((r, c) for r in reversed(range(5 * R)) for c in reversed(range(5 * C))):
for rn, cn in neighbours(r, c):
if path_length[r][c] > path_length[rn][cn] + risk_level[rn][cn]:
path_length[r][c] = path_length[rn][cn] + risk_level[rn][cn]
print(path_length[0][0])
| 53.653846 | 122 | 0.572999 | 813 | 4,185 | 2.833948 | 0.04428 | 0.057292 | 0.135417 | 0.157986 | 0.942708 | 0.928385 | 0.869358 | 0.869358 | 0.869358 | 0.869358 | 0 | 0.01995 | 0.233453 | 4,185 | 77 | 123 | 54.350649 | 0.698254 | 0 | 0 | 0.867647 | 0 | 0 | 0.002151 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.029412 | false | 0 | 0 | 0 | 0.029412 | 0.029412 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
8c287b8c76ec09328e29562aa8bf203ea0e6d388 | 48 | py | Python | creational/singleton/metaclass/metaclass.py | Zim95/design_patterns | 39d0aadb51094049e1d23dd34ef3d137f3609106 | [
"MIT"
] | null | null | null | creational/singleton/metaclass/metaclass.py | Zim95/design_patterns | 39d0aadb51094049e1d23dd34ef3d137f3609106 | [
"MIT"
] | null | null | null | creational/singleton/metaclass/metaclass.py | Zim95/design_patterns | 39d0aadb51094049e1d23dd34ef3d137f3609106 | [
"MIT"
] | null | null | null | class Test:
pass
print(Test)
print(Test()) | 8 | 13 | 0.645833 | 7 | 48 | 4.428571 | 0.571429 | 0.580645 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.208333 | 48 | 6 | 13 | 8 | 0.815789 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.25 | 0 | 0 | 0.25 | 0.5 | 1 | 1 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 1 | 0 | 7 |
8c50beca9141d18fc7857782bc0f0ce0d38422d0 | 10,840 | py | Python | mailchimp_transactional/api/senders_api.py | mailchimp/mailchimp-transactional-python | 13984adc51f8a91a08c8b282d25c6752ba0375c4 | [
"Apache-2.0"
] | 21 | 2020-08-31T16:24:14.000Z | 2022-03-16T17:18:36.000Z | build/lib/mailchimp_transactional/api/senders_api.py | mailchimp/mailchimp-transactional-python | 13984adc51f8a91a08c8b282d25c6752ba0375c4 | [
"Apache-2.0"
] | null | null | null | build/lib/mailchimp_transactional/api/senders_api.py | mailchimp/mailchimp-transactional-python | 13984adc51f8a91a08c8b282d25c6752ba0375c4 | [
"Apache-2.0"
] | 5 | 2021-02-02T10:17:43.000Z | 2022-01-21T15:49:38.000Z | # coding: utf-8
"""
Mailchimp Transactional API
No description provided (generated by Swagger Codegen https://github.com/swagger-api/swagger-codegen) # noqa: E501
OpenAPI spec version: 1.0.46
Contact: apihelp@mailchimp.com
Generated by: https://github.com/swagger-api/swagger-codegen.git
"""
from __future__ import absolute_import
import re # noqa: F401
# python 2 and python 3 compatibility library
import six
from mailchimp_transactional.api_client import ApiClient
class SendersApi(object):
"""NOTE: This class is auto generated by the swagger code generator program.
Do not edit the class manually.
Ref: https://github.com/swagger-api/swagger-codegen
"""
def __init__(self, api_key='', api_client = None):
self.api_key = api_key
if api_client:
self.api_client = api_client
else:
self.api_client = ApiClient()
def add_domain(self, body = {}, **kwargs): # noqa: E501
"""Add sender domain # noqa: E501
Adds a sender domain to your account. Sender domains are added automatically as you send, but you can use this call to add them ahead of time. # noqa: E501
"""
(data) = self.add_domain_with_http_info(body, **kwargs) # noqa: E501
return data
def add_domain_with_http_info(self, body, **kwargs): # noqa: E501
"""Add sender domain # noqa: E501
Adds a sender domain to your account. Sender domains are added automatically as you send, but you can use this call to add them ahead of time. # noqa: E501
"""
all_params = ['body'] # noqa: E501
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method add_domain" % key
)
params[key] = val
del params['kwargs']
# add api_key to body params
params['body']['key'] = self.api_key
body_params = None
if 'body' in params:
body_params = params['body']
return self.api_client.call_api(
'/senders/add-domain', 'POST',
body=body_params,
response_type='InlineResponse20045') # noqa: E501
def check_domain(self, body = {}, **kwargs): # noqa: E501
"""Check domain settings # noqa: E501
Checks the SPF and DKIM settings for a domain, as well the domain verification. If you haven't already added this domain to your account, it will be added automatically. # noqa: E501
"""
(data) = self.check_domain_with_http_info(body, **kwargs) # noqa: E501
return data
def check_domain_with_http_info(self, body, **kwargs): # noqa: E501
"""Check domain settings # noqa: E501
Checks the SPF and DKIM settings for a domain, as well the domain verification. If you haven't already added this domain to your account, it will be added automatically. # noqa: E501
"""
all_params = ['body'] # noqa: E501
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method check_domain" % key
)
params[key] = val
del params['kwargs']
# add api_key to body params
params['body']['key'] = self.api_key
body_params = None
if 'body' in params:
body_params = params['body']
return self.api_client.call_api(
'/senders/check-domain', 'POST',
body=body_params,
response_type='InlineResponse20046') # noqa: E501
def domains(self, body = {}, **kwargs): # noqa: E501
"""List sender domains # noqa: E501
Returns the sender domains that have been added to this account. # noqa: E501
"""
(data) = self.domains_with_http_info(body, **kwargs) # noqa: E501
return data
def domains_with_http_info(self, body, **kwargs): # noqa: E501
"""List sender domains # noqa: E501
Returns the sender domains that have been added to this account. # noqa: E501
"""
all_params = ['body'] # noqa: E501
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method domains" % key
)
params[key] = val
del params['kwargs']
# add api_key to body params
params['body']['key'] = self.api_key
body_params = None
if 'body' in params:
body_params = params['body']
return self.api_client.call_api(
'/senders/domains', 'POST',
body=body_params,
response_type='list[InlineResponse20044]') # noqa: E501
def info(self, body = {}, **kwargs): # noqa: E501
"""Get sender info # noqa: E501
Return more detailed information about a single sender, including aggregates of recent stats. # noqa: E501
"""
(data) = self.info_with_http_info(body, **kwargs) # noqa: E501
return data
def info_with_http_info(self, body, **kwargs): # noqa: E501
"""Get sender info # noqa: E501
Return more detailed information about a single sender, including aggregates of recent stats. # noqa: E501
"""
all_params = ['body'] # noqa: E501
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method info" % key
)
params[key] = val
del params['kwargs']
# add api_key to body params
params['body']['key'] = self.api_key
body_params = None
if 'body' in params:
body_params = params['body']
return self.api_client.call_api(
'/senders/info', 'POST',
body=body_params,
response_type='InlineResponse20048') # noqa: E501
def list(self, body = {}, **kwargs): # noqa: E501
"""List account senders # noqa: E501
Return the senders that have tried to use this account. # noqa: E501
"""
(data) = self.list_with_http_info(body, **kwargs) # noqa: E501
return data
def list_with_http_info(self, body, **kwargs): # noqa: E501
"""List account senders # noqa: E501
Return the senders that have tried to use this account. # noqa: E501
"""
all_params = ['body'] # noqa: E501
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method list" % key
)
params[key] = val
del params['kwargs']
# add api_key to body params
params['body']['key'] = self.api_key
body_params = None
if 'body' in params:
body_params = params['body']
return self.api_client.call_api(
'/senders/list', 'POST',
body=body_params,
response_type='list[InlineResponse20043]') # noqa: E501
def time_series(self, body = {}, **kwargs): # noqa: E501
"""View sender history # noqa: E501
Return the recent history (hourly stats for the last 30 days) for a sender. # noqa: E501
"""
(data) = self.time_series_with_http_info(body, **kwargs) # noqa: E501
return data
def time_series_with_http_info(self, body, **kwargs): # noqa: E501
"""View sender history # noqa: E501
Return the recent history (hourly stats for the last 30 days) for a sender. # noqa: E501
"""
all_params = ['body'] # noqa: E501
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method time_series" % key
)
params[key] = val
del params['kwargs']
# add api_key to body params
params['body']['key'] = self.api_key
body_params = None
if 'body' in params:
body_params = params['body']
return self.api_client.call_api(
'/senders/time-series', 'POST',
body=body_params,
response_type='list[InlineResponse20049]') # noqa: E501
def verify_domain(self, body = {}, **kwargs): # noqa: E501
"""Verify domain # noqa: E501
Sends a verification email in order to verify ownership of a domain. Domain verification is a required step to confirm ownership of a domain. Once a domain has been verified in a Transactional API account, other accounts may not have their messages signed by that domain unless they also verify the domain. This prevents other Transactional API accounts from sending mail signed by your domain. # noqa: E501
"""
(data) = self.verify_domain_with_http_info(body, **kwargs) # noqa: E501
return data
def verify_domain_with_http_info(self, body, **kwargs): # noqa: E501
"""Verify domain # noqa: E501
Sends a verification email in order to verify ownership of a domain. Domain verification is a required step to confirm ownership of a domain. Once a domain has been verified in a Transactional API account, other accounts may not have their messages signed by that domain unless they also verify the domain. This prevents other Transactional API accounts from sending mail signed by your domain. # noqa: E501
"""
all_params = ['body'] # noqa: E501
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method verify_domain" % key
)
params[key] = val
del params['kwargs']
# add api_key to body params
params['body']['key'] = self.api_key
body_params = None
if 'body' in params:
body_params = params['body']
return self.api_client.call_api(
'/senders/verify-domain', 'POST',
body=body_params,
response_type='InlineResponse20047') # noqa: E501
| 35.89404 | 416 | 0.587177 | 1,337 | 10,840 | 4.650711 | 0.136126 | 0.082342 | 0.047282 | 0.060791 | 0.855259 | 0.847539 | 0.837247 | 0.78514 | 0.781923 | 0.769379 | 0 | 0.032603 | 0.318081 | 10,840 | 301 | 417 | 36.013289 | 0.808577 | 0.344926 | 0 | 0.636364 | 1 | 0 | 0.138893 | 0.017604 | 0 | 0 | 0 | 0 | 0 | 1 | 0.090909 | false | 0 | 0.024242 | 0 | 0.206061 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
8c58eac504c6492c4fa09eff8ab88d3100d3609d | 11,764 | py | Python | test/test_manager.py | ORNL/curifactory | f8be235b7fa7b91cc86f61d610d7093075b89d1f | [
"BSD-3-Clause"
] | 4 | 2022-01-25T18:27:49.000Z | 2022-03-30T22:57:04.000Z | test/test_manager.py | ORNL/curifactory | f8be235b7fa7b91cc86f61d610d7093075b89d1f | [
"BSD-3-Clause"
] | 1 | 2022-03-05T19:10:42.000Z | 2022-03-07T18:00:49.000Z | test/test_manager.py | ORNL/curifactory | f8be235b7fa7b91cc86f61d610d7093075b89d1f | [
"BSD-3-Clause"
] | null | null | null | """Testing pathing functions on the manager."""
import os
from pytest_mock import mocker # noqa: F401 -- flake8 doesn't see it's used as fixture
from curifactory import stage, Record
from curifactory.caching import Cacheable, Lazy, PickleCacher
class FakeCacher(Cacheable):
def __init__(self, path_override=None):
super().__init__(".fake", path_override=path_override)
def load(self):
return None
def save(self, obj):
pass
# -----------------------------------------
# Stage/manager.get_path intergration tests
# -----------------------------------------
def test_stage_integration_basic(
mocker, # noqa: F811 -- mocker has to be passed in as fixture
sample_args,
configured_test_manager,
):
"""The stage should be calling manager's get_path with the correct parameters."""
mock = mocker.patch.object(
configured_test_manager, "get_path", return_value="test_path"
)
@stage([], ["test_output"], [FakeCacher])
def do_thing(record):
return "hello world"
record = Record(configured_test_manager, sample_args)
do_thing(record)
mock.assert_called_once_with("test_output", record, aggregate_records=None)
def test_stage_integration_path_override(
mocker, # noqa: F811 -- mocker has to be passed in as fixture
sample_args,
configured_test_manager,
):
"""The stage should be calling manager's get_path with the correct parameters when using a path override in a cacher."""
mock = mocker.patch.object(
configured_test_manager, "get_path", return_value="test_path"
)
@stage([], ["test_output"], [FakeCacher(path_override="test/examples/WHAT")])
def do_thing(record):
return "hello world"
record = Record(configured_test_manager, sample_args)
do_thing(record)
mock.assert_called_once_with(
"test_output", record, base_path="test/examples/WHAT", aggregate_records=None
)
def test_stage_integration_storefull(
mocker, # noqa: F811 -- mocker has to be passed in as fixture
sample_args,
configured_test_manager,
):
"""The stage should be calling manager's get_path twice with the correct parameters."""
configured_test_manager.store_entire_run = True
mock = mocker.patch.object(
configured_test_manager, "get_path", return_value="test_path"
)
@stage([], ["test_output"], [FakeCacher])
def do_thing(record):
return "hello world"
record = Record(configured_test_manager, sample_args)
do_thing(record)
assert len(mock.call_args_list) == 2
assert mock.call_args_list[0].args == ("test_output", record)
assert mock.call_args_list[0].kwargs == dict(aggregate_records=None)
assert mock.call_args_list[1].args == ("test_output", record)
assert mock.call_args_list[1].kwargs == dict(output=True, aggregate_records=None)
# -----------------------------------------
# get_path unit tests
# -----------------------------------------
def test_get_path_basic(sample_args, configured_test_manager):
"""Calling get_path with a basic set of args should return the expected path."""
record = Record(configured_test_manager, sample_args)
path = configured_test_manager.get_path(
"test_output", record, aggregate_records=None
)
assert path == "test/examples/data/cache/test_sample_hash__test_output"
def test_get_path_basic_w_stagename(sample_args, configured_test_manager):
"""Calling get_path with a basic set of args and a valid stage name should return the expected path."""
record = Record(configured_test_manager, sample_args)
configured_test_manager.current_stage_name = "somestage"
path = configured_test_manager.get_path(
"test_output", record, aggregate_records=None
)
assert path == "test/examples/data/cache/test_sample_hash_somestage_test_output"
def test_get_path_path_override(sample_args, configured_test_manager):
"""Calling get_path with a basic set of args and a path override should return the expected path."""
record = Record(configured_test_manager, sample_args)
path = configured_test_manager.get_path(
"test_output", record, base_path="test/examples/WHAT", aggregate_records=None
)
assert path == "test/examples/WHAT/test_sample_hash__test_output"
def test_get_path_store_full(sample_args, configured_test_manager):
"""Calling get_path with store-full should return the expected path."""
record = Record(configured_test_manager, sample_args)
configured_test_manager.store_entire_run = True
ts = configured_test_manager.get_str_timestamp()
path = configured_test_manager.get_path(
"test_output", record, output=True, aggregate_records=None
)
assert (
path
== f"test/examples/data/runs/test_{configured_test_manager.experiment_run_number}_{ts}/test_sample_hash__test_output"
)
def test_get_path_custom_name(sample_args, configured_test_manager):
"""Calling get_path when a custom name is in use should return the expected path."""
record = Record(configured_test_manager, sample_args)
configured_test_manager.custom_name = "some_custom_name"
path = configured_test_manager.get_path(
"test_output", record, aggregate_records=None
)
assert path == "test/examples/data/cache/some_custom_name_sample_hash__test_output"
def test_get_path_custom_name_and_store_full(sample_args, configured_test_manager):
"""Calling get_path when a custom name is in use and storefull is called should return the expected path."""
record = Record(configured_test_manager, sample_args)
configured_test_manager.store_entire_run = True
configured_test_manager.custom_name = "some_custom_name"
ts = configured_test_manager.get_str_timestamp()
path = configured_test_manager.get_path(
"test_output", record, output=True, aggregate_records=None
)
assert (
path
== f"test/examples/data/runs/test_{configured_test_manager.experiment_run_number}_{ts}/some_custom_name_sample_hash__test_output"
)
def test_get_path_no_args(configured_test_manager):
"""Calling get_path with None args will result in None in the output name."""
record = Record(configured_test_manager, None)
path = configured_test_manager.get_path(
"test_output", record, aggregate_records=None
)
assert path == "test/examples/data/cache/test_None__test_output"
# -----------------------------------------
# get_run_output_path unit tests
# -----------------------------------------
def test_get_run_output_path(configured_test_manager):
ts = configured_test_manager.get_str_timestamp()
path = configured_test_manager.get_run_output_path()
assert (
path
== f"test/examples/data/runs/test_{configured_test_manager.experiment_run_number}_{ts}"
)
def test_get_run_output_path_w_obj_name(configured_test_manager):
ts = configured_test_manager.get_str_timestamp()
path = configured_test_manager.get_run_output_path("testing_file.txt")
assert (
path
== f"test/examples/data/runs/test_{configured_test_manager.experiment_run_number}_{ts}/testing_file.txt"
)
assert os.path.exists(
f"test/examples/data/runs/test_{configured_test_manager.experiment_run_number}_{ts}"
)
assert not os.path.isdir(
f"test/examples/data/runs/test_{configured_test_manager.experiment_run_number}_{ts}/testing_file.txt"
)
def test_get_run_output_path_w_subdirs_obj_name(configured_test_manager):
ts = configured_test_manager.get_str_timestamp()
path = configured_test_manager.get_run_output_path("subs/testing_file.txt")
assert (
path
== f"test/examples/data/runs/test_{configured_test_manager.experiment_run_number}_{ts}/subs/testing_file.txt"
)
assert os.path.exists(
f"test/examples/data/runs/test_{configured_test_manager.experiment_run_number}_{ts}/subs"
)
# -----------------------------------------
# misc
# -----------------------------------------
def test_write_run_env_output(configured_test_manager):
"""Storing a manager run should output the four expected environment info textfiles into the store-full folder."""
configured_test_manager.store()
configured_test_manager.write_run_env_output()
ts = configured_test_manager.get_str_timestamp()
# NOTE: the run number is 1 because when we store it we actually populate that information (there's just a default value of 0)
assert os.path.exists(
f"test/examples/data/runs/test_{configured_test_manager.experiment_run_number}_{ts}/requirements.txt"
)
assert os.path.exists(
f"test/examples/data/runs/test_{configured_test_manager.experiment_run_number}_{ts}/environment_meta.txt"
)
assert os.path.exists(
f"test/examples/data/runs/test_{configured_test_manager.experiment_run_number}_{ts}/run_info.json"
)
def test_run_line_sanitization_normal(configured_test_manager):
"""Ensure that a normal store-full run-line will result in a correct reproduction line in the run info."""
configured_test_manager.run_line = "experiment test -p params1 --store-full"
configured_test_manager.store_entire_run = True
configured_test_manager.store()
ts = configured_test_manager.get_str_timestamp()
assert (
configured_test_manager.run_info["reproduce"]
== f"experiment test -p params1 --cache test/examples/data/runs/test_{configured_test_manager.experiment_run_number}_{ts} --dry-cache"
)
def test_run_line_sanitization_order(configured_test_manager):
"""The placement of the --store-full flag in the initial run line should not impact the reproduction line."""
configured_test_manager.run_line = (
"experiment test -p params1 --store-full --parallel 4"
)
configured_test_manager.store_entire_run = True
configured_test_manager.store()
ts = configured_test_manager.get_str_timestamp()
assert (
configured_test_manager.run_info["reproduce"]
== f"experiment test -p params1 --parallel 4 --cache test/examples/data/runs/test_{configured_test_manager.experiment_run_number}_{ts} --dry-cache"
)
def test_run_line_sanitization_overwrite(configured_test_manager):
"""The reproduction line should not include overwrite."""
configured_test_manager.run_line = (
"experiment test -p params1 --store-full --parallel 4 --overwrite"
)
configured_test_manager.store_entire_run = True
configured_test_manager.store()
ts = configured_test_manager.get_str_timestamp()
assert (
configured_test_manager.run_info["reproduce"]
== f"experiment test -p params1 --parallel 4 --cache test/examples/data/runs/test_{configured_test_manager.experiment_run_number}_{ts} --dry-cache"
)
def test_cache_aware_dict_resolve(configured_test_manager):
"""Ensure when resolve is on, lazy objects in a record's state auto load the thing."""
@stage([], [Lazy("tester")], cachers=[PickleCacher])
def output_stage(record):
return "hello world"
record = Record(configured_test_manager, None)
output_stage(record)
assert record.state["tester"] == "hello world"
def test_cache_aware_dict_no_resolve(configured_test_manager):
"""Ensure when resolve is off, lazy objects in a record's state remain lazy."""
@stage([], [Lazy("tester")], cachers=[PickleCacher])
def output_stage(record):
return "hello world"
record = Record(configured_test_manager, None)
output_stage(record)
record.state.resolve = False
assert type(record.state["tester"]) == Lazy
assert record.state["tester"].name == "tester"
assert type(record.state["tester"].cacher) == PickleCacher
| 39.213333 | 155 | 0.720588 | 1,550 | 11,764 | 5.126452 | 0.120645 | 0.153285 | 0.229927 | 0.066449 | 0.803927 | 0.783413 | 0.76227 | 0.72892 | 0.705638 | 0.686257 | 0 | 0.003035 | 0.15964 | 11,764 | 299 | 156 | 39.344482 | 0.800728 | 0.186416 | 0 | 0.533981 | 0 | 0.043689 | 0.253036 | 0.170873 | 0 | 0 | 0 | 0 | 0.145631 | 1 | 0.131068 | false | 0.004854 | 0.019417 | 0.029126 | 0.184466 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
4fcc5c10a1573c93bf7ab37112c086ffd3fc9e9a | 778 | py | Python | mtdnn/common/archive_maps.py | microsoft/mt-dnn | e5c3e07f3a8e55067433714ce261a6d28ba73d22 | [
"MIT"
] | 113 | 2020-05-08T08:02:51.000Z | 2022-03-27T06:43:56.000Z | mtdnn/common/archive_maps.py | microsoft/mt-dnn | e5c3e07f3a8e55067433714ce261a6d28ba73d22 | [
"MIT"
] | 4 | 2020-06-03T12:00:10.000Z | 2021-03-15T07:36:44.000Z | mtdnn/common/archive_maps.py | microsoft/mt-dnn | e5c3e07f3a8e55067433714ce261a6d28ba73d22 | [
"MIT"
] | 24 | 2020-05-11T13:13:22.000Z | 2022-03-25T05:49:51.000Z | # coding=utf-8
# Copyright (c) Microsoft. All rights reserved.
PRETRAINED_MODEL_ARCHIVE_MAP = {
"mtdnn-base-uncased": "https://mrc.blob.core.windows.net/mt-dnn-model/mt_dnn_base.pt",
"mtdnn-large-uncased": "https://mrc.blob.core.windows.net/mt-dnn-model/mt_dnn_large.pt",
"mtdnn-kd-large-cased": "https://mrc.blob.core.windows.net/mt-dnn-model/mt_dnn_kd_large_cased.pt",
}
# TODO - Create these files and upload to blob next to model
PRETRAINED_CONFIG_ARCHIVE_MAP = {
"mtdnn-base-uncased": "https://mrc.blob.core.windows.net/mt-dnn-model/mt_dnn_base.json",
"mtdnn-large-uncased": "https://mrc.blob.core.windows.net/mt-dnn-model/mt_dnn_large.json",
"mtdnn-kd-large-cased": "https://mrc.blob.core.windows.net/mt-dnn-model/mt_dnn_kd_large_cased.json",
}
| 45.764706 | 104 | 0.733933 | 130 | 778 | 4.223077 | 0.292308 | 0.10929 | 0.131148 | 0.174863 | 0.743169 | 0.743169 | 0.743169 | 0.743169 | 0.743169 | 0.743169 | 0 | 0.00141 | 0.088689 | 778 | 16 | 105 | 48.625 | 0.77292 | 0.150386 | 0 | 0 | 0 | 0.6 | 0.77439 | 0 | 0 | 0 | 0 | 0.0625 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
4ff98245a9c31f52ba3d6583b4a7e9ca7a8e7a46 | 2,477 | py | Python | faeAuditor/auditGroup2Results/migrations/0009_auto_20181108_1330.py | opena11y/fae-auditor | ea9099b37b77ddc30092b0cdd962647c92b143a7 | [
"Apache-2.0"
] | 2 | 2018-02-28T19:03:28.000Z | 2021-09-30T13:40:23.000Z | faeAuditor/auditGroup2Results/migrations/0009_auto_20181108_1330.py | opena11y/fae-auditor | ea9099b37b77ddc30092b0cdd962647c92b143a7 | [
"Apache-2.0"
] | 6 | 2020-02-11T21:53:58.000Z | 2022-02-10T07:57:58.000Z | faeAuditor/auditGroup2Results/migrations/0009_auto_20181108_1330.py | opena11y/fae-auditor | ea9099b37b77ddc30092b0cdd962647c92b143a7 | [
"Apache-2.0"
] | 1 | 2019-12-05T06:05:20.000Z | 2019-12-05T06:05:20.000Z | # -*- coding: utf-8 -*-
# Generated by Django 1.11.8 on 2018-11-08 19:30
from __future__ import unicode_literals
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('auditGroup2Results', '0008_auto_20181030_2223'),
]
operations = [
migrations.AddField(
model_name='auditgroup2guidelineresult',
name='implementation_score_v',
field=models.DecimalField(decimal_places=1, default=-1, max_digits=4),
),
migrations.AddField(
model_name='auditgroup2guidelineresult',
name='implementation_score_w',
field=models.DecimalField(decimal_places=1, default=-1, max_digits=4),
),
migrations.AddField(
model_name='auditgroup2result',
name='implementation_score_v',
field=models.DecimalField(decimal_places=1, default=-1, max_digits=4),
),
migrations.AddField(
model_name='auditgroup2result',
name='implementation_score_w',
field=models.DecimalField(decimal_places=1, default=-1, max_digits=4),
),
migrations.AddField(
model_name='auditgroup2rulecategoryresult',
name='implementation_score_v',
field=models.DecimalField(decimal_places=1, default=-1, max_digits=4),
),
migrations.AddField(
model_name='auditgroup2rulecategoryresult',
name='implementation_score_w',
field=models.DecimalField(decimal_places=1, default=-1, max_digits=4),
),
migrations.AddField(
model_name='auditgroup2ruleresult',
name='implementation_score_v',
field=models.DecimalField(decimal_places=1, default=-1, max_digits=4),
),
migrations.AddField(
model_name='auditgroup2ruleresult',
name='implementation_score_w',
field=models.DecimalField(decimal_places=1, default=-1, max_digits=4),
),
migrations.AddField(
model_name='auditgroup2rulescoperesult',
name='implementation_score_v',
field=models.DecimalField(decimal_places=1, default=-1, max_digits=4),
),
migrations.AddField(
model_name='auditgroup2rulescoperesult',
name='implementation_score_w',
field=models.DecimalField(decimal_places=1, default=-1, max_digits=4),
),
]
| 37.530303 | 82 | 0.62939 | 237 | 2,477 | 6.333333 | 0.21097 | 0.11992 | 0.153231 | 0.17988 | 0.864757 | 0.864757 | 0.864757 | 0.864757 | 0.812125 | 0.812125 | 0 | 0.04057 | 0.263625 | 2,477 | 65 | 83 | 38.107692 | 0.782346 | 0.027453 | 0 | 0.862069 | 1 | 0 | 0.207398 | 0.185786 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.034483 | 0 | 0.086207 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
8b18dbcf938d0495297129454100ef1d7e409f74 | 6,812 | py | Python | server/tests/steps/sql_translator/test_totals.py | CharlesRngrd/weaverbird | 67e26c9c1114ff37d1ee8871381cd2c3a971defe | [
"BSD-3-Clause"
] | 54 | 2019-11-20T15:07:39.000Z | 2022-03-24T22:13:51.000Z | server/tests/steps/sql_translator/test_totals.py | ToucanToco/weaverbird | 7cbd3cc612437a876470cc872efba69526694d62 | [
"BSD-3-Clause"
] | 786 | 2019-10-20T11:48:37.000Z | 2022-03-23T08:58:18.000Z | server/tests/steps/sql_translator/test_totals.py | davinov/weaverbird | 3f907f080729ba70be8872d6c5ed0fdcec9b8a9a | [
"BSD-3-Clause"
] | 10 | 2019-11-21T10:16:16.000Z | 2022-03-21T10:34:06.000Z | import pytest
from weaverbird.backends.sql_translator.metadata import ColumnMetadata, SqlQueryMetadataManager
from weaverbird.backends.sql_translator.steps.totals import translate_totals
from weaverbird.backends.sql_translator.types import SQLQuery
from weaverbird.pipeline.steps import TotalsStep
@pytest.fixture()
def query():
return SQLQuery(
query_name='SELECT_STEP_0',
transformed_query='WITH SELECT_STEP_0 AS (SELECT * FROM products)',
selection_query='SELECT * FROM SELECT_STEP_0',
metadata_manager=SqlQueryMetadataManager(
tables_metadata={
'table1': {
'VALUE_1': 'int',
'VALUE_2': 'int',
'YEAR': 'int',
'COUNTRY': 'text',
'PRODUCT': 'text',
},
},
),
)
def test_translate_totals(query):
step = TotalsStep(
**{
'name': 'totals',
'totalDimensions': [
{'totalColumn': 'COUNTRY', 'totalRowsLabel': 'All countries'},
{'totalColumn': 'PRODUCT', 'totalRowsLabel': 'All products'},
],
'aggregations': [
{
'columns': ['VALUE_1', 'VALUE_2'],
'aggfunction': 'sum',
'newcolumns': ['VALUE_1-SUM', 'VALUE_2-SUM'],
},
{
'columns': ['VALUE_1'],
'aggfunction': 'avg',
'newcolumns': ['VALUE_1-AVG'],
},
],
'groups': ['YEAR'],
}
)
new_query = translate_totals(
step,
query,
index=1,
)
expected_query = (
'WITH SELECT_STEP_0 AS (SELECT * FROM products), TOTALS_STEP_1 AS ('
'SELECT CASE WHEN GROUPING(COUNTRY) = 0 THEN COUNTRY ELSE \'All countries\' END AS "COUNTRY", '
'CASE WHEN GROUPING(PRODUCT) = 0 THEN PRODUCT ELSE \'All products\' END AS "PRODUCT", '
'SUM(VALUE_1) AS "VALUE_1-SUM", SUM(VALUE_2) AS "VALUE_2-SUM", AVG(VALUE_1) '
'AS "VALUE_1-AVG", YEAR FROM SELECT_STEP_0 GROUP BY YEAR, '
'GROUPING SETS((COUNTRY), (PRODUCT), (COUNTRY, PRODUCT), ()))'
)
assert new_query.transformed_query == expected_query
assert new_query.query_name == 'TOTALS_STEP_1'
assert (
new_query.selection_query
== 'SELECT COUNTRY, PRODUCT, VALUE_1-SUM, VALUE_2-SUM, VALUE_1-AVG, YEAR FROM TOTALS_STEP_1'
)
assert new_query.metadata_manager.retrieve_query_metadata_columns() == {
'VALUE_1-SUM': ColumnMetadata(
name='VALUE_1-SUM',
original_name='VALUE_1-SUM',
type='FLOAT',
original_type='INT',
alias=None,
delete=False,
),
'VALUE_2-SUM': ColumnMetadata(
name='VALUE_2-SUM',
original_name='VALUE_2-SUM',
type='FLOAT',
original_type='FLOAT',
alias=None,
delete=False,
),
'VALUE_1-AVG': ColumnMetadata(
name='VALUE_1-AVG',
original_name='VALUE_1-AVG',
type='FLOAT',
original_type='FLOAT',
alias=None,
delete=False,
),
'COUNTRY': ColumnMetadata(
name='COUNTRY',
original_name='COUNTRY',
type='TEXT',
original_type='text',
alias=None,
delete=False,
),
'PRODUCT': ColumnMetadata(
name='PRODUCT',
original_name='PRODUCT',
type='TEXT',
original_type='text',
alias=None,
delete=False,
),
'YEAR': ColumnMetadata(
name='YEAR',
original_name='YEAR',
type='INT',
original_type='INT',
alias=None,
delete=False,
),
}
def test_translate_totals_no_group(query):
step = TotalsStep(
**{
'name': 'totals',
'totalDimensions': [
{'totalColumn': 'COUNTRY', 'totalRowsLabel': 'All countries'},
{'totalColumn': 'PRODUCT', 'totalRowsLabel': 'All products'},
],
'aggregations': [
{
'columns': ['VALUE_1', 'VALUE_2'],
'aggfunction': 'sum',
'newcolumns': ['VALUE_1-SUM', 'VALUE_2-SUM'],
},
{
'columns': ['VALUE_1'],
'aggfunction': 'avg',
'newcolumns': ['VALUE_1-AVG'],
},
],
'groups': [],
}
)
new_query = translate_totals(
step,
query,
index=1,
)
expected_query = (
'WITH SELECT_STEP_0 AS (SELECT * FROM products), TOTALS_STEP_1 AS ('
'SELECT CASE WHEN GROUPING(COUNTRY) = 0 THEN COUNTRY ELSE \'All countries\' END AS "COUNTRY", '
'CASE WHEN GROUPING(PRODUCT) = 0 THEN PRODUCT ELSE \'All products\' END AS "PRODUCT", '
'SUM(VALUE_1) AS "VALUE_1-SUM", SUM(VALUE_2) AS "VALUE_2-SUM", AVG(VALUE_1) '
'AS "VALUE_1-AVG" FROM SELECT_STEP_0 GROUP BY '
'GROUPING SETS((COUNTRY), (PRODUCT), (COUNTRY, PRODUCT), ()))'
)
assert new_query.transformed_query == expected_query
assert new_query.query_name == 'TOTALS_STEP_1'
assert (
new_query.selection_query
== 'SELECT COUNTRY, PRODUCT, VALUE_1-SUM, VALUE_2-SUM, VALUE_1-AVG FROM TOTALS_STEP_1'
)
assert new_query.metadata_manager.retrieve_query_metadata_columns() == {
'VALUE_1-SUM': ColumnMetadata(
name='VALUE_1-SUM',
original_name='VALUE_1-SUM',
type='FLOAT',
original_type='INT',
alias=None,
delete=False,
),
'VALUE_2-SUM': ColumnMetadata(
name='VALUE_2-SUM',
original_name='VALUE_2-SUM',
type='FLOAT',
original_type='FLOAT',
alias=None,
delete=False,
),
'VALUE_1-AVG': ColumnMetadata(
name='VALUE_1-AVG',
original_name='VALUE_1-AVG',
type='FLOAT',
original_type='FLOAT',
alias=None,
delete=False,
),
'COUNTRY': ColumnMetadata(
name='COUNTRY',
original_name='COUNTRY',
type='TEXT',
original_type='text',
alias=None,
delete=False,
),
'PRODUCT': ColumnMetadata(
name='PRODUCT',
original_name='PRODUCT',
type='TEXT',
original_type='text',
alias=None,
delete=False,
),
}
| 32.908213 | 103 | 0.507927 | 643 | 6,812 | 5.158631 | 0.118196 | 0.059692 | 0.03256 | 0.066325 | 0.858004 | 0.823937 | 0.810672 | 0.800121 | 0.800121 | 0.788062 | 0 | 0.016238 | 0.367146 | 6,812 | 206 | 104 | 33.067961 | 0.75319 | 0 | 0 | 0.728643 | 0 | 0.01005 | 0.296682 | 0 | 0 | 0 | 0 | 0 | 0.040201 | 1 | 0.015075 | false | 0 | 0.025126 | 0.005025 | 0.045226 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
8b26958a3be9700afe4bba369c28589e4dad9b95 | 56,222 | py | Python | dearpypixl/components/handlers.py | Atlamillias/dearpypixl | b332e23bd847abd729b53b8d77c5f49b604c16b5 | [
"MIT"
] | 2 | 2022-01-29T21:50:21.000Z | 2022-03-27T11:28:30.000Z | dearpypixl/components/handlers.py | Atlamillias/dearpypixl | b332e23bd847abd729b53b8d77c5f49b604c16b5 | [
"MIT"
] | null | null | null | dearpypixl/components/handlers.py | Atlamillias/dearpypixl | b332e23bd847abd729b53b8d77c5f49b604c16b5 | [
"MIT"
] | null | null | null | import functools
from inspect import signature, _KEYWORD_ONLY, _VAR_POSITIONAL
from types import MethodType
from typing import Callable, Any, Union
from dearpygui import dearpygui
from dearpypixl.components.item import Item
from dearpypixl.components.configuration import ItemAttribute, item_attribute, CONFIGURATION
__all__ = [
"KeyDownHandler",
"KeyPressHandler",
"KeyReleaseHandler",
"MouseMoveHandler",
"MouseWheelHandler",
"MouseClickHandler",
"MouseDoubleClickHandler",
"MouseDownHandler",
"MouseReleaseHandler",
"MouseDragHandler",
"HoverHandler",
"ResizeHandler",
"FocusHandler",
"ActiveHandler",
"VisibleHandler",
"ActivatedHandler",
"DeactivatedHandler",
"EditedHandler",
"DeactivatedAfterEditHandler",
"ToggledOpenHandler",
"ClickedHandler",
]
class HandlerItem(Item):
"""Generic class for handler items.
"""
def __init__(self, callback: Callable | None, **kwargs):
super().__init__(**kwargs)
self.callback = callback
@property
@item_attribute(category=CONFIGURATION)
def callback(self) -> Callable | None:
return dearpygui.get_item_configuration(self._tag)["callback"]
@callback.setter
def callback(self, _callback: Callable | None) -> None:
# Wrapping is done for a couple of reasons. Firstly, to ensure that
# the Item instance of `sender` is returned instead of the identifier.
# And second; nuitka compilation doesn't like DPG callbacks unless
# they are wrapped (lambda, etc.)...for some reason.
@functools.wraps(_callback)
def call_object(_, app_data, user_data):
args = (self, app_data, user_data)[0:pos_arg_cnt]
_callback(*args)
# Emulating how DearPyGui doesn't require callbacks having 3 positional
# arguments. Only pass sender/app_data/user_data if there's "room" to
# do so.
pos_arg_cnt = 0
for param in signature(_callback).parameters.values():
if param.kind == _VAR_POSITIONAL:
pos_arg_cnt = 3
elif param.kind != _KEYWORD_ONLY:
pos_arg_cnt += 1
if pos_arg_cnt >= 3:
pos_arg_cnt = 3
break
wrapper = None
if callable(_callback):
wrapper = call_object
elif _callback is None:
wrapper = None
else:
raise ValueError(f"`callback` is not callable (got {type(_callback)!r}.")
dearpygui.configure_item(self._tag, callback=wrapper)
class KeyDownHandler(HandlerItem):
"""Adds a key down handler.
Args:
key (int, optional): Submits callback for all keys
label (str, optional): Overrides 'name' as label.
user_data (Any, optional): User data for callbacks
use_internal_label (bool, optional): Use generated internal label instead of user specified (appends ### uuid).
tag (Union[int, str], optional): Unique id used to programmatically refer to the item.If label is unused this will be the label.
callback (Callable, optional): Registers a callback.
show (bool, optional): Attempt to render widget.
parent (Union[int, str], optional): Parent to add this item to. (runtime adding)
id (Union[int, str], optional): (deprecated)
Returns:
Union[int, str]
"""
key : int = ItemAttribute("configuration", "get_item_config", "set_item_config", None)
show : bool = ItemAttribute("configuration", "get_item_config", "set_item_config", None)
__is_container__ : bool = False
__is_root_item__ : bool = False
__is_value_able__: bool = False
__able_parents__ : tuple = ('Stage', 'TemplateRegistry', 'HandlerRegistry')
__able_children__: tuple = ()
__commands__ : tuple = ()
__constants__ : tuple = ('mvKeyDownHandler',)
__command__ : Callable = dearpygui.add_key_down_handler
def __init__(
self ,
key : int = -1 ,
label : str = None,
user_data : Any = None,
use_internal_label: bool = True,
callback : Callable = None,
show : bool = True,
parent : Union[int, str] = 11 ,
**kwargs ,
) -> None:
super().__init__(
key=key,
label=label,
user_data=user_data,
use_internal_label=use_internal_label,
callback=callback,
show=show,
parent=parent,
**kwargs,
)
class KeyPressHandler(HandlerItem):
"""Adds a key press handler.
Args:
key (int, optional): Submits callback for all keys
label (str, optional): Overrides 'name' as label.
user_data (Any, optional): User data for callbacks
use_internal_label (bool, optional): Use generated internal label instead of user specified (appends ### uuid).
tag (Union[int, str], optional): Unique id used to programmatically refer to the item.If label is unused this will be the label.
callback (Callable, optional): Registers a callback.
show (bool, optional): Attempt to render widget.
parent (Union[int, str], optional): Parent to add this item to. (runtime adding)
id (Union[int, str], optional): (deprecated)
Returns:
Union[int, str]
"""
key : int = ItemAttribute("configuration", "get_item_config", "set_item_config", None)
show : bool = ItemAttribute("configuration", "get_item_config", "set_item_config", None)
__is_container__ : bool = False
__is_root_item__ : bool = False
__is_value_able__: bool = False
__able_parents__ : tuple = ('Stage', 'TemplateRegistry', 'HandlerRegistry')
__able_children__: tuple = ()
__commands__ : tuple = ()
__constants__ : tuple = ('mvKeyPressHandler',)
__command__ : Callable = dearpygui.add_key_press_handler
def __init__(
self ,
key : int = -1 ,
label : str = None,
user_data : Any = None,
use_internal_label: bool = True,
callback : Callable = None,
show : bool = True,
parent : Union[int, str] = 11 ,
**kwargs ,
) -> None:
super().__init__(
key=key,
label=label,
user_data=user_data,
use_internal_label=use_internal_label,
callback=callback,
show=show,
parent=parent,
**kwargs,
)
class KeyReleaseHandler(HandlerItem):
"""Adds a key release handler.
Args:
key (int, optional): Submits callback for all keys
label (str, optional): Overrides 'name' as label.
user_data (Any, optional): User data for callbacks
use_internal_label (bool, optional): Use generated internal label instead of user specified (appends ### uuid).
tag (Union[int, str], optional): Unique id used to programmatically refer to the item.If label is unused this will be the label.
callback (Callable, optional): Registers a callback.
show (bool, optional): Attempt to render widget.
parent (Union[int, str], optional): Parent to add this item to. (runtime adding)
id (Union[int, str], optional): (deprecated)
Returns:
Union[int, str]
"""
key : int = ItemAttribute("configuration", "get_item_config", "set_item_config", None)
show : bool = ItemAttribute("configuration", "get_item_config", "set_item_config", None)
__is_container__ : bool = False
__is_root_item__ : bool = False
__is_value_able__: bool = False
__able_parents__ : tuple = ('Stage', 'TemplateRegistry', 'HandlerRegistry')
__able_children__: tuple = ()
__commands__ : tuple = ()
__constants__ : tuple = ('mvKeyReleaseHandler',)
__command__ : Callable = dearpygui.add_key_release_handler
def __init__(
self ,
key : int = -1 ,
label : str = None,
user_data : Any = None,
use_internal_label: bool = True,
callback : Callable = None,
show : bool = True,
parent : Union[int, str] = 11 ,
**kwargs ,
) -> None:
super().__init__(
key=key,
label=label,
user_data=user_data,
use_internal_label=use_internal_label,
callback=callback,
show=show,
parent=parent,
**kwargs,
)
class MouseMoveHandler(HandlerItem):
"""Adds a mouse move handler.
Args:
label (str, optional): Overrides 'name' as label.
user_data (Any, optional): User data for callbacks
use_internal_label (bool, optional): Use generated internal label instead of user specified (appends ### uuid).
tag (Union[int, str], optional): Unique id used to programmatically refer to the item.If label is unused this will be the label.
callback (Callable, optional): Registers a callback.
show (bool, optional): Attempt to render widget.
parent (Union[int, str], optional): Parent to add this item to. (runtime adding)
id (Union[int, str], optional): (deprecated)
Returns:
Union[int, str]
"""
show : bool = ItemAttribute("configuration", "get_item_config", "set_item_config", None)
__is_container__ : bool = False
__is_root_item__ : bool = False
__is_value_able__: bool = False
__able_parents__ : tuple = ('TemplateRegistry', 'Stage', 'HandlerRegistry')
__able_children__: tuple = ()
__commands__ : tuple = ()
__constants__ : tuple = ('mvMouseMoveHandler',)
__command__ : Callable = dearpygui.add_mouse_move_handler
def __init__(
self ,
label : str = None,
user_data : Any = None,
use_internal_label: bool = True,
callback : Callable = None,
show : bool = True,
parent : Union[int, str] = 11 ,
**kwargs ,
) -> None:
super().__init__(
label=label,
user_data=user_data,
use_internal_label=use_internal_label,
callback=callback,
show=show,
parent=parent,
**kwargs,
)
class MouseWheelHandler(HandlerItem):
"""Adds a mouse wheel handler.
Args:
label (str, optional): Overrides 'name' as label.
user_data (Any, optional): User data for callbacks
use_internal_label (bool, optional): Use generated internal label instead of user specified (appends ### uuid).
tag (Union[int, str], optional): Unique id used to programmatically refer to the item.If label is unused this will be the label.
callback (Callable, optional): Registers a callback.
show (bool, optional): Attempt to render widget.
parent (Union[int, str], optional): Parent to add this item to. (runtime adding)
id (Union[int, str], optional): (deprecated)
Returns:
Union[int, str]
"""
show : bool = ItemAttribute("configuration", "get_item_config", "set_item_config", None)
__is_container__ : bool = False
__is_root_item__ : bool = False
__is_value_able__: bool = False
__able_parents__ : tuple = ('TemplateRegistry', 'Stage', 'HandlerRegistry')
__able_children__: tuple = ()
__commands__ : tuple = ()
__constants__ : tuple = ('mvMouseWheelHandler',)
__command__ : Callable = dearpygui.add_mouse_wheel_handler
def __init__(
self ,
label : str = None,
user_data : Any = None,
use_internal_label: bool = True,
callback : Callable = None,
show : bool = True,
parent : Union[int, str] = 11 ,
**kwargs ,
) -> None:
super().__init__(
label=label,
user_data=user_data,
use_internal_label=use_internal_label,
callback=callback,
show=show,
parent=parent,
**kwargs,
)
class MouseClickHandler(HandlerItem):
"""Adds a mouse click handler.
Args:
button (int, optional): Submits callback for all mouse buttons
label (str, optional): Overrides 'name' as label.
user_data (Any, optional): User data for callbacks
use_internal_label (bool, optional): Use generated internal label instead of user specified (appends ### uuid).
tag (Union[int, str], optional): Unique id used to programmatically refer to the item.If label is unused this will be the label.
callback (Callable, optional): Registers a callback.
show (bool, optional): Attempt to render widget.
parent (Union[int, str], optional): Parent to add this item to. (runtime adding)
id (Union[int, str], optional): (deprecated)
Returns:
Union[int, str]
"""
button : int = ItemAttribute("configuration", "get_item_config", "set_item_config", None)
show : bool = ItemAttribute("configuration", "get_item_config", "set_item_config", None)
__is_container__ : bool = False
__is_root_item__ : bool = False
__is_value_able__: bool = False
__able_parents__ : tuple = ('Stage', 'TemplateRegistry', 'HandlerRegistry')
__able_children__: tuple = ()
__commands__ : tuple = ()
__constants__ : tuple = ('mvMouseClickHandler',)
__command__ : Callable = dearpygui.add_mouse_click_handler
def __init__(
self ,
button : int = -1 ,
label : str = None,
user_data : Any = None,
use_internal_label: bool = True,
callback : Callable = None,
show : bool = True,
parent : Union[int, str] = 11 ,
**kwargs ,
) -> None:
super().__init__(
button=button,
label=label,
user_data=user_data,
use_internal_label=use_internal_label,
callback=callback,
show=show,
parent=parent,
**kwargs,
)
class MouseDoubleClickHandler(HandlerItem):
"""Adds a mouse double click handler.
Args:
button (int, optional): Submits callback for all mouse buttons
label (str, optional): Overrides 'name' as label.
user_data (Any, optional): User data for callbacks
use_internal_label (bool, optional): Use generated internal label instead of user specified (appends ### uuid).
tag (Union[int, str], optional): Unique id used to programmatically refer to the item.If label is unused this will be the label.
callback (Callable, optional): Registers a callback.
show (bool, optional): Attempt to render widget.
parent (Union[int, str], optional): Parent to add this item to. (runtime adding)
id (Union[int, str], optional): (deprecated)
Returns:
Union[int, str]
"""
button : int = ItemAttribute("configuration", "get_item_config", "set_item_config", None)
show : bool = ItemAttribute("configuration", "get_item_config", "set_item_config", None)
__is_container__ : bool = False
__is_root_item__ : bool = False
__is_value_able__: bool = False
__able_parents__ : tuple = ('TemplateRegistry', 'Stage', 'HandlerRegistry')
__able_children__: tuple = ()
__commands__ : tuple = ()
__constants__ : tuple = ('mvMouseDoubleClickHandler',)
__command__ : Callable = dearpygui.add_mouse_double_click_handler
def __init__(
self ,
button : int = -1 ,
label : str = None,
user_data : Any = None,
use_internal_label: bool = True,
callback : Callable = None,
show : bool = True,
parent : Union[int, str] = 11 ,
**kwargs ,
) -> None:
super().__init__(
button=button,
label=label,
user_data=user_data,
use_internal_label=use_internal_label,
callback=callback,
show=show,
parent=parent,
**kwargs,
)
class MouseDownHandler(HandlerItem):
"""Adds a mouse down handler.
Args:
button (int, optional): Submits callback for all mouse buttons
label (str, optional): Overrides 'name' as label.
user_data (Any, optional): User data for callbacks
use_internal_label (bool, optional): Use generated internal label instead of user specified (appends ### uuid).
tag (Union[int, str], optional): Unique id used to programmatically refer to the item.If label is unused this will be the label.
callback (Callable, optional): Registers a callback.
show (bool, optional): Attempt to render widget.
parent (Union[int, str], optional): Parent to add this item to. (runtime adding)
id (Union[int, str], optional): (deprecated)
Returns:
Union[int, str]
"""
button : int = ItemAttribute("configuration", "get_item_config", "set_item_config", None)
show : bool = ItemAttribute("configuration", "get_item_config", "set_item_config", None)
__is_container__ : bool = False
__is_root_item__ : bool = False
__is_value_able__: bool = False
__able_parents__ : tuple = ('TemplateRegistry', 'Stage', 'HandlerRegistry')
__able_children__: tuple = ()
__commands__ : tuple = ()
__constants__ : tuple = ('mvMouseDownHandler',)
__command__ : Callable = dearpygui.add_mouse_down_handler
def __init__(
self ,
button : int = -1 ,
label : str = None,
user_data : Any = None,
use_internal_label: bool = True,
callback : Callable = None,
show : bool = True,
parent : Union[int, str] = 11 ,
**kwargs ,
) -> None:
super().__init__(
button=button,
label=label,
user_data=user_data,
use_internal_label=use_internal_label,
callback=callback,
show=show,
parent=parent,
**kwargs,
)
class MouseReleaseHandler(HandlerItem):
"""Adds a mouse release handler.
Args:
button (int, optional): Submits callback for all mouse buttons
label (str, optional): Overrides 'name' as label.
user_data (Any, optional): User data for callbacks
use_internal_label (bool, optional): Use generated internal label instead of user specified (appends ### uuid).
tag (Union[int, str], optional): Unique id used to programmatically refer to the item.If label is unused this will be the label.
callback (Callable, optional): Registers a callback.
show (bool, optional): Attempt to render widget.
parent (Union[int, str], optional): Parent to add this item to. (runtime adding)
id (Union[int, str], optional): (deprecated)
Returns:
Union[int, str]
"""
button : int = ItemAttribute("configuration", "get_item_config", "set_item_config", None)
show : bool = ItemAttribute("configuration", "get_item_config", "set_item_config", None)
__is_container__ : bool = False
__is_root_item__ : bool = False
__is_value_able__: bool = False
__able_parents__ : tuple = ('TemplateRegistry', 'Stage', 'HandlerRegistry')
__able_children__: tuple = ()
__commands__ : tuple = ()
__constants__ : tuple = ('mvMouseReleaseHandler',)
__command__ : Callable = dearpygui.add_mouse_release_handler
def __init__(
self ,
button : int = -1 ,
label : str = None,
user_data : Any = None,
use_internal_label: bool = True,
callback : Callable = None,
show : bool = True,
parent : Union[int, str] = 11 ,
**kwargs ,
) -> None:
super().__init__(
button=button,
label=label,
user_data=user_data,
use_internal_label=use_internal_label,
callback=callback,
show=show,
parent=parent,
**kwargs,
)
class MouseDragHandler(HandlerItem):
"""Adds a mouse drag handler.
Args:
button (int, optional): Submits callback for all mouse buttons
threshold (float, optional): The threshold the mouse must be dragged before the callback is ran
label (str, optional): Overrides 'name' as label.
user_data (Any, optional): User data for callbacks
use_internal_label (bool, optional): Use generated internal label instead of user specified (appends ### uuid).
tag (Union[int, str], optional): Unique id used to programmatically refer to the item.If label is unused this will be the label.
callback (Callable, optional): Registers a callback.
show (bool, optional): Attempt to render widget.
parent (Union[int, str], optional): Parent to add this item to. (runtime adding)
id (Union[int, str], optional): (deprecated)
Returns:
Union[int, str]
"""
button : int = ItemAttribute("configuration", "get_item_config", "set_item_config", None)
threshold : float = ItemAttribute("configuration", "get_item_config", "set_item_config", None)
show : bool = ItemAttribute("configuration", "get_item_config", "set_item_config", None)
__is_container__ : bool = False
__is_root_item__ : bool = False
__is_value_able__: bool = False
__able_parents__ : tuple = ('TemplateRegistry', 'Stage', 'HandlerRegistry')
__able_children__: tuple = ()
__commands__ : tuple = ()
__constants__ : tuple = ('mvMouseDragHandler',)
__command__ : Callable = dearpygui.add_mouse_drag_handler
def __init__(
self ,
button : int = -1 ,
threshold : float = 10.0,
label : str = None,
user_data : Any = None,
use_internal_label: bool = True,
callback : Callable = None,
show : bool = True,
parent : Union[int, str] = 11 ,
**kwargs ,
) -> None:
super().__init__(
button=button,
threshold=threshold,
label=label,
user_data=user_data,
use_internal_label=use_internal_label,
callback=callback,
show=show,
parent=parent,
**kwargs,
)
class HoverHandler(HandlerItem):
"""Adds a hover handler.
Args:
label (str, optional): Overrides 'name' as label.
user_data (Any, optional): User data for callbacks
use_internal_label (bool, optional): Use generated internal label instead of user specified (appends ### uuid).
tag (Union[int, str], optional): Unique id used to programmatically refer to the item.If label is unused this will be the label.
parent (Union[int, str], optional): Parent to add this item to. (runtime adding)
callback (Callable, optional): Registers a callback.
show (bool, optional): Attempt to render widget.
id (Union[int, str], optional): (deprecated)
Returns:
Union[int, str]
"""
show : bool = ItemAttribute("configuration", "get_item_config", "set_item_config", None)
__is_container__ : bool = False
__is_root_item__ : bool = False
__is_value_able__: bool = False
__able_parents__ : tuple = ('Stage', 'TemplateRegistry', 'ItemHandlerRegistry')
__able_children__: tuple = ()
__commands__ : tuple = ()
__constants__ : tuple = ('mvHoverHandler',)
__command__ : Callable = dearpygui.add_item_hover_handler
def __init__(
self ,
label : str = None,
user_data : Any = None,
use_internal_label: bool = True,
parent : Union[int, str] = 0 ,
callback : Callable = None,
show : bool = True,
**kwargs ,
) -> None:
super().__init__(
label=label,
user_data=user_data,
use_internal_label=use_internal_label,
parent=parent,
callback=callback,
show=show,
**kwargs,
)
class ResizeHandler(HandlerItem):
"""Adds a resize handler.
Args:
label (str, optional): Overrides 'name' as label.
user_data (Any, optional): User data for callbacks
use_internal_label (bool, optional): Use generated internal label instead of user specified (appends ### uuid).
tag (Union[int, str], optional): Unique id used to programmatically refer to the item.If label is unused this will be the label.
parent (Union[int, str], optional): Parent to add this item to. (runtime adding)
callback (Callable, optional): Registers a callback.
show (bool, optional): Attempt to render widget.
id (Union[int, str], optional): (deprecated)
Returns:
Union[int, str]
"""
show : bool = ItemAttribute("configuration", "get_item_config", "set_item_config", None)
__is_container__ : bool = False
__is_root_item__ : bool = False
__is_value_able__: bool = False
__able_parents__ : tuple = ('Stage', 'TemplateRegistry', 'ItemHandlerRegistry')
__able_children__: tuple = ()
__commands__ : tuple = ()
__constants__ : tuple = ('mvResizeHandler',)
__command__ : Callable = dearpygui.add_item_resize_handler
def __init__(
self ,
label : str = None,
user_data : Any = None,
use_internal_label: bool = True,
parent : Union[int, str] = 0 ,
callback : Callable = None,
show : bool = True,
**kwargs ,
) -> None:
super().__init__(
label=label,
user_data=user_data,
use_internal_label=use_internal_label,
parent=parent,
callback=callback,
show=show,
**kwargs,
)
class FocusHandler(HandlerItem):
"""Adds a focus handler.
Args:
label (str, optional): Overrides 'name' as label.
user_data (Any, optional): User data for callbacks
use_internal_label (bool, optional): Use generated internal label instead of user specified (appends ### uuid).
tag (Union[int, str], optional): Unique id used to programmatically refer to the item.If label is unused this will be the label.
parent (Union[int, str], optional): Parent to add this item to. (runtime adding)
callback (Callable, optional): Registers a callback.
show (bool, optional): Attempt to render widget.
id (Union[int, str], optional): (deprecated)
Returns:
Union[int, str]
"""
show : bool = ItemAttribute("configuration", "get_item_config", "set_item_config", None)
__is_container__ : bool = False
__is_root_item__ : bool = False
__is_value_able__: bool = False
__able_parents__ : tuple = ('Stage', 'TemplateRegistry', 'ItemHandlerRegistry')
__able_children__: tuple = ()
__commands__ : tuple = ()
__constants__ : tuple = ('mvFocusHandler',)
__command__ : Callable = dearpygui.add_item_focus_handler
def __init__(
self ,
label : str = None,
user_data : Any = None,
use_internal_label: bool = True,
parent : Union[int, str] = 0 ,
callback : Callable = None,
show : bool = True,
**kwargs ,
) -> None:
super().__init__(
label=label,
user_data=user_data,
use_internal_label=use_internal_label,
parent=parent,
callback=callback,
show=show,
**kwargs,
)
class ActiveHandler(HandlerItem):
"""Adds a active handler.
Args:
label (str, optional): Overrides 'name' as label.
user_data (Any, optional): User data for callbacks
use_internal_label (bool, optional): Use generated internal label instead of user specified (appends ### uuid).
tag (Union[int, str], optional): Unique id used to programmatically refer to the item.If label is unused this will be the label.
parent (Union[int, str], optional): Parent to add this item to. (runtime adding)
callback (Callable, optional): Registers a callback.
show (bool, optional): Attempt to render widget.
id (Union[int, str], optional): (deprecated)
Returns:
Union[int, str]
"""
show : bool = ItemAttribute("configuration", "get_item_config", "set_item_config", None)
__is_container__ : bool = False
__is_root_item__ : bool = False
__is_value_able__: bool = False
__able_parents__ : tuple = ('Stage', 'TemplateRegistry', 'ItemHandlerRegistry')
__able_children__: tuple = ()
__commands__ : tuple = ()
__constants__ : tuple = ('mvActiveHandler',)
__command__ : Callable = dearpygui.add_item_active_handler
def __init__(
self ,
label : str = None,
user_data : Any = None,
use_internal_label: bool = True,
parent : Union[int, str] = 0 ,
callback : Callable = None,
show : bool = True,
**kwargs ,
) -> None:
super().__init__(
label=label,
user_data=user_data,
use_internal_label=use_internal_label,
parent=parent,
callback=callback,
show=show,
**kwargs,
)
class VisibleHandler(HandlerItem):
"""Adds a visible handler.
Args:
label (str, optional): Overrides 'name' as label.
user_data (Any, optional): User data for callbacks
use_internal_label (bool, optional): Use generated internal label instead of user specified (appends ### uuid).
tag (Union[int, str], optional): Unique id used to programmatically refer to the item.If label is unused this will be the label.
parent (Union[int, str], optional): Parent to add this item to. (runtime adding)
callback (Callable, optional): Registers a callback.
show (bool, optional): Attempt to render widget.
id (Union[int, str], optional): (deprecated)
Returns:
Union[int, str]
"""
show : bool = ItemAttribute("configuration", "get_item_config", "set_item_config", None)
__is_container__ : bool = False
__is_root_item__ : bool = False
__is_value_able__: bool = False
__able_parents__ : tuple = ('Stage', 'TemplateRegistry', 'ItemHandlerRegistry')
__able_children__: tuple = ()
__commands__ : tuple = ()
__constants__ : tuple = ('mvVisibleHandler',)
__command__ : Callable = dearpygui.add_item_visible_handler
def __init__(
self ,
label : str = None,
user_data : Any = None,
use_internal_label: bool = True,
parent : Union[int, str] = 0 ,
callback : Callable = None,
show : bool = True,
**kwargs ,
) -> None:
super().__init__(
label=label,
user_data=user_data,
use_internal_label=use_internal_label,
parent=parent,
callback=callback,
show=show,
**kwargs,
)
class ActivatedHandler(HandlerItem):
"""Adds a activated handler.
Args:
label (str, optional): Overrides 'name' as label.
user_data (Any, optional): User data for callbacks
use_internal_label (bool, optional): Use generated internal label instead of user specified (appends ### uuid).
tag (Union[int, str], optional): Unique id used to programmatically refer to the item.If label is unused this will be the label.
parent (Union[int, str], optional): Parent to add this item to. (runtime adding)
callback (Callable, optional): Registers a callback.
show (bool, optional): Attempt to render widget.
id (Union[int, str], optional): (deprecated)
Returns:
Union[int, str]
"""
show : bool = ItemAttribute("configuration", "get_item_config", "set_item_config", None)
__is_container__ : bool = False
__is_root_item__ : bool = False
__is_value_able__: bool = False
__able_parents__ : tuple = ('Stage', 'TemplateRegistry', 'ItemHandlerRegistry')
__able_children__: tuple = ()
__commands__ : tuple = ()
__constants__ : tuple = ('mvActivatedHandler',)
__command__ : Callable = dearpygui.add_item_activated_handler
def __init__(
self ,
label : str = None,
user_data : Any = None,
use_internal_label: bool = True,
parent : Union[int, str] = 0 ,
callback : Callable = None,
show : bool = True,
**kwargs ,
) -> None:
super().__init__(
label=label,
user_data=user_data,
use_internal_label=use_internal_label,
parent=parent,
callback=callback,
show=show,
**kwargs,
)
class DeactivatedHandler(HandlerItem):
"""Adds a deactivated handler.
Args:
label (str, optional): Overrides 'name' as label.
user_data (Any, optional): User data for callbacks
use_internal_label (bool, optional): Use generated internal label instead of user specified (appends ### uuid).
tag (Union[int, str], optional): Unique id used to programmatically refer to the item.If label is unused this will be the label.
parent (Union[int, str], optional): Parent to add this item to. (runtime adding)
callback (Callable, optional): Registers a callback.
show (bool, optional): Attempt to render widget.
id (Union[int, str], optional): (deprecated)
Returns:
Union[int, str]
"""
show : bool = ItemAttribute("configuration", "get_item_config", "set_item_config", None)
__is_container__ : bool = False
__is_root_item__ : bool = False
__is_value_able__: bool = False
__able_parents__ : tuple = ('Stage', 'TemplateRegistry', 'ItemHandlerRegistry')
__able_children__: tuple = ()
__commands__ : tuple = ()
__constants__ : tuple = ('mvDeactivatedHandler',)
__command__ : Callable = dearpygui.add_item_deactivated_handler
def __init__(
self ,
label : str = None,
user_data : Any = None,
use_internal_label: bool = True,
parent : Union[int, str] = 0 ,
callback : Callable = None,
show : bool = True,
**kwargs ,
) -> None:
super().__init__(
label=label,
user_data=user_data,
use_internal_label=use_internal_label,
parent=parent,
callback=callback,
show=show,
**kwargs,
)
class EditedHandler(HandlerItem):
"""Adds an edited handler.
Args:
label (str, optional): Overrides 'name' as label.
user_data (Any, optional): User data for callbacks
use_internal_label (bool, optional): Use generated internal label instead of user specified (appends ### uuid).
tag (Union[int, str], optional): Unique id used to programmatically refer to the item.If label is unused this will be the label.
parent (Union[int, str], optional): Parent to add this item to. (runtime adding)
callback (Callable, optional): Registers a callback.
show (bool, optional): Attempt to render widget.
id (Union[int, str], optional): (deprecated)
Returns:
Union[int, str]
"""
show : bool = ItemAttribute("configuration", "get_item_config", "set_item_config", None)
__is_container__ : bool = False
__is_root_item__ : bool = False
__is_value_able__: bool = False
__able_parents__ : tuple = ('Stage', 'TemplateRegistry', 'ItemHandlerRegistry')
__able_children__: tuple = ()
__commands__ : tuple = ()
__constants__ : tuple = ('mvEditedHandler',)
__command__ : Callable = dearpygui.add_item_edited_handler
def __init__(
self ,
label : str = None,
user_data : Any = None,
use_internal_label: bool = True,
parent : Union[int, str] = 0 ,
callback : Callable = None,
show : bool = True,
**kwargs ,
) -> None:
super().__init__(
label=label,
user_data=user_data,
use_internal_label=use_internal_label,
parent=parent,
callback=callback,
show=show,
**kwargs,
)
class DeactivatedAfterEditHandler(HandlerItem):
"""Adds a deactivated after edit handler.
Args:
label (str, optional): Overrides 'name' as label.
user_data (Any, optional): User data for callbacks
use_internal_label (bool, optional): Use generated internal label instead of user specified (appends ### uuid).
tag (Union[int, str], optional): Unique id used to programmatically refer to the item.If label is unused this will be the label.
parent (Union[int, str], optional): Parent to add this item to. (runtime adding)
callback (Callable, optional): Registers a callback.
show (bool, optional): Attempt to render widget.
id (Union[int, str], optional): (deprecated)
Returns:
Union[int, str]
"""
show : bool = ItemAttribute("configuration", "get_item_config", "set_item_config", None)
__is_container__ : bool = False
__is_root_item__ : bool = False
__is_value_able__: bool = False
__able_parents__ : tuple = ('Stage', 'TemplateRegistry', 'ItemHandlerRegistry')
__able_children__: tuple = ()
__commands__ : tuple = ()
__constants__ : tuple = ('mvDeactivatedAfterEditHandler',)
__command__ : Callable = dearpygui.add_item_deactivated_after_edit_handler
def __init__(
self ,
label : str = None,
user_data : Any = None,
use_internal_label: bool = True,
parent : Union[int, str] = 0 ,
callback : Callable = None,
show : bool = True,
**kwargs ,
) -> None:
super().__init__(
label=label,
user_data=user_data,
use_internal_label=use_internal_label,
parent=parent,
callback=callback,
show=show,
**kwargs,
)
class ToggledOpenHandler(HandlerItem):
"""Adds a togged open handler.
Args:
label (str, optional): Overrides 'name' as label.
user_data (Any, optional): User data for callbacks
use_internal_label (bool, optional): Use generated internal label instead of user specified (appends ### uuid).
tag (Union[int, str], optional): Unique id used to programmatically refer to the item.If label is unused this will be the label.
parent (Union[int, str], optional): Parent to add this item to. (runtime adding)
callback (Callable, optional): Registers a callback.
show (bool, optional): Attempt to render widget.
id (Union[int, str], optional): (deprecated)
Returns:
Union[int, str]
"""
show : bool = ItemAttribute("configuration", "get_item_config", "set_item_config", None)
__is_container__ : bool = False
__is_root_item__ : bool = False
__is_value_able__: bool = False
__able_parents__ : tuple = ('Stage', 'TemplateRegistry', 'ItemHandlerRegistry')
__able_children__: tuple = ()
__commands__ : tuple = ()
__constants__ : tuple = ('mvToggledOpenHandler',)
__command__ : Callable = dearpygui.add_item_toggled_open_handler
def __init__(
self ,
label : str = None,
user_data : Any = None,
use_internal_label: bool = True,
parent : Union[int, str] = 0 ,
callback : Callable = None,
show : bool = True,
**kwargs ,
) -> None:
super().__init__(
label=label,
user_data=user_data,
use_internal_label=use_internal_label,
parent=parent,
callback=callback,
show=show,
**kwargs,
)
class ClickedHandler(HandlerItem):
"""Adds a clicked handler.
Args:
button (int, optional): Submits callback for all mouse buttons
label (str, optional): Overrides 'name' as label.
user_data (Any, optional): User data for callbacks
use_internal_label (bool, optional): Use generated internal label instead of user specified (appends ### uuid).
tag (Union[int, str], optional): Unique id used to programmatically refer to the item.If label is unused this will be the label.
parent (Union[int, str], optional): Parent to add this item to. (runtime adding)
callback (Callable, optional): Registers a callback.
show (bool, optional): Attempt to render widget.
id (Union[int, str], optional): (deprecated)
Returns:
Union[int, str]
"""
button : int = ItemAttribute("configuration", "get_item_config", "set_item_config", None)
show : bool = ItemAttribute("configuration", "get_item_config", "set_item_config", None)
__is_container__ : bool = False
__is_root_item__ : bool = False
__is_value_able__: bool = False
__able_parents__ : tuple = ('Stage', 'TemplateRegistry', 'ItemHandlerRegistry')
__able_children__: tuple = ()
__commands__ : tuple = ()
__constants__ : tuple = ('mvClickedHandler',)
__command__ : Callable = dearpygui.add_item_clicked_handler
def __init__(
self ,
button : int = -1 ,
label : str = None,
user_data : Any = None,
use_internal_label: bool = True,
parent : Union[int, str] = 0 ,
callback : Callable = None,
show : bool = True,
**kwargs ,
) -> None:
super().__init__(
button=button,
label=label,
user_data=user_data,
use_internal_label=use_internal_label,
parent=parent,
callback=callback,
show=show,
**kwargs,
)
| 50.696123 | 134 | 0.460603 | 4,573 | 56,222 | 5.338071 | 0.054669 | 0.035394 | 0.047315 | 0.049035 | 0.872885 | 0.847692 | 0.844251 | 0.843104 | 0.843104 | 0.843104 | 0 | 0.001667 | 0.466543 | 56,222 | 1,108 | 135 | 50.741877 | 0.81225 | 0.28247 | 0 | 0.794286 | 0 | 0 | 0.072471 | 0.003108 | 0 | 0 | 0 | 0 | 0 | 1 | 0.035714 | false | 0 | 0.01 | 0.001429 | 0.362857 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
8c6cb244648975b45e6898dd3ca7adfc5b39b15a | 10,334 | py | Python | server/tests/test_reactions.py | metamdb/metamdb | f884cfb604007ed0bbebbd76ee353fa14797b5fb | [
"MIT"
] | 1 | 2022-03-13T12:33:12.000Z | 2022-03-13T12:33:12.000Z | server/tests/test_reactions.py | metamdb/metamdb | f884cfb604007ed0bbebbd76ee353fa14797b5fb | [
"MIT"
] | null | null | null | server/tests/test_reactions.py | metamdb/metamdb | f884cfb604007ed0bbebbd76ee353fa14797b5fb | [
"MIT"
] | null | null | null | def test_single_id(client):
response = client.get('/api/reactions/1')
json_data = response.get_json()
print(response)
json_data_result = {
'balanced':
False,
'compounds': [{
'compound': {
'id': 1,
'name': 'fumarate',
'inchi':
'InChI=1S/C4H4O4/c5-3(6)1-2-4(7)8/h1-2H,(H,5,6)(H,7,8)/b2-1+',
'formula': 'C4H4O4'
},
'position': 1,
'quantity': 1,
'reactant': 'substrate'
}],
'externalUrls': {
'img': 'https://metamdb.tu-bs.de/img/aam/1',
'metamdb': 'https://metamdb.tu-bs.de/reaction/1'
},
'formula':
'succinate <=> fumarate',
'href':
'https://metamdb.tu-bs.de/api/reactions/1',
'id':
1,
'identifiers': [{
'databaseIdentifier': 'SUC-FUM-OX-RXN',
'source': {
'id': 1,
'name': 'metacyc'
}
}],
'rxnFile':
'LARGE RXN FILE',
'type':
'reaction',
'updated':
False,
'updatedBy':
None,
'updatedOn':
None
}
assert response.status_code == 200
assert json_data == json_data_result
def test_single_id2(client):
response = client.get('/api/reactions/2')
json_data = response.get_json()
json_data_result = {
'balanced':
False,
'compounds': [],
'externalUrls': {
'img': 'https://metamdb.tu-bs.de/img/aam/2',
'metamdb': 'https://metamdb.tu-bs.de/reaction/2'
},
'formula':
'malate <=> fumarate',
'href':
'https://metamdb.tu-bs.de/api/reactions/2',
'id':
2,
'identifiers': [{
'databaseIdentifier': 'RXN-0543',
'source': {
'id': 1,
'name': 'metacyc'
}
}],
'rxnFile':
'LARGE RXN FILE',
'type':
'reaction',
'updated':
True,
'updatedBy': {
'id': 1,
'name': 'Test Name',
'orcid': '0000-0000-1234-5678',
'role': {
'id': 1,
'name': 'Reviewer'
}
},
'updatedOn':
None
}
assert response.status_code == 200
assert json_data == json_data_result
def test_single_wrong_id(client):
response = client.get('/api/reactions/3')
json_data = response.get_json()
json_data_result = {'message': 'Invalid id'}
assert response.status_code == 404
assert json_data == json_data_result
def test_single_bad_id(client):
response = client.get('/api/reactions/thisisanid')
json_data = response.get_json()
json_data_result = {'message': 'Invalid id'}
assert response.status_code == 404
assert json_data == json_data_result
def test_single_bad_id2(client):
response = client.get('/api/reactions/$$$$')
json_data = response.get_json()
json_data_result = {'message': 'Invalid id'}
assert response.status_code == 404
assert json_data == json_data_result
def test_multiple_ids(client):
response = client.get('/api/reactions?ids=1,2')
json_data = response.get_json()
json_data_result = {
'reactions': [{
'balanced':
False,
'compounds': [{
'compound': {
'id': 1,
'name': 'fumarate',
'inchi':
'InChI=1S/C4H4O4/c5-3(6)1-2-4(7)8/h1-2H,(H,5,6)(H,7,8)/b2-1+',
'formula': 'C4H4O4'
},
'position': 1,
'quantity': 1,
'reactant': 'substrate'
}],
'externalUrls': {
'img': 'https://metamdb.tu-bs.de/img/aam/1',
'metamdb': 'https://metamdb.tu-bs.de/reaction/1'
},
'formula':
'succinate <=> fumarate',
'href':
'https://metamdb.tu-bs.de/api/reactions/1',
'id':
1,
'identifiers': [{
'databaseIdentifier': 'SUC-FUM-OX-RXN',
'source': {
'id': 1,
'name': 'metacyc'
}
}],
'rxnFile':
'LARGE RXN FILE',
'type':
'reaction',
'updated':
False,
'updatedBy':
None,
'updatedOn':
None
}, {
'balanced':
False,
'compounds': [],
'externalUrls': {
'img': 'https://metamdb.tu-bs.de/img/aam/2',
'metamdb': 'https://metamdb.tu-bs.de/reaction/2'
},
'formula':
'malate <=> fumarate',
'href':
'https://metamdb.tu-bs.de/api/reactions/2',
'id':
2,
'identifiers': [{
'databaseIdentifier': 'RXN-0543',
'source': {
'id': 1,
'name': 'metacyc'
}
}],
'rxnFile':
'LARGE RXN FILE',
'type':
'reaction',
'updated':
True,
'updatedBy': {
'id': 1,
'name': 'Test Name',
'orcid': '0000-0000-1234-5678',
'role': {
'id': 1,
'name': 'Reviewer'
}
},
'updatedOn':
None
}]
}
assert response.status_code == 200
assert json_data == json_data_result
def test_multiple_ids_one_wrong(client):
response = client.get('/api/reactions?ids=1,3')
json_data = response.get_json()
json_data_result = {
'reactions': [{
'balanced':
False,
'compounds': [{
'compound': {
'id': 1,
'name': 'fumarate',
'inchi':
'InChI=1S/C4H4O4/c5-3(6)1-2-4(7)8/h1-2H,(H,5,6)(H,7,8)/b2-1+',
'formula': 'C4H4O4'
},
'position': 1,
'quantity': 1,
'reactant': 'substrate'
}],
'externalUrls': {
'img': 'https://metamdb.tu-bs.de/img/aam/1',
'metamdb': 'https://metamdb.tu-bs.de/reaction/1'
},
'formula':
'succinate <=> fumarate',
'href':
'https://metamdb.tu-bs.de/api/reactions/1',
'id':
1,
'identifiers': [{
'databaseIdentifier': 'SUC-FUM-OX-RXN',
'source': {
'id': 1,
'name': 'metacyc'
}
}],
'rxnFile':
'LARGE RXN FILE',
'type':
'reaction',
'updated':
False,
'updatedBy':
None,
'updatedOn':
None
}, None]
}
assert response.status_code == 200
assert json_data == json_data_result
def test_multiple_ids_all_wrong(client):
response = client.get('/api/reactions?ids=4,3')
json_data = response.get_json()
json_data_result = {'reactions': [None, None]}
assert response.status_code == 200
assert json_data == json_data_result
def test_multiple_ids_one_bad(client):
response = client.get('/api/reactions?ids=1,abc')
json_data = response.get_json()
json_data_result = {
'reactions': [{
'balanced':
False,
'compounds': [{
'compound': {
'id': 1,
'name': 'fumarate',
'inchi':
'InChI=1S/C4H4O4/c5-3(6)1-2-4(7)8/h1-2H,(H,5,6)(H,7,8)/b2-1+',
'formula': 'C4H4O4'
},
'position': 1,
'quantity': 1,
'reactant': 'substrate'
}],
'externalUrls': {
'img': 'https://metamdb.tu-bs.de/img/aam/1',
'metamdb': 'https://metamdb.tu-bs.de/reaction/1'
},
'formula':
'succinate <=> fumarate',
'href':
'https://metamdb.tu-bs.de/api/reactions/1',
'id':
1,
'identifiers': [{
'databaseIdentifier': 'SUC-FUM-OX-RXN',
'source': {
'id': 1,
'name': 'metacyc'
}
}],
'rxnFile':
'LARGE RXN FILE',
'type':
'reaction',
'updated':
False,
'updatedBy':
None,
'updatedOn':
None
}, None]
}
assert response.status_code == 200
assert json_data == json_data_result
def test_multiple_ids_all_bad(client):
response = client.get('/api/reactions?ids=$$$,abc')
json_data = response.get_json()
json_data_result = {'reactions': [None, None]}
assert response.status_code == 200
assert json_data == json_data_result
def test_multiple_ids_no_ids(client):
response = client.get('/api/reactions?ids=')
json_data = response.get_json()
json_data_result = {'message': 'Ids are required but none were given'}
assert response.status_code == 400
assert json_data == json_data_result
def test_multiple_ids_no_ids2(client):
response = client.get('/api/reactions?abc=abc')
json_data = response.get_json()
json_data_result = {
'message':
'The following query parameter/s are required but were not given: [ids]'
}
assert response.status_code == 400
assert json_data == json_data_result
def test_multiple_ids_no_ids3(client):
response = client.get('/api/reactions')
json_data = response.get_json()
json_data_result = {
'message':
'The following query parameter/s are required but were not given: [ids]'
}
assert response.status_code == 400
assert json_data == json_data_result | 26.841558 | 82 | 0.451906 | 983 | 10,334 | 4.601221 | 0.10885 | 0.091974 | 0.080478 | 0.063675 | 0.981428 | 0.976343 | 0.96551 | 0.931904 | 0.876631 | 0.857396 | 0 | 0.037476 | 0.403522 | 10,334 | 385 | 83 | 26.841558 | 0.696301 | 0 | 0 | 0.829851 | 0 | 0.01194 | 0.290082 | 0.038607 | 0 | 0 | 0 | 0 | 0.077612 | 1 | 0.038806 | false | 0 | 0 | 0 | 0.038806 | 0.002985 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
8ca6f099e02f7024ec631fc1d2865c82f3a2eeb7 | 5,220 | py | Python | TestModule/UserPlayerTest.py | INYEONGKIM/Quattro | 0fd70b08716f71404f520941791cd314d90de83a | [
"MIT"
] | null | null | null | TestModule/UserPlayerTest.py | INYEONGKIM/Quattro | 0fd70b08716f71404f520941791cd314d90de83a | [
"MIT"
] | null | null | null | TestModule/UserPlayerTest.py | INYEONGKIM/Quattro | 0fd70b08716f71404f520941791cd314d90de83a | [
"MIT"
] | null | null | null | import unittest
from QuattroComponents.Player import Player
from QuattroComponents.Card import Card
from TestModule.GetMethodName import get_method_name_decorator
class UserPlayerTest(unittest.TestCase):
player = Player(user_name="player", user_deck=[])
method_names = set()
@get_method_name_decorator
def test_is_quattro(self):
self.player.user_deck = [
Card(number=1, color='green', isOpen=False),
Card(number=1, color='blue', isOpen=False),
Card(number=5, color='yellow', isOpen=False),
Card(number=2, color='red', isOpen=False)
]
self.assertTrue(self.player.isQuattro(), msg="Normal Quattro-Case Test Fail!")
self.player.user_deck = [
Card(number=0, color='zero', isOpen=False),
Card(number=1, color='blue', isOpen=False),
Card(number=5, color='yellow', isOpen=False),
Card(number=2, color='red', isOpen=False)
]
self.assertTrue(self.player.isQuattro(), msg="Zero-include Quattro-Case Test Fail!")
self.player.user_deck = [
Card(number=0, color='zero', isOpen=False),
Card(number=0, color='zero', isOpen=False),
Card(number=1, color='blue', isOpen=False),
Card(number=2, color='red', isOpen=False)
]
self.assertFalse(self.player.isQuattro(), msg="Two Zeros False Test Fail!")
self.player.user_deck = [
Card(number=3, color='blue', isOpen=False),
Card(number=1, color='blue', isOpen=False),
Card(number=5, color='yellow', isOpen=False),
Card(number=2, color='red', isOpen=False)
]
self.assertFalse(self.player.isQuattro(), msg="Duplicate Color Test Fail!")
@get_method_name_decorator
def test_calculate_score_and_top(self):
self.player.user_deck = [
Card(number=1, color='green', isOpen=False),
Card(number=1, color='blue', isOpen=False),
Card(number=1, color='yellow', isOpen=False),
Card(number=1, color='red', isOpen=False)
]
self.player.calculate_total_score()
self.assertEqual(self.player.total_score, 4, msg="Normal Quattro-Case Calculate Test Fail!")
self.assertEqual(self.player.top_card, 1, msg="Normal Quattro-Case Top-Card Test Fail!")
self.player.total_score = 0
self.top_card = -1
self.player.user_deck = [
Card(number=0, color='zero', isOpen=False),
Card(number=1, color='blue', isOpen=False),
Card(number=1, color='yellow', isOpen=False),
Card(number=1, color='red', isOpen=False)
]
self.player.calculate_total_score()
self.assertEqual(self.player.total_score, 3, msg="Zero-include Quattro-Case Calculate Test Fail!")
self.assertEqual(self.player.top_card, 1, msg="Zero-include Quattro-Case Top-Card Test Fail!")
self.player.total_score = 0
self.top_card = -1
self.player.user_deck = [
Card(number=0, color='zero', isOpen=False),
Card(number=0, color='zero', isOpen=False),
Card(number=1, color='yellow', isOpen=False),
Card(number=1, color='red', isOpen=False)
]
self.player.calculate_total_score()
self.assertEqual(self.player.total_score, 0, msg="Two Zeros Calculate Test Fail!")
self.assertEqual(self.player.top_card, 1, msg="Two Zeros Top-Card Test Fail!")
self.player.total_score = 0
self.top_card = -1
self.player.user_deck = [
Card(number=0, color='zero', isOpen=False),
Card(number=2, color='yellow', isOpen=False),
Card(number=1, color='yellow', isOpen=False),
Card(number=1, color='red', isOpen=False)
]
self.player.calculate_total_score()
self.assertEqual(self.player.total_score, 0, msg="Non-Quattro Case Calculate Test Fail!")
self.assertEqual(self.player.top_card, 2, msg="Non-Quattro Case Top-Card Test Fail!")
self.player.total_score = 0
self.top_card = -1
self.player.user_deck = [
Card(number=1, color='blue', isOpen=False),
Card(number=2, color='yellow', isOpen=False),
Card(number=2, color='green', isOpen=False),
Card(number=1, color='red', isOpen=False)
]
self.player.calculate_total_score()
self.assertEqual(self.player.total_score, 6, msg="Duplicate Color Top Case Calculate Test Fail!")
self.assertEqual(self.player.top_card, 2, msg="Duplicate Color Top Case Top-Card Test Fail!")
self.player.total_score = 0
self.top_card = -1
self.player.user_deck = [
Card(number=6, color='blue', isOpen=False),
Card(number=5, color='yellow', isOpen=False),
Card(number=4, color='green', isOpen=False),
Card(number=3, color='red', isOpen=False)
]
self.player.calculate_total_score()
self.assertEqual(self.player.total_score, 18, msg="Normal Top Case Calculate Test Fail!")
self.assertEqual(self.player.top_card, 6, msg="Normal Top Case Top-Card Test Fail!")
| 45 | 106 | 0.619349 | 669 | 5,220 | 4.73991 | 0.088191 | 0.126143 | 0.141911 | 0.198676 | 0.886156 | 0.865027 | 0.829076 | 0.829076 | 0.811416 | 0.811416 | 0 | 0.015941 | 0.242912 | 5,220 | 115 | 107 | 45.391304 | 0.786437 | 0 | 0 | 0.61165 | 0 | 0 | 0.145594 | 0 | 0 | 0 | 0 | 0 | 0.15534 | 1 | 0.019417 | false | 0 | 0.038835 | 0 | 0.087379 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
8caac2e14bda4b1d7f43cf825cd665a7be5a59c4 | 3,757 | py | Python | lapis_tests/test_job_caching.py | tfesenbecker/lapis | 0a79525ad0c3418893232fb084d9437b625696a5 | [
"MIT"
] | null | null | null | lapis_tests/test_job_caching.py | tfesenbecker/lapis | 0a79525ad0c3418893232fb084d9437b625696a5 | [
"MIT"
] | 13 | 2020-09-17T14:17:14.000Z | 2020-10-14T14:30:58.000Z | lapis_tests/test_job_caching.py | tfesenbecker/lapis | 0a79525ad0c3418893232fb084d9437b625696a5 | [
"MIT"
] | null | null | null | from usim import time
from lapis.job import Job
from lapis_tests import via_usim, DummyDrone
class TestJobCaching(object):
@via_usim
async def test_calculation_time(self):
self.job = Job(resources={"walltime": 60},
used_resources={"walltime": 10, "cores": 0.7})
self.job.drone = DummyDrone(1)
starttime = time.now
await self.job._calculate()
assert time.now - starttime == 10
self.job = Job(resources={"walltime": 60, "inputfiles": {"file"}},
used_resources={"walltime": 10, "cores": 0.7})
self.job.drone = DummyDrone(1)
starttime = time.now
await self.job._calculate()
assert time.now - starttime == 7
self.job = Job(resources={"walltime": 60, "inputfiles": {"file"}},
used_resources={"walltime": 10, "cores": 0.7},
calculation_efficiency=0.5)
self.job.drone = DummyDrone(1)
starttime = time.now
await self.job._calculate()
assert time.now - starttime == 14
self.job = Job(resources={"walltime": 60, "inputfiles": {"file"}},
used_resources={"walltime": 10},
calculation_efficiency=0.5)
self.job.drone = DummyDrone(1)
starttime = time.now
await self.job._calculate()
assert time.now - starttime == 10
@via_usim
async def test_transfer_time(self):
conversion_GB_to_B = 1000 * 1000 * 1000
drone = DummyDrone(1)
self.job = Job(resources={"walltime": 60,
"inputfiles": {"file": {"usedsize": 20 *conversion_GB_to_B}}},
used_resources={"walltime": 10,
"inputfiles": {
"file": {"usedsize": 20 * conversion_GB_to_B,
"hitrates": {}}}
},
calculation_efficiency=1.0)
self.job.drone = drone
starttime = time.now
await self.job._transfer_inputfiles()
assert time.now - starttime == 20
self.job = Job(resources={"walltime": 60},
used_resources={"walltime": 10},
calculation_efficiency=1.0)
self.job.drone = drone
starttime = time.now
await self.job._transfer_inputfiles()
assert time.now - starttime == 0
self.job = Job(resources={"walltime": 60,
"inputfiles": {
"file": {"usedsize": 20 *conversion_GB_to_B}}},
used_resources={"walltime": 10},
calculation_efficiency=1.0)
self.job.drone = drone
starttime = time.now
await self.job._transfer_inputfiles()
assert time.now - starttime == 0
self.job = Job(resources={"walltime": 60,
"inputfiles": {
"file": {"usedsize": 20 * conversion_GB_to_B}}},
used_resources={"walltime": 10,
"inputfiles": {
"file": {"usedsize": 20 *
conversion_GB_to_B,
"hitrates": {}},
}
},
calculation_efficiency=1.0)
self.job.drone = drone
starttime = time.now
await self.job._transfer_inputfiles()
assert time.now - starttime == 20
| 39.135417 | 96 | 0.47325 | 342 | 3,757 | 5.046784 | 0.134503 | 0.097335 | 0.04635 | 0.088065 | 0.894554 | 0.872538 | 0.872538 | 0.872538 | 0.872538 | 0.872538 | 0 | 0.041096 | 0.417088 | 3,757 | 95 | 97 | 39.547368 | 0.747032 | 0 | 0 | 0.746835 | 0 | 0 | 0.082801 | 0 | 0 | 0 | 0 | 0 | 0.101266 | 1 | 0 | false | 0 | 0.037975 | 0 | 0.050633 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
50843364fde2dba9e0c7b90260c1436fcccfb7e1 | 8,675 | py | Python | test/ai4cities_invest_model_scenarios.py | enerflow/ai4cities-model | cffe22b661f4c9cef55d708ddfb8f1cf0ed63347 | [
"Apache-2.0"
] | null | null | null | test/ai4cities_invest_model_scenarios.py | enerflow/ai4cities-model | cffe22b661f4c9cef55d708ddfb8f1cf0ed63347 | [
"Apache-2.0"
] | null | null | null | test/ai4cities_invest_model_scenarios.py | enerflow/ai4cities-model | cffe22b661f4c9cef55d708ddfb8f1cf0ed63347 | [
"Apache-2.0"
] | null | null | null | from ai4cities.ai4cities_invest_model import solve_model, model_invest, model_invest_input, model_invest_results
import pandas as pd
import numpy as np
import copy
# Set solver
#solver = {'name':"cbc",'path': "C:/cbc-win64/cbc"}
solver = {'name':"cbc"}
# Import data
data = pd.read_csv('./data/data.csv', header=[0], sep=',', index_col=0, parse_dates = True)
# Copy data to new dataframe
df = copy.copy(data)
# Assign value to every unique month-year
df['TM'] = df.index.month
df['TY'] = df.index.year
df['month_order'] = df['TM'] + (df['TY'] - df['TY'][0])*12 - df['TM'][0]+1
df = df.drop(columns=['TM', 'TY'])
#%% Tarrif structures
# Example 1 - Fixed energy tariff
# Prices for Affärsverken Elnät AB (Fuse 16A, appartment)
fixed_charge = 14.5 # €/Month
df['grid_energy_import_fee'] = 45*0.001 # €/kWh
df['grid_energy_export_fee'] = 0 # €/kWh
df['grid_power_import_fee'] = 0 # €/kW-month
df['grid_power_export_fee'] = 0 # €/kW-month
# Example 2 - Power based tariff
# Prices for Bjäre Kraft ek för (Fuse 16, house)
fixed_charge = 14.1 # €/Month
df['grid_energy_import_fee'] = 0 # €/kWh
df['grid_energy_export_fee'] = 0 # €/kWh
df['grid_power_import_fee'] = 0 # €/kW-month
df['grid_power_export_fee'] = 0 # €/kW-month
# Set grid energy fee for different months, days and hours
for i in [1,2,3,11,12]:
df.loc[(df.index.month == i),'grid_power_import_fee'] = 12.6
for i in [4,5,6,7,8,9,10]:
for j in list(range(0,5)):
for k in list(range(9,19)):
df.loc[(df.index.month == i) & (df.index.weekday == j) & (df.index.hour == k),'grid_power_import_fee'] = 7.5
# Example 3 - Time based tariff
# Prices for Ellevio AB Dalarna Södra (Fuse 16A, house)
fixed_charge = 25.5 # €/Month
df['grid_energy_import_fee'] = 9*0.001 # €/kWh
df['grid_energy_export_fee'] = 0 # €/kWh
df['grid_power_import_fee'] = 0 # €/kW-month
df['grid_power_export_fee'] = 0 # €/kW-month
# Set grid energy fee for different months, days and hours
for i in [1,2,3,11,12]:
for j in list(range(0,5)):
for k in list(range(8,22)):
df.loc[(df.index.month == i) & (df.index.weekday == j) & (df.index.hour == k),'grid_energy_import_fee'] = 58*0.001
for i in [4,5,6,7,8,9,10]:
df.loc[(df.index.month == i),'grid_energy_import_fee'] = 9*0.001
#%% Scenario 1: No PV/Battery/Heating
data_dict = {
'demand': df['Electricity demand [kWh]'].to_list(),
'energy_price_buy': (df['DAM Price [EUR/MWh]']/1000).to_list(),
'energy_price_sell': (0.99*df['DAM Price [EUR/MWh]']/1000).to_list(),
'grid_fixed_fee': fixed_charge,
'grid_energy_import_fee': df['grid_energy_import_fee'].to_list(),
'grid_energy_export_fee': df['grid_energy_export_fee'].to_list(),
'grid_power_import_fee': df['grid_power_import_fee'].to_list(),
'grid_power_export_fee': df['grid_power_export_fee'].to_list(),
'emission_factor_grid': list(5*np.ones(len(df))),
'month_order': df['month_order'].to_list(),
'dt': 1.0,
}
# Create model data structure
model_data = model_invest_input(data_dict)
# Create model instance with data
model_instance = model_invest(model_data)
# Solve
solution = solve_model(model_instance, solver)
# Get results
results = model_invest_results(solution)
#%% Scenario 2: PV/Battery sizing - No Heating
data_dict = {
'invest_cost_battery': 20,
'invest_cost_pv': 30,
'battery_capacity_max': 5,
'pv_capacity_max': 20,
'pv_lifespan': 25,
'battery_lifespan': 10,
'discount_rate': 0.03,
'demand': df['Electricity demand [kWh]'].to_list(),
'generation_pu': df['PV 1kWp 180Azim 40Tilt Physical [kW]'].to_list(),
'battery_min_level': 0.0,
'battery_charge_max': 0.5,
'battery_discharge_max': 0.5,
'battery_efficiency_charge': 0.9,
'battery_efficiency_discharge': 0.9,
'battery_grid_charging': True,
'bel_ini_level': 0.0,
'bel_fin_level': 0.0,
'energy_price_buy': (df['DAM Price [EUR/MWh]']/1000).to_list(),
'energy_price_sell': (0.99*df['DAM Price [EUR/MWh]']/1000).to_list(),
'grid_fixed_fee': fixed_charge,
'grid_energy_import_fee': df['grid_energy_import_fee'].to_list(),
'grid_energy_export_fee': df['grid_energy_export_fee'].to_list(),
'grid_power_import_fee': df['grid_power_import_fee'].to_list(),
'grid_power_export_fee': df['grid_power_export_fee'].to_list(),
'emission_factor_grid': list(5*np.ones(len(df))),
'month_order': df['month_order'].to_list(),
'dt': 1.0,
}
# Create model data structure
model_data = model_invest_input(data_dict)
# Create model instance with data
model_instance = model_invest(model_data)
# Solve
solution = solve_model(model_instance, solver)
# Get results
results = model_invest_results(solution)
#%% Scenario 3: PV/Battery/Heat pump sizing
data_dict = {
'invest_cost_battery': 20,
'invest_cost_pv': 30,
'invest_cost_heat_pump': 10,
'battery_capacity_max': 5,
'pv_capacity_max': 20,
'heat_pump_capacity_max': 2000,
'pv_lifespan': 25,
'battery_lifespan': 10,
'heat_pump_lifespan': 25,
'discount_rate': 0.03,
'demand': df['Electricity demand [kWh]'].to_list(),
'generation_pu': df['PV 1kWp 180Azim 40Tilt Physical [kW]'].to_list(),
'heat_demand': df['Heat demand [kWh]'].to_list(),
'battery_min_level': 0.0,
'battery_charge_max': 0.5,
'battery_discharge_max': 0.5,
'battery_efficiency_charge': 0.9,
'battery_efficiency_discharge': 0.9,
'battery_grid_charging': True,
'bel_ini_level': 0.0,
'bel_fin_level': 0.0,
'energy_price_buy': (df['DAM Price [EUR/MWh]']/1000).to_list(),
'energy_price_sell': (0.99*df['DAM Price [EUR/MWh]']/1000).to_list(),
'grid_fixed_fee': fixed_charge,
'grid_energy_import_fee': df['grid_energy_import_fee'].to_list(),
'grid_energy_export_fee': df['grid_energy_export_fee'].to_list(),
'grid_power_import_fee': df['grid_power_import_fee'].to_list(),
'grid_power_export_fee': df['grid_power_export_fee'].to_list(),
'heat_capacity_factor': 0.1,
'emission_factor_grid': list(5*np.ones(len(df))),
'heat_pump_cop': 2.5,
'month_order': df['month_order'].to_list(),
'dt': 1.0,
}
# Create model data structure
model_data = model_invest_input(data_dict)
# Create model instance with data
model_instance = model_invest(model_data)
# Solve
solution = solve_model(model_instance, solver)
# Get results
results = model_invest_results(solution)
#%% Scenario 4: PV/Battery/Fuel boiler sizing
data_dict = {
'invest_cost_battery': 20,
'invest_cost_pv': 30,
'invest_cost_fuel_boiler': 10,
'battery_capacity_max': 5,
'pv_capacity_max': 20,
'fuel_boiler_capacity_max': 500,
'pv_lifespan': 25,
'battery_lifespan': 10,
'fuel_boiler_lifespan': 25,
'discount_rate': 0.03,
'demand': df['Electricity demand [kWh]'].to_list(),
'generation_pu': df['PV 1kWp 180Azim 40Tilt Physical [kW]'].to_list(),
'heat_demand': df['Heat demand [kWh]'].to_list(),
'battery_min_level': 0.0,
'battery_charge_max': 0.5,
'battery_discharge_max': 0.5,
'battery_efficiency_charge': 0.9,
'battery_efficiency_discharge': 0.9,
'battery_grid_charging': True,
'bel_ini_level': 0.0,
'bel_fin_level': 0.0,
'energy_price_buy': (df['DAM Price [EUR/MWh]']/1000).to_list(),
'energy_price_sell': (0.99*df['DAM Price [EUR/MWh]']/1000).to_list(),
'grid_fixed_fee': fixed_charge,
'grid_energy_import_fee': df['grid_energy_import_fee'].to_list(),
'grid_energy_export_fee': df['grid_energy_export_fee'].to_list(),
'grid_power_import_fee': df['grid_power_import_fee'].to_list(),
'grid_power_export_fee': df['grid_power_export_fee'].to_list(),
'fuel_price': list(20*np.ones(len(df))),
'heat_capacity_factor': 0.1,
'emission_factor_grid': list(5*np.ones(len(df))),
'emission_factor_fuel': list(5*np.ones(len(df))),
'fuel_boiler_efficiency': 0.9,
'month_order': df['month_order'].to_list(),
'dt': 1.0,
}
# Create model data structure
model_data = model_invest_input(data_dict)
# Create model instance with data
model_instance = model_invest(model_data)
# Solve
solution = solve_model(model_instance, solver)
# Get results
results = model_invest_results(solution)
| 29.307432 | 126 | 0.6483 | 1,305 | 8,675 | 4.013793 | 0.131034 | 0.042383 | 0.030546 | 0.047155 | 0.816342 | 0.813478 | 0.793624 | 0.769569 | 0.752386 | 0.738641 | 0 | 0.042157 | 0.196081 | 8,675 | 295 | 127 | 29.40678 | 0.706768 | 0.130029 | 0 | 0.779762 | 0 | 0 | 0.392957 | 0.19061 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.130952 | 0 | 0.130952 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
50e8fc869d2070813c85ec99d9f2414e7b705a70 | 3,346 | py | Python | src/scml/tests/test_rescale_as_int.py | seahrh/kaggle-rig | 263dbdd326f31bed92e80e08f7be65e934a3607d | [
"MIT"
] | null | null | null | src/scml/tests/test_rescale_as_int.py | seahrh/kaggle-rig | 263dbdd326f31bed92e80e08f7be65e934a3607d | [
"MIT"
] | null | null | null | src/scml/tests/test_rescale_as_int.py | seahrh/kaggle-rig | 263dbdd326f31bed92e80e08f7be65e934a3607d | [
"MIT"
] | null | null | null | import numpy as np
import pandas as pd
from scml import rescale_as_int
class TestRescaleAsInt:
def test_int8_rescale(self):
dtype = np.int8
a = rescale_as_int(
pd.Series(
[0, 0.101, 0.201, 0.301, 0.401, 0.501, 0.601, 0.701, 0.801, 0.901, 1]
),
dtype=dtype,
)
assert list(a) == [0, 12, 25, 38, 50, 63, 76, 89, 101, 114, 127]
assert a.dtype == dtype
a = rescale_as_int(
pd.Series([0, 10.1, 20.1, 30.1, 40.1, 50.1, 60.1, 70.1, 80.1, 90.1, 100]),
dtype=dtype,
)
assert list(a) == [0, 12, 25, 38, 50, 63, 76, 89, 101, 114, 127]
assert a.dtype == dtype
def test_int16_rescale(self):
dtype = np.int16
a = rescale_as_int(
pd.Series(
[0, 0.101, 0.201, 0.301, 0.401, 0.501, 0.601, 0.701, 0.801, 0.901, 1]
),
dtype=dtype,
)
assert list(a) == [
0,
3309,
6586,
9862,
13139,
16416,
19692,
22969,
26246,
29523,
32767,
]
assert a.dtype == dtype
a = rescale_as_int(
pd.Series([0, 10.1, 20.1, 30.1, 40.1, 50.1, 60.1, 70.1, 80.1, 90.1, 100]),
dtype=dtype,
)
assert list(a) == [
0,
3309,
6586,
9862,
13139,
16416,
19692,
22969,
26246,
29523,
32767,
]
assert a.dtype == dtype
def test_int32_rescale(self):
dtype = np.int32
a = rescale_as_int(
pd.Series(
[0, 0.101, 0.201, 0.301, 0.401, 0.501, 0.601, 0.701, 0.801, 0.901, 1]
),
dtype=dtype,
)
assert list(a) == [
0,
216895848,
431644213,
646392577,
861140942,
1075889307,
1290637671,
1505386036,
1720134401,
1934882765,
2147483647,
]
assert a.dtype == dtype
a = rescale_as_int(
pd.Series([0, 10.1, 20.1, 30.1, 40.1, 50.1, 60.1, 70.1, 80.1, 90.1, 100]),
dtype=dtype,
)
assert list(a) == [
0,
216895848,
431644213,
646392577,
861140942,
1075889307,
1290637671,
1505386036,
1720134401,
1934882765,
2147483647,
]
assert a.dtype == dtype
def test_scaling_is_bounded(self):
a = rescale_as_int(
pd.Series([-0.2, 1.2]), min_value=0, max_value=1, dtype=np.int8
)
assert list(a) == [0, np.iinfo(np.int8).max]
a = rescale_as_int(
pd.Series([-0.2, 1.2]), min_value=0, max_value=1, dtype=np.int16
)
assert list(a) == [0, np.iinfo(np.int16).max]
a = rescale_as_int(
pd.Series([-0.2, 1.2]), min_value=0, max_value=1, dtype=np.int32
)
assert list(a) == [0, np.iinfo(np.int32).max]
| 27.883333 | 87 | 0.416019 | 406 | 3,346 | 3.342365 | 0.197044 | 0.08843 | 0.08843 | 0.08622 | 0.839352 | 0.839352 | 0.823876 | 0.77745 | 0.77745 | 0.77745 | 0 | 0.325194 | 0.461447 | 3,346 | 119 | 88 | 28.117647 | 0.427858 | 0 | 0 | 0.734513 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.132743 | 1 | 0.035398 | false | 0 | 0.026549 | 0 | 0.070796 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
50f472b2e0b9c4497f62dc100aa8895ccc17f370 | 2,020 | py | Python | standard-training/train_model.py | kaustubhsridhar/PoE-robustness | 878bd94c64534afc4fdff04ada9e12aab6ed4b28 | [
"MIT"
] | null | null | null | standard-training/train_model.py | kaustubhsridhar/PoE-robustness | 878bd94c64534afc4fdff04ada9e12aab6ed4b28 | [
"MIT"
] | null | null | null | standard-training/train_model.py | kaustubhsridhar/PoE-robustness | 878bd94c64534afc4fdff04ada9e12aab6ed4b28 | [
"MIT"
] | 1 | 2022-03-10T05:28:58.000Z | 2022-03-10T05:28:58.000Z | import os
import argparse
parser = argparse.ArgumentParser(description='Wrapper for easy evaluation of downloaded models.')
parser.add_argument('--folder', default='train_checkpoints/5000_cifar10_resnet20_0.1', type=str)
args = parser.parse_args()
LR = float(args.folder.rsplit('_')[-1])
checkpoints_slash_B = args.folder.rsplit('_')[-4]
B = int(checkpoints_slash_B.split('/')[-1])
print(LR, B)
if 'cifar10_resnet20' in args.folder:
cmd = 'python cifar_plus.py -a resnet --lr {} --train-batch {} --depth 20 --epochs 164 --schedule 81 122 --gamma 0.1 --wd 1e-4 --checkpoint {}'.format(LR, B, args.folder)
elif 'cifar10_resnet50' in args.folder:
cmd = 'python cifar_plus.py -a resnet --lr {} --train-batch {} --depth 50 --epochs 164 --schedule 81 122 --gamma 0.1 --wd 1e-4 --checkpoint {}'.format(LR, B, args.folder)
elif 'cifar10_resnet110' in args.folder:
cmd = 'python cifar_plus.py -a resnet --lr {} --train-batch {} --depth 110 --epochs 164 --schedule 81 122 --gamma 0.1 --wd 1e-4 --checkpoint {}'.format(LR, B, args.folder)
elif 'cifar10_densenet' in args.folder:
cmd = 'python cifar_plus.py -a densenet --lr {} --train-batch {} --depth 40 --growthRate 12 --epochs 300 --schedule 150 225 --wd 1e-4 --gamma 0.1 --checkpoint {}'.format(LR, B, args.folder)
elif 'cifar100_resnet50' in args.folder:
cmd = 'python cifar_plus.py -a resnet --dataset cifar100 --lr {} --train-batch {} --depth 50 --epochs 164 --schedule 81 122 --gamma 0.1 --wd 1e-4 --checkpoint {}'.format(LR, B, args.folder)
elif 'cifar100_resnet110' in args.folder:
cmd = 'python cifar_plus.py -a resnet --dataset cifar100 --lr {} --train-batch {} --depth 110 --epochs 164 --schedule 81 122 --gamma 0.1 --wd 1e-4 --checkpoint {}'.format(LR, B, args.folder)
elif 'cifar100_densenet' in args.folder:
cmd = 'python cifar_plus.py -a densenet --dataset cifar100 --lr {} --train-batch {} --depth 40 --growthRate 12 --epochs 300 --schedule 150 225 --wd 1e-4 --gamma 0.1 --checkpoint {}'.format(LR, B, args.folder)
os.system(cmd) | 72.142857 | 212 | 0.686139 | 314 | 2,020 | 4.33121 | 0.235669 | 0.117647 | 0.064706 | 0.077206 | 0.765441 | 0.765441 | 0.754412 | 0.754412 | 0.745588 | 0.745588 | 0 | 0.090075 | 0.137129 | 2,020 | 28 | 213 | 72.142857 | 0.690189 | 0 | 0 | 0 | 0 | 0.291667 | 0.624443 | 0.021277 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.083333 | 0 | 0.083333 | 0.041667 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
0fc303c9d7a2a53556ae04ff4b50654b424e5779 | 46,103 | py | Python | sdk/python/pulumi_alicloud/cen/transit_router_peer_attachment.py | pulumi/pulumi-alicloud | 9c34d84b4588a7c885c6bec1f03b5016e5a41683 | [
"ECL-2.0",
"Apache-2.0"
] | 42 | 2019-03-18T06:34:37.000Z | 2022-03-24T07:08:57.000Z | sdk/python/pulumi_alicloud/cen/transit_router_peer_attachment.py | pulumi/pulumi-alicloud | 9c34d84b4588a7c885c6bec1f03b5016e5a41683 | [
"ECL-2.0",
"Apache-2.0"
] | 152 | 2019-04-15T21:03:44.000Z | 2022-03-29T18:00:57.000Z | sdk/python/pulumi_alicloud/cen/transit_router_peer_attachment.py | pulumi/pulumi-alicloud | 9c34d84b4588a7c885c6bec1f03b5016e5a41683 | [
"ECL-2.0",
"Apache-2.0"
] | 3 | 2020-08-26T17:30:07.000Z | 2021-07-05T01:37:45.000Z | # coding=utf-8
# *** WARNING: this file was generated by the Pulumi Terraform Bridge (tfgen) Tool. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
import warnings
import pulumi
import pulumi.runtime
from typing import Any, Mapping, Optional, Sequence, Union, overload
from .. import _utilities
__all__ = ['TransitRouterPeerAttachmentArgs', 'TransitRouterPeerAttachment']
@pulumi.input_type
class TransitRouterPeerAttachmentArgs:
def __init__(__self__, *,
cen_id: pulumi.Input[str],
peer_transit_router_id: pulumi.Input[str],
peer_transit_router_region_id: pulumi.Input[str],
auto_publish_route_enabled: Optional[pulumi.Input[bool]] = None,
bandwidth: Optional[pulumi.Input[int]] = None,
cen_bandwidth_package_id: Optional[pulumi.Input[str]] = None,
dry_run: Optional[pulumi.Input[bool]] = None,
resource_type: Optional[pulumi.Input[str]] = None,
route_table_association_enabled: Optional[pulumi.Input[bool]] = None,
route_table_propagation_enabled: Optional[pulumi.Input[bool]] = None,
transit_router_attachment_description: Optional[pulumi.Input[str]] = None,
transit_router_attachment_name: Optional[pulumi.Input[str]] = None,
transit_router_id: Optional[pulumi.Input[str]] = None):
"""
The set of arguments for constructing a TransitRouterPeerAttachment resource.
:param pulumi.Input[str] cen_id: The ID of the CEN.
:param pulumi.Input[str] peer_transit_router_id: The ID of the peer transit router.
:param pulumi.Input[str] peer_transit_router_region_id: The region ID of peer transit router.
:param pulumi.Input[bool] auto_publish_route_enabled: Auto publish route enabled. The system default value is `false`.
:param pulumi.Input[int] bandwidth: The bandwidth of the bandwidth package.
:param pulumi.Input[str] cen_bandwidth_package_id: The ID of the bandwidth package. If you do not enter the ID of the package, it means you are using the test. The system default test is 1bps, demonstrating that you test network connectivity
:param pulumi.Input[bool] dry_run: Whether to perform pre-check for this request, including permission, instance status verification, etc.
:param pulumi.Input[str] resource_type: The resource type to attachment. Only support `VR` and default value is `VR`.
:param pulumi.Input[bool] route_table_association_enabled: Whether to association route table. System default is `false`.
:param pulumi.Input[bool] route_table_propagation_enabled: Whether to propagation route table. System default is `false`.
:param pulumi.Input[str] transit_router_attachment_description: The description of transit router attachment. The description is 2~256 characters long and must start with a letter or Chinese, but cannot start with `http://` or `https://`.
:param pulumi.Input[str] transit_router_attachment_name: The name of transit router attachment. The name is 2~128 characters in length, starts with uppercase and lowercase letters or Chinese, and can contain numbers, underscores (_) and dashes (-)
:param pulumi.Input[str] transit_router_id: The ID of the transit router to attach.
"""
pulumi.set(__self__, "cen_id", cen_id)
pulumi.set(__self__, "peer_transit_router_id", peer_transit_router_id)
pulumi.set(__self__, "peer_transit_router_region_id", peer_transit_router_region_id)
if auto_publish_route_enabled is not None:
pulumi.set(__self__, "auto_publish_route_enabled", auto_publish_route_enabled)
if bandwidth is not None:
pulumi.set(__self__, "bandwidth", bandwidth)
if cen_bandwidth_package_id is not None:
pulumi.set(__self__, "cen_bandwidth_package_id", cen_bandwidth_package_id)
if dry_run is not None:
pulumi.set(__self__, "dry_run", dry_run)
if resource_type is not None:
pulumi.set(__self__, "resource_type", resource_type)
if route_table_association_enabled is not None:
pulumi.set(__self__, "route_table_association_enabled", route_table_association_enabled)
if route_table_propagation_enabled is not None:
pulumi.set(__self__, "route_table_propagation_enabled", route_table_propagation_enabled)
if transit_router_attachment_description is not None:
pulumi.set(__self__, "transit_router_attachment_description", transit_router_attachment_description)
if transit_router_attachment_name is not None:
pulumi.set(__self__, "transit_router_attachment_name", transit_router_attachment_name)
if transit_router_id is not None:
pulumi.set(__self__, "transit_router_id", transit_router_id)
@property
@pulumi.getter(name="cenId")
def cen_id(self) -> pulumi.Input[str]:
"""
The ID of the CEN.
"""
return pulumi.get(self, "cen_id")
@cen_id.setter
def cen_id(self, value: pulumi.Input[str]):
pulumi.set(self, "cen_id", value)
@property
@pulumi.getter(name="peerTransitRouterId")
def peer_transit_router_id(self) -> pulumi.Input[str]:
"""
The ID of the peer transit router.
"""
return pulumi.get(self, "peer_transit_router_id")
@peer_transit_router_id.setter
def peer_transit_router_id(self, value: pulumi.Input[str]):
pulumi.set(self, "peer_transit_router_id", value)
@property
@pulumi.getter(name="peerTransitRouterRegionId")
def peer_transit_router_region_id(self) -> pulumi.Input[str]:
"""
The region ID of peer transit router.
"""
return pulumi.get(self, "peer_transit_router_region_id")
@peer_transit_router_region_id.setter
def peer_transit_router_region_id(self, value: pulumi.Input[str]):
pulumi.set(self, "peer_transit_router_region_id", value)
@property
@pulumi.getter(name="autoPublishRouteEnabled")
def auto_publish_route_enabled(self) -> Optional[pulumi.Input[bool]]:
"""
Auto publish route enabled. The system default value is `false`.
"""
return pulumi.get(self, "auto_publish_route_enabled")
@auto_publish_route_enabled.setter
def auto_publish_route_enabled(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "auto_publish_route_enabled", value)
@property
@pulumi.getter
def bandwidth(self) -> Optional[pulumi.Input[int]]:
"""
The bandwidth of the bandwidth package.
"""
return pulumi.get(self, "bandwidth")
@bandwidth.setter
def bandwidth(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "bandwidth", value)
@property
@pulumi.getter(name="cenBandwidthPackageId")
def cen_bandwidth_package_id(self) -> Optional[pulumi.Input[str]]:
"""
The ID of the bandwidth package. If you do not enter the ID of the package, it means you are using the test. The system default test is 1bps, demonstrating that you test network connectivity
"""
return pulumi.get(self, "cen_bandwidth_package_id")
@cen_bandwidth_package_id.setter
def cen_bandwidth_package_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "cen_bandwidth_package_id", value)
@property
@pulumi.getter(name="dryRun")
def dry_run(self) -> Optional[pulumi.Input[bool]]:
"""
Whether to perform pre-check for this request, including permission, instance status verification, etc.
"""
return pulumi.get(self, "dry_run")
@dry_run.setter
def dry_run(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "dry_run", value)
@property
@pulumi.getter(name="resourceType")
def resource_type(self) -> Optional[pulumi.Input[str]]:
"""
The resource type to attachment. Only support `VR` and default value is `VR`.
"""
return pulumi.get(self, "resource_type")
@resource_type.setter
def resource_type(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "resource_type", value)
@property
@pulumi.getter(name="routeTableAssociationEnabled")
def route_table_association_enabled(self) -> Optional[pulumi.Input[bool]]:
"""
Whether to association route table. System default is `false`.
"""
return pulumi.get(self, "route_table_association_enabled")
@route_table_association_enabled.setter
def route_table_association_enabled(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "route_table_association_enabled", value)
@property
@pulumi.getter(name="routeTablePropagationEnabled")
def route_table_propagation_enabled(self) -> Optional[pulumi.Input[bool]]:
"""
Whether to propagation route table. System default is `false`.
"""
return pulumi.get(self, "route_table_propagation_enabled")
@route_table_propagation_enabled.setter
def route_table_propagation_enabled(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "route_table_propagation_enabled", value)
@property
@pulumi.getter(name="transitRouterAttachmentDescription")
def transit_router_attachment_description(self) -> Optional[pulumi.Input[str]]:
"""
The description of transit router attachment. The description is 2~256 characters long and must start with a letter or Chinese, but cannot start with `http://` or `https://`.
"""
return pulumi.get(self, "transit_router_attachment_description")
@transit_router_attachment_description.setter
def transit_router_attachment_description(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "transit_router_attachment_description", value)
@property
@pulumi.getter(name="transitRouterAttachmentName")
def transit_router_attachment_name(self) -> Optional[pulumi.Input[str]]:
"""
The name of transit router attachment. The name is 2~128 characters in length, starts with uppercase and lowercase letters or Chinese, and can contain numbers, underscores (_) and dashes (-)
"""
return pulumi.get(self, "transit_router_attachment_name")
@transit_router_attachment_name.setter
def transit_router_attachment_name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "transit_router_attachment_name", value)
@property
@pulumi.getter(name="transitRouterId")
def transit_router_id(self) -> Optional[pulumi.Input[str]]:
"""
The ID of the transit router to attach.
"""
return pulumi.get(self, "transit_router_id")
@transit_router_id.setter
def transit_router_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "transit_router_id", value)
@pulumi.input_type
class _TransitRouterPeerAttachmentState:
def __init__(__self__, *,
auto_publish_route_enabled: Optional[pulumi.Input[bool]] = None,
bandwidth: Optional[pulumi.Input[int]] = None,
cen_bandwidth_package_id: Optional[pulumi.Input[str]] = None,
cen_id: Optional[pulumi.Input[str]] = None,
dry_run: Optional[pulumi.Input[bool]] = None,
peer_transit_router_id: Optional[pulumi.Input[str]] = None,
peer_transit_router_region_id: Optional[pulumi.Input[str]] = None,
resource_type: Optional[pulumi.Input[str]] = None,
route_table_association_enabled: Optional[pulumi.Input[bool]] = None,
route_table_propagation_enabled: Optional[pulumi.Input[bool]] = None,
status: Optional[pulumi.Input[str]] = None,
transit_router_attachment_description: Optional[pulumi.Input[str]] = None,
transit_router_attachment_id: Optional[pulumi.Input[str]] = None,
transit_router_attachment_name: Optional[pulumi.Input[str]] = None,
transit_router_id: Optional[pulumi.Input[str]] = None):
"""
Input properties used for looking up and filtering TransitRouterPeerAttachment resources.
:param pulumi.Input[bool] auto_publish_route_enabled: Auto publish route enabled. The system default value is `false`.
:param pulumi.Input[int] bandwidth: The bandwidth of the bandwidth package.
:param pulumi.Input[str] cen_bandwidth_package_id: The ID of the bandwidth package. If you do not enter the ID of the package, it means you are using the test. The system default test is 1bps, demonstrating that you test network connectivity
:param pulumi.Input[str] cen_id: The ID of the CEN.
:param pulumi.Input[bool] dry_run: Whether to perform pre-check for this request, including permission, instance status verification, etc.
:param pulumi.Input[str] peer_transit_router_id: The ID of the peer transit router.
:param pulumi.Input[str] peer_transit_router_region_id: The region ID of peer transit router.
:param pulumi.Input[str] resource_type: The resource type to attachment. Only support `VR` and default value is `VR`.
:param pulumi.Input[bool] route_table_association_enabled: Whether to association route table. System default is `false`.
:param pulumi.Input[bool] route_table_propagation_enabled: Whether to propagation route table. System default is `false`.
:param pulumi.Input[str] status: The associating status of the network.
:param pulumi.Input[str] transit_router_attachment_description: The description of transit router attachment. The description is 2~256 characters long and must start with a letter or Chinese, but cannot start with `http://` or `https://`.
:param pulumi.Input[str] transit_router_attachment_id: The ID of transit router attachment id.
:param pulumi.Input[str] transit_router_attachment_name: The name of transit router attachment. The name is 2~128 characters in length, starts with uppercase and lowercase letters or Chinese, and can contain numbers, underscores (_) and dashes (-)
:param pulumi.Input[str] transit_router_id: The ID of the transit router to attach.
"""
if auto_publish_route_enabled is not None:
pulumi.set(__self__, "auto_publish_route_enabled", auto_publish_route_enabled)
if bandwidth is not None:
pulumi.set(__self__, "bandwidth", bandwidth)
if cen_bandwidth_package_id is not None:
pulumi.set(__self__, "cen_bandwidth_package_id", cen_bandwidth_package_id)
if cen_id is not None:
pulumi.set(__self__, "cen_id", cen_id)
if dry_run is not None:
pulumi.set(__self__, "dry_run", dry_run)
if peer_transit_router_id is not None:
pulumi.set(__self__, "peer_transit_router_id", peer_transit_router_id)
if peer_transit_router_region_id is not None:
pulumi.set(__self__, "peer_transit_router_region_id", peer_transit_router_region_id)
if resource_type is not None:
pulumi.set(__self__, "resource_type", resource_type)
if route_table_association_enabled is not None:
pulumi.set(__self__, "route_table_association_enabled", route_table_association_enabled)
if route_table_propagation_enabled is not None:
pulumi.set(__self__, "route_table_propagation_enabled", route_table_propagation_enabled)
if status is not None:
pulumi.set(__self__, "status", status)
if transit_router_attachment_description is not None:
pulumi.set(__self__, "transit_router_attachment_description", transit_router_attachment_description)
if transit_router_attachment_id is not None:
pulumi.set(__self__, "transit_router_attachment_id", transit_router_attachment_id)
if transit_router_attachment_name is not None:
pulumi.set(__self__, "transit_router_attachment_name", transit_router_attachment_name)
if transit_router_id is not None:
pulumi.set(__self__, "transit_router_id", transit_router_id)
@property
@pulumi.getter(name="autoPublishRouteEnabled")
def auto_publish_route_enabled(self) -> Optional[pulumi.Input[bool]]:
"""
Auto publish route enabled. The system default value is `false`.
"""
return pulumi.get(self, "auto_publish_route_enabled")
@auto_publish_route_enabled.setter
def auto_publish_route_enabled(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "auto_publish_route_enabled", value)
@property
@pulumi.getter
def bandwidth(self) -> Optional[pulumi.Input[int]]:
"""
The bandwidth of the bandwidth package.
"""
return pulumi.get(self, "bandwidth")
@bandwidth.setter
def bandwidth(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "bandwidth", value)
@property
@pulumi.getter(name="cenBandwidthPackageId")
def cen_bandwidth_package_id(self) -> Optional[pulumi.Input[str]]:
"""
The ID of the bandwidth package. If you do not enter the ID of the package, it means you are using the test. The system default test is 1bps, demonstrating that you test network connectivity
"""
return pulumi.get(self, "cen_bandwidth_package_id")
@cen_bandwidth_package_id.setter
def cen_bandwidth_package_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "cen_bandwidth_package_id", value)
@property
@pulumi.getter(name="cenId")
def cen_id(self) -> Optional[pulumi.Input[str]]:
"""
The ID of the CEN.
"""
return pulumi.get(self, "cen_id")
@cen_id.setter
def cen_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "cen_id", value)
@property
@pulumi.getter(name="dryRun")
def dry_run(self) -> Optional[pulumi.Input[bool]]:
"""
Whether to perform pre-check for this request, including permission, instance status verification, etc.
"""
return pulumi.get(self, "dry_run")
@dry_run.setter
def dry_run(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "dry_run", value)
@property
@pulumi.getter(name="peerTransitRouterId")
def peer_transit_router_id(self) -> Optional[pulumi.Input[str]]:
"""
The ID of the peer transit router.
"""
return pulumi.get(self, "peer_transit_router_id")
@peer_transit_router_id.setter
def peer_transit_router_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "peer_transit_router_id", value)
@property
@pulumi.getter(name="peerTransitRouterRegionId")
def peer_transit_router_region_id(self) -> Optional[pulumi.Input[str]]:
"""
The region ID of peer transit router.
"""
return pulumi.get(self, "peer_transit_router_region_id")
@peer_transit_router_region_id.setter
def peer_transit_router_region_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "peer_transit_router_region_id", value)
@property
@pulumi.getter(name="resourceType")
def resource_type(self) -> Optional[pulumi.Input[str]]:
"""
The resource type to attachment. Only support `VR` and default value is `VR`.
"""
return pulumi.get(self, "resource_type")
@resource_type.setter
def resource_type(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "resource_type", value)
@property
@pulumi.getter(name="routeTableAssociationEnabled")
def route_table_association_enabled(self) -> Optional[pulumi.Input[bool]]:
"""
Whether to association route table. System default is `false`.
"""
return pulumi.get(self, "route_table_association_enabled")
@route_table_association_enabled.setter
def route_table_association_enabled(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "route_table_association_enabled", value)
@property
@pulumi.getter(name="routeTablePropagationEnabled")
def route_table_propagation_enabled(self) -> Optional[pulumi.Input[bool]]:
"""
Whether to propagation route table. System default is `false`.
"""
return pulumi.get(self, "route_table_propagation_enabled")
@route_table_propagation_enabled.setter
def route_table_propagation_enabled(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "route_table_propagation_enabled", value)
@property
@pulumi.getter
def status(self) -> Optional[pulumi.Input[str]]:
"""
The associating status of the network.
"""
return pulumi.get(self, "status")
@status.setter
def status(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "status", value)
@property
@pulumi.getter(name="transitRouterAttachmentDescription")
def transit_router_attachment_description(self) -> Optional[pulumi.Input[str]]:
"""
The description of transit router attachment. The description is 2~256 characters long and must start with a letter or Chinese, but cannot start with `http://` or `https://`.
"""
return pulumi.get(self, "transit_router_attachment_description")
@transit_router_attachment_description.setter
def transit_router_attachment_description(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "transit_router_attachment_description", value)
@property
@pulumi.getter(name="transitRouterAttachmentId")
def transit_router_attachment_id(self) -> Optional[pulumi.Input[str]]:
"""
The ID of transit router attachment id.
"""
return pulumi.get(self, "transit_router_attachment_id")
@transit_router_attachment_id.setter
def transit_router_attachment_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "transit_router_attachment_id", value)
@property
@pulumi.getter(name="transitRouterAttachmentName")
def transit_router_attachment_name(self) -> Optional[pulumi.Input[str]]:
"""
The name of transit router attachment. The name is 2~128 characters in length, starts with uppercase and lowercase letters or Chinese, and can contain numbers, underscores (_) and dashes (-)
"""
return pulumi.get(self, "transit_router_attachment_name")
@transit_router_attachment_name.setter
def transit_router_attachment_name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "transit_router_attachment_name", value)
@property
@pulumi.getter(name="transitRouterId")
def transit_router_id(self) -> Optional[pulumi.Input[str]]:
"""
The ID of the transit router to attach.
"""
return pulumi.get(self, "transit_router_id")
@transit_router_id.setter
def transit_router_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "transit_router_id", value)
class TransitRouterPeerAttachment(pulumi.CustomResource):
@overload
def __init__(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
auto_publish_route_enabled: Optional[pulumi.Input[bool]] = None,
bandwidth: Optional[pulumi.Input[int]] = None,
cen_bandwidth_package_id: Optional[pulumi.Input[str]] = None,
cen_id: Optional[pulumi.Input[str]] = None,
dry_run: Optional[pulumi.Input[bool]] = None,
peer_transit_router_id: Optional[pulumi.Input[str]] = None,
peer_transit_router_region_id: Optional[pulumi.Input[str]] = None,
resource_type: Optional[pulumi.Input[str]] = None,
route_table_association_enabled: Optional[pulumi.Input[bool]] = None,
route_table_propagation_enabled: Optional[pulumi.Input[bool]] = None,
transit_router_attachment_description: Optional[pulumi.Input[str]] = None,
transit_router_attachment_name: Optional[pulumi.Input[str]] = None,
transit_router_id: Optional[pulumi.Input[str]] = None,
__props__=None):
"""
Provides a CEN transit router peer attachment resource that associate the transit router with the CEN instance. [What is CEN transit router peer attachment](https://help.aliyun.com/document_detail/261363.html)
> **NOTE:** Available in 1.128.0+
## Example Usage
Basic Usage
```python
import pulumi
import pulumi_alicloud as alicloud
import pulumi_pulumi as pulumi
config = pulumi.Config()
name = config.get("name")
if name is None:
name = "tf-testAcccExample"
us = pulumi.providers.Alicloud("us", region="us-east-1")
cn = pulumi.providers.Alicloud("cn", region="cn-hangzhou")
default_instance = alicloud.cen.Instance("defaultInstance",
cen_instance_name=name,
protection_level="REDUCED",
opts=pulumi.ResourceOptions(provider=alicloud["cn"]))
default_bandwidth_package = alicloud.cen.BandwidthPackage("defaultBandwidthPackage",
bandwidth=5,
cen_bandwidth_package_name=name,
geographic_region_a_id="China",
geographic_region_b_id="North-America")
default_bandwidth_package_attachment = alicloud.cen.BandwidthPackageAttachment("defaultBandwidthPackageAttachment",
instance_id=default_instance.id,
bandwidth_package_id=default_bandwidth_package.id,
opts=pulumi.ResourceOptions(provider=alicloud["cn"]))
cn_transit_router = alicloud.cen.TransitRouter("cnTransitRouter", cen_id=default_instance.id,
opts=pulumi.ResourceOptions(provider=alicloud["cn"],
depends_on=[default_bandwidth_package_attachment]))
us_transit_router = alicloud.cen.TransitRouter("usTransitRouter", cen_id=default_instance.id,
opts=pulumi.ResourceOptions(provider=alicloud["us"],
depends_on=[alicloud_cen_transit_router["default_0"]]))
default_transit_router_peer_attachment = alicloud.cen.TransitRouterPeerAttachment("defaultTransitRouterPeerAttachment",
cen_id=default_instance.id,
transit_router_id=cn_transit_router.transit_router_id,
peer_transit_router_region_id="us-east-1",
peer_transit_router_id=us_transit_router.transit_router_id,
cen_bandwidth_package_id=default_bandwidth_package_attachment.bandwidth_package_id,
bandwidth=5,
transit_router_attachment_description=name,
transit_router_attachment_name=name,
opts=pulumi.ResourceOptions(provider=alicloud["cn"]))
```
## Import
CEN instance can be imported using the id, e.g.
```sh
$ pulumi import alicloud:cen/transitRouterPeerAttachment:TransitRouterPeerAttachment example tr-********:tr-attach-*******
```
:param str resource_name: The name of the resource.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[bool] auto_publish_route_enabled: Auto publish route enabled. The system default value is `false`.
:param pulumi.Input[int] bandwidth: The bandwidth of the bandwidth package.
:param pulumi.Input[str] cen_bandwidth_package_id: The ID of the bandwidth package. If you do not enter the ID of the package, it means you are using the test. The system default test is 1bps, demonstrating that you test network connectivity
:param pulumi.Input[str] cen_id: The ID of the CEN.
:param pulumi.Input[bool] dry_run: Whether to perform pre-check for this request, including permission, instance status verification, etc.
:param pulumi.Input[str] peer_transit_router_id: The ID of the peer transit router.
:param pulumi.Input[str] peer_transit_router_region_id: The region ID of peer transit router.
:param pulumi.Input[str] resource_type: The resource type to attachment. Only support `VR` and default value is `VR`.
:param pulumi.Input[bool] route_table_association_enabled: Whether to association route table. System default is `false`.
:param pulumi.Input[bool] route_table_propagation_enabled: Whether to propagation route table. System default is `false`.
:param pulumi.Input[str] transit_router_attachment_description: The description of transit router attachment. The description is 2~256 characters long and must start with a letter or Chinese, but cannot start with `http://` or `https://`.
:param pulumi.Input[str] transit_router_attachment_name: The name of transit router attachment. The name is 2~128 characters in length, starts with uppercase and lowercase letters or Chinese, and can contain numbers, underscores (_) and dashes (-)
:param pulumi.Input[str] transit_router_id: The ID of the transit router to attach.
"""
...
@overload
def __init__(__self__,
resource_name: str,
args: TransitRouterPeerAttachmentArgs,
opts: Optional[pulumi.ResourceOptions] = None):
"""
Provides a CEN transit router peer attachment resource that associate the transit router with the CEN instance. [What is CEN transit router peer attachment](https://help.aliyun.com/document_detail/261363.html)
> **NOTE:** Available in 1.128.0+
## Example Usage
Basic Usage
```python
import pulumi
import pulumi_alicloud as alicloud
import pulumi_pulumi as pulumi
config = pulumi.Config()
name = config.get("name")
if name is None:
name = "tf-testAcccExample"
us = pulumi.providers.Alicloud("us", region="us-east-1")
cn = pulumi.providers.Alicloud("cn", region="cn-hangzhou")
default_instance = alicloud.cen.Instance("defaultInstance",
cen_instance_name=name,
protection_level="REDUCED",
opts=pulumi.ResourceOptions(provider=alicloud["cn"]))
default_bandwidth_package = alicloud.cen.BandwidthPackage("defaultBandwidthPackage",
bandwidth=5,
cen_bandwidth_package_name=name,
geographic_region_a_id="China",
geographic_region_b_id="North-America")
default_bandwidth_package_attachment = alicloud.cen.BandwidthPackageAttachment("defaultBandwidthPackageAttachment",
instance_id=default_instance.id,
bandwidth_package_id=default_bandwidth_package.id,
opts=pulumi.ResourceOptions(provider=alicloud["cn"]))
cn_transit_router = alicloud.cen.TransitRouter("cnTransitRouter", cen_id=default_instance.id,
opts=pulumi.ResourceOptions(provider=alicloud["cn"],
depends_on=[default_bandwidth_package_attachment]))
us_transit_router = alicloud.cen.TransitRouter("usTransitRouter", cen_id=default_instance.id,
opts=pulumi.ResourceOptions(provider=alicloud["us"],
depends_on=[alicloud_cen_transit_router["default_0"]]))
default_transit_router_peer_attachment = alicloud.cen.TransitRouterPeerAttachment("defaultTransitRouterPeerAttachment",
cen_id=default_instance.id,
transit_router_id=cn_transit_router.transit_router_id,
peer_transit_router_region_id="us-east-1",
peer_transit_router_id=us_transit_router.transit_router_id,
cen_bandwidth_package_id=default_bandwidth_package_attachment.bandwidth_package_id,
bandwidth=5,
transit_router_attachment_description=name,
transit_router_attachment_name=name,
opts=pulumi.ResourceOptions(provider=alicloud["cn"]))
```
## Import
CEN instance can be imported using the id, e.g.
```sh
$ pulumi import alicloud:cen/transitRouterPeerAttachment:TransitRouterPeerAttachment example tr-********:tr-attach-*******
```
:param str resource_name: The name of the resource.
:param TransitRouterPeerAttachmentArgs args: The arguments to use to populate this resource's properties.
:param pulumi.ResourceOptions opts: Options for the resource.
"""
...
def __init__(__self__, resource_name: str, *args, **kwargs):
resource_args, opts = _utilities.get_resource_args_opts(TransitRouterPeerAttachmentArgs, pulumi.ResourceOptions, *args, **kwargs)
if resource_args is not None:
__self__._internal_init(resource_name, opts, **resource_args.__dict__)
else:
__self__._internal_init(resource_name, *args, **kwargs)
def _internal_init(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
auto_publish_route_enabled: Optional[pulumi.Input[bool]] = None,
bandwidth: Optional[pulumi.Input[int]] = None,
cen_bandwidth_package_id: Optional[pulumi.Input[str]] = None,
cen_id: Optional[pulumi.Input[str]] = None,
dry_run: Optional[pulumi.Input[bool]] = None,
peer_transit_router_id: Optional[pulumi.Input[str]] = None,
peer_transit_router_region_id: Optional[pulumi.Input[str]] = None,
resource_type: Optional[pulumi.Input[str]] = None,
route_table_association_enabled: Optional[pulumi.Input[bool]] = None,
route_table_propagation_enabled: Optional[pulumi.Input[bool]] = None,
transit_router_attachment_description: Optional[pulumi.Input[str]] = None,
transit_router_attachment_name: Optional[pulumi.Input[str]] = None,
transit_router_id: Optional[pulumi.Input[str]] = None,
__props__=None):
if opts is None:
opts = pulumi.ResourceOptions()
if not isinstance(opts, pulumi.ResourceOptions):
raise TypeError('Expected resource options to be a ResourceOptions instance')
if opts.version is None:
opts.version = _utilities.get_version()
if opts.id is None:
if __props__ is not None:
raise TypeError('__props__ is only valid when passed in combination with a valid opts.id to get an existing resource')
__props__ = TransitRouterPeerAttachmentArgs.__new__(TransitRouterPeerAttachmentArgs)
__props__.__dict__["auto_publish_route_enabled"] = auto_publish_route_enabled
__props__.__dict__["bandwidth"] = bandwidth
__props__.__dict__["cen_bandwidth_package_id"] = cen_bandwidth_package_id
if cen_id is None and not opts.urn:
raise TypeError("Missing required property 'cen_id'")
__props__.__dict__["cen_id"] = cen_id
__props__.__dict__["dry_run"] = dry_run
if peer_transit_router_id is None and not opts.urn:
raise TypeError("Missing required property 'peer_transit_router_id'")
__props__.__dict__["peer_transit_router_id"] = peer_transit_router_id
if peer_transit_router_region_id is None and not opts.urn:
raise TypeError("Missing required property 'peer_transit_router_region_id'")
__props__.__dict__["peer_transit_router_region_id"] = peer_transit_router_region_id
__props__.__dict__["resource_type"] = resource_type
__props__.__dict__["route_table_association_enabled"] = route_table_association_enabled
__props__.__dict__["route_table_propagation_enabled"] = route_table_propagation_enabled
__props__.__dict__["transit_router_attachment_description"] = transit_router_attachment_description
__props__.__dict__["transit_router_attachment_name"] = transit_router_attachment_name
__props__.__dict__["transit_router_id"] = transit_router_id
__props__.__dict__["status"] = None
__props__.__dict__["transit_router_attachment_id"] = None
super(TransitRouterPeerAttachment, __self__).__init__(
'alicloud:cen/transitRouterPeerAttachment:TransitRouterPeerAttachment',
resource_name,
__props__,
opts)
@staticmethod
def get(resource_name: str,
id: pulumi.Input[str],
opts: Optional[pulumi.ResourceOptions] = None,
auto_publish_route_enabled: Optional[pulumi.Input[bool]] = None,
bandwidth: Optional[pulumi.Input[int]] = None,
cen_bandwidth_package_id: Optional[pulumi.Input[str]] = None,
cen_id: Optional[pulumi.Input[str]] = None,
dry_run: Optional[pulumi.Input[bool]] = None,
peer_transit_router_id: Optional[pulumi.Input[str]] = None,
peer_transit_router_region_id: Optional[pulumi.Input[str]] = None,
resource_type: Optional[pulumi.Input[str]] = None,
route_table_association_enabled: Optional[pulumi.Input[bool]] = None,
route_table_propagation_enabled: Optional[pulumi.Input[bool]] = None,
status: Optional[pulumi.Input[str]] = None,
transit_router_attachment_description: Optional[pulumi.Input[str]] = None,
transit_router_attachment_id: Optional[pulumi.Input[str]] = None,
transit_router_attachment_name: Optional[pulumi.Input[str]] = None,
transit_router_id: Optional[pulumi.Input[str]] = None) -> 'TransitRouterPeerAttachment':
"""
Get an existing TransitRouterPeerAttachment resource's state with the given name, id, and optional extra
properties used to qualify the lookup.
:param str resource_name: The unique name of the resulting resource.
:param pulumi.Input[str] id: The unique provider ID of the resource to lookup.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[bool] auto_publish_route_enabled: Auto publish route enabled. The system default value is `false`.
:param pulumi.Input[int] bandwidth: The bandwidth of the bandwidth package.
:param pulumi.Input[str] cen_bandwidth_package_id: The ID of the bandwidth package. If you do not enter the ID of the package, it means you are using the test. The system default test is 1bps, demonstrating that you test network connectivity
:param pulumi.Input[str] cen_id: The ID of the CEN.
:param pulumi.Input[bool] dry_run: Whether to perform pre-check for this request, including permission, instance status verification, etc.
:param pulumi.Input[str] peer_transit_router_id: The ID of the peer transit router.
:param pulumi.Input[str] peer_transit_router_region_id: The region ID of peer transit router.
:param pulumi.Input[str] resource_type: The resource type to attachment. Only support `VR` and default value is `VR`.
:param pulumi.Input[bool] route_table_association_enabled: Whether to association route table. System default is `false`.
:param pulumi.Input[bool] route_table_propagation_enabled: Whether to propagation route table. System default is `false`.
:param pulumi.Input[str] status: The associating status of the network.
:param pulumi.Input[str] transit_router_attachment_description: The description of transit router attachment. The description is 2~256 characters long and must start with a letter or Chinese, but cannot start with `http://` or `https://`.
:param pulumi.Input[str] transit_router_attachment_id: The ID of transit router attachment id.
:param pulumi.Input[str] transit_router_attachment_name: The name of transit router attachment. The name is 2~128 characters in length, starts with uppercase and lowercase letters or Chinese, and can contain numbers, underscores (_) and dashes (-)
:param pulumi.Input[str] transit_router_id: The ID of the transit router to attach.
"""
opts = pulumi.ResourceOptions.merge(opts, pulumi.ResourceOptions(id=id))
__props__ = _TransitRouterPeerAttachmentState.__new__(_TransitRouterPeerAttachmentState)
__props__.__dict__["auto_publish_route_enabled"] = auto_publish_route_enabled
__props__.__dict__["bandwidth"] = bandwidth
__props__.__dict__["cen_bandwidth_package_id"] = cen_bandwidth_package_id
__props__.__dict__["cen_id"] = cen_id
__props__.__dict__["dry_run"] = dry_run
__props__.__dict__["peer_transit_router_id"] = peer_transit_router_id
__props__.__dict__["peer_transit_router_region_id"] = peer_transit_router_region_id
__props__.__dict__["resource_type"] = resource_type
__props__.__dict__["route_table_association_enabled"] = route_table_association_enabled
__props__.__dict__["route_table_propagation_enabled"] = route_table_propagation_enabled
__props__.__dict__["status"] = status
__props__.__dict__["transit_router_attachment_description"] = transit_router_attachment_description
__props__.__dict__["transit_router_attachment_id"] = transit_router_attachment_id
__props__.__dict__["transit_router_attachment_name"] = transit_router_attachment_name
__props__.__dict__["transit_router_id"] = transit_router_id
return TransitRouterPeerAttachment(resource_name, opts=opts, __props__=__props__)
@property
@pulumi.getter(name="autoPublishRouteEnabled")
def auto_publish_route_enabled(self) -> pulumi.Output[Optional[bool]]:
"""
Auto publish route enabled. The system default value is `false`.
"""
return pulumi.get(self, "auto_publish_route_enabled")
@property
@pulumi.getter
def bandwidth(self) -> pulumi.Output[Optional[int]]:
"""
The bandwidth of the bandwidth package.
"""
return pulumi.get(self, "bandwidth")
@property
@pulumi.getter(name="cenBandwidthPackageId")
def cen_bandwidth_package_id(self) -> pulumi.Output[Optional[str]]:
"""
The ID of the bandwidth package. If you do not enter the ID of the package, it means you are using the test. The system default test is 1bps, demonstrating that you test network connectivity
"""
return pulumi.get(self, "cen_bandwidth_package_id")
@property
@pulumi.getter(name="cenId")
def cen_id(self) -> pulumi.Output[str]:
"""
The ID of the CEN.
"""
return pulumi.get(self, "cen_id")
@property
@pulumi.getter(name="dryRun")
def dry_run(self) -> pulumi.Output[Optional[bool]]:
"""
Whether to perform pre-check for this request, including permission, instance status verification, etc.
"""
return pulumi.get(self, "dry_run")
@property
@pulumi.getter(name="peerTransitRouterId")
def peer_transit_router_id(self) -> pulumi.Output[str]:
"""
The ID of the peer transit router.
"""
return pulumi.get(self, "peer_transit_router_id")
@property
@pulumi.getter(name="peerTransitRouterRegionId")
def peer_transit_router_region_id(self) -> pulumi.Output[str]:
"""
The region ID of peer transit router.
"""
return pulumi.get(self, "peer_transit_router_region_id")
@property
@pulumi.getter(name="resourceType")
def resource_type(self) -> pulumi.Output[Optional[str]]:
"""
The resource type to attachment. Only support `VR` and default value is `VR`.
"""
return pulumi.get(self, "resource_type")
@property
@pulumi.getter(name="routeTableAssociationEnabled")
def route_table_association_enabled(self) -> pulumi.Output[Optional[bool]]:
"""
Whether to association route table. System default is `false`.
"""
return pulumi.get(self, "route_table_association_enabled")
@property
@pulumi.getter(name="routeTablePropagationEnabled")
def route_table_propagation_enabled(self) -> pulumi.Output[Optional[bool]]:
"""
Whether to propagation route table. System default is `false`.
"""
return pulumi.get(self, "route_table_propagation_enabled")
@property
@pulumi.getter
def status(self) -> pulumi.Output[str]:
"""
The associating status of the network.
"""
return pulumi.get(self, "status")
@property
@pulumi.getter(name="transitRouterAttachmentDescription")
def transit_router_attachment_description(self) -> pulumi.Output[Optional[str]]:
"""
The description of transit router attachment. The description is 2~256 characters long and must start with a letter or Chinese, but cannot start with `http://` or `https://`.
"""
return pulumi.get(self, "transit_router_attachment_description")
@property
@pulumi.getter(name="transitRouterAttachmentId")
def transit_router_attachment_id(self) -> pulumi.Output[str]:
"""
The ID of transit router attachment id.
"""
return pulumi.get(self, "transit_router_attachment_id")
@property
@pulumi.getter(name="transitRouterAttachmentName")
def transit_router_attachment_name(self) -> pulumi.Output[Optional[str]]:
"""
The name of transit router attachment. The name is 2~128 characters in length, starts with uppercase and lowercase letters or Chinese, and can contain numbers, underscores (_) and dashes (-)
"""
return pulumi.get(self, "transit_router_attachment_name")
@property
@pulumi.getter(name="transitRouterId")
def transit_router_id(self) -> pulumi.Output[Optional[str]]:
"""
The ID of the transit router to attach.
"""
return pulumi.get(self, "transit_router_id")
| 52.211778 | 255 | 0.693729 | 5,536 | 46,103 | 5.478324 | 0.046604 | 0.105018 | 0.054471 | 0.051504 | 0.93445 | 0.928581 | 0.919711 | 0.913842 | 0.907511 | 0.898114 | 0 | 0.00265 | 0.214108 | 46,103 | 882 | 256 | 52.270975 | 0.834401 | 0.36117 | 0 | 0.798301 | 1 | 0 | 0.146807 | 0.112877 | 0 | 0 | 0 | 0 | 0 | 1 | 0.165605 | false | 0.002123 | 0.010616 | 0 | 0.276008 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
0fe025c41a4d760bfb352d5ac9377b0dde83b1a3 | 240 | py | Python | features/admin/user_admin.py | DAgostinateur/Woh-Bot-2.0 | 4e99d97218a59156bacb1669cc1cb6c8807dd5b1 | [
"MIT"
] | null | null | null | features/admin/user_admin.py | DAgostinateur/Woh-Bot-2.0 | 4e99d97218a59156bacb1669cc1cb6c8807dd5b1 | [
"MIT"
] | null | null | null | features/admin/user_admin.py | DAgostinateur/Woh-Bot-2.0 | 4e99d97218a59156bacb1669cc1cb6c8807dd5b1 | [
"MIT"
] | null | null | null | class UserAdmin:
def __init__(self, user_id, server_id):
self.user_id = user_id
self.server_id = server_id
def __eq__(self, other):
return self.user_id == other.user_id and other.server_id == self.server_id
| 30 | 82 | 0.675 | 37 | 240 | 3.891892 | 0.324324 | 0.208333 | 0.208333 | 0.194444 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.233333 | 240 | 7 | 83 | 34.285714 | 0.782609 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0 | 0.166667 | 0.666667 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 7 |
ba261ec6c62bcff007eb1565b475fce0b43066f3 | 14,377 | py | Python | countess/tests/test_selection_basic_lib_noncoding.py | VariantEffect/Enrich2-py3 | 5f8534c8c9259d90d99d70e5bd9140fd0fdc8ea4 | [
"BSD-3-Clause"
] | 4 | 2020-01-14T19:24:07.000Z | 2020-01-16T18:11:35.000Z | countess/tests/test_selection_basic_lib_noncoding.py | VariantEffect/CountESS | 5f8534c8c9259d90d99d70e5bd9140fd0fdc8ea4 | [
"BSD-3-Clause"
] | 3 | 2020-01-01T10:38:15.000Z | 2020-01-03T09:45:41.000Z | countess/tests/test_selection_basic_lib_noncoding.py | VariantEffect/CountESS | 5f8534c8c9259d90d99d70e5bd9140fd0fdc8ea4 | [
"BSD-3-Clause"
] | 1 | 2022-02-20T00:35:24.000Z | 2022-02-20T00:35:24.000Z | import unittest
from copy import deepcopy
from ..selection.selection import Selection
from .methods import HDF5TestComponent
from .utilities import DEFAULT_STORE_PARAMS
from .utilities import load_config_data, update_cfg_file
CFG_FILE = "basic_selection_noncoding.json"
CFG_DIR = "data/config/selection/"
READS_DIR = "data/reads/selection/"
RESULT_DIR = "data/result/selection/"
DRIVER = "runTest"
LIBTYPE = "basic"
CODING_STR = "n"
FILE_EXT = "tsv"
FILE_SEP = "\t"
class TestSelectionBasicLibWLSScoringCompleteNormN(unittest.TestCase):
def setUp(self):
scoring = "WLS"
logr = "complete"
cfg = load_config_data(CFG_FILE, CFG_DIR)
cfg = update_cfg_file(cfg, scoring, logr)
params = deepcopy(DEFAULT_STORE_PARAMS)
self.general_test_component = HDF5TestComponent(
store_constructor=Selection,
cfg=cfg,
result_dir=RESULT_DIR,
file_ext=FILE_EXT,
file_sep=FILE_SEP,
save=False,
params=params,
verbose=False,
libtype=LIBTYPE,
scoring_method=scoring,
logr_method=logr,
coding="noncoding",
)
self.general_test_component.setUp()
def tearDown(self):
self.general_test_component.tearDown()
def test_all_hdf5_dataframes(self):
self.general_test_component.runTest()
class TestSelectionBasicLibWLSScoringFullNormN(unittest.TestCase):
def setUp(self):
scoring = "WLS"
logr = "full"
cfg = load_config_data(CFG_FILE, CFG_DIR)
cfg = update_cfg_file(cfg, scoring, logr)
params = deepcopy(DEFAULT_STORE_PARAMS)
self.general_test_component = HDF5TestComponent(
store_constructor=Selection,
cfg=cfg,
result_dir=RESULT_DIR,
file_ext=FILE_EXT,
file_sep=FILE_SEP,
save=False,
params=params,
verbose=False,
libtype=LIBTYPE,
scoring_method=scoring,
logr_method=logr,
coding="noncoding",
)
self.general_test_component.setUp()
def tearDown(self):
self.general_test_component.tearDown()
def test_all_hdf5_dataframes(self):
self.general_test_component.runTest()
class TestSelectionBasicLibWLSScoringWTNormN(unittest.TestCase):
def setUp(self):
scoring = "WLS"
logr = "wt"
cfg = load_config_data(CFG_FILE, CFG_DIR)
cfg = update_cfg_file(cfg, scoring, logr)
params = deepcopy(DEFAULT_STORE_PARAMS)
self.general_test_component = HDF5TestComponent(
store_constructor=Selection,
cfg=cfg,
result_dir=RESULT_DIR,
file_ext=FILE_EXT,
file_sep=FILE_SEP,
save=False,
params=params,
verbose=False,
libtype=LIBTYPE,
scoring_method=scoring,
logr_method=logr,
coding="noncoding",
)
self.general_test_component.setUp()
def tearDown(self):
self.general_test_component.tearDown()
def test_all_hdf5_dataframes(self):
self.general_test_component.runTest()
class TestSelectionBasicLibOLSScoringCompleteNormN(unittest.TestCase):
def setUp(self):
scoring = "OLS"
logr = "complete"
cfg = load_config_data(CFG_FILE, CFG_DIR)
cfg = update_cfg_file(cfg, scoring, logr)
params = deepcopy(DEFAULT_STORE_PARAMS)
self.general_test_component = HDF5TestComponent(
store_constructor=Selection,
cfg=cfg,
result_dir=RESULT_DIR,
file_ext=FILE_EXT,
file_sep=FILE_SEP,
save=False,
params=params,
verbose=False,
libtype=LIBTYPE,
scoring_method=scoring,
logr_method=logr,
coding="noncoding",
)
self.general_test_component.setUp()
def tearDown(self):
self.general_test_component.tearDown()
def test_all_hdf5_dataframes(self):
self.general_test_component.runTest()
class TestSelectionBasicLibOLSScoringFullNormN(unittest.TestCase):
def setUp(self):
scoring = "OLS"
logr = "full"
cfg = load_config_data(CFG_FILE, CFG_DIR)
cfg = update_cfg_file(cfg, scoring, logr)
params = deepcopy(DEFAULT_STORE_PARAMS)
self.general_test_component = HDF5TestComponent(
store_constructor=Selection,
cfg=cfg,
result_dir=RESULT_DIR,
file_ext=FILE_EXT,
file_sep=FILE_SEP,
save=False,
params=params,
verbose=False,
libtype=LIBTYPE,
scoring_method=scoring,
logr_method=logr,
coding="noncoding",
)
self.general_test_component.setUp()
def tearDown(self):
self.general_test_component.tearDown()
def test_all_hdf5_dataframes(self):
self.general_test_component.runTest()
class TestSelectionBasicLibOLSScoringWTNormN(unittest.TestCase):
def setUp(self):
scoring = "OLS"
logr = "wt"
cfg = load_config_data(CFG_FILE, CFG_DIR)
cfg = update_cfg_file(cfg, scoring, logr)
params = deepcopy(DEFAULT_STORE_PARAMS)
self.general_test_component = HDF5TestComponent(
store_constructor=Selection,
cfg=cfg,
result_dir=RESULT_DIR,
file_ext=FILE_EXT,
file_sep=FILE_SEP,
save=False,
params=params,
verbose=False,
libtype=LIBTYPE,
scoring_method=scoring,
logr_method=logr,
coding="noncoding",
)
self.general_test_component.setUp()
def tearDown(self):
self.general_test_component.tearDown()
def test_all_hdf5_dataframes(self):
self.general_test_component.runTest()
class TestSelectionBasicLibRatiosScoringCompleteNormN(unittest.TestCase):
def setUp(self):
scoring = "ratios"
logr = "complete"
cfg = load_config_data(CFG_FILE, CFG_DIR)
cfg = update_cfg_file(cfg, scoring, logr)
params = deepcopy(DEFAULT_STORE_PARAMS)
self.general_test_component = HDF5TestComponent(
store_constructor=Selection,
cfg=cfg,
result_dir=RESULT_DIR,
file_ext=FILE_EXT,
file_sep=FILE_SEP,
save=False,
params=params,
verbose=False,
libtype=LIBTYPE,
scoring_method=scoring,
logr_method=logr,
coding="noncoding",
)
self.general_test_component.setUp()
def tearDown(self):
self.general_test_component.tearDown()
def test_all_hdf5_dataframes(self):
self.general_test_component.runTest()
class TestSelectionBasicLibRatiosScoringFullNormN(unittest.TestCase):
def setUp(self):
scoring = "ratios"
logr = "full"
cfg = load_config_data(CFG_FILE, CFG_DIR)
cfg = update_cfg_file(cfg, scoring, logr)
params = deepcopy(DEFAULT_STORE_PARAMS)
self.general_test_component = HDF5TestComponent(
store_constructor=Selection,
cfg=cfg,
result_dir=RESULT_DIR,
file_ext=FILE_EXT,
file_sep=FILE_SEP,
save=False,
params=params,
verbose=False,
libtype=LIBTYPE,
scoring_method=scoring,
logr_method=logr,
coding="noncoding",
)
self.general_test_component.setUp()
def tearDown(self):
self.general_test_component.tearDown()
def test_all_hdf5_dataframes(self):
self.general_test_component.runTest()
class TestSelectionBasicLibRatiosScoringWTNormN(unittest.TestCase):
def setUp(self):
scoring = "ratios"
logr = "wt"
cfg = load_config_data(CFG_FILE, CFG_DIR)
cfg = update_cfg_file(cfg, scoring, logr)
params = deepcopy(DEFAULT_STORE_PARAMS)
self.general_test_component = HDF5TestComponent(
store_constructor=Selection,
cfg=cfg,
result_dir=RESULT_DIR,
file_ext=FILE_EXT,
file_sep=FILE_SEP,
save=False,
params=params,
verbose=False,
libtype=LIBTYPE,
scoring_method=scoring,
logr_method=logr,
coding="noncoding",
)
self.general_test_component.setUp()
def tearDown(self):
self.general_test_component.tearDown()
def test_all_hdf5_dataframes(self):
self.general_test_component.runTest()
class TestSelectionBasicLibCountsScoringCompleteNormN(unittest.TestCase):
def setUp(self):
scoring = "counts"
logr = "complete"
cfg = load_config_data(CFG_FILE, CFG_DIR)
cfg = update_cfg_file(cfg, scoring, logr)
params = deepcopy(DEFAULT_STORE_PARAMS)
self.general_test_component = HDF5TestComponent(
store_constructor=Selection,
cfg=cfg,
result_dir=RESULT_DIR,
file_ext=FILE_EXT,
file_sep=FILE_SEP,
save=False,
params=params,
verbose=False,
libtype=LIBTYPE,
scoring_method=scoring,
logr_method=logr,
coding="noncoding",
)
self.general_test_component.setUp()
def tearDown(self):
self.general_test_component.tearDown()
def test_all_hdf5_dataframes(self):
self.general_test_component.runTest()
class TestSelectionBasicLibCountsScoringFullNormN(unittest.TestCase):
def setUp(self):
scoring = "counts"
logr = "full"
cfg = load_config_data(CFG_FILE, CFG_DIR)
cfg = update_cfg_file(cfg, scoring, logr)
params = deepcopy(DEFAULT_STORE_PARAMS)
self.general_test_component = HDF5TestComponent(
store_constructor=Selection,
cfg=cfg,
result_dir=RESULT_DIR,
file_ext=FILE_EXT,
file_sep=FILE_SEP,
save=False,
params=params,
verbose=False,
libtype=LIBTYPE,
scoring_method=scoring,
logr_method=logr,
coding="noncoding",
)
self.general_test_component.setUp()
def tearDown(self):
self.general_test_component.tearDown()
def test_all_hdf5_dataframes(self):
self.general_test_component.runTest()
class TestSelectionBasicLibCountsScoringWTNormN(unittest.TestCase):
def setUp(self):
scoring = "counts"
logr = "wt"
cfg = load_config_data(CFG_FILE, CFG_DIR)
cfg = update_cfg_file(cfg, scoring, logr)
params = deepcopy(DEFAULT_STORE_PARAMS)
self.general_test_component = HDF5TestComponent(
store_constructor=Selection,
cfg=cfg,
result_dir=RESULT_DIR,
file_ext=FILE_EXT,
file_sep=FILE_SEP,
save=False,
params=params,
verbose=False,
libtype=LIBTYPE,
scoring_method=scoring,
logr_method=logr,
coding="noncoding",
)
self.general_test_component.setUp()
def tearDown(self):
self.general_test_component.tearDown()
def test_all_hdf5_dataframes(self):
self.general_test_component.runTest()
class TestSelectionBasicLibSimpleScoringCompleteNormN(unittest.TestCase):
def setUp(self):
scoring = "simple"
logr = "complete"
cfg = load_config_data(CFG_FILE, CFG_DIR)
cfg = update_cfg_file(cfg, scoring, logr)
params = deepcopy(DEFAULT_STORE_PARAMS)
self.general_test_component = HDF5TestComponent(
store_constructor=Selection,
cfg=cfg,
result_dir=RESULT_DIR,
file_ext=FILE_EXT,
file_sep=FILE_SEP,
save=False,
params=params,
verbose=False,
libtype=LIBTYPE,
scoring_method=scoring,
logr_method=logr,
coding="noncoding",
)
self.general_test_component.setUp()
def tearDown(self):
self.general_test_component.tearDown()
def test_all_hdf5_dataframes(self):
self.general_test_component.runTest()
class TestSelectionBasicLibSimpleScoringFullNormN(unittest.TestCase):
def setUp(self):
scoring = "simple"
logr = "full"
cfg = load_config_data(CFG_FILE, CFG_DIR)
cfg = update_cfg_file(cfg, scoring, logr)
params = deepcopy(DEFAULT_STORE_PARAMS)
self.general_test_component = HDF5TestComponent(
store_constructor=Selection,
cfg=cfg,
result_dir=RESULT_DIR,
file_ext=FILE_EXT,
file_sep=FILE_SEP,
save=False,
params=params,
verbose=False,
libtype=LIBTYPE,
scoring_method=scoring,
logr_method=logr,
coding="noncoding",
)
self.general_test_component.setUp()
def tearDown(self):
self.general_test_component.tearDown()
def test_all_hdf5_dataframes(self):
self.general_test_component.runTest()
class TestSelectionBasicLibSimpleScoringWTNormN(unittest.TestCase):
def setUp(self):
scoring = "simple"
logr = "wt"
cfg = load_config_data(CFG_FILE, CFG_DIR)
cfg = update_cfg_file(cfg, scoring, logr)
params = deepcopy(DEFAULT_STORE_PARAMS)
self.general_test_component = HDF5TestComponent(
store_constructor=Selection,
cfg=cfg,
result_dir=RESULT_DIR,
file_ext=FILE_EXT,
file_sep=FILE_SEP,
save=False,
params=params,
verbose=False,
libtype=LIBTYPE,
scoring_method=scoring,
logr_method=logr,
coding="noncoding",
)
self.general_test_component.setUp()
def tearDown(self):
self.general_test_component.tearDown()
def test_all_hdf5_dataframes(self):
self.general_test_component.runTest()
if __name__ == "__main__":
unittest.main()
| 29.461066 | 73 | 0.617305 | 1,464 | 14,377 | 5.759563 | 0.053962 | 0.078273 | 0.106736 | 0.170778 | 0.879032 | 0.879032 | 0.879032 | 0.879032 | 0.808231 | 0.808231 | 0 | 0.003093 | 0.302845 | 14,377 | 487 | 74 | 29.521561 | 0.838172 | 0 | 0 | 0.884521 | 0 | 0 | 0.027683 | 0.006608 | 0 | 0 | 0 | 0 | 0 | 1 | 0.110565 | false | 0 | 0.014742 | 0 | 0.162162 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
e886be48a12ea33c2b6cd69f63556c8e385424b7 | 3,720 | py | Python | z2/part3/updated_part2_batch/jm/parser_errors_2/690864861.py | kozakusek/ipp-2020-testy | 09aa008fa53d159672cc7cbf969a6b237e15a7b8 | [
"MIT"
] | 1 | 2020-04-16T12:13:47.000Z | 2020-04-16T12:13:47.000Z | z2/part3/updated_part2_batch/jm/parser_errors_2/690864861.py | kozakusek/ipp-2020-testy | 09aa008fa53d159672cc7cbf969a6b237e15a7b8 | [
"MIT"
] | 18 | 2020-03-06T17:50:15.000Z | 2020-05-19T14:58:30.000Z | z2/part3/updated_part2_batch/jm/parser_errors_2/690864861.py | kozakusek/ipp-2020-testy | 09aa008fa53d159672cc7cbf969a6b237e15a7b8 | [
"MIT"
] | 18 | 2020-03-06T17:45:13.000Z | 2020-06-09T19:18:31.000Z | from part1 import (
gamma_board,
gamma_busy_fields,
gamma_delete,
gamma_free_fields,
gamma_golden_move,
gamma_golden_possible,
gamma_move,
gamma_new,
)
"""
scenario: test_random_actions
uuid: 690864861
"""
"""
random actions, total chaos
"""
board = gamma_new(5, 4, 4, 6)
assert board is not None
assert gamma_move(board, 1, 2, 4) == 0
assert gamma_move(board, 1, 2, 1) == 1
assert gamma_move(board, 2, 2, 3) == 1
assert gamma_golden_possible(board, 2) == 1
assert gamma_move(board, 3, 0, 1) == 1
assert gamma_move(board, 3, 4, 3) == 1
assert gamma_golden_possible(board, 3) == 1
assert gamma_move(board, 4, 0, 2) == 1
assert gamma_move(board, 4, 4, 3) == 0
assert gamma_move(board, 1, 3, 0) == 1
assert gamma_move(board, 1, 3, 2) == 1
assert gamma_move(board, 2, 3, 0) == 0
assert gamma_move(board, 2, 0, 0) == 1
assert gamma_move(board, 3, 0, 2) == 0
assert gamma_golden_possible(board, 3) == 1
assert gamma_move(board, 4, 2, 2) == 1
assert gamma_move(board, 4, 1, 2) == 1
assert gamma_busy_fields(board, 4) == 3
assert gamma_move(board, 1, 1, 1) == 1
assert gamma_move(board, 1, 1, 0) == 1
assert gamma_move(board, 2, 1, 3) == 1
assert gamma_move(board, 3, 0, 2) == 0
assert gamma_move(board, 4, 3, 3) == 1
board372090703 = gamma_board(board)
assert board372090703 is not None
assert board372090703 == (".2243\n"
"4441.\n"
"311..\n"
"21.1.\n")
del board372090703
board372090703 = None
assert gamma_move(board, 1, 2, 4) == 0
assert gamma_free_fields(board, 1) == 6
assert gamma_move(board, 2, 1, 3) == 0
assert gamma_move(board, 2, 0, 0) == 0
assert gamma_golden_possible(board, 2) == 1
assert gamma_move(board, 3, 1, 3) == 0
assert gamma_move(board, 3, 2, 3) == 0
assert gamma_move(board, 4, 1, 2) == 0
assert gamma_move(board, 4, 4, 0) == 1
board862555635 = gamma_board(board)
assert board862555635 is not None
assert board862555635 == (".2243\n"
"4441.\n"
"311..\n"
"21.14\n")
del board862555635
board862555635 = None
assert gamma_move(board, 1, 3, 0) == 0
assert gamma_move(board, 1, 2, 2) == 0
assert gamma_move(board, 2, 0, 2) == 0
assert gamma_move(board, 2, 2, 2) == 0
assert gamma_move(board, 3, 1, 2) == 0
assert gamma_move(board, 4, 3, 1) == 1
assert gamma_move(board, 1, 4, 1) == 1
assert gamma_move(board, 2, 3, 0) == 0
assert gamma_move(board, 3, 2, 3) == 0
assert gamma_busy_fields(board, 3) == 2
assert gamma_free_fields(board, 3) == 3
assert gamma_move(board, 4, 1, 3) == 0
assert gamma_move(board, 4, 0, 1) == 0
assert gamma_move(board, 1, 3, 1) == 0
assert gamma_move(board, 2, 0, 2) == 0
assert gamma_move(board, 2, 0, 0) == 0
assert gamma_move(board, 3, 2, 0) == 1
assert gamma_move(board, 3, 0, 2) == 0
assert gamma_move(board, 4, 3, 1) == 0
assert gamma_move(board, 4, 4, 2) == 1
assert gamma_move(board, 1, 3, 0) == 0
assert gamma_busy_fields(board, 1) == 6
assert gamma_move(board, 2, 4, 2) == 0
assert gamma_move(board, 3, 3, 0) == 0
assert gamma_move(board, 4, 3, 0) == 0
assert gamma_move(board, 1, 2, 2) == 0
assert gamma_busy_fields(board, 1) == 6
assert gamma_golden_move(board, 1, 0, 0) == 1
assert gamma_move(board, 2, 1, 1) == 0
assert gamma_golden_move(board, 2, 0, 2) == 1
assert gamma_move(board, 3, 3, 0) == 0
assert gamma_move(board, 3, 1, 3) == 0
assert gamma_move(board, 4, 3, 0) == 0
assert gamma_move(board, 4, 1, 1) == 0
assert gamma_free_fields(board, 4) == 1
assert gamma_move(board, 1, 3, 0) == 0
assert gamma_move(board, 1, 0, 2) == 0
assert gamma_move(board, 2, 3, 0) == 0
assert gamma_move(board, 2, 4, 3) == 0
assert gamma_move(board, 3, 1, 2) == 0
assert gamma_golden_possible(board, 3) == 1
assert gamma_move(board, 4, 3, 0) == 0
gamma_delete(board)
| 31 | 46 | 0.660215 | 678 | 3,720 | 3.466077 | 0.060472 | 0.346383 | 0.382979 | 0.510638 | 0.802553 | 0.760426 | 0.708085 | 0.585957 | 0.537021 | 0.512766 | 0 | 0.133487 | 0.184409 | 3,720 | 119 | 47 | 31.260504 | 0.641068 | 0 | 0 | 0.378641 | 0 | 0 | 0.015419 | 0 | 0 | 0 | 0 | 0 | 0.76699 | 1 | 0 | false | 0 | 0.009709 | 0 | 0.009709 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
fa09de37ca28dc725550747a7a2760e060743902 | 114 | py | Python | server/dockyard/driver/app/__init__.py | tor4z/Galileo-dockyard | c7f65c1466a1450315935ad03a5a504e7bdb1660 | [
"MIT"
] | null | null | null | server/dockyard/driver/app/__init__.py | tor4z/Galileo-dockyard | c7f65c1466a1450315935ad03a5a504e7bdb1660 | [
"MIT"
] | null | null | null | server/dockyard/driver/app/__init__.py | tor4z/Galileo-dockyard | c7f65c1466a1450315935ad03a5a504e7bdb1660 | [
"MIT"
] | null | null | null | from dockyard.driver.app._app import AppDriver as App
from dockyard.driver.app._build import BuildDriver as Build
| 38 | 59 | 0.842105 | 18 | 114 | 5.222222 | 0.5 | 0.255319 | 0.382979 | 0.446809 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.105263 | 114 | 2 | 60 | 57 | 0.921569 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
3afbb11e2004132e14b096fb40c7a8ddaa5df863 | 29,979 | py | Python | web/transiq/restapi/serializers/owner.py | manibhushan05/transiq | 763fafb271ce07d13ac8ce575f2fee653cf39343 | [
"Apache-2.0"
] | null | null | null | web/transiq/restapi/serializers/owner.py | manibhushan05/transiq | 763fafb271ce07d13ac8ce575f2fee653cf39343 | [
"Apache-2.0"
] | 14 | 2020-06-05T23:06:45.000Z | 2022-03-12T00:00:18.000Z | web/transiq/restapi/serializers/owner.py | manibhushan05/transiq | 763fafb271ce07d13ac8ce575f2fee653cf39343 | [
"Apache-2.0"
] | null | null | null | from django.contrib.auth.models import User
from rest_framework import serializers, ISO_8601
from rest_framework.validators import UniqueValidator
from authentication.models import Profile
from driver.models import Driver, DriverAppUser
from owner.models import Route, Owner, Vehicle, FuelCard, FuelCardTransaction, VehicleSummary
from owner.vehicle_util import display_format
from restapi.helper_api import DATE_FORMAT
from restapi.serializers.authentication import ProfileSerializer, BankSerializer
from restapi.serializers.utils import CitySerializer
from restapi.service.validators import validate_vehicle_number, validate_pan
from utils.models import Address, City, VehicleCategory, Bank
class RouteSerializer(serializers.Serializer):
id = serializers.IntegerField(label='ID', read_only=True)
created_on = serializers.DateTimeField(read_only=True)
updated_on = serializers.DateTimeField(read_only=True)
deleted = serializers.BooleanField(required=False)
deleted_on = serializers.DateTimeField(allow_null=True, required=False)
created_by = serializers.SlugRelatedField(queryset=User.objects.all(), required=False, slug_field="username")
changed_by = serializers.SlugRelatedField(queryset=User.objects.all(), slug_field="username")
source = serializers.PrimaryKeyRelatedField(queryset=City.objects.all())
destination = serializers.PrimaryKeyRelatedField(queryset=City.objects.all())
def to_representation(self, instance):
self.fields["source"] = CitySerializer(read_only=True)
self.fields["destination"] = CitySerializer(read_only=True)
return super().to_representation(instance=instance)
def create(self, validated_data):
instance = Route.objects.create(**validated_data)
return instance
def update(self, instance, validated_data):
Route.objects.filter(id=instance.id).update(**validated_data)
return Route.objects.get(id=instance.id)
class OwnerSerializer(serializers.Serializer):
id = serializers.IntegerField(label='ID', read_only=True)
owner_address = serializers.CharField(allow_null=True, min_length=3, max_length=300, required=False)
pin = serializers.CharField(max_length=6, min_length=6, required=False)
route_temp = serializers.CharField(allow_null=True, min_length=3, max_length=300, required=False)
pan = serializers.CharField(allow_null=True, max_length=11, required=False,
validators=[UniqueValidator(queryset=Owner.objects.all())])
declaration = serializers.CharField(allow_null=True, max_length=255, required=False)
declaration_validity = serializers.DateField(allow_null=True, required=False, format=DATE_FORMAT,
input_formats=[DATE_FORMAT, ISO_8601])
created_on = serializers.DateTimeField(read_only=True, format=DATE_FORMAT, input_formats=[DATE_FORMAT, ISO_8601])
updated_on = serializers.DateTimeField(read_only=True)
deleted = serializers.BooleanField(required=False)
deleted_on = serializers.DateTimeField(allow_null=True, required=False)
created_by = serializers.SlugRelatedField(queryset=User.objects.all(), required=False, slug_field="username")
changed_by = serializers.SlugRelatedField(queryset=User.objects.all(), slug_field="username")
name = serializers.PrimaryKeyRelatedField(
queryset=User.objects.all(), validators=[UniqueValidator(queryset=Owner.objects.all())], required=False)
city = serializers.PrimaryKeyRelatedField(queryset=City.objects.all())
account_details = serializers.PrimaryKeyRelatedField(allow_null=True, queryset=Bank.objects.all(),
required=False,
validators=[UniqueValidator(queryset=Owner.objects.all())])
address = serializers.PrimaryKeyRelatedField(allow_null=True, queryset=Address.objects.all(), required=False,
validators=[UniqueValidator(queryset=Owner.objects.all())])
owner_profile = serializers.SerializerMethodField()
vehicle_list = serializers.SerializerMethodField()
city_data = serializers.SerializerMethodField()
owner_phone = serializers.CharField(write_only=True, required=False)
def validate_owner_phone(self, attrs):
if Owner.objects.exclude(deleted=True).filter(name__profile__phone=attrs).exists():
raise serializers.ValidationError("Owner Phone must be unique")
return attrs
def validate_pan(self,attrs):
if attrs and not validate_pan(attrs):
raise serializers.ValidationError('PAN is not valid')
return attrs
def get_city_data(self, instance):
if isinstance(instance.city, City):
return {'id': instance.city.id, 'name': instance.city.name, 'state': instance.city.state_name}
return {}
def get_vehicle_list(self, instance):
if isinstance(instance, Owner):
return [{'id': vehicle.id, 'vehicle_number': vehicle.number()} for vehicle in
Vehicle.objects.filter(owner=instance)]
return []
def get_owner_profile(self, instance):
if isinstance(instance.name, User) and isinstance(instance.name.profile, Profile):
return ProfileSerializer(instance.name.profile).data
return {}
def create(self, validated_data):
routes = []
if "route" in validated_data.keys():
routes = validated_data.pop('route')
validated_data.pop("owner_phone")
instance = Owner.objects.create(**validated_data)
for route in routes:
instance.route.add(route)
return instance
def update(self, instance, validated_data):
routes = []
if "route" in validated_data.keys():
instance.route.clear()
routes = validated_data.pop('route')
Owner.objects.filter(id=instance.id).update(**validated_data)
for route in routes:
instance.route.add(route)
return Owner.objects.get(id=instance.id)
class VehicleSerializer(serializers.Serializer):
id = serializers.IntegerField(label='ID', read_only=True)
vehicle_number = serializers.CharField(max_length=18, validators=[UniqueValidator(queryset=Vehicle.objects.all())])
rc_number = serializers.CharField(allow_null=True,min_length=3 ,max_length=20, required=False)
permit = serializers.CharField(allow_null=True,min_length=3, max_length=25, required=False)
permit_validity = serializers.DateField(allow_null=True, required=False, format=DATE_FORMAT,
input_formats=[DATE_FORMAT, ISO_8601])
permit_type = serializers.CharField(allow_null=True,min_length=3 ,max_length=70, required=False)
vehicle_capacity = serializers.IntegerField(allow_null=True, label='Exact Vehicle Capacity in Kg',
max_value=2147483647,
min_value=-2147483648, required=False)
body_type = serializers.ChoiceField( choices=(
('open', 'Open'), ('closed', 'Closed'), ('semi', 'Semi'), ('half', 'Half'), ('containerized', 'Containerized')),
required=True)
vehicle_model = serializers.CharField(allow_null=True, min_length=3,max_length=30, required=False)
chassis_number = serializers.CharField(allow_null=True,min_length=3, max_length=255, required=False)
engine_number = serializers.CharField(allow_null=True,min_length=3, max_length=255, required=False)
insurer = serializers.CharField(allow_null=True,min_length=3, max_length=100, required=False)
insurance_number = serializers.CharField(allow_null=True,min_length=3, max_length=30, required=False)
insurance_validity = serializers.DateField(allow_null=True, required=False, input_formats=[DATE_FORMAT, ISO_8601],
format=DATE_FORMAT)
registration_year = serializers.DateField(allow_null=True, required=False, input_formats=[DATE_FORMAT, ISO_8601],
format=DATE_FORMAT)
registration_validity = serializers.DateField(allow_null=True, required=False,
input_formats=[DATE_FORMAT, ISO_8601],
format=DATE_FORMAT)
fitness_certificate_number = serializers.CharField(
allow_null=True,min_length=3, max_length=255, required=False)
fitness_certificate_issued_on = serializers.DateField(
allow_null=True, required=False, input_formats=[DATE_FORMAT, ISO_8601], format=DATE_FORMAT)
fitness_certificate_validity_date = serializers.DateField(
allow_null=True, required=False, input_formats=[DATE_FORMAT, ISO_8601], format=DATE_FORMAT)
puc_certificate_number = serializers.CharField(allow_null=True, min_length=3,max_length=255, required=False)
puc_certificate_issued_on = serializers.DateField(
allow_null=True, required=False, input_formats=[DATE_FORMAT, ISO_8601], format=DATE_FORMAT)
puc_certificate_validity_date = serializers.DateField(
allow_null=True, required=False, input_formats=[DATE_FORMAT, ISO_8601], format=DATE_FORMAT)
status = serializers.ChoiceField(
choices=(('unloaded', 'unloaded'), ('loading', 'loading'), ('loaded', 'loaded'), ('unloading', 'unloading')),
required=False)
gps_enabled = serializers.BooleanField(required=False)
supplier_name = serializers.CharField(allow_null=True, max_length=70, required=False)
supplier_phone = serializers.CharField(allow_null=True, max_length=30, required=False)
owner_name = serializers.CharField(allow_null=True, max_length=70, required=False)
owner_phone = serializers.CharField(allow_null=True, max_length=30, required=False)
updated_on = serializers.DateTimeField(read_only=True)
created_on = serializers.DateTimeField(read_only=True, format=DATE_FORMAT)
owner = serializers.PrimaryKeyRelatedField(allow_null=True, queryset=Owner.objects.all(), required=False)
driver = serializers.PrimaryKeyRelatedField(allow_null=True, queryset=Driver.objects.all(), required=False,
validators=[UniqueValidator(queryset=Vehicle.objects.all())])
driver_app_user = serializers.PrimaryKeyRelatedField(allow_null=True, queryset=DriverAppUser.objects.all(),
required=False)
vehicle_type = serializers.PrimaryKeyRelatedField(allow_null=True, queryset=VehicleCategory.objects.all(),
required=False)
route = serializers.PrimaryKeyRelatedField(many=True, queryset=Route.objects.all(), required=False)
bank_account = serializers.PrimaryKeyRelatedField(allow_null=True, queryset=Bank.objects.all(), required=False)
deleted = serializers.BooleanField(required=False)
deleted_on = serializers.DateTimeField(allow_null=True, required=False)
created_by = serializers.SlugRelatedField(queryset=User.objects.all(), required=False, slug_field="username")
changed_by = serializers.SlugRelatedField(queryset=User.objects.all(), slug_field="username")
owner_data = serializers.SerializerMethodField()
vehicle_type_data = serializers.SerializerMethodField()
vehicle_number_display = serializers.SerializerMethodField()
driver_data = serializers.SerializerMethodField()
documents = serializers.SerializerMethodField()
def validate_vehicle_number(self, value):
if not validate_vehicle_number(value) and value:
raise serializers.ValidationError("Not a valid vehicle number")
return value
def get_driver_data(self, instance):
if isinstance(instance.driver, Driver):
return {
'id': instance.driver.id, 'name': instance.driver.name, 'phone': instance.driver.phone,
'dl_number': instance.driver.driving_licence_number,
'dl_validity': instance.driver.driving_licence_validity.strftime(
DATE_FORMAT) if instance.driver.driving_licence_validity else None,
'dl_location': instance.driver.driving_licence_location}
return {'id': '-1', 'name': '-', 'phone': None, 'dl_number': None, 'dl_validity': None, 'dl_location': None}
def get_vehicle_number_display(self, instance):
return display_format(instance.vehicle_number)
def get_owner_data(self, instance):
if isinstance(instance.owner, Owner):
return {'id': instance.owner.id, 'name': instance.owner.get_name(), 'phone': instance.owner.get_phone()}
return {'id': -1, 'name': '-', 'phone': '-'}
def get_vehicle_type_data(self, instance):
if isinstance(instance.vehicle_type, VehicleCategory):
return {'id': instance.vehicle_type.id, 'name': instance.vehicle_type.get_name()}
return {'id': 0, 'name': '-'}
def get_documents(self, instance):
if isinstance(instance, Vehicle):
return [{'id': doc.id, 'url': doc.s3_upload.public_url(), 'document_category': doc.document_category,
'document_category_display': doc.get_document_category_display(),
'thumb_url': doc.s3_upload.public_url(),
'bucket': doc.s3_upload.bucket,
'folder': doc.s3_upload.folder,
'uuid': doc.s3_upload.uuid,
'filename': doc.s3_upload.filename,
'validity': None,
} for doc in
instance.vehicle_files.exclude(deleted=True).exclude(s3_upload=None).order_by('id')]
return []
def create(self, validated_data):
routes = []
if "route" in validated_data.keys():
routes = validated_data.pop('route')
instance = Vehicle.objects.create(**validated_data)
for route in routes:
instance.route.add(route)
return instance
def update(self, instance, validated_data):
routes = []
if "route" in validated_data.keys():
instance.route.clear()
routes = validated_data.pop('route')
Vehicle.objects.filter(id=instance.id).update(**validated_data)
for route in routes:
instance.route.add(route)
return Vehicle.objects.get(id=instance.id)
class VehicleSummarySerializer(serializers.Serializer):
id = serializers.IntegerField(label='ID', read_only=True)
accounting_summary = serializers.JSONField(style={'base_template': 'textarea.html'})
created_on = serializers.DateTimeField(read_only=True)
updated_on = serializers.DateTimeField(read_only=True)
deleted = serializers.BooleanField(required=False)
deleted_on = serializers.DateTimeField(allow_null=True, required=False)
vehicle = serializers.PrimaryKeyRelatedField(queryset=Vehicle.objects.all(),
validators=[UniqueValidator(queryset=VehicleSummary.objects.all())])
def create(self, validated_data):
pass
def update(self, instance, validated_data):
pass
class FMSVehicleSerializer(serializers.Serializer):
id = serializers.IntegerField(label='ID', read_only=True)
vehicle_number = serializers.CharField(write_only=True, max_length=18,
validators=[UniqueValidator(queryset=Vehicle.objects.all())])
rc_number = serializers.CharField(allow_null=True ,max_length=20, required=False)
permit = serializers.CharField(allow_null=True,min_length=3, max_length=25, required=False)
permit_validity = serializers.DateField(allow_null=True, required=False, format=DATE_FORMAT,
input_formats=[DATE_FORMAT, ISO_8601])
permit_type = serializers.CharField(allow_null=True, max_length=70, required=False)
vehicle_capacity = serializers.IntegerField(allow_null=True, label='Exact Vehicle Capacity in Kg',
max_value=2147483647,
min_value=-2147483648, required=False)
body_type = serializers.ChoiceField(allow_null=True, choices=(
('open', 'Open'), ('closed', 'Closed'), ('semi', 'Semi'), ('half', 'Half'), ('containerized', 'Containerized')),
required=False)
vehicle_model = serializers.CharField(allow_null=True, max_length=30, required=False)
chassis_number = serializers.CharField(allow_null=True, max_length=255, required=False)
engine_number = serializers.CharField(allow_null=True, max_length=255, required=False)
insurer = serializers.CharField(allow_null=True, max_length=100, required=False)
insurance_number = serializers.CharField(allow_null=True, max_length=30, required=False)
insurance_validity = serializers.DateField(allow_null=True, required=False, input_formats=[DATE_FORMAT, ISO_8601],
format=DATE_FORMAT)
registration_year = serializers.DateField(allow_null=True, required=False, input_formats=[DATE_FORMAT, ISO_8601],
format='%Y')
registration_validity = serializers.DateField(allow_null=True, required=False,
input_formats=[DATE_FORMAT, ISO_8601],
format=DATE_FORMAT)
fitness_certificate_number = serializers.CharField(
allow_null=True, max_length=255, required=False)
fitness_certificate_issued_on = serializers.DateField(
allow_null=True, required=False, input_formats=[DATE_FORMAT, ISO_8601], format=DATE_FORMAT)
fitness_certificate_validity_date = serializers.DateField(
allow_null=True, required=False, input_formats=[DATE_FORMAT, ISO_8601], format=DATE_FORMAT)
puc_certificate_number = serializers.CharField(allow_null=True, max_length=255, required=False)
puc_certificate_issued_on = serializers.DateField(
allow_null=True, required=False, input_formats=[DATE_FORMAT, ISO_8601], format=DATE_FORMAT)
puc_certificate_validity_date = serializers.DateField(
allow_null=True, required=False, input_formats=[DATE_FORMAT, ISO_8601], format=DATE_FORMAT)
status = serializers.ChoiceField(
choices=(('unloaded', 'unloaded'), ('loading', 'loading'), ('loaded', 'loaded'), ('unloading', 'unloading')),
required=False)
gps_enabled = serializers.BooleanField(required=False)
supplier_name = serializers.CharField(allow_null=True, max_length=70, required=False)
supplier_phone = serializers.CharField(allow_null=True, max_length=30, required=False)
owner_name = serializers.CharField(allow_null=True, max_length=70, required=False)
owner_phone = serializers.CharField(allow_null=True, max_length=30, required=False)
updated_on = serializers.DateTimeField(read_only=True)
created_on = serializers.DateTimeField(read_only=True, format=DATE_FORMAT)
owner = serializers.PrimaryKeyRelatedField(write_only=True, allow_null=True, queryset=Owner.objects.all(),
required=False)
driver = serializers.PrimaryKeyRelatedField(allow_null=True, queryset=Driver.objects.all(), required=False,
validators=[UniqueValidator(queryset=Vehicle.objects.all())])
driver_app_user = serializers.PrimaryKeyRelatedField(write_only=True, allow_null=True,
queryset=DriverAppUser.objects.all(),
required=False)
vehicle_type = serializers.PrimaryKeyRelatedField(write_only=True, allow_null=True,
queryset=VehicleCategory.objects.all(),
required=False)
route = serializers.PrimaryKeyRelatedField(many=True, queryset=Route.objects.all(), required=False)
bank_account = serializers.PrimaryKeyRelatedField(allow_null=True, queryset=Bank.objects.all(), required=False)
deleted = serializers.BooleanField(required=False)
deleted_on = serializers.DateTimeField(allow_null=True, required=False)
created_by = serializers.SlugRelatedField(queryset=User.objects.all(), required=False, slug_field="username")
changed_by = serializers.SlugRelatedField(queryset=User.objects.all(), slug_field="username")
owner_data = serializers.SerializerMethodField()
vehicle_type_data = serializers.SerializerMethodField()
vehicle_number_display = serializers.SerializerMethodField()
driver_data = serializers.SerializerMethodField()
documents = serializers.SerializerMethodField()
def get_driver_data(self, instance):
if isinstance(instance.driver, Driver):
return {'id': instance.driver.id, 'name': instance.driver.name, 'phone': instance.driver.phone,
'dl_number': instance.driver.driving_licence_number,
'dl_validity': instance.driver.driving_licence_validity.strftime(
DATE_FORMAT) if instance.driver.driving_licence_validity else None,
'dl_location': instance.driver.driving_licence_location}
return {'id': '-1', 'name': '-', 'phone': None, 'dl_number': None, 'dl_validity': None, 'dl_location': None}
def to_representation(self, instance):
self.fields['bank_account'] = BankSerializer(read_only=True)
return super().to_representation(instance=instance)
def get_vehicle_number_display(self, instance):
return display_format(instance.vehicle_number)
def get_owner_data(self, instance):
if isinstance(instance.owner, Owner):
return {'id': instance.owner.id, 'name': instance.owner.get_name(), 'phone': instance.owner.get_phone(),
'pan': instance.owner.pan, 'declaration_validity': instance.owner.declaration_validity.strftime(
DATE_FORMAT) if instance.owner.declaration_validity else None}
return {'id': -1, 'name': None, 'phone': None, 'pan': None, 'declaration_validity': None}
def get_vehicle_type_data(self, instance):
if isinstance(instance.vehicle_type, VehicleCategory):
return {'id': instance.vehicle_type.id, 'name': instance.vehicle_type.get_name()}
return {'id': 0, 'name': '-'}
def get_documents(self, instance):
vehicle_docs = [{'id': doc.id, 'url': doc.s3_upload.public_url(), 'document_category': doc.document_category,
'document_category_display': doc.get_document_category_display(),
'thumb_url': doc.s3_upload.public_url(),
'bucket': doc.s3_upload.bucket,
'folder': doc.s3_upload.folder,
'uuid': doc.s3_upload.uuid,
'filename': doc.s3_upload.filename,
'validity': None,
} for doc in
instance.vehicle_files.exclude(deleted=True).exclude(s3_upload=None)]
if isinstance(instance.owner, Owner):
owner_docs = [{'id': doc.id, 'url': doc.s3_upload.public_url(), 'document_category': doc.document_category,
'document_category_display': doc.get_document_category_display(),
'thumb_url': doc.s3_upload.public_url(),
'bucket': doc.s3_upload.bucket,
'folder': doc.s3_upload.folder,
'uuid': doc.s3_upload.uuid,
'filename': doc.s3_upload.filename,
'validity': None,
} for doc in
instance.owner.owner_files.exclude(s3_upload=None)]
else:
owner_docs = []
if isinstance(instance.driver, Driver):
driver_docs = [{'id': doc.id, 'url': doc.s3_upload.public_url(), 'document_category': doc.document_category,
'document_category_display': doc.get_document_category_display(),
'thumb_url': doc.s3_upload.public_url(),
'bucket': doc.s3_upload.bucket,
'folder': doc.s3_upload.folder,
'uuid': doc.s3_upload.uuid,
'filename': doc.s3_upload.filename,
'validity': None,
} for doc in
instance.driver.driver_files.exclude(s3_upload=None)]
else:
driver_docs = []
return vehicle_docs + owner_docs + driver_docs
@classmethod
def many_init(cls, *args, **kwargs):
kwargs['child'] = cls()
excluded_fields = [
'documents', 'driver_data', 'driver_app_user', 'rc_number', 'permit', 'permit_validity', 'permit_type',
'vehicle_capacity', 'chassis_number', 'engine_number', 'insurer', 'insurance_number', 'insurance_validity',
'registration_year', 'registration_validity', 'fitness_certificate_number', 'fitness_certificate_issued_on',
'fitness_certificate_validity_date', 'puc_certificate_number', 'puc_certificate_issued_on', 'route',
'puc_certificate_validity_date', 'status', 'gps_enabled', 'supplier_name', 'supplier_phone', 'owner_name',
'updated_on', 'created_on', 'deleted', 'deleted_on', 'created_by', 'changed_by', 'body_type',
'vehicle_model', 'owner_phone', 'driver', 'owner_data'
]
for field in excluded_fields:
kwargs['child'].fields.pop(field)
return serializers.ListSerializer(*args, **kwargs)
def create(self, validated_data):
routes = []
if "route" in validated_data.keys():
routes = validated_data.pop('route')
instance = Vehicle.objects.create(**validated_data)
for route in routes:
instance.route.add(route)
return instance
def update(self, instance, validated_data):
routes = []
if "route" in validated_data.keys():
instance.route.clear()
routes = validated_data.pop('route')
Vehicle.objects.filter(id=instance.id).update(**validated_data)
for route in routes:
instance.route.add(route)
return Vehicle.objects.get(id=instance.id)
class Select2FuelCardSerializer(serializers.Serializer):
id = serializers.IntegerField(label='ID', read_only=True)
text = serializers.SerializerMethodField()
def get_text(self, instance):
if isinstance(instance, FuelCard):
return instance.card_number
return None
class FuelCardSerializer(serializers.Serializer):
id = serializers.IntegerField(label='ID', read_only=True)
customer_id = serializers.CharField(max_length=30, required=False)
card_number = serializers.CharField(allow_null=True, max_length=40, required=False,
validators=[UniqueValidator(queryset=FuelCard.objects.all())])
issue_date = serializers.DateField(allow_null=True, required=False, format=DATE_FORMAT,
input_formats=[DATE_FORMAT, ISO_8601])
expiry_date = serializers.DateField(allow_null=True, required=False, format=DATE_FORMAT,
input_formats=[DATE_FORMAT, ISO_8601])
update_on = serializers.DateTimeField(read_only=True)
created_on = serializers.DateTimeField(read_only=True)
deleted = serializers.BooleanField(required=False)
deleted_on = serializers.DateTimeField(allow_null=True, required=False)
created_by = serializers.SlugRelatedField(queryset=User.objects.all(), required=False, slug_field="username")
changed_by = serializers.SlugRelatedField(queryset=User.objects.all(), slug_field="username")
def create(self, validated_data):
instance = FuelCard.objects.create(**validated_data)
return instance
def update(self, instance, validated_data):
FuelCard.objects.filter(id=instance.id).update(**validated_data)
return FuelCard.objects.get(id=instance.id)
class FuelCardTransactionSerializer(serializers.Serializer):
id = serializers.IntegerField(label='ID', read_only=True)
paid_to = serializers.CharField(allow_null=True, max_length=70, required=False)
amount = serializers.IntegerField(max_value=2147483647, min_value=-2147483648, required=False)
payment_date = serializers.DateTimeField()
update_on = serializers.DateTimeField(read_only=True)
created_on = serializers.DateTimeField(read_only=True)
deleted = serializers.BooleanField(required=False)
deleted_on = serializers.DateTimeField(allow_null=True, required=False)
created_by = serializers.SlugRelatedField(queryset=User.objects.all(), required=False, slug_field="username")
changed_by = serializers.SlugRelatedField(queryset=User.objects.all(), slug_field="username")
vehicle = serializers.PrimaryKeyRelatedField(queryset=Vehicle.objects.all())
fuel_card = serializers.PrimaryKeyRelatedField(queryset=FuelCard.objects.all())
def to_representation(self, instance):
self.fields["vehicle"] = VehicleSerializer(read_only=True)
self.fields["fuel_card"] = FuelCardSerializer(read_only=True)
return super().to_representation(instance=instance)
def create(self, validated_data):
instance = FuelCardTransaction.objects.create(**validated_data)
return instance
def update(self, instance, validated_data):
FuelCardTransaction.objects.filter(id=instance.id).update(**validated_data)
return FuelCardTransaction.objects.get(id=instance.id)
| 57.102857 | 120 | 0.67574 | 3,195 | 29,979 | 6.129264 | 0.074491 | 0.06572 | 0.049788 | 0.05035 | 0.826278 | 0.806107 | 0.774907 | 0.762958 | 0.754889 | 0.738549 | 0 | 0.011992 | 0.218353 | 29,979 | 524 | 121 | 57.211832 | 0.82371 | 0 | 0 | 0.632054 | 0 | 0 | 0.061176 | 0.009507 | 0 | 0 | 0 | 0 | 0 | 1 | 0.079007 | false | 0.004515 | 0.027088 | 0.004515 | 0.566591 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 7 |
d737f21d1b1f5f322229f36ed56d99824a855d79 | 10,609 | py | Python | venv/lib/python3.7/site-packages/setuptools_git/tests.py | vchiapaikeo/prophet | e8c250ca7bfffc280baa7dabc80a2c2d1f72c6a7 | [
"MIT"
] | 15 | 2015-05-01T22:09:55.000Z | 2021-04-25T17:05:18.000Z | venv/lib/python3.7/site-packages/setuptools_git/tests.py | vchiapaikeo/prophet | e8c250ca7bfffc280baa7dabc80a2c2d1f72c6a7 | [
"MIT"
] | 7 | 2016-12-08T21:54:25.000Z | 2018-05-13T14:30:54.000Z | venv/lib/python3.7/site-packages/setuptools_git/tests.py | vchiapaikeo/prophet | e8c250ca7bfffc280baa7dabc80a2c2d1f72c6a7 | [
"MIT"
] | 12 | 2015-05-19T18:14:30.000Z | 2018-08-29T20:35:19.000Z | # -*- coding: utf-8 -*-
import sys
import os
import tempfile
import unittest
from os.path import realpath, join
from setuptools_git.utils import rmtree
from setuptools_git.utils import posix
from setuptools_git.utils import fsdecode
from setuptools_git.utils import hfs_quote
from setuptools_git.utils import decompose
class GitTestCase(unittest.TestCase):
def setUp(self):
self.old_cwd = os.getcwd()
self.directory = self.new_repo()
def tearDown(self):
os.chdir(self.old_cwd)
rmtree(self.directory)
def new_repo(self):
from setuptools_git.utils import check_call
directory = realpath(tempfile.mkdtemp())
os.chdir(directory)
check_call(['git', 'init', '--quiet', os.curdir])
return directory
def create_file(self, *path):
fd = open(join(*path), 'wt')
fd.write('dummy\n')
fd.close()
def create_dir(self, *path):
os.makedirs(join(*path))
def create_git_file(self, *path):
from setuptools_git.utils import check_call
filename = join(*path)
fd = open(filename, 'wt')
fd.write('dummy\n')
fd.close()
check_call(['git', 'add', filename])
check_call(['git', 'commit', '--quiet', '-m', 'add new file'])
class gitlsfiles_tests(GitTestCase):
def gitlsfiles(self, *a, **kw):
from setuptools_git import gitlsfiles
return gitlsfiles(*a, **kw)
def test_at_repo_root(self):
self.create_git_file('root.txt')
self.assertEqual(
set(self.gitlsfiles(self.directory)),
set([posix(realpath('root.txt'))]))
def test_at_repo_root_with_subdir(self):
self.create_git_file('root.txt')
self.create_dir('subdir')
self.create_git_file('subdir', 'entry.txt')
self.assertEqual(
set(self.gitlsfiles(self.directory)),
set([posix(realpath('root.txt')),
posix(realpath('subdir/entry.txt'))]))
def test_at_repo_subdir(self):
self.create_git_file('root.txt')
self.create_dir('subdir')
self.create_git_file('subdir', 'entry.txt')
self.assertEqual(
set(self.gitlsfiles(join(self.directory, 'subdir'))),
set([posix(realpath('root.txt')),
posix(realpath('subdir/entry.txt'))]))
def test_nonascii_filename(self):
filename = 'héhé.html'
# HFS Plus uses decomposed UTF-8
if sys.platform == 'darwin':
filename = decompose(filename)
self.create_git_file(filename)
self.assertEqual(
[fn for fn in os.listdir(self.directory) if fn[0] != '.'],
[filename])
self.assertEqual(
set(self.gitlsfiles(self.directory)),
set([posix(realpath(filename))]))
def test_utf8_filename(self):
if sys.version_info >= (3,):
filename = 'héhé.html'.encode('utf-8')
else:
filename = 'héhé.html'
# HFS Plus uses decomposed UTF-8
if sys.platform == 'darwin':
filename = decompose(filename)
# Windows does not like byte filenames under Python 3
if sys.platform == 'win32' and sys.version_info >= (3,):
filename = filename.decode('utf-8')
self.create_git_file(filename)
self.assertEqual(
[fn for fn in os.listdir(self.directory) if fn[0] != '.'],
[fsdecode(filename)])
self.assertEqual(
set(self.gitlsfiles(self.directory)),
set([posix(realpath(fsdecode(filename)))]))
def test_latin1_filename(self):
if sys.version_info >= (3,):
filename = 'héhé.html'.encode('latin-1')
else:
filename = 'h\xe9h\xe9.html'
# HFS Plus quotes unknown bytes
if sys.platform == 'darwin':
filename = hfs_quote(filename)
# Windows does not like byte filenames under Python 3
if sys.platform == 'win32' and sys.version_info >= (3,):
filename = filename.decode('latin-1')
self.create_git_file(filename)
self.assertEqual(
[fn for fn in os.listdir(self.directory) if fn[0] != '.'],
[fsdecode(filename)])
self.assertEqual(
set(self.gitlsfiles(self.directory)),
set([posix(realpath(fsdecode(filename)))]))
def test_empty_repo(self):
self.assertEqual(
[fn for fn in os.listdir(self.directory) if fn[0] != '.'],
[])
self.assertEqual(
set(self.gitlsfiles(self.directory)),
set([]))
def test_empty_dirname(self):
self.create_git_file('root.txt')
self.assertEqual(
set(self.gitlsfiles()),
set([posix(realpath('root.txt'))]))
def test_directory_only_contains_another_directory(self):
self.create_dir('foo/bar')
self.create_git_file('foo/bar/root.txt')
self.assertEqual(
set(self.gitlsfiles()),
set([posix(realpath(join('foo', 'bar', 'root.txt')))])
)
def test_empty_dirname_in_subdir(self):
self.create_git_file('root.txt')
self.create_dir('subdir')
self.create_git_file('subdir', 'entry.txt')
os.chdir(join(self.directory, 'subdir'))
self.assertEqual(
set(self.gitlsfiles()),
set([posix(realpath('../root.txt')),
posix(realpath('../subdir/entry.txt'))]))
def test_git_error(self):
import setuptools_git
from setuptools_git.utils import CalledProcessError
def do_raise(*args, **kw):
raise CalledProcessError(1, 'git')
self.create_git_file('root.txt')
saved = setuptools_git.check_output
setuptools_git.check_output = do_raise
try:
self.assertEqual(self.gitlsfiles(), set())
finally:
setuptools_git.check_output = saved
class listfiles_tests(GitTestCase):
def listfiles(self, *a, **kw):
from setuptools_git import listfiles
return listfiles(*a, **kw)
def test_at_repo_root(self):
self.create_git_file('root.txt')
self.assertEqual(
set(self.listfiles(self.directory)),
set(['root.txt']))
def test_at_repo_root_with_subdir(self):
self.create_git_file('root.txt')
self.create_dir('subdir')
self.create_git_file('subdir', 'entry.txt')
self.assertEqual(
set(self.listfiles(self.directory)),
set(['root.txt', join('subdir', 'entry.txt')]))
def test_at_repo_subdir(self):
self.create_git_file('root.txt')
self.create_dir('subdir')
self.create_git_file('subdir', 'entry.txt')
self.assertEqual(
set(self.listfiles(join(self.directory, 'subdir'))),
set(['entry.txt']))
def test_nonascii_filename(self):
filename = 'héhé.html'
# HFS Plus uses decomposed UTF-8
if sys.platform == 'darwin':
filename = decompose(filename)
self.create_git_file(filename)
self.assertEqual(
[fn for fn in os.listdir(self.directory) if fn[0] != '.'],
[filename])
self.assertEqual(
set(self.listfiles(self.directory)),
set([filename]))
def test_utf8_filename(self):
if sys.version_info >= (3,):
filename = 'héhé.html'.encode('utf-8')
else:
filename = 'héhé.html'
# HFS Plus uses decomposed UTF-8
if sys.platform == 'darwin':
filename = decompose(filename)
# Windows does not like byte filenames under Python 3
if sys.platform == 'win32' and sys.version_info >= (3,):
filename = filename.decode('utf-8')
self.create_git_file(filename)
self.assertEqual(
[fn for fn in os.listdir(self.directory) if fn[0] != '.'],
[fsdecode(filename)])
self.assertEqual(
set(self.listfiles(self.directory)),
set([fsdecode(filename)]))
def test_latin1_filename(self):
if sys.version_info >= (3,):
filename = 'héhé.html'.encode('latin-1')
else:
filename = 'h\xe9h\xe9.html'
# HFS Plus quotes unknown bytes
if sys.platform == 'darwin':
filename = hfs_quote(filename)
# Windows does not like byte filenames under Python 3
if sys.platform == 'win32' and sys.version_info >= (3,):
filename = filename.decode('latin-1')
self.create_git_file(filename)
self.assertEqual(
[fn for fn in os.listdir(self.directory) if fn[0] != '.'],
[fsdecode(filename)])
self.assertEqual(
set(self.listfiles(self.directory)),
set([fsdecode(filename)]))
def test_empty_repo(self):
self.assertEqual(
[fn for fn in os.listdir(self.directory) if fn[0] != '.'],
[])
self.assertEqual(
set(self.listfiles(self.directory)),
set([]))
def test_empty_dirname(self):
self.create_git_file('root.txt')
self.assertEqual(
set(self.listfiles()),
set(['root.txt']))
def test_directory_only_contains_another_directory(self):
self.create_dir('foo/bar')
self.create_git_file('foo/bar/root.txt')
self.assertEqual(
set(self.listfiles()),
set([join('foo', 'bar', 'root.txt')])
)
def test_empty_dirname_in_subdir(self):
self.create_git_file('root.txt')
self.create_dir('subdir')
self.create_git_file('subdir', 'entry.txt')
os.chdir(join(self.directory, 'subdir'))
self.assertEqual(
set(self.listfiles()),
set(['entry.txt']))
def test_git_error(self):
import setuptools_git
from setuptools_git.utils import CalledProcessError
def do_raise(*args, **kw):
raise CalledProcessError(1, 'git')
self.create_git_file('root.txt')
saved = setuptools_git.check_output
setuptools_git.check_output = do_raise
try:
for filename in self.listfiles():
self.fail('unexpected results')
finally:
setuptools_git.check_output = saved
| 31.668657 | 74 | 0.572533 | 1,209 | 10,609 | 4.875931 | 0.107527 | 0.057676 | 0.059542 | 0.074979 | 0.86955 | 0.837489 | 0.82324 | 0.793045 | 0.787277 | 0.783036 | 0 | 0.006853 | 0.29852 | 10,609 | 334 | 75 | 31.763473 | 0.785273 | 0.038929 | 0 | 0.788845 | 0 | 0 | 0.076304 | 0 | 0 | 0 | 0 | 0 | 0.115538 | 1 | 0.12749 | false | 0 | 0.071713 | 0 | 0.223108 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
d789807efd5325ce92b9c56fccb1d9a43a6fce75 | 173 | py | Python | tottle/types/updates/callback_query.py | muffleo/tottle | 69a5bdda879ab56d43505d517d3369a687c135a2 | [
"MIT"
] | 12 | 2020-09-06T15:31:34.000Z | 2021-02-27T20:30:34.000Z | tottle/types/updates/callback_query.py | cyanlabs-org/tottle | 6cf02022ed7b445c9b5af475c6e854b91780d792 | [
"MIT"
] | 2 | 2021-04-13T06:43:42.000Z | 2021-07-07T20:52:39.000Z | tottle/types/updates/callback_query.py | cyanlabs-org/tottle | 6cf02022ed7b445c9b5af475c6e854b91780d792 | [
"MIT"
] | 4 | 2020-09-12T03:09:25.000Z | 2021-03-22T08:52:04.000Z | from tottle.types.objects.query import CallbackQuery
from tottle.types.updates.base import BaseBotUpdate
class CallbackQueryUpdate(BaseBotUpdate, CallbackQuery):
pass
| 24.714286 | 56 | 0.83815 | 19 | 173 | 7.631579 | 0.684211 | 0.137931 | 0.206897 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.104046 | 173 | 6 | 57 | 28.833333 | 0.935484 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.25 | 0.5 | 0 | 0.75 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 7 |
ad21881e396ca87dd762cc1bc06d7ac12178d460 | 124 | py | Python | obsvail/__init__.py | sandipan1/robo_rl | 3bcb7caabeba71dd747fadf2355ac42408b7f340 | [
"MIT"
] | 5 | 2018-10-16T03:48:02.000Z | 2021-10-01T08:58:05.000Z | obsvail/__init__.py | sandipan1/robo_rl | 3bcb7caabeba71dd747fadf2355ac42408b7f340 | [
"MIT"
] | 1 | 2018-10-17T16:19:14.000Z | 2018-10-31T06:19:30.000Z | obsvail/__init__.py | sandipan1/robo_rl | 3bcb7caabeba71dd747fadf2355ac42408b7f340 | [
"MIT"
] | null | null | null | from robo_rl.obsvail.obsvail import ObsVAIL
from robo_rl.obsvail.obsvail_parser import get_obsvail_parser, get_logfile_name
| 41.333333 | 79 | 0.887097 | 20 | 124 | 5.15 | 0.45 | 0.15534 | 0.194175 | 0.330097 | 0.466019 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.072581 | 124 | 2 | 80 | 62 | 0.895652 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
ad47b14ef93536bdb9ce5fc373a2b332dfa79c38 | 186 | py | Python | EDyA_II/5_Files/seek.py | jrg-sln/academy | 498c11dcfeab78dbbbb77045a13d7d6675c0d150 | [
"MIT"
] | null | null | null | EDyA_II/5_Files/seek.py | jrg-sln/academy | 498c11dcfeab78dbbbb77045a13d7d6675c0d150 | [
"MIT"
] | null | null | null | EDyA_II/5_Files/seek.py | jrg-sln/academy | 498c11dcfeab78dbbbb77045a13d7d6675c0d150 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
print("seek()")
with open("seek.py", "r") as file:
print(file.tell())
print(file.readline())
print(file.tell())
print(file.seek(30))
print(file.readline())
| 18.6 | 34 | 0.61828 | 28 | 186 | 4.107143 | 0.5 | 0.391304 | 0.226087 | 0.313043 | 0.382609 | 0 | 0 | 0 | 0 | 0 | 0 | 0.018293 | 0.11828 | 186 | 9 | 35 | 20.666667 | 0.682927 | 0.112903 | 0 | 0.571429 | 0 | 0 | 0.08589 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 0.857143 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 7 |
ad60d43abe23350b1857b76ddabc0ead5bb34ac0 | 33,299 | py | Python | sdk/python/pulumi_azure/redis/enterprise_database.py | henriktao/pulumi-azure | f1cbcf100b42b916da36d8fe28be3a159abaf022 | [
"ECL-2.0",
"Apache-2.0"
] | 109 | 2018-06-18T00:19:44.000Z | 2022-02-20T05:32:57.000Z | sdk/python/pulumi_azure/redis/enterprise_database.py | henriktao/pulumi-azure | f1cbcf100b42b916da36d8fe28be3a159abaf022 | [
"ECL-2.0",
"Apache-2.0"
] | 663 | 2018-06-18T21:08:46.000Z | 2022-03-31T20:10:11.000Z | sdk/python/pulumi_azure/redis/enterprise_database.py | henriktao/pulumi-azure | f1cbcf100b42b916da36d8fe28be3a159abaf022 | [
"ECL-2.0",
"Apache-2.0"
] | 41 | 2018-07-19T22:37:38.000Z | 2022-03-14T10:56:26.000Z | # coding=utf-8
# *** WARNING: this file was generated by the Pulumi Terraform Bridge (tfgen) Tool. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
import warnings
import pulumi
import pulumi.runtime
from typing import Any, Mapping, Optional, Sequence, Union, overload
from .. import _utilities
from . import outputs
from ._inputs import *
__all__ = ['EnterpriseDatabaseArgs', 'EnterpriseDatabase']
@pulumi.input_type
class EnterpriseDatabaseArgs:
def __init__(__self__, *,
cluster_id: pulumi.Input[str],
resource_group_name: pulumi.Input[str],
client_protocol: Optional[pulumi.Input[str]] = None,
clustering_policy: Optional[pulumi.Input[str]] = None,
eviction_policy: Optional[pulumi.Input[str]] = None,
modules: Optional[pulumi.Input[Sequence[pulumi.Input['EnterpriseDatabaseModuleArgs']]]] = None,
name: Optional[pulumi.Input[str]] = None,
port: Optional[pulumi.Input[int]] = None):
"""
The set of arguments for constructing a EnterpriseDatabase resource.
:param pulumi.Input[str] cluster_id: The resource id of the Redis Enterprise Cluster to deploy this Redis Enterprise Database. Changing this forces a new Redis Enterprise Database to be created.
:param pulumi.Input[str] resource_group_name: The name of the Resource Group where the Redis Enterprise Database should exist. Changing this forces a new Redis Enterprise Database to be created.
:param pulumi.Input[str] client_protocol: Specifies whether redis clients can connect using TLS-encrypted or plaintext redis protocols. Default is TLS-encrypted. Possible values are `Encrypted` and `Plaintext`. Defaults to `Encrypted`. Changing this forces a new Redis Enterprise Database to be created.
:param pulumi.Input[str] clustering_policy: Clustering policy - default is OSSCluster. Specified at create time. Possible values are `EnterpriseCluster` and `OSSCluster`. Defaults to `OSSCluster`. Changing this forces a new Redis Enterprise Database to be created.
:param pulumi.Input[str] eviction_policy: Redis eviction policy - default is VolatileLRU. Possible values are `AllKeysLFU`, `AllKeysLRU`, `AllKeysRandom`, `VolatileLRU`, `VolatileLFU`, `VolatileTTL`, `VolatileRandom` and `NoEviction`. Defaults to `VolatileLRU`. Changing this forces a new Redis Enterprise Database to be created.
:param pulumi.Input[Sequence[pulumi.Input['EnterpriseDatabaseModuleArgs']]] modules: A `module` block as defined below.
:param pulumi.Input[str] name: The name which should be used for this Redis Enterprise Database. Currently the acceptable value for this argument is `default`. Defaults to `default`. Changing this forces a new Redis Enterprise Database to be created.
:param pulumi.Input[int] port: TCP port of the database endpoint. Specified at create time. Defaults to an available port. Changing this forces a new Redis Enterprise Database to be created.
"""
pulumi.set(__self__, "cluster_id", cluster_id)
pulumi.set(__self__, "resource_group_name", resource_group_name)
if client_protocol is not None:
pulumi.set(__self__, "client_protocol", client_protocol)
if clustering_policy is not None:
pulumi.set(__self__, "clustering_policy", clustering_policy)
if eviction_policy is not None:
pulumi.set(__self__, "eviction_policy", eviction_policy)
if modules is not None:
pulumi.set(__self__, "modules", modules)
if name is not None:
pulumi.set(__self__, "name", name)
if port is not None:
pulumi.set(__self__, "port", port)
@property
@pulumi.getter(name="clusterId")
def cluster_id(self) -> pulumi.Input[str]:
"""
The resource id of the Redis Enterprise Cluster to deploy this Redis Enterprise Database. Changing this forces a new Redis Enterprise Database to be created.
"""
return pulumi.get(self, "cluster_id")
@cluster_id.setter
def cluster_id(self, value: pulumi.Input[str]):
pulumi.set(self, "cluster_id", value)
@property
@pulumi.getter(name="resourceGroupName")
def resource_group_name(self) -> pulumi.Input[str]:
"""
The name of the Resource Group where the Redis Enterprise Database should exist. Changing this forces a new Redis Enterprise Database to be created.
"""
return pulumi.get(self, "resource_group_name")
@resource_group_name.setter
def resource_group_name(self, value: pulumi.Input[str]):
pulumi.set(self, "resource_group_name", value)
@property
@pulumi.getter(name="clientProtocol")
def client_protocol(self) -> Optional[pulumi.Input[str]]:
"""
Specifies whether redis clients can connect using TLS-encrypted or plaintext redis protocols. Default is TLS-encrypted. Possible values are `Encrypted` and `Plaintext`. Defaults to `Encrypted`. Changing this forces a new Redis Enterprise Database to be created.
"""
return pulumi.get(self, "client_protocol")
@client_protocol.setter
def client_protocol(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "client_protocol", value)
@property
@pulumi.getter(name="clusteringPolicy")
def clustering_policy(self) -> Optional[pulumi.Input[str]]:
"""
Clustering policy - default is OSSCluster. Specified at create time. Possible values are `EnterpriseCluster` and `OSSCluster`. Defaults to `OSSCluster`. Changing this forces a new Redis Enterprise Database to be created.
"""
return pulumi.get(self, "clustering_policy")
@clustering_policy.setter
def clustering_policy(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "clustering_policy", value)
@property
@pulumi.getter(name="evictionPolicy")
def eviction_policy(self) -> Optional[pulumi.Input[str]]:
"""
Redis eviction policy - default is VolatileLRU. Possible values are `AllKeysLFU`, `AllKeysLRU`, `AllKeysRandom`, `VolatileLRU`, `VolatileLFU`, `VolatileTTL`, `VolatileRandom` and `NoEviction`. Defaults to `VolatileLRU`. Changing this forces a new Redis Enterprise Database to be created.
"""
return pulumi.get(self, "eviction_policy")
@eviction_policy.setter
def eviction_policy(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "eviction_policy", value)
@property
@pulumi.getter
def modules(self) -> Optional[pulumi.Input[Sequence[pulumi.Input['EnterpriseDatabaseModuleArgs']]]]:
"""
A `module` block as defined below.
"""
return pulumi.get(self, "modules")
@modules.setter
def modules(self, value: Optional[pulumi.Input[Sequence[pulumi.Input['EnterpriseDatabaseModuleArgs']]]]):
pulumi.set(self, "modules", value)
@property
@pulumi.getter
def name(self) -> Optional[pulumi.Input[str]]:
"""
The name which should be used for this Redis Enterprise Database. Currently the acceptable value for this argument is `default`. Defaults to `default`. Changing this forces a new Redis Enterprise Database to be created.
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "name", value)
@property
@pulumi.getter
def port(self) -> Optional[pulumi.Input[int]]:
"""
TCP port of the database endpoint. Specified at create time. Defaults to an available port. Changing this forces a new Redis Enterprise Database to be created.
"""
return pulumi.get(self, "port")
@port.setter
def port(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "port", value)
@pulumi.input_type
class _EnterpriseDatabaseState:
def __init__(__self__, *,
client_protocol: Optional[pulumi.Input[str]] = None,
cluster_id: Optional[pulumi.Input[str]] = None,
clustering_policy: Optional[pulumi.Input[str]] = None,
eviction_policy: Optional[pulumi.Input[str]] = None,
modules: Optional[pulumi.Input[Sequence[pulumi.Input['EnterpriseDatabaseModuleArgs']]]] = None,
name: Optional[pulumi.Input[str]] = None,
port: Optional[pulumi.Input[int]] = None,
primary_access_key: Optional[pulumi.Input[str]] = None,
resource_group_name: Optional[pulumi.Input[str]] = None,
secondary_access_key: Optional[pulumi.Input[str]] = None):
"""
Input properties used for looking up and filtering EnterpriseDatabase resources.
:param pulumi.Input[str] client_protocol: Specifies whether redis clients can connect using TLS-encrypted or plaintext redis protocols. Default is TLS-encrypted. Possible values are `Encrypted` and `Plaintext`. Defaults to `Encrypted`. Changing this forces a new Redis Enterprise Database to be created.
:param pulumi.Input[str] cluster_id: The resource id of the Redis Enterprise Cluster to deploy this Redis Enterprise Database. Changing this forces a new Redis Enterprise Database to be created.
:param pulumi.Input[str] clustering_policy: Clustering policy - default is OSSCluster. Specified at create time. Possible values are `EnterpriseCluster` and `OSSCluster`. Defaults to `OSSCluster`. Changing this forces a new Redis Enterprise Database to be created.
:param pulumi.Input[str] eviction_policy: Redis eviction policy - default is VolatileLRU. Possible values are `AllKeysLFU`, `AllKeysLRU`, `AllKeysRandom`, `VolatileLRU`, `VolatileLFU`, `VolatileTTL`, `VolatileRandom` and `NoEviction`. Defaults to `VolatileLRU`. Changing this forces a new Redis Enterprise Database to be created.
:param pulumi.Input[Sequence[pulumi.Input['EnterpriseDatabaseModuleArgs']]] modules: A `module` block as defined below.
:param pulumi.Input[str] name: The name which should be used for this Redis Enterprise Database. Currently the acceptable value for this argument is `default`. Defaults to `default`. Changing this forces a new Redis Enterprise Database to be created.
:param pulumi.Input[int] port: TCP port of the database endpoint. Specified at create time. Defaults to an available port. Changing this forces a new Redis Enterprise Database to be created.
:param pulumi.Input[str] primary_access_key: The Primary Access Key for the Redis Enterprise Database Instance.
:param pulumi.Input[str] resource_group_name: The name of the Resource Group where the Redis Enterprise Database should exist. Changing this forces a new Redis Enterprise Database to be created.
:param pulumi.Input[str] secondary_access_key: The Secondary Access Key for the Redis Enterprise Database Instance.
"""
if client_protocol is not None:
pulumi.set(__self__, "client_protocol", client_protocol)
if cluster_id is not None:
pulumi.set(__self__, "cluster_id", cluster_id)
if clustering_policy is not None:
pulumi.set(__self__, "clustering_policy", clustering_policy)
if eviction_policy is not None:
pulumi.set(__self__, "eviction_policy", eviction_policy)
if modules is not None:
pulumi.set(__self__, "modules", modules)
if name is not None:
pulumi.set(__self__, "name", name)
if port is not None:
pulumi.set(__self__, "port", port)
if primary_access_key is not None:
pulumi.set(__self__, "primary_access_key", primary_access_key)
if resource_group_name is not None:
pulumi.set(__self__, "resource_group_name", resource_group_name)
if secondary_access_key is not None:
pulumi.set(__self__, "secondary_access_key", secondary_access_key)
@property
@pulumi.getter(name="clientProtocol")
def client_protocol(self) -> Optional[pulumi.Input[str]]:
"""
Specifies whether redis clients can connect using TLS-encrypted or plaintext redis protocols. Default is TLS-encrypted. Possible values are `Encrypted` and `Plaintext`. Defaults to `Encrypted`. Changing this forces a new Redis Enterprise Database to be created.
"""
return pulumi.get(self, "client_protocol")
@client_protocol.setter
def client_protocol(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "client_protocol", value)
@property
@pulumi.getter(name="clusterId")
def cluster_id(self) -> Optional[pulumi.Input[str]]:
"""
The resource id of the Redis Enterprise Cluster to deploy this Redis Enterprise Database. Changing this forces a new Redis Enterprise Database to be created.
"""
return pulumi.get(self, "cluster_id")
@cluster_id.setter
def cluster_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "cluster_id", value)
@property
@pulumi.getter(name="clusteringPolicy")
def clustering_policy(self) -> Optional[pulumi.Input[str]]:
"""
Clustering policy - default is OSSCluster. Specified at create time. Possible values are `EnterpriseCluster` and `OSSCluster`. Defaults to `OSSCluster`. Changing this forces a new Redis Enterprise Database to be created.
"""
return pulumi.get(self, "clustering_policy")
@clustering_policy.setter
def clustering_policy(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "clustering_policy", value)
@property
@pulumi.getter(name="evictionPolicy")
def eviction_policy(self) -> Optional[pulumi.Input[str]]:
"""
Redis eviction policy - default is VolatileLRU. Possible values are `AllKeysLFU`, `AllKeysLRU`, `AllKeysRandom`, `VolatileLRU`, `VolatileLFU`, `VolatileTTL`, `VolatileRandom` and `NoEviction`. Defaults to `VolatileLRU`. Changing this forces a new Redis Enterprise Database to be created.
"""
return pulumi.get(self, "eviction_policy")
@eviction_policy.setter
def eviction_policy(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "eviction_policy", value)
@property
@pulumi.getter
def modules(self) -> Optional[pulumi.Input[Sequence[pulumi.Input['EnterpriseDatabaseModuleArgs']]]]:
"""
A `module` block as defined below.
"""
return pulumi.get(self, "modules")
@modules.setter
def modules(self, value: Optional[pulumi.Input[Sequence[pulumi.Input['EnterpriseDatabaseModuleArgs']]]]):
pulumi.set(self, "modules", value)
@property
@pulumi.getter
def name(self) -> Optional[pulumi.Input[str]]:
"""
The name which should be used for this Redis Enterprise Database. Currently the acceptable value for this argument is `default`. Defaults to `default`. Changing this forces a new Redis Enterprise Database to be created.
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "name", value)
@property
@pulumi.getter
def port(self) -> Optional[pulumi.Input[int]]:
"""
TCP port of the database endpoint. Specified at create time. Defaults to an available port. Changing this forces a new Redis Enterprise Database to be created.
"""
return pulumi.get(self, "port")
@port.setter
def port(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "port", value)
@property
@pulumi.getter(name="primaryAccessKey")
def primary_access_key(self) -> Optional[pulumi.Input[str]]:
"""
The Primary Access Key for the Redis Enterprise Database Instance.
"""
return pulumi.get(self, "primary_access_key")
@primary_access_key.setter
def primary_access_key(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "primary_access_key", value)
@property
@pulumi.getter(name="resourceGroupName")
def resource_group_name(self) -> Optional[pulumi.Input[str]]:
"""
The name of the Resource Group where the Redis Enterprise Database should exist. Changing this forces a new Redis Enterprise Database to be created.
"""
return pulumi.get(self, "resource_group_name")
@resource_group_name.setter
def resource_group_name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "resource_group_name", value)
@property
@pulumi.getter(name="secondaryAccessKey")
def secondary_access_key(self) -> Optional[pulumi.Input[str]]:
"""
The Secondary Access Key for the Redis Enterprise Database Instance.
"""
return pulumi.get(self, "secondary_access_key")
@secondary_access_key.setter
def secondary_access_key(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "secondary_access_key", value)
class EnterpriseDatabase(pulumi.CustomResource):
@overload
def __init__(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
client_protocol: Optional[pulumi.Input[str]] = None,
cluster_id: Optional[pulumi.Input[str]] = None,
clustering_policy: Optional[pulumi.Input[str]] = None,
eviction_policy: Optional[pulumi.Input[str]] = None,
modules: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['EnterpriseDatabaseModuleArgs']]]]] = None,
name: Optional[pulumi.Input[str]] = None,
port: Optional[pulumi.Input[int]] = None,
resource_group_name: Optional[pulumi.Input[str]] = None,
__props__=None):
"""
Manages a Redis Enterprise Database.
## Example Usage
```python
import pulumi
import pulumi_azure as azure
example_resource_group = azure.core.ResourceGroup("exampleResourceGroup", location="West Europe")
example_enterprise_cluster = azure.redis.EnterpriseCluster("exampleEnterpriseCluster",
resource_group_name=example_resource_group.name,
location=example_resource_group.location,
sku_name="Enterprise_E20-4")
example_enterprise_database = azure.redis.EnterpriseDatabase("exampleEnterpriseDatabase",
resource_group_name=example_resource_group.name,
cluster_id=example_enterprise_cluster.id)
```
## Import
Redis Enterprise Databases can be imported using the `resource id`, e.g.
```sh
$ pulumi import azure:redis/enterpriseDatabase:EnterpriseDatabase example /subscriptions/00000000-0000-0000-0000-000000000000/resourceGroups/group1/providers/Microsoft.Cache/redisEnterprise/cluster1/databases/database1
```
:param str resource_name: The name of the resource.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[str] client_protocol: Specifies whether redis clients can connect using TLS-encrypted or plaintext redis protocols. Default is TLS-encrypted. Possible values are `Encrypted` and `Plaintext`. Defaults to `Encrypted`. Changing this forces a new Redis Enterprise Database to be created.
:param pulumi.Input[str] cluster_id: The resource id of the Redis Enterprise Cluster to deploy this Redis Enterprise Database. Changing this forces a new Redis Enterprise Database to be created.
:param pulumi.Input[str] clustering_policy: Clustering policy - default is OSSCluster. Specified at create time. Possible values are `EnterpriseCluster` and `OSSCluster`. Defaults to `OSSCluster`. Changing this forces a new Redis Enterprise Database to be created.
:param pulumi.Input[str] eviction_policy: Redis eviction policy - default is VolatileLRU. Possible values are `AllKeysLFU`, `AllKeysLRU`, `AllKeysRandom`, `VolatileLRU`, `VolatileLFU`, `VolatileTTL`, `VolatileRandom` and `NoEviction`. Defaults to `VolatileLRU`. Changing this forces a new Redis Enterprise Database to be created.
:param pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['EnterpriseDatabaseModuleArgs']]]] modules: A `module` block as defined below.
:param pulumi.Input[str] name: The name which should be used for this Redis Enterprise Database. Currently the acceptable value for this argument is `default`. Defaults to `default`. Changing this forces a new Redis Enterprise Database to be created.
:param pulumi.Input[int] port: TCP port of the database endpoint. Specified at create time. Defaults to an available port. Changing this forces a new Redis Enterprise Database to be created.
:param pulumi.Input[str] resource_group_name: The name of the Resource Group where the Redis Enterprise Database should exist. Changing this forces a new Redis Enterprise Database to be created.
"""
...
@overload
def __init__(__self__,
resource_name: str,
args: EnterpriseDatabaseArgs,
opts: Optional[pulumi.ResourceOptions] = None):
"""
Manages a Redis Enterprise Database.
## Example Usage
```python
import pulumi
import pulumi_azure as azure
example_resource_group = azure.core.ResourceGroup("exampleResourceGroup", location="West Europe")
example_enterprise_cluster = azure.redis.EnterpriseCluster("exampleEnterpriseCluster",
resource_group_name=example_resource_group.name,
location=example_resource_group.location,
sku_name="Enterprise_E20-4")
example_enterprise_database = azure.redis.EnterpriseDatabase("exampleEnterpriseDatabase",
resource_group_name=example_resource_group.name,
cluster_id=example_enterprise_cluster.id)
```
## Import
Redis Enterprise Databases can be imported using the `resource id`, e.g.
```sh
$ pulumi import azure:redis/enterpriseDatabase:EnterpriseDatabase example /subscriptions/00000000-0000-0000-0000-000000000000/resourceGroups/group1/providers/Microsoft.Cache/redisEnterprise/cluster1/databases/database1
```
:param str resource_name: The name of the resource.
:param EnterpriseDatabaseArgs args: The arguments to use to populate this resource's properties.
:param pulumi.ResourceOptions opts: Options for the resource.
"""
...
def __init__(__self__, resource_name: str, *args, **kwargs):
resource_args, opts = _utilities.get_resource_args_opts(EnterpriseDatabaseArgs, pulumi.ResourceOptions, *args, **kwargs)
if resource_args is not None:
__self__._internal_init(resource_name, opts, **resource_args.__dict__)
else:
__self__._internal_init(resource_name, *args, **kwargs)
def _internal_init(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
client_protocol: Optional[pulumi.Input[str]] = None,
cluster_id: Optional[pulumi.Input[str]] = None,
clustering_policy: Optional[pulumi.Input[str]] = None,
eviction_policy: Optional[pulumi.Input[str]] = None,
modules: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['EnterpriseDatabaseModuleArgs']]]]] = None,
name: Optional[pulumi.Input[str]] = None,
port: Optional[pulumi.Input[int]] = None,
resource_group_name: Optional[pulumi.Input[str]] = None,
__props__=None):
if opts is None:
opts = pulumi.ResourceOptions()
if not isinstance(opts, pulumi.ResourceOptions):
raise TypeError('Expected resource options to be a ResourceOptions instance')
if opts.version is None:
opts.version = _utilities.get_version()
if opts.id is None:
if __props__ is not None:
raise TypeError('__props__ is only valid when passed in combination with a valid opts.id to get an existing resource')
__props__ = EnterpriseDatabaseArgs.__new__(EnterpriseDatabaseArgs)
__props__.__dict__["client_protocol"] = client_protocol
if cluster_id is None and not opts.urn:
raise TypeError("Missing required property 'cluster_id'")
__props__.__dict__["cluster_id"] = cluster_id
__props__.__dict__["clustering_policy"] = clustering_policy
__props__.__dict__["eviction_policy"] = eviction_policy
__props__.__dict__["modules"] = modules
__props__.__dict__["name"] = name
__props__.__dict__["port"] = port
if resource_group_name is None and not opts.urn:
raise TypeError("Missing required property 'resource_group_name'")
__props__.__dict__["resource_group_name"] = resource_group_name
__props__.__dict__["primary_access_key"] = None
__props__.__dict__["secondary_access_key"] = None
super(EnterpriseDatabase, __self__).__init__(
'azure:redis/enterpriseDatabase:EnterpriseDatabase',
resource_name,
__props__,
opts)
@staticmethod
def get(resource_name: str,
id: pulumi.Input[str],
opts: Optional[pulumi.ResourceOptions] = None,
client_protocol: Optional[pulumi.Input[str]] = None,
cluster_id: Optional[pulumi.Input[str]] = None,
clustering_policy: Optional[pulumi.Input[str]] = None,
eviction_policy: Optional[pulumi.Input[str]] = None,
modules: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['EnterpriseDatabaseModuleArgs']]]]] = None,
name: Optional[pulumi.Input[str]] = None,
port: Optional[pulumi.Input[int]] = None,
primary_access_key: Optional[pulumi.Input[str]] = None,
resource_group_name: Optional[pulumi.Input[str]] = None,
secondary_access_key: Optional[pulumi.Input[str]] = None) -> 'EnterpriseDatabase':
"""
Get an existing EnterpriseDatabase resource's state with the given name, id, and optional extra
properties used to qualify the lookup.
:param str resource_name: The unique name of the resulting resource.
:param pulumi.Input[str] id: The unique provider ID of the resource to lookup.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[str] client_protocol: Specifies whether redis clients can connect using TLS-encrypted or plaintext redis protocols. Default is TLS-encrypted. Possible values are `Encrypted` and `Plaintext`. Defaults to `Encrypted`. Changing this forces a new Redis Enterprise Database to be created.
:param pulumi.Input[str] cluster_id: The resource id of the Redis Enterprise Cluster to deploy this Redis Enterprise Database. Changing this forces a new Redis Enterprise Database to be created.
:param pulumi.Input[str] clustering_policy: Clustering policy - default is OSSCluster. Specified at create time. Possible values are `EnterpriseCluster` and `OSSCluster`. Defaults to `OSSCluster`. Changing this forces a new Redis Enterprise Database to be created.
:param pulumi.Input[str] eviction_policy: Redis eviction policy - default is VolatileLRU. Possible values are `AllKeysLFU`, `AllKeysLRU`, `AllKeysRandom`, `VolatileLRU`, `VolatileLFU`, `VolatileTTL`, `VolatileRandom` and `NoEviction`. Defaults to `VolatileLRU`. Changing this forces a new Redis Enterprise Database to be created.
:param pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['EnterpriseDatabaseModuleArgs']]]] modules: A `module` block as defined below.
:param pulumi.Input[str] name: The name which should be used for this Redis Enterprise Database. Currently the acceptable value for this argument is `default`. Defaults to `default`. Changing this forces a new Redis Enterprise Database to be created.
:param pulumi.Input[int] port: TCP port of the database endpoint. Specified at create time. Defaults to an available port. Changing this forces a new Redis Enterprise Database to be created.
:param pulumi.Input[str] primary_access_key: The Primary Access Key for the Redis Enterprise Database Instance.
:param pulumi.Input[str] resource_group_name: The name of the Resource Group where the Redis Enterprise Database should exist. Changing this forces a new Redis Enterprise Database to be created.
:param pulumi.Input[str] secondary_access_key: The Secondary Access Key for the Redis Enterprise Database Instance.
"""
opts = pulumi.ResourceOptions.merge(opts, pulumi.ResourceOptions(id=id))
__props__ = _EnterpriseDatabaseState.__new__(_EnterpriseDatabaseState)
__props__.__dict__["client_protocol"] = client_protocol
__props__.__dict__["cluster_id"] = cluster_id
__props__.__dict__["clustering_policy"] = clustering_policy
__props__.__dict__["eviction_policy"] = eviction_policy
__props__.__dict__["modules"] = modules
__props__.__dict__["name"] = name
__props__.__dict__["port"] = port
__props__.__dict__["primary_access_key"] = primary_access_key
__props__.__dict__["resource_group_name"] = resource_group_name
__props__.__dict__["secondary_access_key"] = secondary_access_key
return EnterpriseDatabase(resource_name, opts=opts, __props__=__props__)
@property
@pulumi.getter(name="clientProtocol")
def client_protocol(self) -> pulumi.Output[Optional[str]]:
"""
Specifies whether redis clients can connect using TLS-encrypted or plaintext redis protocols. Default is TLS-encrypted. Possible values are `Encrypted` and `Plaintext`. Defaults to `Encrypted`. Changing this forces a new Redis Enterprise Database to be created.
"""
return pulumi.get(self, "client_protocol")
@property
@pulumi.getter(name="clusterId")
def cluster_id(self) -> pulumi.Output[str]:
"""
The resource id of the Redis Enterprise Cluster to deploy this Redis Enterprise Database. Changing this forces a new Redis Enterprise Database to be created.
"""
return pulumi.get(self, "cluster_id")
@property
@pulumi.getter(name="clusteringPolicy")
def clustering_policy(self) -> pulumi.Output[Optional[str]]:
"""
Clustering policy - default is OSSCluster. Specified at create time. Possible values are `EnterpriseCluster` and `OSSCluster`. Defaults to `OSSCluster`. Changing this forces a new Redis Enterprise Database to be created.
"""
return pulumi.get(self, "clustering_policy")
@property
@pulumi.getter(name="evictionPolicy")
def eviction_policy(self) -> pulumi.Output[Optional[str]]:
"""
Redis eviction policy - default is VolatileLRU. Possible values are `AllKeysLFU`, `AllKeysLRU`, `AllKeysRandom`, `VolatileLRU`, `VolatileLFU`, `VolatileTTL`, `VolatileRandom` and `NoEviction`. Defaults to `VolatileLRU`. Changing this forces a new Redis Enterprise Database to be created.
"""
return pulumi.get(self, "eviction_policy")
@property
@pulumi.getter
def modules(self) -> pulumi.Output[Optional[Sequence['outputs.EnterpriseDatabaseModule']]]:
"""
A `module` block as defined below.
"""
return pulumi.get(self, "modules")
@property
@pulumi.getter
def name(self) -> pulumi.Output[str]:
"""
The name which should be used for this Redis Enterprise Database. Currently the acceptable value for this argument is `default`. Defaults to `default`. Changing this forces a new Redis Enterprise Database to be created.
"""
return pulumi.get(self, "name")
@property
@pulumi.getter
def port(self) -> pulumi.Output[Optional[int]]:
"""
TCP port of the database endpoint. Specified at create time. Defaults to an available port. Changing this forces a new Redis Enterprise Database to be created.
"""
return pulumi.get(self, "port")
@property
@pulumi.getter(name="primaryAccessKey")
def primary_access_key(self) -> pulumi.Output[str]:
"""
The Primary Access Key for the Redis Enterprise Database Instance.
"""
return pulumi.get(self, "primary_access_key")
@property
@pulumi.getter(name="resourceGroupName")
def resource_group_name(self) -> pulumi.Output[str]:
"""
The name of the Resource Group where the Redis Enterprise Database should exist. Changing this forces a new Redis Enterprise Database to be created.
"""
return pulumi.get(self, "resource_group_name")
@property
@pulumi.getter(name="secondaryAccessKey")
def secondary_access_key(self) -> pulumi.Output[str]:
"""
The Secondary Access Key for the Redis Enterprise Database Instance.
"""
return pulumi.get(self, "secondary_access_key")
| 56.921368 | 337 | 0.695066 | 3,926 | 33,299 | 5.722364 | 0.059093 | 0.065121 | 0.057331 | 0.054838 | 0.91663 | 0.902742 | 0.888409 | 0.874165 | 0.864862 | 0.855337 | 0 | 0.002943 | 0.214181 | 33,299 | 584 | 338 | 57.018836 | 0.85562 | 0.46608 | 0 | 0.737003 | 1 | 0 | 0.119624 | 0.022972 | 0 | 0 | 0 | 0 | 0 | 1 | 0.16208 | false | 0.003058 | 0.021407 | 0 | 0.281346 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
ad779be1f5e86d684093783630faab91c1c8003c | 506 | py | Python | test/test_pycron.py | mutalisk999/pycron | 8aba4c58a77179cdce6045913661e802592fb8e5 | [
"MIT"
] | null | null | null | test/test_pycron.py | mutalisk999/pycron | 8aba4c58a77179cdce6045913661e802592fb8e5 | [
"MIT"
] | null | null | null | test/test_pycron.py | mutalisk999/pycron | 8aba4c58a77179cdce6045913661e802592fb8e5 | [
"MIT"
] | null | null | null | import sys, time, pycron
print pycron.last_cronexpr_time("0 * * * * ?", int(time.time()))
print pycron.next_cronexpr_time("0 * * * * ?", int(time.time()))
print pycron.last_cronexpr_time("0/5 * * * * ?", int(time.time()))
print pycron.next_cronexpr_time("0/5 * * * * ?", int(time.time()))
print pycron.last_cronexpr_time("1,2,3 * * * * ?", int(time.time()))
print pycron.next_cronexpr_time("1,2,3 * * * * ?", int(time.time()))
'''
1478964180
1478964240
1478964235
1478964240
1478964183
1478964241
'''
| 24.095238 | 68 | 0.65415 | 70 | 506 | 4.557143 | 0.271429 | 0.206897 | 0.206897 | 0.250784 | 0.752351 | 0.752351 | 0.705329 | 0.705329 | 0.360502 | 0 | 0 | 0.163265 | 0.128459 | 506 | 20 | 69 | 25.3 | 0.560091 | 0 | 0 | 0 | 0 | 0 | 0.180139 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.142857 | null | null | 0.857143 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 8 |
ad7e4dad4217d19180f5806fa1b40042ba76ea2c | 6,734 | py | Python | test/modules/shogi_input_test.py | nakanishi-m/slack-shogi | c1046cc8fb2d0f5164446b4370a55e2f11eb932c | [
"MIT"
] | 41 | 2016-07-18T14:50:10.000Z | 2021-12-16T03:20:15.000Z | test/modules/shogi_input_test.py | nakanishi-m/slack-shogi | c1046cc8fb2d0f5164446b4370a55e2f11eb932c | [
"MIT"
] | 96 | 2016-07-16T16:11:23.000Z | 2021-07-06T05:27:10.000Z | test/modules/shogi_input_test.py | setokinto/slack-shogi | a026fdb5e0acb5573230e26ec08e6cf9a8759f5e | [
"MIT"
] | 6 | 2017-01-04T06:33:50.000Z | 2018-10-25T02:19:09.000Z |
import unittest
from app.modules.shogi_input import ShogiInput, UserDifferentException, KomaCannotMoveException
from app.modules.shogi import Koma
class ShogiTest(unittest.TestCase):
def setUp(self):
pass
def test_shogi_input_is_initable(self):
shogi = ShogiInput.init("channel_id", [{
"id": "user1",
"name": "user1name",
}, {
"id": "user2",
"name": "user2name",
}
])
self.assertEqual(shogi.channel_id, "channel_id")
shogi = ShogiInput.init("channel_id", [{
"id": "user1",
"name": "user1name",
}, {
"id": "user2",
"name": "user2name",
}
])
self.assertIsNone(shogi)
ShogiInput.clear("channel_id")
shogi = ShogiInput.init("channel_id", [{
"id": "user1",
"name": "user1name",
}, {
"id": "user2",
"name": "user2name",
}
])
self.assertEqual(shogi.channel_id, "channel_id")
def test_clear_for_non_exists_channnel(self):
self.assertIsNone(ShogiInput.clear("channel_id_non_exists"))
def test_move_method_should_work(self):
channel_id = "test_move_method_should_work"
shogi = ShogiInput.init(channel_id, [{
"id": "user1",
"name": "user1name",
}, {
"id": "user2",
"name": "user2name",
}])
ShogiInput.move("76歩", channel_id, shogi.first_user.id)
self.assertEqual(shogi.board[5][2], Koma.fu)
def test_move_method_should_raise_UserDifferentException(self):
channel_id = "test_move_method_should_raise_UserDifferentException"
shogi = ShogiInput.init(channel_id, [{
"id": "user1",
"name": "user1name",
}, {
"id": "user2",
"name": "user2name",
}])
with self.assertRaises(UserDifferentException):
ShogiInput.move("76歩", channel_id, shogi.second_user.id)
with self.assertRaises(UserDifferentException):
ShogiInput.move("76歩", channel_id, shogi.second_user.id)
def test_move_method_should_raise_KomaCannotMoveException(self):
channel_id = "test_move_method_should_raise_KomaCannotMoveException"
shogi = ShogiInput.init(channel_id, [{
"id": "user1",
"name": "user1name",
}, {
"id": "user2",
"name": "user2name",
}])
with self.assertRaises(KomaCannotMoveException):
ShogiInput.move("75歩", channel_id, shogi.first_user.id)
with self.assertRaises(KomaCannotMoveException):
ShogiInput.move("34歩", channel_id, shogi.first_user.id)
with self.assertRaises(KomaCannotMoveException):
ShogiInput.move("15151歩", channel_id, shogi.first_user.id)
with self.assertRaises(KomaCannotMoveException):
ShogiInput.move("Wow, it's great.", channel_id, shogi.first_user.id)
def test_set_any_user_validator(self):
channel_id = "test_set_validotr"
shogi = ShogiInput.init(channel_id, [{
"id": "user1",
"name": "user1name",
}, {
"id": "user2",
"name": "user2name",
}])
ShogiInput.move("76歩", channel_id, shogi.first_user.id)
with self.assertRaises(UserDifferentException):
ShogiInput.move("34歩", channel_id, shogi.first_user.id)
ShogiInput.setAllMode(channel_id)
ShogiInput.move("34歩", channel_id, shogi.first_user.id)
def test_matta(self):
channel_id = "test_matta"
shogi = ShogiInput.init(channel_id, [{
"id": "user1",
"name": "user1name",
}, {
"id": "user2",
"name": "user2name",
}])
ShogiInput.move("76歩", channel_id, shogi.first_user.id)
self.assertEqual(shogi.board[5][2], Koma.fu)
ShogiInput.matta(channel_id, shogi.second_user.id)
self.assertEqual(shogi.board[5][2], Koma.empty)
ShogiInput.move("76歩", channel_id, shogi.first_user.id)
self.assertEqual(shogi.board[5][2], Koma.fu)
def test_matta_for_UserDifferentException(self):
channel_id = "test_matta_for_UserDifferentException"
shogi = ShogiInput.init(channel_id, [{
"id": "user1",
"name": "user1name",
}, {
"id": "user2",
"name": "user2name",
}])
ShogiInput.move("76歩", channel_id, shogi.first_user.id)
self.assertEqual(shogi.board[5][2], Koma.fu)
with self.assertRaises(UserDifferentException):
ShogiInput.matta(channel_id, shogi.first_user.id)
ShogiInput.move("34歩", channel_id, shogi.second_user.id)
with self.assertRaises(UserDifferentException):
ShogiInput.matta(channel_id, shogi.second_user.id)
def test_matta_for_KomaCannotMoveException(self):
channel_id = "test_matta_for_KomaCannotMoveException"
shogi = ShogiInput.init(channel_id, [{
"id": "user1",
"name": "user1name",
}, {
"id": "user2",
"name": "user2name",
}])
with self.assertRaises(KomaCannotMoveException):
ShogiInput.matta(channel_id, shogi.first_user.id)
def test_matta_for_drop_komas(self):
channel_id = "test_matta_for_da_komas"
shogi = ShogiInput.init(channel_id, [{
"id": "user1",
"name": "user1name",
}, {
"id": "user2",
"name": "user2name",
}])
ShogiInput.move("76歩", channel_id, shogi.first_user.id)
ShogiInput.move("34歩", channel_id, shogi.second_user.id)
ShogiInput.move("22角", channel_id, shogi.first_user.id)
ShogiInput.move("同銀", channel_id, shogi.second_user.id)
ShogiInput.move("55角打", channel_id, shogi.first_user.id)
ShogiInput.move("33角打", channel_id, shogi.second_user.id)
ShogiInput.matta(channel_id, shogi.first_user.id)
ShogiInput.matta(channel_id, shogi.second_user.id)
ShogiInput.move("55角打", channel_id, shogi.first_user.id)
ShogiInput.move("33角打", channel_id, shogi.second_user.id)
self.assertEqual(shogi.board[4][4], Koma.kaku)
self.assertEqual(shogi.board[2][6], Koma.opponent_kaku)
def test_try_to_get_shogi_board(self):
channel_id = "test_try_to_get_shogi_board"
shogi = ShogiInput.init(channel_id, [{
"id": "user1",
"name": "user1name",
}, {
"id": "user2",
"name": "user2name",
}])
ShogiInput.get_shogi_board(channel_id)
| 35.819149 | 95 | 0.589991 | 701 | 6,734 | 5.430813 | 0.112696 | 0.134752 | 0.110323 | 0.089835 | 0.86052 | 0.834778 | 0.776727 | 0.764907 | 0.703441 | 0.627791 | 0 | 0.021574 | 0.27725 | 6,734 | 187 | 96 | 36.010695 | 0.760633 | 0 | 0 | 0.745455 | 0 | 0 | 0.138295 | 0.041444 | 0 | 0 | 0 | 0 | 0.127273 | 1 | 0.072727 | false | 0.006061 | 0.018182 | 0 | 0.09697 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
d100cd12312264959100d1ca4c0d0c85306c73b2 | 384 | py | Python | word_vectorizer/exceptions/vector_not_computed_exception.py | RodSernaPerez/WordVectorizer | 097b2ccfc284b39ad43f56047ee25e393b7525ec | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | word_vectorizer/exceptions/vector_not_computed_exception.py | RodSernaPerez/WordVectorizer | 097b2ccfc284b39ad43f56047ee25e393b7525ec | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | word_vectorizer/exceptions/vector_not_computed_exception.py | RodSernaPerez/WordVectorizer | 097b2ccfc284b39ad43f56047ee25e393b7525ec | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | """VectorNotComputedException
Exception raised when the vector of a word could not be gotten (typically the
word does not appear in the model vocabulary."""
class VectorNotComputedException(Exception):
"""VectorNotComputedException
Exception raised when the vector of a word could not be gotten (typically
the word does not appear in the model vocabulary."""
pass
| 29.538462 | 77 | 0.770833 | 50 | 384 | 5.92 | 0.42 | 0.35473 | 0.277027 | 0.304054 | 0.851351 | 0.851351 | 0.851351 | 0.851351 | 0.851351 | 0.851351 | 0 | 0 | 0.179688 | 384 | 12 | 78 | 32 | 0.939683 | 0.789063 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.5 | 0 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 11 |
0f5f0ccd193301b6995a9dac57ab913562c28e03 | 41,806 | py | Python | Packs/Sixgill-Darkfeed/Integrations/Sixgill_DVE_Feed/Sixgill_DVE_Feed_test.py | keremvatandas/content | 550d6252601310e48cea077b539823f3c8becb6b | [
"MIT"
] | 2 | 2020-07-27T10:35:41.000Z | 2020-12-14T15:44:18.000Z | Packs/Sixgill-Darkfeed/Integrations/Sixgill_DVE_Feed/Sixgill_DVE_Feed_test.py | Axonius/content | e058add82b7422338015cf14591512b9aad4d3e9 | [
"MIT"
] | 48 | 2022-03-08T13:45:00.000Z | 2022-03-31T14:32:05.000Z | Packs/Sixgill-Darkfeed/Integrations/Sixgill_DVE_Feed/Sixgill_DVE_Feed_test.py | adambaumeister/content | c6808d0b13d00edc4cd6268793c2ae0c2e39aed6 | [
"MIT"
] | 1 | 2022-01-06T07:09:11.000Z | 2022-01-06T07:09:11.000Z | import requests
import pytest
import json
import demistomock as demisto
bundle_index = 0
submitted_indicators = 0
mocked_get_token_response = """{"access_token": "fababfafbh"}"""
iocs_bundle = [
{
"id": "bundle--f00374ec-429c-40cb-b7bb-61f920814775",
"objects": [
{
"created": "2017-01-20T00:00:00.000Z",
"definition": {"tlp": "amber"},
"definition_type": "tlp",
"id": "marking-definition--f88d31f6-486f-44da-b317-01333bde0b82",
"type": "marking-definition",
},
{
"created": "2019-12-26T00:00:00Z",
"definition": {"statement": "Copyright Sixgill 2020. All rights reserved."},
"definition_type": "statement",
"id": "marking-definition--41eaaf7c-0bc0-4c56-abdf-d89a7f096ac4",
"type": "marking-definition",
},
{
"created": "2020-09-06T20:33:33.538Z",
"external_references": [{"external_id": "CVE-2020-15392", "source_name": "cve"}],
"id": "cveevent--a26f4710-0d64-4a76-ae27-6ac038e7536b",
"modified": "2020-09-06T20:33:33.538Z",
"object_marking_refs": [
"marking-definition--41eaaf7c-0bc0-4c56-abdf-d89a7f096ac4",
"marking-definition--f88d31f6-486f-44da-b317-01333bde0b82",
],
"spec_version": "2.0",
"type": "x-cybersixgill-com-cve-event",
"x_sixgill_info": {
"event": {
"_id": "5f1f17164731b1cef86c8aaf",
"action": "trend",
"description": "Trend of Github commits related to CVE-2020-15392",
"event_datetime": "2020-06-30T00:00Z",
"name": "trend_Github_commits",
"prev_level": "prev_level",
"type": "github_authoring",
},
"nvd": {
"base_score_v3": 5.3,
"base_severity_v3": "MEDIUM",
"link": "https://nvd.nist.gov/vuln/detail/CVE-2020-15392",
"modified": "2020-07-15T16:52Z",
"published": "2020-07-07T14:15Z",
"score_2_0": 5.0,
"severity_2_0": "MEDIUM",
"vector_v2": "AV:N/AC:L/Au:N/C:P/I:N/A:N",
"vector_v3": "CVSS:3.1/AV:N/AC:L/PR:N/UI:N/S:U/C:L/I:N/A:N",
},
"score": {
"current": 0.02,
"highest": {"date": "2020-07-27T00:00Z", "value": 0.02},
"previouslyExploited": 0.07,
},
},
},
{
"created": "2020-08-19T23:08:05.709Z",
"external_references": [{"external_id": "CVE-2020-2021", "source_name": "cve"}],
"id": "cveevent--9c735811-6e08-44d8-a844-75acb10d79b9",
"modified": "2020-08-19T23:08:05.709Z",
"object_marking_refs": [
"marking-definition--41eaaf7c-0bc0-4c56-abdf-d89a7f096ac4",
"marking-definition--f88d31f6-486f-44da-b317-01333bde0b82",
],
"spec_version": "2.0",
"type": "x-cybersixgill-com-cve-event",
"x_sixgill_info": {
"event": {
"_id": "5f3db0ec3ecfe5a6d70b6245",
"action": "trend",
"description": "CVE-2020-2021 is trending on Twitter.",
"event_datetime": "2020-06-30T00:00Z",
"name": "trend_Twitter",
"prev_level": "prev_level",
"type": "dark_mention",
},
"nvd": {
"base_score_v3": 10.0,
"base_severity_v3": "CRITICAL",
"link": "https://nvd.nist.gov/vuln/detail/CVE-2020-2021",
"modified": "2020-07-06T14:39Z",
"published": "2020-06-29T15:15Z",
"score_2_0": 9.3,
"severity_2_0": "HIGH",
"vector_v2": "AV:N/AC:M/Au:N/C:C/I:C/A:C",
"vector_v3": "CVSS:3.1/AV:N/AC:L/PR:N/UI:N/S:C/C:H/I:H/A:H",
},
"score": {
"current": 9.13,
"highest": {"date": "2020-07-14T00:00Z", "value": 9.25},
"previouslyExploited": 5.32,
},
},
},
{
"created": "2020-08-19T23:08:05.709Z",
"external_references": [{"external_id": "CVE-2020-12828", "source_name": "cve"}],
"id": "cveevent--dffdcd6b-2157-4652-b7eb-4ce4bb9eebc5",
"modified": "2020-08-19T23:08:05.709Z",
"object_marking_refs": [
"marking-definition--41eaaf7c-0bc0-4c56-abdf-d89a7f096ac4",
"marking-definition--f88d31f6-486f-44da-b317-01333bde0b82",
],
"spec_version": "2.0",
"type": "x-cybersixgill-com-cve-event",
"x_sixgill_info": {
"event": {
"_id": "5f3db0ec3ecfe5a6d70b6274",
"action": "trend",
"description": "CVE-2020-12828 is trending on Twitter.",
"event_datetime": "2020-06-30T00:00Z",
"name": "trend_Twitter",
"prev_level": "prev_level",
"type": "dark_mention",
},
"nvd": {
"base_score_v3": 9.8,
"base_severity_v3": "CRITICAL",
"link": "https://nvd.nist.gov/vuln/detail/CVE-2020-12828",
"modified": "2020-06-02T16:55Z",
"published": "2020-05-21T17:15Z",
"score_2_0": 10.0,
"severity_2_0": "HIGH",
"vector_v2": "AV:N/AC:L/Au:N/C:C/I:C/A:C",
"vector_v3": "CVSS:3.1/AV:N/AC:L/PR:N/UI:N/S:U/C:H/I:H/A:H",
},
"score": {
"current": 8.33,
"highest": {"date": "2020-07-25T00:00Z", "value": 8.4},
"previouslyExploited": 5.07,
},
},
},
{
"created": "2020-08-19T23:08:05.709Z",
"external_references": [{"external_id": "CVE-2020-9771", "source_name": "cve"}],
"id": "cveevent--4b86077c-99f6-42ca-8b4d-953411fa17bd",
"modified": "2020-08-19T23:08:05.709Z",
"object_marking_refs": [
"marking-definition--41eaaf7c-0bc0-4c56-abdf-d89a7f096ac4",
"marking-definition--f88d31f6-486f-44da-b317-01333bde0b82",
],
"spec_version": "2.0",
"type": "x-cybersixgill-com-cve-event",
"x_sixgill_info": {
"event": {
"_id": "5f3db0ec3ecfe5a6d70b627c",
"action": "trend",
"description": "CVE-2020-9771 is trending on Twitter.",
"event_datetime": "2020-06-30T00:00Z",
"name": "trend_Twitter",
"prev_level": "prev_level",
"type": "dark_mention",
},
"nvd": {
"base_score_v3": None,
"base_severity_v3": None,
"link": "https://nvd.nist.gov/vuln/detail/CVE-2020-9771",
"modified": None,
"published": None,
"score_2_0": None,
"severity_2_0": None,
"vector_v2": "None",
"vector_v3": "None",
},
"score": {"current": None, "highest": {"date": None, "value": None}, "previouslyExploited": None},
},
},
{
"created": "2020-08-25T17:16:52.536Z",
"external_references": [{"external_id": "CVE-2015-6086", "source_name": "cve"}],
"id": "cveevent--1d6320f1-8b22-48e2-876d-5e31b9d36288",
"modified": "2020-08-25T17:16:52.536Z",
"object_marking_refs": [
"marking-definition--41eaaf7c-0bc0-4c56-abdf-d89a7f096ac4",
"marking-definition--f88d31f6-486f-44da-b317-01333bde0b82",
],
"spec_version": "2.0",
"type": "x-cybersixgill-com-cve-event",
"x_sixgill_info": {
"event": {
"_id": "5f454784ffebcfa91197c9d0",
"action": "modified",
"description": "Sixgill Current score of CVE-2015-6086 changed from Low to None.",
"event_datetime": "2020-06-30T00:00Z",
"level": "None",
"name": "Sixgill_score_level_change",
"prev_level": "prev_level",
"type": "score_level",
},
"nvd": {
"base_score_v3": None,
"base_severity_v3": None,
"link": "https://nvd.nist.gov/vuln/detail/CVE-2015-6086",
"modified": "2018-10-12T22:10Z",
"published": "2015-11-11T12:59Z",
"score_2_0": 4.3,
"severity_2_0": "MEDIUM",
"vector_v2": "AV:N/AC:M/Au:N/C:P/I:N/A:N",
"vector_v3": "None",
},
"score": {
"current": None,
"highest": {"date": "2016-04-14T00:00Z", "value": 7.02},
"previouslyExploited": 1.51,
},
},
},
{
"created": "2020-08-25T17:16:52.536Z",
"external_references": [{"external_id": "CVE-2015-6086", "source_name": "cve"}],
"id": "cveevent--1d6320f1-8b22-48e2-876d-5e31b9d36288",
"modified": "2020-08-25T17:16:52.536Z",
"object_marking_refs": [
"marking-definition--41eaaf7c-0bc0-4c56-abdf-d89a7f096ac4",
"marking-definition--f88d31f6-486f-44da-b317-01333bde0b82",
],
"spec_version": "2.0",
"type": "x-cybersixgill-com-cve-event",
"x_sixgill_info": {
"event": {
"_id": "5f454784ffebcfa91197c9d0",
"action": "modified",
"description": "Sixgill Current score of CVE-2015-6086 changed from Low to None.",
"event_datetime": "2020-06-30T00:00Z",
"level": "None",
"name": "Sixgill_score_level_change",
"prev_level": "prev_level",
"type": "score_level",
},
"nvd": {
"base_score_v3": None,
"base_severity_v3": None,
"link": "https://nvd.nist.gov/vuln/detail/CVE-2015-6086",
"modified": "2018-10-12T22:10Z",
"published": "2015-11-11T12:59Z",
"score_2_0": 4.3,
"severity_2_0": "MEDIUM",
"vector_v2": "AV:N/AC:M/Au:N/C:P/I:N/A:N",
"vector_v3": "None",
},
"score": {
"current": None,
"highest": {"date": "2016-04-14T00:00Z", "value": 7.02},
"previouslyExploited": 1.51,
},
},
},
],
"spec_version": "2.0",
"type": "bundle",
},
{
"id": "bundle--f00374ec-429c-40cb-b7bb-61f920814775",
"objects": [
{
"created": "2017-01-20T00:00:00.000Z",
"definition": {"tlp": "amber"},
"definition_type": "tlp",
"id": "marking-definition--f88d31f6-486f-44da-b317-01333bde0b82",
"type": "marking-definition",
},
{
"created": "2019-12-26T00:00:00Z",
"definition": {"statement": "Copyright Sixgill 2020. All rights reserved."},
"definition_type": "statement",
"id": "marking-definition--41eaaf7c-0bc0-4c56-abdf-d89a7f096ac4",
"type": "marking-definition",
},
],
"spec_version": "2.0",
"type": "bundle",
},
]
expected_ioc_output = [
{
"value": "CVE-2020-15392",
"type": "CVE",
"rawJSON": {
"value": "CVE-2020-15392",
"type": "x-cybersixgill-com-cve-event",
"created": "2020-09-06T20:33:33.538Z",
"external_references": [{"external_id": "CVE-2020-15392", "source_name": "cve"}],
"id": "cveevent--a26f4710-0d64-4a76-ae27-6ac038e7536b",
"modified": "2020-09-06T20:33:33.538Z",
"object_marking_refs": [
"marking-definition--41eaaf7c-0bc0-4c56-abdf-d89a7f096ac4",
"marking-definition--f88d31f6-486f-44da-b317-01333bde0b82",
],
"spec_version": "2.0",
"x_sixgill_info": {
"event": {
"_id": "5f1f17164731b1cef86c8aaf",
"action": "trend",
"description": "Trend of Github commits related to CVE-2020-15392",
"event_datetime": "2020-06-30T00:00Z",
"name": "trend_Github_commits",
"prev_level": "prev_level",
"type": "github_authoring",
},
"nvd": {
"base_score_v3": 5.3,
"base_severity_v3": "MEDIUM",
"link": "https://nvd.nist.gov/vuln/detail/CVE-2020-15392",
"modified": "2020-07-15T16:52Z",
"published": "2020-07-07T14:15Z",
"score_2_0": 5.0,
"severity_2_0": "MEDIUM",
"vector_v2": "AV:N/AC:L/Au:N/C:P/I:N/A:N",
"vector_v3": "CVSS:3.1/AV:N/AC:L/PR:N/UI:N/S:U/C:L/I:N/A:N",
},
"score": {
"current": 0.02,
"highest": {"date": "2020-07-27T00:00Z", "value": 0.02},
"previouslyExploited": 0.07,
},
},
},
"score": "3",
"fields": {
"description": """Description: Trend of Github commits related to CVE-2020-15392
Created: 2020-09-06T20:33:33.538Z
Modified: 2020-09-06T20:33:33.538Z
External id: CVE-2020-15392
Sixgill DVE score - current: 0.02
Sixgill DVE score - highest ever date: 2020-07-27T00:00Z
Sixgill DVE score - highest ever: 0.02
Sixgill - Previously exploited probability: 0.07
Event Name: trend_Github_commits
Event Type: github_authoring
Event Action: trend
Previous level: prev_level
Event Description: Trend of Github commits related to CVE-2020-15392
Event Datetime: 2020-06-30T00:00Z
CVSS 3.1 score: 5.3
CVSS 3.1 severity: MEDIUM
NVD Link: https://nvd.nist.gov/vuln/detail/CVE-2020-15392
NVD - last modified date: 2020-07-15T16:52Z
NVD - publication date: 2020-07-07T14:15Z
CVSS 2.0 score: 5.0
CVSS 2.0 severity: MEDIUM
NVD Vector - V2.0: AV:N/AC:L/Au:N/C:P/I:N/A:N
NVD Vector - V3.1: CVSS:3.1/AV:N/AC:L/PR:N/UI:N/S:U/C:L/I:N/A:N
""",
"creationdate": "2020-09-06T20:33:33.538Z",
"modified": "2020-09-06T20:33:33.538Z",
"externalid": "CVE-2020-15392",
"sixgilldvescorecurrent": 0.02,
"sixgilldvescorehighesteverdate": "2020-07-27T00:00Z",
"sixgilldvescorehighestever": 0.02,
"sixgillpreviouslyexploitedprobability": 0.07,
"eventname": "trend_Github_commits",
"eventtype": "github_authoring",
"eventaction": "trend",
"previouslevel": "prev_level",
"eventdescription": "Trend of Github commits related to CVE-2020-15392",
"eventdatetime": "2020-06-30T00:00Z",
"cvss31score": 5.3,
"cvss31severity": "MEDIUM",
"nvdlink": "https://nvd.nist.gov/vuln/detail/CVE-2020-15392",
"nvdlastmodifieddate": "2020-07-15T16:52Z",
"nvdpublicationdate": "2020-07-07T14:15Z",
"cvss20score": 5.0,
"cvss20severity": "MEDIUM",
"nvdvectorv20": "AV:N/AC:L/Au:N/C:P/I:N/A:N",
"nvdvectorv31": "CVSS:3.1/AV:N/AC:L/PR:N/UI:N/S:U/C:L/I:N/A:N",
},
},
{
"value": "CVE-2020-2021",
"type": "CVE",
"rawJSON": {
"value": "CVE-2020-2021",
"type": "x-cybersixgill-com-cve-event",
"created": "2020-08-19T23:08:05.709Z",
"external_references": [{"external_id": "CVE-2020-2021", "source_name": "cve"}],
"id": "cveevent--9c735811-6e08-44d8-a844-75acb10d79b9",
"modified": "2020-08-19T23:08:05.709Z",
"object_marking_refs": [
"marking-definition--41eaaf7c-0bc0-4c56-abdf-d89a7f096ac4",
"marking-definition--f88d31f6-486f-44da-b317-01333bde0b82",
],
"spec_version": "2.0",
"x_sixgill_info": {
"event": {
"_id": "5f3db0ec3ecfe5a6d70b6245",
"action": "trend",
"description": "CVE-2020-2021 is trending on Twitter.",
"event_datetime": "2020-06-30T00:00Z",
"name": "trend_Twitter",
"prev_level": "prev_level",
"type": "dark_mention",
},
"nvd": {
"base_score_v3": 10,
"base_severity_v3": "CRITICAL",
"link": "https://nvd.nist.gov/vuln/detail/CVE-2020-2021",
"modified": "2020-07-06T14:39Z",
"published": "2020-06-29T15:15Z",
"score_2_0": 9.3,
"severity_2_0": "HIGH",
"vector_v2": "AV:N/AC:M/Au:N/C:C/I:C/A:C",
"vector_v3": "CVSS:3.1/AV:N/AC:L/PR:N/UI:N/S:C/C:H/I:H/A:H",
},
"score": {
"current": 9.13,
"highest": {"date": "2020-07-14T00:00Z", "value": 9.25},
"previouslyExploited": 5.32,
},
},
},
"score": "3",
"fields": {
"description": """Description: CVE-2020-2021 is trending on Twitter.
Created: 2020-08-19T23:08:05.709Z
Modified: 2020-08-19T23:08:05.709Z
External id: CVE-2020-2021
Sixgill DVE score - current: 9.13
Sixgill DVE score - highest ever date: 2020-07-14T00:00Z
Sixgill DVE score - highest ever: 9.25
Sixgill - Previously exploited probability: 5.32
Event Name: trend_Twitter
Event Type: dark_mention
Event Action: trend
Previous level: prev_level
Event Description: CVE-2020-2021 is trending on Twitter.
Event Datetime: 2020-06-30T00:00Z
CVSS 3.1 score: 10.0
CVSS 3.1 severity: CRITICAL
NVD Link: https://nvd.nist.gov/vuln/detail/CVE-2020-2021
NVD - last modified date: 2020-07-06T14:39Z
NVD - publication date: 2020-06-29T15:15Z
CVSS 2.0 score: 9.3
CVSS 2.0 severity: HIGH
NVD Vector - V2.0: AV:N/AC:M/Au:N/C:C/I:C/A:C
NVD Vector - V3.1: CVSS:3.1/AV:N/AC:L/PR:N/UI:N/S:C/C:H/I:H/A:H
""",
"creationdate": "2020-08-19T23:08:05.709Z",
"modified": "2020-08-19T23:08:05.709Z",
"externalid": "CVE-2020-2021",
"sixgilldvescorecurrent": 9.13,
"sixgilldvescorehighesteverdate": "2020-07-14T00:00Z",
"sixgilldvescorehighestever": 9.25,
"sixgillpreviouslyexploitedprobability": 5.32,
"eventname": "trend_Twitter",
"eventtype": "dark_mention",
"eventaction": "trend",
"previouslevel": "prev_level",
"eventdescription": "CVE-2020-2021 is trending on Twitter.",
"eventdatetime": "2020-06-30T00:00Z",
"cvss31score": 10.0,
"cvss31severity": "CRITICAL",
"nvdlink": "https://nvd.nist.gov/vuln/detail/CVE-2020-2021",
"nvdlastmodifieddate": "2020-07-06T14:39Z",
"nvdpublicationdate": "2020-06-29T15:15Z",
"cvss20score": 9.3,
"cvss20severity": "HIGH",
"nvdvectorv20": "AV:N/AC:M/Au:N/C:C/I:C/A:C",
"nvdvectorv31": "CVSS:3.1/AV:N/AC:L/PR:N/UI:N/S:C/C:H/I:H/A:H",
},
},
{
"value": "CVE-2020-12828",
"type": "CVE",
"rawJSON": {
"value": "CVE-2020-12828",
"type": "x-cybersixgill-com-cve-event",
"created": "2020-08-19T23:08:05.709Z",
"external_references": [{"external_id": "CVE-2020-12828", "source_name": "cve"}],
"id": "cveevent--dffdcd6b-2157-4652-b7eb-4ce4bb9eebc5",
"modified": "2020-08-19T23:08:05.709Z",
"object_marking_refs": [
"marking-definition--41eaaf7c-0bc0-4c56-abdf-d89a7f096ac4",
"marking-definition--f88d31f6-486f-44da-b317-01333bde0b82",
],
"spec_version": "2.0",
"x_sixgill_info": {
"event": {
"_id": "5f3db0ec3ecfe5a6d70b6274",
"action": "trend",
"description": "CVE-2020-12828 is trending on Twitter.",
"event_datetime": "2020-06-30T00:00Z",
"name": "trend_Twitter",
"prev_level": "prev_level",
"type": "dark_mention",
},
"nvd": {
"base_score_v3": 9.8,
"base_severity_v3": "CRITICAL",
"link": "https://nvd.nist.gov/vuln/detail/CVE-2020-12828",
"modified": "2020-06-02T16:55Z",
"published": "2020-05-21T17:15Z",
"score_2_0": 10.0,
"severity_2_0": "HIGH",
"vector_v2": "AV:N/AC:L/Au:N/C:C/I:C/A:C",
"vector_v3": "CVSS:3.1/AV:N/AC:L/PR:N/UI:N/S:U/C:H/I:H/A:H",
},
"score": {
"current": 8.33,
"highest": {"date": "2020-07-25T00:00Z", "value": 8.4},
"previouslyExploited": 5.07,
},
},
},
"score": "3",
"fields": {
"description": """Description: CVE-2020-12828 is trending on Twitter.
Created: 2020-08-19T23:08:05.709Z
Modified: 2020-08-19T23:08:05.709Z
External id: CVE-2020-12828
Sixgill DVE score - current: 8.33
Sixgill DVE score - highest ever date: 2020-07-25T00:00Z
Sixgill DVE score - highest ever: 8.4
Sixgill - Previously exploited probability: 5.07
Event Name: trend_Twitter
Event Type: dark_mention
Event Action: trend
Previous level: prev_level
Event Description: CVE-2020-12828 is trending on Twitter.
Event Datetime: 2020-06-30T00:00Z
CVSS 3.1 score: 9.8
CVSS 3.1 severity: CRITICAL
NVD Link: https://nvd.nist.gov/vuln/detail/CVE-2020-12828
NVD - last modified date: 2020-06-02T16:55Z
NVD - publication date: 2020-05-21T17:15Z
CVSS 2.0 score: 10.0
CVSS 2.0 severity: HIGH
NVD Vector - V2.0: AV:N/AC:L/Au:N/C:C/I:C/A:C
NVD Vector - V3.1: CVSS:3.1/AV:N/AC:L/PR:N/UI:N/S:U/C:H/I:H/A:H
""",
"creationdate": "2020-08-19T23:08:05.709Z",
"modified": "2020-08-19T23:08:05.709Z",
"externalid": "CVE-2020-12828",
"sixgilldvescorecurrent": 8.33,
"sixgilldvescorehighesteverdate": "2020-07-25T00:00Z",
"sixgilldvescorehighestever": 8.4,
"sixgillpreviouslyexploitedprobability": 5.07,
"eventname": "trend_Twitter",
"eventtype": "dark_mention",
"eventaction": "trend",
"previouslevel": "prev_level",
"eventdescription": "CVE-2020-12828 is trending on Twitter.",
"eventdatetime": "2020-06-30T00:00Z",
"cvss31score": 9.8,
"cvss31severity": "CRITICAL",
"nvdlink": "https://nvd.nist.gov/vuln/detail/CVE-2020-12828",
"nvdlastmodifieddate": "2020-06-02T16:55Z",
"nvdpublicationdate": "2020-05-21T17:15Z",
"cvss20score": 10.0,
"cvss20severity": "HIGH",
"nvdvectorv20": "AV:N/AC:L/Au:N/C:C/I:C/A:C",
"nvdvectorv31": "CVSS:3.1/AV:N/AC:L/PR:N/UI:N/S:U/C:H/I:H/A:H",
},
},
{
"value": "CVE-2020-9771",
"type": "CVE",
"rawJSON": {
"value": "CVE-2020-9771",
"type": "x-cybersixgill-com-cve-event",
"created": "2020-08-19T23:08:05.709Z",
"external_references": [{"external_id": "CVE-2020-9771", "source_name": "cve"}],
"id": "cveevent--4b86077c-99f6-42ca-8b4d-953411fa17bd",
"modified": "2020-08-19T23:08:05.709Z",
"object_marking_refs": [
"marking-definition--41eaaf7c-0bc0-4c56-abdf-d89a7f096ac4",
"marking-definition--f88d31f6-486f-44da-b317-01333bde0b82",
],
"spec_version": "2.0",
"x_sixgill_info": {
"event": {
"_id": "5f3db0ec3ecfe5a6d70b627c",
"action": "trend",
"description": "CVE-2020-9771 is trending on Twitter.",
"event_datetime": "2020-06-30T00:00Z",
"name": "trend_Twitter",
"prev_level": "prev_level",
"type": "dark_mention",
},
"nvd": {
"base_score_v3": None,
"base_severity_v3": None,
"link": "https://nvd.nist.gov/vuln/detail/CVE-2020-9771",
"modified": None,
"published": None,
"score_2_0": None,
"severity_2_0": None,
"vector_v2": "None",
"vector_v3": "None",
},
"score": {"current": None, "highest": {"date": None, "value": None}, "previouslyExploited": None},
},
},
"score": "3",
"fields": {
"description": """Description: CVE-2020-9771 is trending on Twitter.
Created: 2020-08-19T23:08:05.709Z
Modified: 2020-08-19T23:08:05.709Z
External id: CVE-2020-9771
Sixgill DVE score - current: None
Sixgill DVE score - highest ever date: None
Sixgill DVE score - highest ever: None
Sixgill - Previously exploited probability: None
Event Name: trend_Twitter
Event Type: dark_mention
Event Action: trend
Previous level: prev_level
Event Description: CVE-2020-9771 is trending on Twitter.
Event Datetime: 2020-06-30T00:00Z
CVSS 3.1 score: None
CVSS 3.1 severity: None
NVD Link: https://nvd.nist.gov/vuln/detail/CVE-2020-9771
NVD - last modified date: None
NVD - publication date: None
CVSS 2.0 score: None
CVSS 2.0 severity: None
NVD Vector - V2.0: None
NVD Vector - V3.1: None
""",
"creationdate": "2020-08-19T23:08:05.709Z",
"modified": "2020-08-19T23:08:05.709Z",
"externalid": "CVE-2020-9771",
"sixgilldvescorecurrent": None,
"sixgilldvescorehighesteverdate": None,
"sixgilldvescorehighestever": None,
"sixgillpreviouslyexploitedprobability": None,
"eventname": "trend_Twitter",
"eventtype": "dark_mention",
"eventaction": "trend",
"previouslevel": "prev_level",
"eventdescription": "CVE-2020-9771 is trending on Twitter.",
"eventdatetime": "2020-06-30T00:00Z",
"cvss31score": None,
"cvss31severity": None,
"nvdlink": "https://nvd.nist.gov/vuln/detail/CVE-2020-9771",
"nvdlastmodifieddate": None,
"nvdpublicationdate": None,
"cvss20score": None,
"cvss20severity": None,
"nvdvectorv20": "None",
"nvdvectorv31": "None",
},
},
{
"value": "CVE-2015-6086",
"type": "CVE",
"rawJSON": {
"value": "CVE-2015-6086",
"type": "x-cybersixgill-com-cve-event",
"created": "2020-08-25T17:16:52.536Z",
"external_references": [{"external_id": "CVE-2015-6086", "source_name": "cve"}],
"id": "cveevent--1d6320f1-8b22-48e2-876d-5e31b9d36288",
"modified": "2020-08-25T17:16:52.536Z",
"object_marking_refs": [
"marking-definition--41eaaf7c-0bc0-4c56-abdf-d89a7f096ac4",
"marking-definition--f88d31f6-486f-44da-b317-01333bde0b82",
],
"spec_version": "2.0",
"x_sixgill_info": {
"event": {
"_id": "5f454784ffebcfa91197c9d0",
"action": "modified",
"description": "Sixgill Current score of CVE-2015-6086 changed from Low to None.",
"event_datetime": "2020-06-30T00:00Z",
"level": "None",
"name": "Sixgill_score_level_change",
"prev_level": "prev_level",
"type": "score_level",
},
"nvd": {
"base_score_v3": None,
"base_severity_v3": None,
"link": "https://nvd.nist.gov/vuln/detail/CVE-2015-6086",
"modified": "2018-10-12T22:10Z",
"published": "2015-11-11T12:59Z",
"score_2_0": 4.3,
"severity_2_0": "MEDIUM",
"vector_v2": "AV:N/AC:M/Au:N/C:P/I:N/A:N",
"vector_v3": "None",
},
"score": {
"current": None,
"highest": {"date": "2016-04-14T00:00Z", "value": 7.02},
"previouslyExploited": 1.51,
},
},
},
"score": "3",
"fields": {
"description": """Description: Sixgill Current score of CVE-2015-6086 changed from Low to None.
Created: 2020-08-25T17:16:52.536Z
Modified: 2020-08-25T17:16:52.536Z
External id: CVE-2015-6086
Sixgill DVE score - current: None
Sixgill DVE score - highest ever date: 2016-04-14T00:00Z
Sixgill DVE score - highest ever: 7.02
Sixgill - Previously exploited probability: 1.51
Event Name: Sixgill_score_level_change
Event Type: score_level
Event Action: modified
Previous level: prev_level
Event Description: Sixgill Current score of CVE-2015-6086 changed from Low to None.
Event Datetime: 2020-06-30T00:00Z
CVSS 3.1 score: None
CVSS 3.1 severity: None
NVD Link: https://nvd.nist.gov/vuln/detail/CVE-2015-6086
NVD - last modified date: 2018-10-12T22:10Z
NVD - publication date: 2015-11-11T12:59Z
CVSS 2.0 score: 4.3
CVSS 2.0 severity: MEDIUM
NVD Vector - V2.0: AV:N/AC:M/Au:N/C:P/I:N/A:N
NVD Vector - V3.1: None
""",
"creationdate": "2020-08-25T17:16:52.536Z",
"modified": "2020-08-25T17:16:52.536Z",
"externalid": "CVE-2015-6086",
"sixgilldvescorecurrent": None,
"sixgilldvescorehighesteverdate": "2016-04-14T00:00Z",
"sixgilldvescorehighestever": 7.02,
"sixgillpreviouslyexploitedprobability": 1.51,
"eventname": "Sixgill_score_level_change",
"eventtype": "score_level",
"eventaction": "modified",
"previouslevel": "prev_level",
"eventdescription": "Sixgill Current score of CVE-2015-6086 changed from Low to None.",
"eventdatetime": "2020-06-30T00:00Z",
"cvss31score": None,
"cvss31severity": None,
"nvdlink": "https://nvd.nist.gov/vuln/detail/CVE-2015-6086",
"nvdlastmodifieddate": "2018-10-12T22:10Z",
"nvdpublicationdate": "2015-11-11T12:59Z",
"cvss20score": 4.3,
"cvss20severity": "MEDIUM",
"nvdvectorv20": "AV:N/AC:M/Au:N/C:P/I:N/A:N",
"nvdvectorv31": "None",
},
},
{
"value": "CVE-2015-6086",
"type": "CVE",
"rawJSON": {
"value": "CVE-2015-6086",
"type": "x-cybersixgill-com-cve-event",
"created": "2020-08-25T17:16:52.536Z",
"external_references": [{"external_id": "CVE-2015-6086", "source_name": "cve"}],
"id": "cveevent--1d6320f1-8b22-48e2-876d-5e31b9d36288",
"modified": "2020-08-25T17:16:52.536Z",
"object_marking_refs": [
"marking-definition--41eaaf7c-0bc0-4c56-abdf-d89a7f096ac4",
"marking-definition--f88d31f6-486f-44da-b317-01333bde0b82",
],
"spec_version": "2.0",
"x_sixgill_info": {
"event": {
"_id": "5f454784ffebcfa91197c9d0",
"action": "modified",
"description": "Sixgill Current score of CVE-2015-6086 changed from Low to None.",
"event_datetime": "2020-06-30T00:00Z",
"level": "None",
"name": "Sixgill_score_level_change",
"prev_level": "prev_level",
"type": "score_level",
},
"nvd": {
"base_score_v3": None,
"base_severity_v3": None,
"link": "https://nvd.nist.gov/vuln/detail/CVE-2015-6086",
"modified": "2018-10-12T22:10Z",
"published": "2015-11-11T12:59Z",
"score_2_0": 4.3,
"severity_2_0": "MEDIUM",
"vector_v2": "AV:N/AC:M/Au:N/C:P/I:N/A:N",
"vector_v3": "None",
},
"score": {
"current": None,
"highest": {"date": "2016-04-14T00:00Z", "value": 7.02},
"previouslyExploited": 1.51,
},
},
},
"score": "3",
"fields": {
"description": """Description: Sixgill Current score of CVE-2015-6086 changed from Low to None.
Created: 2020-08-25T17:16:52.536Z
Modified: 2020-08-25T17:16:52.536Z
External id: CVE-2015-6086
Sixgill DVE score - current: None
Sixgill DVE score - highest ever date: 2016-04-14T00:00Z
Sixgill DVE score - highest ever: 7.02
Sixgill - Previously exploited probability: 1.51
Event Name: Sixgill_score_level_change
Event Type: score_level
Event Action: modified
Previous level: prev_level
Event Description: Sixgill Current score of CVE-2015-6086 changed from Low to None.
Event Datetime: 2020-06-30T00:00Z
CVSS 3.1 score: None
CVSS 3.1 severity: None
NVD Link: https://nvd.nist.gov/vuln/detail/CVE-2015-6086
NVD - last modified date: 2018-10-12T22:10Z
NVD - publication date: 2015-11-11T12:59Z
CVSS 2.0 score: 4.3
CVSS 2.0 severity: MEDIUM
NVD Vector - V2.0: AV:N/AC:M/Au:N/C:P/I:N/A:N
NVD Vector - V3.1: None
""",
"creationdate": "2020-08-25T17:16:52.536Z",
"modified": "2020-08-25T17:16:52.536Z",
"externalid": "CVE-2015-6086",
"sixgilldvescorecurrent": None,
"sixgilldvescorehighesteverdate": "2016-04-14T00:00Z",
"sixgilldvescorehighestever": 7.02,
"sixgillpreviouslyexploitedprobability": 1.51,
"eventname": "Sixgill_score_level_change",
"eventtype": "score_level",
"eventaction": "modified",
"previouslevel": "prev_level",
"eventdescription": "Sixgill Current score of CVE-2015-6086 changed from Low to None.",
"eventdatetime": "2020-06-30T00:00Z",
"cvss31score": None,
"cvss31severity": None,
"nvdlink": "https://nvd.nist.gov/vuln/detail/CVE-2015-6086",
"nvdlastmodifieddate": "2018-10-12T22:10Z",
"nvdpublicationdate": "2015-11-11T12:59Z",
"cvss20score": 4.3,
"cvss20severity": "MEDIUM",
"nvdvectorv20": "AV:N/AC:M/Au:N/C:P/I:N/A:N",
"nvdvectorv31": "None",
},
},
]
class MockedResponse(object):
def __init__(
self,
status_code,
text,
reason=None,
url=None,
method=None,
):
self.status_code = status_code
self.text = text
self.reason = reason
self.url = url
self.request = requests.Request("GET")
self.ok = True if self.status_code == 200 else False
def json(self):
return json.loads(self.text)
def init_params():
return {"client_id": "WRONG_CLIENT_ID_TEST", "client_secret": "CLIENT_SECRET_TEST"}
def mocked_request(*args, **kwargs):
global bundle_index
global submitted_indicators
request = kwargs.get("request", {})
end_point = request.path_url
method = request.method
response_dict = {
"POST": {
"/auth/token": MockedResponse(200, mocked_get_token_response),
"/dvefeed/ioc/ack": MockedResponse(200, str(submitted_indicators)),
},
"GET": {"/dvefeed/ioc?limit=1000": MockedResponse(200, json.dumps(iocs_bundle[bundle_index]))},
}
response_dict = response_dict.get(method)
response = response_dict.get(end_point)
if method == "GET" and end_point == "/dvefeed/ioc?limit=1000":
submitted_indicators = len(iocs_bundle[bundle_index].get("objects")) - 2
bundle_index += 1
return response
def test_test_module_command_raise_exception(mocker):
mocker.patch.object(demisto, "params", return_value=init_params())
mocker.patch("requests.sessions.Session.send", return_value=MockedResponse(400, "error"))
from Sixgill_DVE_Feed import module_command_test
with pytest.raises(Exception):
module_command_test()
def test_test_module_command(mocker):
mocker.patch.object(demisto, "params", return_value=init_params())
mocker.patch("requests.sessions.Session.send", return_value=MockedResponse(200, "ok"))
from Sixgill_DVE_Feed import module_command_test
module_command_test()
def test_fetch_indicators_command(mocker):
global bundle_index
global submitted_indicators
mocker.patch.object(demisto, "params", return_value=init_params())
mocker.patch("requests.sessions.Session.send", new=mocked_request)
from Sixgill_DVE_Feed import fetch_indicators_command
from sixgill.sixgill_feed_client import SixgillFeedClient
from sixgill.sixgill_constants import FeedStream
client = SixgillFeedClient(
"client_id",
"client_secret",
"some_channel",
FeedStream.DVEFEED,
demisto,
1000,
)
output = fetch_indicators_command(client)
bundle_index = 0
submitted_indicators = 0
assert output == expected_ioc_output
def test_get_indicators_command(mocker):
global bundle_index
global submitted_indicators
mocker.patch.object(demisto, "params", return_value=init_params())
mocker.patch("requests.sessions.Session.send", new=mocked_request)
from Sixgill_DVE_Feed import get_indicators_command
from sixgill.sixgill_feed_client import SixgillFeedClient
from sixgill.sixgill_constants import FeedStream
client = SixgillFeedClient(
"client_id",
"client_secret",
"some_channel",
FeedStream.DVEFEED,
demisto,
1000,
)
output = get_indicators_command(client, {"limit": 10})
bundle_index = 0
submitted_indicators = 0
assert output[2] == expected_ioc_output
@pytest.mark.parametrize("tlp_color", ["", None, "AMBER"])
def test_feed_tags_and_tlp_color(mocker, tlp_color):
"""
Given:
- feedTags parameter
When:
- Executing fetch command on feed
Then:
- Validate the tags supplied are added to the tags list in addition to the tags that were there before
"""
global bundle_index
global submitted_indicators
mocker.patch.object(demisto, "params", return_value=init_params())
mocker.patch("requests.sessions.Session.send", new=mocked_request)
from Sixgill_DVE_Feed import fetch_indicators_command
from sixgill.sixgill_feed_client import SixgillFeedClient
from sixgill.sixgill_constants import FeedStream
client = SixgillFeedClient(
"client_id",
"client_secret",
"some_channel",
FeedStream.DVEFEED,
demisto,
1000,
)
output = fetch_indicators_command(client, tags=["tag1", "tag2"], tlp_color=tlp_color)
assert all(item in output[0]["fields"]["tags"] for item in ["tag1", "tag2"])
if tlp_color:
assert output[0]["fields"]["trafficlightprotocol"] == tlp_color
else:
assert not output[0]["fields"].get("trafficlightprotocol")
bundle_index -= 1
| 41.639442 | 118 | 0.513491 | 4,466 | 41,806 | 4.695029 | 0.077698 | 0.020031 | 0.007631 | 0.016024 | 0.863172 | 0.845574 | 0.820441 | 0.814193 | 0.793924 | 0.785864 | 0 | 0.151543 | 0.340214 | 41,806 | 1,003 | 119 | 41.680957 | 0.608636 | 0.00421 | 0 | 0.708595 | 0 | 0.033543 | 0.504304 | 0.148435 | 0 | 0 | 0 | 0 | 0.005241 | 1 | 0.009434 | false | 0 | 0.015723 | 0.002096 | 0.02935 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
7e22d7e45a980ab9f5af611262bd32ebbe306e87 | 25,014 | py | Python | Program.py | FlamesLLC/Lucid64 | dbd24a1ea951b9a5ce4774afd85ca80be3742b2e | [
"MIT"
] | null | null | null | Program.py | FlamesLLC/Lucid64 | dbd24a1ea951b9a5ce4774afd85ca80be3742b2e | [
"MIT"
] | null | null | null | Program.py | FlamesLLC/Lucid64 | dbd24a1ea951b9a5ce4774afd85ca80be3742b2e | [
"MIT"
] | null | null | null | ### EMUAI (c) N64 LAB TEAM
from PyQt5.QtWidgets import QApplication, QWidget, QPushButton, QLabel, QLineEdit, QMessageBox
from PyQt5.QtGui import QPixmap, QIcon
from PyQt5.QtCore import QCoreApplication
from PyQt5.QtCore import Qt
import sys
import os
import re
import time
import random
import subprocess
import threading
import json
from datetime import datetime
from pynput.keyboard import Key, Listener
import platform
import socket
import base64
import hashlib
import requests
from PyQt5.QtCore import QThread
from PyQt5.QtCore import pyqtSignal
from PyQt5.QtCore import QObject
from PyQt5.QtCore import QEvent
from PyQt5.QtCore import QTimer
from PyQt5.QtCore import QCoreApplication
from PyQt5.QtGui import QPainter, QPen, QBrush, QColor, QFont
from PyQt5.QtWidgets import QMainWindow, QApplication, QWidget, QPushButton, QLabel, QLineEdit, QMessageBox
from PyQt5.QtGui import QPixmap, QIcon
from PyQt5.QtCore import QCoreApplication
from PyQt5.QtCore import Qt
from PyQt5.QtWidgets import QApplication, QWidget, QMainWindow, QPushButton, QLabel, QLineEdit, QMessageBox, QFileDialog
from PyQt5.QtGui import QPixmap, QIcon
from PyQt5.QtCore import QCoreApplication
from PyQt5.QtCore import Qt
from PyQt5.QtWidgets import QApplication, QWidget, QMainWindow, QPushButton, QLabel, QLineEdit, QMessageBox, QFileDialog
from PyQt5.QtGui import QPixmap, QIcon
from PyQt5.QtCore import QCoreApplication
from PyQt5.Q
import sys
import os
import time
import random
import math
import numpy as np
import numpy.random as npr
import scipy.stats as sps
import scipy.misc as spm
import matplotlib.pyplot as plt
import matplotlib.animation as animation
import matplotlib.patches as mpatches
import matplotlib.cm as cm
import matplotlib.colors as colors
import matplotlib.colorbar as cbar
from matplotlib.collections import PatchCollection
from matplotlib.ticker import MaxNLocator
from matplotlib import rc
from matplotlib import ticker
from matplotlib.ticker import FormatStrFormatter
from matplotlib.ticker import LinearLocator
from matplotlib.ticker import ScalarFormatter
from matplotlib.colors import LogNorm
from matplotlib.colors import Normalize
from matplotlib.colors import PowerNorm
from matplotlib.colors import SymLogNorm
from matplotlib.colors import BoundaryNorm
from matplotlib.colors import ListedColormap
from matplotlib.colors import LinearSegmentedColormap
from matplotlib.patches import Rectangle
from matplotlib.patches import Circle
from matplotlib.patches import Ellipse
from matplotlib.patches import FancyArrowPatch
from matplotlib.patches import Patch
from matplotlib.patches import RegularPolygon
from matplotlib.patches import Polygon
from matplotlib.patches import Wedge
from matplotlib.patches import Arc
from matplotlib.collections import PatchCollection
from matplotlib.lines import Line2D
from matplotlib.markers import MarkerStyle
from matplotlib.markers import Marker
from matplotlib.lines import Line2D
from matplotlib.patches import Circle
from matplotlib.collections import PatchCollection
from matplotlib.collections import LineCollection
from matplotlib.collections import PolyCollection
from matplotlib.collections import PathCollection
from matplotlib.collections import PatchCollection
from mat
import sys
import os
import time
import random
import numpy as np
import numpy.random as npr
import scipy.stats as sps
## make a project 64 like gui
import tkinter
from tkinter import *
from tkinter import ttk
from tkinter import filedialog
from tkinter import messagebox
## write a n64 like gui
import tkinter
from tkinter import *
from tkinter import ttk
from tkinter import filedialog
from tkinter import messagebox
print("Welcome to EMUN64 Please insert rom")
## write a mips 64 like rom loader
import os
rom = filedialog.askopenfilename(initialdir = "/",title = "Select file",filetypes = (("n64 rom files","*.z64"),("all files","*.*")))
os.system("mips64 -n -m -f " + rom)
## write the os that reads the data from n64
import os
import time
import random
import numpy as np
import numpy.random as npr
import scipy.stats as sps
import scipy.misc as spm
import matplotlib.pyplot as plt
import matplotlib.animation as animation
import matplotlib.patches as mpatches
import matplotlib.cm as cm
import matplotlib.colors as colors
import matplotlib.colorbar as cbar
from matplotlib.collections import PatchCollection
from matplotlib.ticker import MaxNLocator
from matplotlib import rc
from matplotlib import ticker
from matplotlib.ticker import FormatStrFormatter
from matplotlib.ticker import LinearLocator
from matplotlib.ticker import ScalarFormatter
from matplotlib.colors import LogNorm
from matplotlib.colors import Normalize
from matplotlib.colors import PowerNorm
from matplotlib.colors import SymLogNorm
from matplotlib.colors import BoundaryNorm
from matplotlib.colors import ListedColormap
from matplotlib.colors import LinearSegmentedColormap
from matplotlib.patches import Rectangle
from matplotlib.patches import Circle
from matplotlib.patches import Ellipse
from matplotlib.patches import FancyArrowPatch
from matplotlib.patches import Patch
from matplotlib.patches import RegularPolygon
from matplotlib.patches import Polygon
from matplotlib.patches import Wedge
from matplotlib.patches import Arc
from matplotlib.collections import PatchCollection
from matplotlib.lines import Line2D
from matplotlib.markers import MarkerStyle
from matplotlib.markers import Marker
from matplotlib.lines import Line2D
from matplotlib.patches import Circle
from matplotlib.collections import PatchCollection
from matplotlib.collections import LineCollection
from matplotlib.collections import PolyCollection
from matplotlib.collections import PathCollection
from matplotlib.collections import PatchCollection
from matplotlib.collections import
=#======
import os
import time
import random
import numpy as np
import numpy.random as npr
import scipy.stats as sps
import scipy.misc as spm
import matplotlib.pyplot as plt
import matplotlib.animation as animation
import matplotlib.patches as mpatches
import matplotlib.cm as cm
import matplotlib.colors as colors
import matplotlib.colorbar as cbar
from matplotlib.collections import PatchCollection
from matplotlib.ticker import MaxNLocator
from matplotlib import rc
from matplotlib import ticker
from matplotlib.ticker import FormatStrFormatter
from matplotlib.ticker import LinearLocator
from matplotlib.ticker import ScalarFormatter
from matplotlib.colors import LogNorm
from matplotlib.colors import Normalize
from matplotlib.colors import PowerNorm
from matplotlib.colors import SymLogNorm
from matplotlib.colors import BoundaryNorm
from matplotlib.colors import ListedColormap
from matplotlib.colors import LinearSegmentedColormap
from matplotlib.patches import Rectangle
from matplotlib.patches import Circle
from matplotlib.patches import Ellipse
from matplotlib.patches import FancyArrowPatch
from matplotlib.patches import Patch
from matplotlib.patches import RegularPolygon
from matplotlib.patches import Polygon
from matplotlib.patches import Wedge
from matplotlib.patches import Arc
from matplotlib.collections import PatchCollection
from matplotlib.lines import Line2D
from matplotlib.markers import MarkerStyle
from matplotlib.markers import Marker
from matplotlib.lines import Line2D
from matplotlib.patches import Circle
from matplotlib.collections import PatchCollection
from matplotlib.collections import LineCollection
from matplotlib.collections import PolyCollection
from matplotlib.collections import PathCollection
from matplotlib.collections import PatchCollection
from mat
import sys
import
#=======
import os
import time
import random
import numpy as np
import numpy.random as npr
import scipy.stats as sps
import scipy.misc as spm
import matplotlib.pyplot as plt
import matplotlib.animation as animation
import matplotlib.patches as mpatches
import matplotlib.cm as cm
import matplotlib.colors as colors
import matplotlib.colorbar as cbar
from matplotlib.collections import PatchCollection
from matplotlib.ticker import MaxNLocator
from matplotlib import rc
from matplotlib import ticker
from matplotlib.ticker import FormatStrFormatter
from matplotlib.ticker import LinearLocator
from matplotlib.ticker import ScalarFormatter
from matplotlib.colors import LogNorm
from matplotlib.colors import Normalize
from matplotlib.colors import PowerNorm
from matplotlib.colors import SymLogNorm
from matplotlib.colors import BoundaryNorm
from matplotlib.colors import ListedColormap
from matplotlib.colors import LinearSegmentedColormap
from matplotlib.patches import Rectangle
from matplotlib.patches import Circle
from matplotlib.patches import Ellipse
from matplotlib.patches import FancyArrowPatch
from matplotlib.patches import Patch
from matplotlib.patches import RegularPolygon
from matplotlib.patches import Polygon
from matplotlib.patches import Wedge
from matplotlib.patches import Arc
from matplotlib.collections import PatchCollection
from matplotlib.lines import Line2D
from matplotlib.markers import MarkerStyle
from matplotlib.markers import Marker
from matplotlib.lines import Line2D
from matplotlib.patches import Circle
from matplotlib.collections import PatchCollection
from matplotlib.collections import LineCollection
from matplotlib.collections import PolyCollection
from matplotlib.collections import PathCollection
from matplotlib.collections import PatchCollection
from mat
import os
import
#=======
import os
import time
import subprocess
import random
import numpy as np
import numpy.random as npr
import scipy.stats as sps
import scipy.misc as spm
import matplotlib.pyplot as plt
import matplotlib.animation as animation
import matplotlib.patches as mpatches
import matplotlib.cm as cm
import matplotlib.colors as colors
import matplotlib.colorbar as cbar
from matplotlib.collections import PatchCollection
from matplotlib.ticker import MaxNLocator
from matplotlib import rc
from matplotlib import ticker
from matplotlib.ticker import FormatStrFormatter
from matplotlib.ticker import LinearLocator
from matplotlib.ticker import ScalarFormatter
from matplotlib.colors import LogNorm
from matplotlib.colors import Normalize
from matplotlib.colors import PowerNorm
from matplotlib.colors import SymLogNorm
from matplotlib.colors import BoundaryNorm
from matplotlib.colors import ListedColormap
from matplotlib.colors import LinearSegmentedColormap
from matplotlib.patches import Rectangle
from matplotlib.patches import Circle
from matplotlib.patches import Ellipse
from matplotlib.patches import FancyArrowPatch
from matplotlib.patches import Patch
from matplotlib.patches import RegularPolygon
from matplotlib.patches import Polygon
from matplotlib.patches import Wedge
from matplotlib.patches import Arc
from matplotlib.collections import PatchCollection
from matplotlib.lines import Line2D
from matplotlib.markers import MarkerStyle
from matplotlib.markers import Marker
from matplotlib.lines import Line2D
from matplotlib.patches import Circle
from matplotlib.collections import PatchCollection
from matplotlib.collections import LineCollection
from matplotlib.collections import PolyCollection
from matplotlib.collections import PathCollection
from matplotlib.collections import PatchCollection
from matplotlib
#=======
import os
import subprocess
import re
import time
import random
import socket
import json
import base64
import hashlib
import requests
from datetime import datetime
import platform
import sys
import os
import subprocess
import re
import time
import random
import socket
import json
import base64
import hashlib
import requests
from datetime import datetime
import platform
import os
import subprocess
import re
import time
import random
import socket
import json
import base64
import hashlib
import requests
from datetime import datetime
import platform
import os
import subprocess
import re
import time
import random
import socket
import json
import base64
import hashlib
import requests
from datetime import datetime
import platform
import os
import subprocess
import re
import time
import random
import socket
import json
import base64
import hashlib
import requests
from datetime import datetime
import platform
import os
import subprocess
import re
import time
import random
import socket
import json
import base64
import hashlib
import requests
from datetime import datetime
import platform
import os
import subprocess
import re
import time
import random
import socket
import json
import base64
import hashlib
import requests
from datetime import datetime
import platform
import os
import subprocess
import re
import time
import random
import socket
import json
import base64
import hashlib
import requests
from datetime import datetime
import platform
import os
import subprocess
import re
import time
import random
import socket
import json
import base64
import hashlib
import requests
from datetime import datetime
import platform
import os
import subprocess
import re
import time
import random
import socket
import json
import base64
import hashlib
import requests
from datetime import datetime
import platform
import os
import subprocess
import re
import time
import random
import socket
import json
import base64
import hashlib
import requests
from datetime import datetime
import platform
import os
import subprocess
import re
import time
import random
import socket
import json
import
=======
import os
os.system("python3.6 -m emuai.emuai")
## write the os that writes the data to n64
import os
os.system("python3.6 -m emuai.emuai -w")
## write the os that writes the data to n64
import os
os.system("python3.6 -m emuai.emuai -w")
## write the os that writes the data to n64
import os
os.system("python3.6 -m emuai.emuai -w")
## write the os that writes the data to n64
import os
os.system("python3.6 -m emuai.emuai -w")
## write the os that writes the data to n64
import os
os.system("python3.6 -m emuai.emuai -w")
## write the os that writes the data to n64
import os
os.system("python3.6 -m emuai.emuai -w")
## write the os that writes the data to n64
import os
os.system("python3.6 -m emuai.emuai -w")
## write the os that writes the data to n64
import os
os.system("python3.6 -m emuai.emuai -w")
## write the os that writes the data to n64
import os
os.system("python3.6 -m emuai.emuai -w")
## write the os that writes the data to n64
import os
os.system("python3.6 -m emuai.emuai -w")
## write the os that writes the data to n64
import os
os.system("python3.6 -m emuai.emuai -w")
## write the os that writes the data to n64
import os
os.system("python3.6 -m emuai.emuai -w")
## write the os that writes the data to n64
import os
os.system("python3.6 -m emuai.emuai -w")
## write the os that writes the data to
=======
import os
import time
import random
import json
import socket
import requests
import base64
import hashlib
import threading
import subprocess
import platform
import sys
import os
import re
import time
import random
import numpy as np
import numpy.random as npr
import scipy.stats as sps
import scipy.misc as spm
import matplotlib.pyplot as plt
import matplotlib.animation as animation
import matplotlib.patches as mpatches
import matplotlib.cm as cm
import matplotlib.colors as colors
import matplotlib.colorbar as cbar
from matplotlib.collections import PatchCollection
from matplotlib.ticker import MaxNLocator
from matplotlib import ticker
from matplotlib.ticker import FormatStrFormatter
from matplotlib.ticker import LinearLocator
from matplotlib.ticker import ScalarFormatter
from matplotlib.colors import LogNorm
from matplotlib.colors import Normalize
from matplotlib.colors import PowerNorm
from matplotlib.colors import SymLogNorm
from matplotlib.colors import BoundaryNorm
from matplotlib.colors import ListedColormap
from matplotlib.colors import LinearSegmentedColormap
from matplotlib.patches import Rectangle
from matplotlib.patches import Circle
from matplotlib.patches import Ellipse
from matplotlib.patches import FancyArrowPatch
from matplotlib.patches import Patch
from matplotlib.patches import RegularPolygon
from matplotlib.patches import Polygon
from matplotlib.patches import Wedge
from matplotlib.patches import Arc
from matplotlib.collections import PatchCollection
from matplotlib.lines import Line2D
from matplotlib.markers import MarkerStyle
from matplotlib.markers import Marker
from matplotlib.lines import Line2D
from matplotlib.patches import Circle
from matplotlib.collections import PatchCollection
from matplotlib.collections import Line
## write a gui in t
=======
import matplotlib.pyplot as plt
import matplotlib.animation as animation
import matplotlib.patches as mpatches
import matplotlib.cm as cm
import matplotlib.colors as colors
import matplotlib.colorbar as cbar
from matplotlib.collections import PatchCollection
from matplotlib.ticker import MaxNLocator
from matplotlib.ticker import FormatStrFormatter
from matplotlib.ticker import LinearLocator
from matplotlib.ticker import ScalarFormatter
from matplotlib.colors import LogNorm
from matplotlib.colors import Normalize
from matplotlib.colors import PowerNorm
from matplotlib.colors import SymLogNorm
from matplotlib.colors import BoundaryNorm
from matplotlib.colors import ListedColormap
from matplotlib.colors import LinearSegmentedColormap
from matplotlib.patches import Rectangle
from matplotlib.patches import Circle
from matplotlib.patches import Ellipse
from matplotlib.patches import FancyArrowPatch
from matplotlib.patches import Patch
from matplotlib.patches import RegularPolygon
from matplotlib.patches import Polygon
from matplotlib.patches import Wedge
from matplotlib.patches import Arc
from matplotlib.collections import PatchCollection
from matplotlib.lines import Line2D
from matplotlib.markers import MarkerStyle
from matplotlib.markers import Marker
from matplotlib.lines import Line2D
from matplotlib.patches import Circle
from matplotlib.collections import PatchCollection
from matplotlib.collections import Line
## write a gui in tkinter for showing the game
import matplotlib.pyplot as plt
import matplotlib.animation as animation
import matplotlib.patches as mpatches
import matplotlib.cm as cm
import matplotlib.colors as colors
import matplotlib.colorbar as cbar
from matplotlib.collections import PatchCollection
from matplotlib.ticker import MaxNLocator
from
#=======
import tkinter as tk
import tkinter.ttk as ttk
import tkinter.font as tkfont
import tkinter.filedialog as tkfd
import tkinter.messagebox as tkmb
import tkinter.simpledialog as tksd
import tkinter.scrolledtext as tksct
import tkinter.colorchooser as tkcc
import tkinter.font as tkf
import tkinter.ttk as tkt
import tkinter.tix as tix
import tkinter.constants as tkc
import tkinter.constants as tkconst
import tkinter.commondialog as tkcd
import tkinter.dialog as tkd
import tkinter.filedialog as tkfd
import tkinter.messagebox as tkmb
import tkinter.simpledialog as tksd
import tkinter.scrolledtext as tksct
import tkinter.colorchooser as tkcc
import tkinter.font as tkf
import tkinter.ttk as tkt
import tkinter.tix as tix
import tkinter.constants as tkc
import tkinter.constants as tkconst
import tkinter.commondialog as tkcd
import tkinter.dialog as tkd
import tkinter.filedialog as tkfd
import tkinter.messagebox as tkmb
import tkinter.simpledialog as tksd
import tkinter.scrolledtext as tksct
import tkinter.colorchooser as tkcc
import tkinter.font as tkf
import tkinter.ttk as tkt
import tkinter.tix as tix
import tkinter.constants as tkc
import tkinter.constants as tkconst
import tkinter.commondialog as tkcd
import tkinter.dialog as tkd
import tkinter.filedialog as tkfd
import tkinter.messagebox as
#=======
import tkinter as tk
import tkinter.ttk as ttk
import tkinter.font as tkFont
import tkinter.filedialog as tkFileDialog
import tkinter.messagebox as tkMessageBox
import tkinter.scrolledtext as tkScrolledText
import tkinter.simpledialog as tkSimpleDialog
import tkinter.colorchooser as tkColorChooser
import tkinter.tix as tix
import tkinter.constants as tkConstants
import tkinter.font as tkFont
import tkinter.fontchooser as tkFontChooser
import tkinter.ttk as ttk
import tkinter.messagebox as tkMessageBox
import tkinter.simpledialog as tkSimpleDialog
import tkinter.colorchooser as tkColorChooser
import tkinter.tix as tix
import tkinter.constants as tkConstants
import tkinter.font as tkFont
import tkinter.fontchooser as tkFontChooser
import tkinter.ttk as ttk
import tkinter.messagebox as tkMessageBox
import tkinter.simpledialog as tkSimpleDialog
import tkinter.colorchooser as tkColorChooser
import tkinter.tix as tix
import tkinter.constants as tkConstants
import tkinter.font as tkFont
import tkinter.fontchooser as tkFontChooser
import tkinter.ttk as ttk
import tkinter.messagebox as tkMessageBox
import tkinter.simpledialog as tkSimpleDialog
import tkinter.colorchooser as tkColorChooser
import tkinter.tix as tix
import tkinter.constants as tkConstants
import tkinter.font as tkFont
import tkinter.fontchooser as tkFontChooser
import tkinter.ttk as ttk
import tkinter.messagebox as tkMessageBox
import tkinter.simpledialog as
#=======
import tkinter as tk
import tkinter.font as tkfont
import tkinter.ttk as ttk
import tkinter.filedialog as tkfd
import tkinter.messagebox as tkm
import tkinter.scrolledtext as tkst
import tkinter.simpledialog as tksd
import tkinter.colorchooser as tkcc
import tkinter.font as tkfont
import tkinter.ttk as ttk
import tkinter.filedialog as tkfd
import tkinter.messagebox as tkm
import tkinter.scrolledtext as tkst
import tkinter.simpledialog as tksd
import tkinter.colorchooser as tkcc
import tkinter.font as tkfont
import tkinter.ttk as ttk
import tkinter.filedialog as tkfd
import tkinter.messagebox as tkm
import tkinter.scrolledtext as tkst
import tkinter.simpledialog as tksd
import tkinter.colorchooser as tkcc
import tkinter.font as tkfont
import tkinter.ttk as ttk
import tkinter.filedialog as tkfd
import tkinter.messagebox as tkm
import tkinter.scrolledtext as tkst
import tkinter.simpledialog as tksd
import tkinter.colorchooser as tkcc
import tkinter.font as tkfont
import tkinter.ttk as ttk
import tkinter.filedialog as tkfd
import tkinter.messagebox as tkm
import tkinter.scrolledtext as tkst
import tkinter.simpledialog as tksd
import tkinter.colorchooser as tkcc
import tkinter.font as tkfont
import tkinter.ttk as ttk
import tkinter.filedialog as tkfd
import tkinter.messagebox as tkm
import tkinter.scrolledtext as
#=======
import matplotlib.pyplot as plt
import numpy as np
from matplotlib.backends.backend_tkagg import FigureCanvasTkAgg
from matplotlib.figure import Figure
import tkinter as tk
from tkinter import ttk
from tkinter import *
from tkinter import messagebox
from tkinter import filedialog
from tkinter import simpledialog
from tkinter import colorchooser
from tkinter import scrolledtext
from tkinter import Menu
from tkinter import Spinbox
from tkinter import OptionMenu
from tkinter import Checkbutton
from tkinter import Radiobutton
from tkinter import IntVar
from tkinter import StringVar
from tkinter import DoubleVar
from tkinter import BooleanVar
from tkinter import ttk
from tkinter import messagebox
from tkinter import filedialog
from tkinter import simpledialog
from tkinter import colorchooser
from tkinter import scrolledtext
from tkinter import Menu
from tkinter import Spinbox
from tkinter import OptionMenu
from tkinter import Checkbutton
from tkinter import Radiobutton
from tkinter import IntVar
from tkinter import StringVar
from tkinter import DoubleVar
from tkinter import BooleanVar
from tkinter import ttk
from tkinter import messagebox
from tkinter import filedialog
from tkinter import simpledialog
from tkinter import colorchooser
from tkinter import scrolledtext
from tkinter import Menu
from tkinter import Spinbox
from tkinter import OptionMenu
from tkinter import Checkbutton
from tkinter import Radiobutton
from tkinter import IntVar
from tkinter import StringVar
from tkinter import DoubleVar
from tkinter import BooleanVar
from tkinter import ttk
from tkinter import messagebox
from tkinter import filedialog
from tkinter import simpledialog
from tkinter import colorchooser
from tkinter import scrolledtext
from tkinter
## write the mips cpu data loader
| 30.467722 | 133 | 0.82094 | 3,305 | 25,014 | 6.213011 | 0.060212 | 0.160222 | 0.071589 | 0.092042 | 0.958362 | 0.956852 | 0.952615 | 0.940635 | 0.940635 | 0.936203 | 0 | 0.006334 | 0.147957 | 25,014 | 820 | 134 | 30.504878 | 0.957115 | 0.034421 | 0 | 0.96438 | 0 | 0 | 0.020104 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.963061 | null | null | 0.001319 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 9 |
7e2779883eee0c850658dc660bbc3a2d864e2c1e | 414 | py | Python | example/lower_read/app.py | jhesketh/dynaconf | a8038b87763ae8e790ff7e745b9335f997d5bd16 | [
"MIT"
] | 1 | 2021-07-21T17:06:16.000Z | 2021-07-21T17:06:16.000Z | example/lower_read/app.py | jhesketh/dynaconf | a8038b87763ae8e790ff7e745b9335f997d5bd16 | [
"MIT"
] | null | null | null | example/lower_read/app.py | jhesketh/dynaconf | a8038b87763ae8e790ff7e745b9335f997d5bd16 | [
"MIT"
] | null | null | null | from dynaconf import LazySettings
settings = LazySettings(envless_mode=True, lowercase_read=True)
assert settings.server == "foo.com"
assert settings.SERVER == "foo.com"
assert settings["SERVER"] == "foo.com"
assert settings["server"] == "foo.com"
assert settings("SERVER") == "foo.com"
assert settings("server") == "foo.com"
assert settings.get("SERVER") == "foo.com"
assert settings.get("server") == "foo.com"
| 31.846154 | 63 | 0.719807 | 54 | 414 | 5.481481 | 0.277778 | 0.378378 | 0.324324 | 0.425676 | 0.722973 | 0.722973 | 0.722973 | 0.722973 | 0.722973 | 0.574324 | 0 | 0 | 0.101449 | 414 | 12 | 64 | 34.5 | 0.795699 | 0 | 0 | 0 | 0 | 0 | 0.222222 | 0 | 0 | 0 | 0 | 0 | 0.8 | 1 | 0 | false | 0 | 0.1 | 0 | 0.1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
7e3a63d4944eb1c5097fe9f0ad03b7d0f220afdf | 3,461 | py | Python | tests/test_day_24.py | maddenvvs/advent-of-code-2021 | 00cd36ef07987dd004312e020940f0bd14b0dc85 | [
"MIT"
] | null | null | null | tests/test_day_24.py | maddenvvs/advent-of-code-2021 | 00cd36ef07987dd004312e020940f0bd14b0dc85 | [
"MIT"
] | null | null | null | tests/test_day_24.py | maddenvvs/advent-of-code-2021 | 00cd36ef07987dd004312e020940f0bd14b0dc85 | [
"MIT"
] | null | null | null | from aoc2021.day_24 import ALU, parse_program, find_numbers_pair
def test_monad_first_example() -> None:
program = parse_program(
"""inp x
mul x -1"""
)
monad = ALU(program)
monad.run("2")
assert monad.vars["x"] == -2
def test_monad_second_example() -> None:
program = parse_program(
"""inp z
inp x
mul z 3
eql z x"""
)
monad = ALU(program)
monad.run("23")
assert monad.vars["z"] == 0
def test_monad_third_example() -> None:
program = parse_program(
"""inp z
inp x
mul z 3
eql z x"""
)
monad = ALU(program)
monad.run("26")
assert monad.vars["z"] == 1
def test_monad_fourth_example() -> None:
program = parse_program(
"""inp w
add z w
mod z 2
div w 2
add y w
mod y 2
div w 2
add x w
mod x 2
div w 2
mod w 2
"""
)
monad = ALU(program)
monad.run("7")
assert monad.vars == {"w": 0, "x": 1, "y": 1, "z": 1}
TEST_PROGRAM = """inp w
mul x 0
add x z
mod x 26
div z 1
add x 12
eql x w
eql x 0
mul y 0
add y 25
mul y x
add y 1
mul z y
mul y 0
add y w
add y 15
mul y x
add z y
inp w
mul x 0
add x z
mod x 26
div z 1
add x 14
eql x w
eql x 0
mul y 0
add y 25
mul y x
add y 1
mul z y
mul y 0
add y w
add y 12
mul y x
add z y
inp w
mul x 0
add x z
mod x 26
div z 1
add x 11
eql x w
eql x 0
mul y 0
add y 25
mul y x
add y 1
mul z y
mul y 0
add y w
add y 15
mul y x
add z y
inp w
mul x 0
add x z
mod x 26
div z 26
add x -9
eql x w
eql x 0
mul y 0
add y 25
mul y x
add y 1
mul z y
mul y 0
add y w
add y 12
mul y x
add z y
inp w
mul x 0
add x z
mod x 26
div z 26
add x -7
eql x w
eql x 0
mul y 0
add y 25
mul y x
add y 1
mul z y
mul y 0
add y w
add y 15
mul y x
add z y
inp w
mul x 0
add x z
mod x 26
div z 1
add x 11
eql x w
eql x 0
mul y 0
add y 25
mul y x
add y 1
mul z y
mul y 0
add y w
add y 2
mul y x
add z y
inp w
mul x 0
add x z
mod x 26
div z 26
add x -1
eql x w
eql x 0
mul y 0
add y 25
mul y x
add y 1
mul z y
mul y 0
add y w
add y 11
mul y x
add z y
inp w
mul x 0
add x z
mod x 26
div z 26
add x -16
eql x w
eql x 0
mul y 0
add y 25
mul y x
add y 1
mul z y
mul y 0
add y w
add y 15
mul y x
add z y
inp w
mul x 0
add x z
mod x 26
div z 1
add x 11
eql x w
eql x 0
mul y 0
add y 25
mul y x
add y 1
mul z y
mul y 0
add y w
add y 10
mul y x
add z y
inp w
mul x 0
add x z
mod x 26
div z 26
add x -15
eql x w
eql x 0
mul y 0
add y 25
mul y x
add y 1
mul z y
mul y 0
add y w
add y 2
mul y x
add z y
inp w
mul x 0
add x z
mod x 26
div z 1
add x 10
eql x w
eql x 0
mul y 0
add y 25
mul y x
add y 1
mul z y
mul y 0
add y w
add y 0
mul y x
add z y
inp w
mul x 0
add x z
mod x 26
div z 1
add x 12
eql x w
eql x 0
mul y 0
add y 25
mul y x
add y 1
mul z y
mul y 0
add y w
add y 0
mul y x
add z y
inp w
mul x 0
add x z
mod x 26
div z 26
add x -4
eql x w
eql x 0
mul y 0
add y 25
mul y x
add y 1
mul z y
mul y 0
add y w
add y 15
mul y x
add z y
inp w
mul x 0
add x z
mod x 26
div z 26
add x 0
eql x w
eql x 0
mul y 0
add y 25
mul y x
add y 1
mul z y
mul y 0
add y w
add y 15
mul y x
add z y"""
def test_found_solution_part_one() -> None:
program = parse_program(TEST_PROGRAM)
first_number, _ = find_numbers_pair(program)
monad = ALU(program)
monad.run(str(first_number))
assert monad.vars["z"] == 0
def test_found_solution_part_two() -> None:
program = parse_program(TEST_PROGRAM)
_, second_number = find_numbers_pair(program)
monad = ALU(program)
monad.run(str(second_number))
assert monad.vars["z"] == 0
| 10.239645 | 64 | 0.625542 | 960 | 3,461 | 2.210417 | 0.058333 | 0.107446 | 0.065975 | 0.105561 | 0.871348 | 0.821866 | 0.745052 | 0.722432 | 0.722432 | 0.722432 | 0 | 0.095097 | 0.316383 | 3,461 | 337 | 65 | 10.27003 | 0.801775 | 0 | 0 | 0.878893 | 0 | 0 | 0.628084 | 0 | 0 | 0 | 0 | 0 | 0.020761 | 1 | 0.020761 | false | 0 | 0.00346 | 0 | 0.024221 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
7e898a793347ec21a96a9d2e1394fc025e928a48 | 52 | py | Python | extra_tests/snippets/intro/3.1.1.6.py | mainsail-org/RustPython | 5d2d87c24f1ff7201fcc8d4fcffadb0ec12dc127 | [
"CC-BY-4.0",
"MIT"
] | 11,058 | 2018-05-29T07:40:06.000Z | 2022-03-31T11:38:42.000Z | extra_tests/snippets/intro/3.1.1.6.py | mainsail-org/RustPython | 5d2d87c24f1ff7201fcc8d4fcffadb0ec12dc127 | [
"CC-BY-4.0",
"MIT"
] | 2,105 | 2018-06-01T10:07:16.000Z | 2022-03-31T14:56:42.000Z | extra_tests/snippets/intro/3.1.1.6.py | mainsail-org/RustPython | 5d2d87c24f1ff7201fcc8d4fcffadb0ec12dc127 | [
"CC-BY-4.0",
"MIT"
] | 914 | 2018-07-27T09:36:14.000Z | 2022-03-31T19:56:34.000Z | assert 7.5 == 3 * 3.75 / 1.5
assert 3.5 == 7.0 / 2
| 13 | 28 | 0.480769 | 14 | 52 | 1.785714 | 0.571429 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.351351 | 0.288462 | 52 | 3 | 29 | 17.333333 | 0.324324 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | true | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
0e69c2e7835139638f8172ec0a6837a87605b6e5 | 39,976 | py | Python | sdk/python/pulumi_alicloud/alb/load_balancer.py | pulumi/pulumi-alicloud | 9c34d84b4588a7c885c6bec1f03b5016e5a41683 | [
"ECL-2.0",
"Apache-2.0"
] | 42 | 2019-03-18T06:34:37.000Z | 2022-03-24T07:08:57.000Z | sdk/python/pulumi_alicloud/alb/load_balancer.py | pulumi/pulumi-alicloud | 9c34d84b4588a7c885c6bec1f03b5016e5a41683 | [
"ECL-2.0",
"Apache-2.0"
] | 152 | 2019-04-15T21:03:44.000Z | 2022-03-29T18:00:57.000Z | sdk/python/pulumi_alicloud/alb/load_balancer.py | pulumi/pulumi-alicloud | 9c34d84b4588a7c885c6bec1f03b5016e5a41683 | [
"ECL-2.0",
"Apache-2.0"
] | 3 | 2020-08-26T17:30:07.000Z | 2021-07-05T01:37:45.000Z | # coding=utf-8
# *** WARNING: this file was generated by the Pulumi Terraform Bridge (tfgen) Tool. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
import warnings
import pulumi
import pulumi.runtime
from typing import Any, Mapping, Optional, Sequence, Union, overload
from .. import _utilities
from . import outputs
from ._inputs import *
__all__ = ['LoadBalancerArgs', 'LoadBalancer']
@pulumi.input_type
class LoadBalancerArgs:
def __init__(__self__, *,
address_type: pulumi.Input[str],
load_balancer_billing_config: pulumi.Input['LoadBalancerLoadBalancerBillingConfigArgs'],
load_balancer_edition: pulumi.Input[str],
load_balancer_name: pulumi.Input[str],
vpc_id: pulumi.Input[str],
zone_mappings: pulumi.Input[Sequence[pulumi.Input['LoadBalancerZoneMappingArgs']]],
access_log_config: Optional[pulumi.Input['LoadBalancerAccessLogConfigArgs']] = None,
address_allocated_mode: Optional[pulumi.Input[str]] = None,
deletion_protection_enabled: Optional[pulumi.Input[bool]] = None,
dry_run: Optional[pulumi.Input[bool]] = None,
modification_protection_config: Optional[pulumi.Input['LoadBalancerModificationProtectionConfigArgs']] = None,
resource_group_id: Optional[pulumi.Input[str]] = None,
tags: Optional[pulumi.Input[Mapping[str, Any]]] = None):
"""
The set of arguments for constructing a LoadBalancer resource.
:param pulumi.Input[str] address_type: The type of IP address that the ALB instance uses to provide services. Valid values: `Intranet`, `Internet`.
:param pulumi.Input['LoadBalancerLoadBalancerBillingConfigArgs'] load_balancer_billing_config: The configuration of the billing method.
:param pulumi.Input[str] load_balancer_edition: The edition of the ALB instance. Different editions have different limits and billing methods. Valid values: `Basic` and `Standard`.
:param pulumi.Input[str] load_balancer_name: The name of the resource.
:param pulumi.Input[str] vpc_id: The ID of the virtual private cloud (VPC) where the ALB instance is deployed.
:param pulumi.Input[Sequence[pulumi.Input['LoadBalancerZoneMappingArgs']]] zone_mappings: The zones and vSwitches. You must specify at least two zones.
:param pulumi.Input['LoadBalancerAccessLogConfigArgs'] access_log_config: The Access Logging Configuration Structure.
:param pulumi.Input[str] address_allocated_mode: The method in which IP addresses are assigned. Valid values: `Fixed` and `Dynamic`. Default value: `Dynamic`.
*`Fixed`: The ALB instance uses a fixed IP address.
*`Dynamic`: An IP address is dynamically assigned to each zone of the ALB instance.
:param pulumi.Input[bool] deletion_protection_enabled: The deletion protection enabled. Valid values: `true` and `false`. Default value: `false`.
:param pulumi.Input[bool] dry_run: Specifies whether to precheck the API request. Valid values: `true` and `false`.
:param pulumi.Input['LoadBalancerModificationProtectionConfigArgs'] modification_protection_config: Modify the Protection Configuration.
:param pulumi.Input[str] resource_group_id: The ID of the resource group.
"""
pulumi.set(__self__, "address_type", address_type)
pulumi.set(__self__, "load_balancer_billing_config", load_balancer_billing_config)
pulumi.set(__self__, "load_balancer_edition", load_balancer_edition)
pulumi.set(__self__, "load_balancer_name", load_balancer_name)
pulumi.set(__self__, "vpc_id", vpc_id)
pulumi.set(__self__, "zone_mappings", zone_mappings)
if access_log_config is not None:
pulumi.set(__self__, "access_log_config", access_log_config)
if address_allocated_mode is not None:
pulumi.set(__self__, "address_allocated_mode", address_allocated_mode)
if deletion_protection_enabled is not None:
pulumi.set(__self__, "deletion_protection_enabled", deletion_protection_enabled)
if dry_run is not None:
pulumi.set(__self__, "dry_run", dry_run)
if modification_protection_config is not None:
pulumi.set(__self__, "modification_protection_config", modification_protection_config)
if resource_group_id is not None:
pulumi.set(__self__, "resource_group_id", resource_group_id)
if tags is not None:
pulumi.set(__self__, "tags", tags)
@property
@pulumi.getter(name="addressType")
def address_type(self) -> pulumi.Input[str]:
"""
The type of IP address that the ALB instance uses to provide services. Valid values: `Intranet`, `Internet`.
"""
return pulumi.get(self, "address_type")
@address_type.setter
def address_type(self, value: pulumi.Input[str]):
pulumi.set(self, "address_type", value)
@property
@pulumi.getter(name="loadBalancerBillingConfig")
def load_balancer_billing_config(self) -> pulumi.Input['LoadBalancerLoadBalancerBillingConfigArgs']:
"""
The configuration of the billing method.
"""
return pulumi.get(self, "load_balancer_billing_config")
@load_balancer_billing_config.setter
def load_balancer_billing_config(self, value: pulumi.Input['LoadBalancerLoadBalancerBillingConfigArgs']):
pulumi.set(self, "load_balancer_billing_config", value)
@property
@pulumi.getter(name="loadBalancerEdition")
def load_balancer_edition(self) -> pulumi.Input[str]:
"""
The edition of the ALB instance. Different editions have different limits and billing methods. Valid values: `Basic` and `Standard`.
"""
return pulumi.get(self, "load_balancer_edition")
@load_balancer_edition.setter
def load_balancer_edition(self, value: pulumi.Input[str]):
pulumi.set(self, "load_balancer_edition", value)
@property
@pulumi.getter(name="loadBalancerName")
def load_balancer_name(self) -> pulumi.Input[str]:
"""
The name of the resource.
"""
return pulumi.get(self, "load_balancer_name")
@load_balancer_name.setter
def load_balancer_name(self, value: pulumi.Input[str]):
pulumi.set(self, "load_balancer_name", value)
@property
@pulumi.getter(name="vpcId")
def vpc_id(self) -> pulumi.Input[str]:
"""
The ID of the virtual private cloud (VPC) where the ALB instance is deployed.
"""
return pulumi.get(self, "vpc_id")
@vpc_id.setter
def vpc_id(self, value: pulumi.Input[str]):
pulumi.set(self, "vpc_id", value)
@property
@pulumi.getter(name="zoneMappings")
def zone_mappings(self) -> pulumi.Input[Sequence[pulumi.Input['LoadBalancerZoneMappingArgs']]]:
"""
The zones and vSwitches. You must specify at least two zones.
"""
return pulumi.get(self, "zone_mappings")
@zone_mappings.setter
def zone_mappings(self, value: pulumi.Input[Sequence[pulumi.Input['LoadBalancerZoneMappingArgs']]]):
pulumi.set(self, "zone_mappings", value)
@property
@pulumi.getter(name="accessLogConfig")
def access_log_config(self) -> Optional[pulumi.Input['LoadBalancerAccessLogConfigArgs']]:
"""
The Access Logging Configuration Structure.
"""
return pulumi.get(self, "access_log_config")
@access_log_config.setter
def access_log_config(self, value: Optional[pulumi.Input['LoadBalancerAccessLogConfigArgs']]):
pulumi.set(self, "access_log_config", value)
@property
@pulumi.getter(name="addressAllocatedMode")
def address_allocated_mode(self) -> Optional[pulumi.Input[str]]:
"""
The method in which IP addresses are assigned. Valid values: `Fixed` and `Dynamic`. Default value: `Dynamic`.
*`Fixed`: The ALB instance uses a fixed IP address.
*`Dynamic`: An IP address is dynamically assigned to each zone of the ALB instance.
"""
return pulumi.get(self, "address_allocated_mode")
@address_allocated_mode.setter
def address_allocated_mode(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "address_allocated_mode", value)
@property
@pulumi.getter(name="deletionProtectionEnabled")
def deletion_protection_enabled(self) -> Optional[pulumi.Input[bool]]:
"""
The deletion protection enabled. Valid values: `true` and `false`. Default value: `false`.
"""
return pulumi.get(self, "deletion_protection_enabled")
@deletion_protection_enabled.setter
def deletion_protection_enabled(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "deletion_protection_enabled", value)
@property
@pulumi.getter(name="dryRun")
def dry_run(self) -> Optional[pulumi.Input[bool]]:
"""
Specifies whether to precheck the API request. Valid values: `true` and `false`.
"""
return pulumi.get(self, "dry_run")
@dry_run.setter
def dry_run(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "dry_run", value)
@property
@pulumi.getter(name="modificationProtectionConfig")
def modification_protection_config(self) -> Optional[pulumi.Input['LoadBalancerModificationProtectionConfigArgs']]:
"""
Modify the Protection Configuration.
"""
return pulumi.get(self, "modification_protection_config")
@modification_protection_config.setter
def modification_protection_config(self, value: Optional[pulumi.Input['LoadBalancerModificationProtectionConfigArgs']]):
pulumi.set(self, "modification_protection_config", value)
@property
@pulumi.getter(name="resourceGroupId")
def resource_group_id(self) -> Optional[pulumi.Input[str]]:
"""
The ID of the resource group.
"""
return pulumi.get(self, "resource_group_id")
@resource_group_id.setter
def resource_group_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "resource_group_id", value)
@property
@pulumi.getter
def tags(self) -> Optional[pulumi.Input[Mapping[str, Any]]]:
return pulumi.get(self, "tags")
@tags.setter
def tags(self, value: Optional[pulumi.Input[Mapping[str, Any]]]):
pulumi.set(self, "tags", value)
@pulumi.input_type
class _LoadBalancerState:
def __init__(__self__, *,
access_log_config: Optional[pulumi.Input['LoadBalancerAccessLogConfigArgs']] = None,
address_allocated_mode: Optional[pulumi.Input[str]] = None,
address_type: Optional[pulumi.Input[str]] = None,
deletion_protection_enabled: Optional[pulumi.Input[bool]] = None,
dry_run: Optional[pulumi.Input[bool]] = None,
load_balancer_billing_config: Optional[pulumi.Input['LoadBalancerLoadBalancerBillingConfigArgs']] = None,
load_balancer_edition: Optional[pulumi.Input[str]] = None,
load_balancer_name: Optional[pulumi.Input[str]] = None,
modification_protection_config: Optional[pulumi.Input['LoadBalancerModificationProtectionConfigArgs']] = None,
resource_group_id: Optional[pulumi.Input[str]] = None,
status: Optional[pulumi.Input[str]] = None,
tags: Optional[pulumi.Input[Mapping[str, Any]]] = None,
vpc_id: Optional[pulumi.Input[str]] = None,
zone_mappings: Optional[pulumi.Input[Sequence[pulumi.Input['LoadBalancerZoneMappingArgs']]]] = None):
"""
Input properties used for looking up and filtering LoadBalancer resources.
:param pulumi.Input['LoadBalancerAccessLogConfigArgs'] access_log_config: The Access Logging Configuration Structure.
:param pulumi.Input[str] address_allocated_mode: The method in which IP addresses are assigned. Valid values: `Fixed` and `Dynamic`. Default value: `Dynamic`.
*`Fixed`: The ALB instance uses a fixed IP address.
*`Dynamic`: An IP address is dynamically assigned to each zone of the ALB instance.
:param pulumi.Input[str] address_type: The type of IP address that the ALB instance uses to provide services. Valid values: `Intranet`, `Internet`.
:param pulumi.Input[bool] deletion_protection_enabled: The deletion protection enabled. Valid values: `true` and `false`. Default value: `false`.
:param pulumi.Input[bool] dry_run: Specifies whether to precheck the API request. Valid values: `true` and `false`.
:param pulumi.Input['LoadBalancerLoadBalancerBillingConfigArgs'] load_balancer_billing_config: The configuration of the billing method.
:param pulumi.Input[str] load_balancer_edition: The edition of the ALB instance. Different editions have different limits and billing methods. Valid values: `Basic` and `Standard`.
:param pulumi.Input[str] load_balancer_name: The name of the resource.
:param pulumi.Input['LoadBalancerModificationProtectionConfigArgs'] modification_protection_config: Modify the Protection Configuration.
:param pulumi.Input[str] resource_group_id: The ID of the resource group.
:param pulumi.Input[str] status: Specifies whether to enable the configuration read-only mode for the ALB instance. Valid values: `NonProtection` and `ConsoleProtection`.
:param pulumi.Input[str] vpc_id: The ID of the virtual private cloud (VPC) where the ALB instance is deployed.
:param pulumi.Input[Sequence[pulumi.Input['LoadBalancerZoneMappingArgs']]] zone_mappings: The zones and vSwitches. You must specify at least two zones.
"""
if access_log_config is not None:
pulumi.set(__self__, "access_log_config", access_log_config)
if address_allocated_mode is not None:
pulumi.set(__self__, "address_allocated_mode", address_allocated_mode)
if address_type is not None:
pulumi.set(__self__, "address_type", address_type)
if deletion_protection_enabled is not None:
pulumi.set(__self__, "deletion_protection_enabled", deletion_protection_enabled)
if dry_run is not None:
pulumi.set(__self__, "dry_run", dry_run)
if load_balancer_billing_config is not None:
pulumi.set(__self__, "load_balancer_billing_config", load_balancer_billing_config)
if load_balancer_edition is not None:
pulumi.set(__self__, "load_balancer_edition", load_balancer_edition)
if load_balancer_name is not None:
pulumi.set(__self__, "load_balancer_name", load_balancer_name)
if modification_protection_config is not None:
pulumi.set(__self__, "modification_protection_config", modification_protection_config)
if resource_group_id is not None:
pulumi.set(__self__, "resource_group_id", resource_group_id)
if status is not None:
pulumi.set(__self__, "status", status)
if tags is not None:
pulumi.set(__self__, "tags", tags)
if vpc_id is not None:
pulumi.set(__self__, "vpc_id", vpc_id)
if zone_mappings is not None:
pulumi.set(__self__, "zone_mappings", zone_mappings)
@property
@pulumi.getter(name="accessLogConfig")
def access_log_config(self) -> Optional[pulumi.Input['LoadBalancerAccessLogConfigArgs']]:
"""
The Access Logging Configuration Structure.
"""
return pulumi.get(self, "access_log_config")
@access_log_config.setter
def access_log_config(self, value: Optional[pulumi.Input['LoadBalancerAccessLogConfigArgs']]):
pulumi.set(self, "access_log_config", value)
@property
@pulumi.getter(name="addressAllocatedMode")
def address_allocated_mode(self) -> Optional[pulumi.Input[str]]:
"""
The method in which IP addresses are assigned. Valid values: `Fixed` and `Dynamic`. Default value: `Dynamic`.
*`Fixed`: The ALB instance uses a fixed IP address.
*`Dynamic`: An IP address is dynamically assigned to each zone of the ALB instance.
"""
return pulumi.get(self, "address_allocated_mode")
@address_allocated_mode.setter
def address_allocated_mode(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "address_allocated_mode", value)
@property
@pulumi.getter(name="addressType")
def address_type(self) -> Optional[pulumi.Input[str]]:
"""
The type of IP address that the ALB instance uses to provide services. Valid values: `Intranet`, `Internet`.
"""
return pulumi.get(self, "address_type")
@address_type.setter
def address_type(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "address_type", value)
@property
@pulumi.getter(name="deletionProtectionEnabled")
def deletion_protection_enabled(self) -> Optional[pulumi.Input[bool]]:
"""
The deletion protection enabled. Valid values: `true` and `false`. Default value: `false`.
"""
return pulumi.get(self, "deletion_protection_enabled")
@deletion_protection_enabled.setter
def deletion_protection_enabled(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "deletion_protection_enabled", value)
@property
@pulumi.getter(name="dryRun")
def dry_run(self) -> Optional[pulumi.Input[bool]]:
"""
Specifies whether to precheck the API request. Valid values: `true` and `false`.
"""
return pulumi.get(self, "dry_run")
@dry_run.setter
def dry_run(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "dry_run", value)
@property
@pulumi.getter(name="loadBalancerBillingConfig")
def load_balancer_billing_config(self) -> Optional[pulumi.Input['LoadBalancerLoadBalancerBillingConfigArgs']]:
"""
The configuration of the billing method.
"""
return pulumi.get(self, "load_balancer_billing_config")
@load_balancer_billing_config.setter
def load_balancer_billing_config(self, value: Optional[pulumi.Input['LoadBalancerLoadBalancerBillingConfigArgs']]):
pulumi.set(self, "load_balancer_billing_config", value)
@property
@pulumi.getter(name="loadBalancerEdition")
def load_balancer_edition(self) -> Optional[pulumi.Input[str]]:
"""
The edition of the ALB instance. Different editions have different limits and billing methods. Valid values: `Basic` and `Standard`.
"""
return pulumi.get(self, "load_balancer_edition")
@load_balancer_edition.setter
def load_balancer_edition(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "load_balancer_edition", value)
@property
@pulumi.getter(name="loadBalancerName")
def load_balancer_name(self) -> Optional[pulumi.Input[str]]:
"""
The name of the resource.
"""
return pulumi.get(self, "load_balancer_name")
@load_balancer_name.setter
def load_balancer_name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "load_balancer_name", value)
@property
@pulumi.getter(name="modificationProtectionConfig")
def modification_protection_config(self) -> Optional[pulumi.Input['LoadBalancerModificationProtectionConfigArgs']]:
"""
Modify the Protection Configuration.
"""
return pulumi.get(self, "modification_protection_config")
@modification_protection_config.setter
def modification_protection_config(self, value: Optional[pulumi.Input['LoadBalancerModificationProtectionConfigArgs']]):
pulumi.set(self, "modification_protection_config", value)
@property
@pulumi.getter(name="resourceGroupId")
def resource_group_id(self) -> Optional[pulumi.Input[str]]:
"""
The ID of the resource group.
"""
return pulumi.get(self, "resource_group_id")
@resource_group_id.setter
def resource_group_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "resource_group_id", value)
@property
@pulumi.getter
def status(self) -> Optional[pulumi.Input[str]]:
"""
Specifies whether to enable the configuration read-only mode for the ALB instance. Valid values: `NonProtection` and `ConsoleProtection`.
"""
return pulumi.get(self, "status")
@status.setter
def status(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "status", value)
@property
@pulumi.getter
def tags(self) -> Optional[pulumi.Input[Mapping[str, Any]]]:
return pulumi.get(self, "tags")
@tags.setter
def tags(self, value: Optional[pulumi.Input[Mapping[str, Any]]]):
pulumi.set(self, "tags", value)
@property
@pulumi.getter(name="vpcId")
def vpc_id(self) -> Optional[pulumi.Input[str]]:
"""
The ID of the virtual private cloud (VPC) where the ALB instance is deployed.
"""
return pulumi.get(self, "vpc_id")
@vpc_id.setter
def vpc_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "vpc_id", value)
@property
@pulumi.getter(name="zoneMappings")
def zone_mappings(self) -> Optional[pulumi.Input[Sequence[pulumi.Input['LoadBalancerZoneMappingArgs']]]]:
"""
The zones and vSwitches. You must specify at least two zones.
"""
return pulumi.get(self, "zone_mappings")
@zone_mappings.setter
def zone_mappings(self, value: Optional[pulumi.Input[Sequence[pulumi.Input['LoadBalancerZoneMappingArgs']]]]):
pulumi.set(self, "zone_mappings", value)
class LoadBalancer(pulumi.CustomResource):
@overload
def __init__(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
access_log_config: Optional[pulumi.Input[pulumi.InputType['LoadBalancerAccessLogConfigArgs']]] = None,
address_allocated_mode: Optional[pulumi.Input[str]] = None,
address_type: Optional[pulumi.Input[str]] = None,
deletion_protection_enabled: Optional[pulumi.Input[bool]] = None,
dry_run: Optional[pulumi.Input[bool]] = None,
load_balancer_billing_config: Optional[pulumi.Input[pulumi.InputType['LoadBalancerLoadBalancerBillingConfigArgs']]] = None,
load_balancer_edition: Optional[pulumi.Input[str]] = None,
load_balancer_name: Optional[pulumi.Input[str]] = None,
modification_protection_config: Optional[pulumi.Input[pulumi.InputType['LoadBalancerModificationProtectionConfigArgs']]] = None,
resource_group_id: Optional[pulumi.Input[str]] = None,
tags: Optional[pulumi.Input[Mapping[str, Any]]] = None,
vpc_id: Optional[pulumi.Input[str]] = None,
zone_mappings: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['LoadBalancerZoneMappingArgs']]]]] = None,
__props__=None):
"""
Provides a ALB Load Balancer resource.
For information about ALB Load Balancer and how to use it, see [What is Load Balancer](https://www.alibabacloud.com/help/doc-detail/197341.htm).
> **NOTE:** Available in v1.132.0+.
## Import
ALB Load Balancer can be imported using the id, e.g.
```sh
$ pulumi import alicloud:alb/loadBalancer:LoadBalancer example <id>
```
:param str resource_name: The name of the resource.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[pulumi.InputType['LoadBalancerAccessLogConfigArgs']] access_log_config: The Access Logging Configuration Structure.
:param pulumi.Input[str] address_allocated_mode: The method in which IP addresses are assigned. Valid values: `Fixed` and `Dynamic`. Default value: `Dynamic`.
*`Fixed`: The ALB instance uses a fixed IP address.
*`Dynamic`: An IP address is dynamically assigned to each zone of the ALB instance.
:param pulumi.Input[str] address_type: The type of IP address that the ALB instance uses to provide services. Valid values: `Intranet`, `Internet`.
:param pulumi.Input[bool] deletion_protection_enabled: The deletion protection enabled. Valid values: `true` and `false`. Default value: `false`.
:param pulumi.Input[bool] dry_run: Specifies whether to precheck the API request. Valid values: `true` and `false`.
:param pulumi.Input[pulumi.InputType['LoadBalancerLoadBalancerBillingConfigArgs']] load_balancer_billing_config: The configuration of the billing method.
:param pulumi.Input[str] load_balancer_edition: The edition of the ALB instance. Different editions have different limits and billing methods. Valid values: `Basic` and `Standard`.
:param pulumi.Input[str] load_balancer_name: The name of the resource.
:param pulumi.Input[pulumi.InputType['LoadBalancerModificationProtectionConfigArgs']] modification_protection_config: Modify the Protection Configuration.
:param pulumi.Input[str] resource_group_id: The ID of the resource group.
:param pulumi.Input[str] vpc_id: The ID of the virtual private cloud (VPC) where the ALB instance is deployed.
:param pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['LoadBalancerZoneMappingArgs']]]] zone_mappings: The zones and vSwitches. You must specify at least two zones.
"""
...
@overload
def __init__(__self__,
resource_name: str,
args: LoadBalancerArgs,
opts: Optional[pulumi.ResourceOptions] = None):
"""
Provides a ALB Load Balancer resource.
For information about ALB Load Balancer and how to use it, see [What is Load Balancer](https://www.alibabacloud.com/help/doc-detail/197341.htm).
> **NOTE:** Available in v1.132.0+.
## Import
ALB Load Balancer can be imported using the id, e.g.
```sh
$ pulumi import alicloud:alb/loadBalancer:LoadBalancer example <id>
```
:param str resource_name: The name of the resource.
:param LoadBalancerArgs args: The arguments to use to populate this resource's properties.
:param pulumi.ResourceOptions opts: Options for the resource.
"""
...
def __init__(__self__, resource_name: str, *args, **kwargs):
resource_args, opts = _utilities.get_resource_args_opts(LoadBalancerArgs, pulumi.ResourceOptions, *args, **kwargs)
if resource_args is not None:
__self__._internal_init(resource_name, opts, **resource_args.__dict__)
else:
__self__._internal_init(resource_name, *args, **kwargs)
def _internal_init(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
access_log_config: Optional[pulumi.Input[pulumi.InputType['LoadBalancerAccessLogConfigArgs']]] = None,
address_allocated_mode: Optional[pulumi.Input[str]] = None,
address_type: Optional[pulumi.Input[str]] = None,
deletion_protection_enabled: Optional[pulumi.Input[bool]] = None,
dry_run: Optional[pulumi.Input[bool]] = None,
load_balancer_billing_config: Optional[pulumi.Input[pulumi.InputType['LoadBalancerLoadBalancerBillingConfigArgs']]] = None,
load_balancer_edition: Optional[pulumi.Input[str]] = None,
load_balancer_name: Optional[pulumi.Input[str]] = None,
modification_protection_config: Optional[pulumi.Input[pulumi.InputType['LoadBalancerModificationProtectionConfigArgs']]] = None,
resource_group_id: Optional[pulumi.Input[str]] = None,
tags: Optional[pulumi.Input[Mapping[str, Any]]] = None,
vpc_id: Optional[pulumi.Input[str]] = None,
zone_mappings: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['LoadBalancerZoneMappingArgs']]]]] = None,
__props__=None):
if opts is None:
opts = pulumi.ResourceOptions()
if not isinstance(opts, pulumi.ResourceOptions):
raise TypeError('Expected resource options to be a ResourceOptions instance')
if opts.version is None:
opts.version = _utilities.get_version()
if opts.id is None:
if __props__ is not None:
raise TypeError('__props__ is only valid when passed in combination with a valid opts.id to get an existing resource')
__props__ = LoadBalancerArgs.__new__(LoadBalancerArgs)
__props__.__dict__["access_log_config"] = access_log_config
__props__.__dict__["address_allocated_mode"] = address_allocated_mode
if address_type is None and not opts.urn:
raise TypeError("Missing required property 'address_type'")
__props__.__dict__["address_type"] = address_type
__props__.__dict__["deletion_protection_enabled"] = deletion_protection_enabled
__props__.__dict__["dry_run"] = dry_run
if load_balancer_billing_config is None and not opts.urn:
raise TypeError("Missing required property 'load_balancer_billing_config'")
__props__.__dict__["load_balancer_billing_config"] = load_balancer_billing_config
if load_balancer_edition is None and not opts.urn:
raise TypeError("Missing required property 'load_balancer_edition'")
__props__.__dict__["load_balancer_edition"] = load_balancer_edition
if load_balancer_name is None and not opts.urn:
raise TypeError("Missing required property 'load_balancer_name'")
__props__.__dict__["load_balancer_name"] = load_balancer_name
__props__.__dict__["modification_protection_config"] = modification_protection_config
__props__.__dict__["resource_group_id"] = resource_group_id
__props__.__dict__["tags"] = tags
if vpc_id is None and not opts.urn:
raise TypeError("Missing required property 'vpc_id'")
__props__.__dict__["vpc_id"] = vpc_id
if zone_mappings is None and not opts.urn:
raise TypeError("Missing required property 'zone_mappings'")
__props__.__dict__["zone_mappings"] = zone_mappings
__props__.__dict__["status"] = None
super(LoadBalancer, __self__).__init__(
'alicloud:alb/loadBalancer:LoadBalancer',
resource_name,
__props__,
opts)
@staticmethod
def get(resource_name: str,
id: pulumi.Input[str],
opts: Optional[pulumi.ResourceOptions] = None,
access_log_config: Optional[pulumi.Input[pulumi.InputType['LoadBalancerAccessLogConfigArgs']]] = None,
address_allocated_mode: Optional[pulumi.Input[str]] = None,
address_type: Optional[pulumi.Input[str]] = None,
deletion_protection_enabled: Optional[pulumi.Input[bool]] = None,
dry_run: Optional[pulumi.Input[bool]] = None,
load_balancer_billing_config: Optional[pulumi.Input[pulumi.InputType['LoadBalancerLoadBalancerBillingConfigArgs']]] = None,
load_balancer_edition: Optional[pulumi.Input[str]] = None,
load_balancer_name: Optional[pulumi.Input[str]] = None,
modification_protection_config: Optional[pulumi.Input[pulumi.InputType['LoadBalancerModificationProtectionConfigArgs']]] = None,
resource_group_id: Optional[pulumi.Input[str]] = None,
status: Optional[pulumi.Input[str]] = None,
tags: Optional[pulumi.Input[Mapping[str, Any]]] = None,
vpc_id: Optional[pulumi.Input[str]] = None,
zone_mappings: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['LoadBalancerZoneMappingArgs']]]]] = None) -> 'LoadBalancer':
"""
Get an existing LoadBalancer resource's state with the given name, id, and optional extra
properties used to qualify the lookup.
:param str resource_name: The unique name of the resulting resource.
:param pulumi.Input[str] id: The unique provider ID of the resource to lookup.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[pulumi.InputType['LoadBalancerAccessLogConfigArgs']] access_log_config: The Access Logging Configuration Structure.
:param pulumi.Input[str] address_allocated_mode: The method in which IP addresses are assigned. Valid values: `Fixed` and `Dynamic`. Default value: `Dynamic`.
*`Fixed`: The ALB instance uses a fixed IP address.
*`Dynamic`: An IP address is dynamically assigned to each zone of the ALB instance.
:param pulumi.Input[str] address_type: The type of IP address that the ALB instance uses to provide services. Valid values: `Intranet`, `Internet`.
:param pulumi.Input[bool] deletion_protection_enabled: The deletion protection enabled. Valid values: `true` and `false`. Default value: `false`.
:param pulumi.Input[bool] dry_run: Specifies whether to precheck the API request. Valid values: `true` and `false`.
:param pulumi.Input[pulumi.InputType['LoadBalancerLoadBalancerBillingConfigArgs']] load_balancer_billing_config: The configuration of the billing method.
:param pulumi.Input[str] load_balancer_edition: The edition of the ALB instance. Different editions have different limits and billing methods. Valid values: `Basic` and `Standard`.
:param pulumi.Input[str] load_balancer_name: The name of the resource.
:param pulumi.Input[pulumi.InputType['LoadBalancerModificationProtectionConfigArgs']] modification_protection_config: Modify the Protection Configuration.
:param pulumi.Input[str] resource_group_id: The ID of the resource group.
:param pulumi.Input[str] status: Specifies whether to enable the configuration read-only mode for the ALB instance. Valid values: `NonProtection` and `ConsoleProtection`.
:param pulumi.Input[str] vpc_id: The ID of the virtual private cloud (VPC) where the ALB instance is deployed.
:param pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['LoadBalancerZoneMappingArgs']]]] zone_mappings: The zones and vSwitches. You must specify at least two zones.
"""
opts = pulumi.ResourceOptions.merge(opts, pulumi.ResourceOptions(id=id))
__props__ = _LoadBalancerState.__new__(_LoadBalancerState)
__props__.__dict__["access_log_config"] = access_log_config
__props__.__dict__["address_allocated_mode"] = address_allocated_mode
__props__.__dict__["address_type"] = address_type
__props__.__dict__["deletion_protection_enabled"] = deletion_protection_enabled
__props__.__dict__["dry_run"] = dry_run
__props__.__dict__["load_balancer_billing_config"] = load_balancer_billing_config
__props__.__dict__["load_balancer_edition"] = load_balancer_edition
__props__.__dict__["load_balancer_name"] = load_balancer_name
__props__.__dict__["modification_protection_config"] = modification_protection_config
__props__.__dict__["resource_group_id"] = resource_group_id
__props__.__dict__["status"] = status
__props__.__dict__["tags"] = tags
__props__.__dict__["vpc_id"] = vpc_id
__props__.__dict__["zone_mappings"] = zone_mappings
return LoadBalancer(resource_name, opts=opts, __props__=__props__)
@property
@pulumi.getter(name="accessLogConfig")
def access_log_config(self) -> pulumi.Output[Optional['outputs.LoadBalancerAccessLogConfig']]:
"""
The Access Logging Configuration Structure.
"""
return pulumi.get(self, "access_log_config")
@property
@pulumi.getter(name="addressAllocatedMode")
def address_allocated_mode(self) -> pulumi.Output[Optional[str]]:
"""
The method in which IP addresses are assigned. Valid values: `Fixed` and `Dynamic`. Default value: `Dynamic`.
*`Fixed`: The ALB instance uses a fixed IP address.
*`Dynamic`: An IP address is dynamically assigned to each zone of the ALB instance.
"""
return pulumi.get(self, "address_allocated_mode")
@property
@pulumi.getter(name="addressType")
def address_type(self) -> pulumi.Output[str]:
"""
The type of IP address that the ALB instance uses to provide services. Valid values: `Intranet`, `Internet`.
"""
return pulumi.get(self, "address_type")
@property
@pulumi.getter(name="deletionProtectionEnabled")
def deletion_protection_enabled(self) -> pulumi.Output[Optional[bool]]:
"""
The deletion protection enabled. Valid values: `true` and `false`. Default value: `false`.
"""
return pulumi.get(self, "deletion_protection_enabled")
@property
@pulumi.getter(name="dryRun")
def dry_run(self) -> pulumi.Output[Optional[bool]]:
"""
Specifies whether to precheck the API request. Valid values: `true` and `false`.
"""
return pulumi.get(self, "dry_run")
@property
@pulumi.getter(name="loadBalancerBillingConfig")
def load_balancer_billing_config(self) -> pulumi.Output['outputs.LoadBalancerLoadBalancerBillingConfig']:
"""
The configuration of the billing method.
"""
return pulumi.get(self, "load_balancer_billing_config")
@property
@pulumi.getter(name="loadBalancerEdition")
def load_balancer_edition(self) -> pulumi.Output[str]:
"""
The edition of the ALB instance. Different editions have different limits and billing methods. Valid values: `Basic` and `Standard`.
"""
return pulumi.get(self, "load_balancer_edition")
@property
@pulumi.getter(name="loadBalancerName")
def load_balancer_name(self) -> pulumi.Output[str]:
"""
The name of the resource.
"""
return pulumi.get(self, "load_balancer_name")
@property
@pulumi.getter(name="modificationProtectionConfig")
def modification_protection_config(self) -> pulumi.Output['outputs.LoadBalancerModificationProtectionConfig']:
"""
Modify the Protection Configuration.
"""
return pulumi.get(self, "modification_protection_config")
@property
@pulumi.getter(name="resourceGroupId")
def resource_group_id(self) -> pulumi.Output[str]:
"""
The ID of the resource group.
"""
return pulumi.get(self, "resource_group_id")
@property
@pulumi.getter
def status(self) -> pulumi.Output[str]:
"""
Specifies whether to enable the configuration read-only mode for the ALB instance. Valid values: `NonProtection` and `ConsoleProtection`.
"""
return pulumi.get(self, "status")
@property
@pulumi.getter
def tags(self) -> pulumi.Output[Optional[Mapping[str, Any]]]:
return pulumi.get(self, "tags")
@property
@pulumi.getter(name="vpcId")
def vpc_id(self) -> pulumi.Output[str]:
"""
The ID of the virtual private cloud (VPC) where the ALB instance is deployed.
"""
return pulumi.get(self, "vpc_id")
@property
@pulumi.getter(name="zoneMappings")
def zone_mappings(self) -> pulumi.Output[Sequence['outputs.LoadBalancerZoneMapping']]:
"""
The zones and vSwitches. You must specify at least two zones.
"""
return pulumi.get(self, "zone_mappings")
| 51.715395 | 189 | 0.685586 | 4,550 | 39,976 | 5.773187 | 0.054505 | 0.078727 | 0.074501 | 0.038526 | 0.923481 | 0.908634 | 0.895843 | 0.883394 | 0.874753 | 0.863789 | 0 | 0.000732 | 0.214279 | 39,976 | 772 | 190 | 51.782383 | 0.835562 | 0.305433 | 0 | 0.749455 | 1 | 0 | 0.175327 | 0.112091 | 0 | 0 | 0 | 0 | 0 | 1 | 0.163399 | false | 0.002179 | 0.015251 | 0.006536 | 0.276688 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
0e7df8332ce3f1a6cb7f94f2c87e5340f378689a | 31 | py | Python | applications/she/controllers/principal.py | DuFerreira/projeto-she | a97e1e3e7a1d817bcca3efe516b1dceac56eb44f | [
"BSD-3-Clause"
] | null | null | null | applications/she/controllers/principal.py | DuFerreira/projeto-she | a97e1e3e7a1d817bcca3efe516b1dceac56eb44f | [
"BSD-3-Clause"
] | null | null | null | applications/she/controllers/principal.py | DuFerreira/projeto-she | a97e1e3e7a1d817bcca3efe516b1dceac56eb44f | [
"BSD-3-Clause"
] | null | null | null | def login():
return locals() | 10.333333 | 16 | 0.645161 | 4 | 31 | 5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.193548 | 31 | 3 | 16 | 10.333333 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | true | 0 | 0 | 0.5 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 7 |
0e835309c78a8333db28705be939651cfb47dd16 | 3,799 | py | Python | tests/test_buffer.py | Jeremiah-England/Shapely | 769b203f2b7cbeeb0a694c21440b4025a563f807 | [
"BSD-3-Clause"
] | 2,382 | 2015-01-04T03:16:59.000Z | 2021-12-10T15:48:56.000Z | tests/test_buffer.py | Jeremiah-England/Shapely | 769b203f2b7cbeeb0a694c21440b4025a563f807 | [
"BSD-3-Clause"
] | 1,009 | 2015-01-03T23:44:02.000Z | 2021-12-10T16:02:42.000Z | tests/test_buffer.py | Jeremiah-England/Shapely | 769b203f2b7cbeeb0a694c21440b4025a563f807 | [
"BSD-3-Clause"
] | 467 | 2015-01-19T23:18:33.000Z | 2021-12-09T18:31:28.000Z | from . import unittest
from shapely import geometry
class BufferSingleSidedCase(unittest.TestCase):
""" Test Buffer Point/Line/Polygon with and without single_sided params """
def test_empty(self):
g = geometry.Point(0,0)
h = g.buffer(0)
assert h.is_empty
def test_point(self):
g = geometry.Point(0, 0)
h = g.buffer(1, resolution=1)
self.assertEqual(h.geom_type, 'Polygon')
expected_coord = [(1.0, 0.0), (0, -1.0), (-1.0, 0), (0, 1.0), (1.0, 0.0)]
for index, coord in enumerate(h.exterior.coords):
self.assertAlmostEqual(coord[0], expected_coord[index][0])
self.assertAlmostEqual(coord[1], expected_coord[index][1])
def test_point_single_sidedd(self):
g = geometry.Point(0, 0)
h = g.buffer(1, resolution=1, single_sided=True)
self.assertEqual(h.geom_type, 'Polygon')
expected_coord = [(1.0, 0.0), (0, -1.0), (-1.0, 0), (0, 1.0), (1.0, 0.0)]
for index, coord in enumerate(h.exterior.coords):
self.assertAlmostEqual(coord[0], expected_coord[index][0])
self.assertAlmostEqual(coord[1], expected_coord[index][1])
def test_line(self):
g = geometry.LineString([[0, 0], [0, 1]])
h = g.buffer(1, resolution=1)
self.assertEqual(h.geom_type, 'Polygon')
expected_coord = [(-1.0, 1.0), (0, 2.0), (1.0, 1.0), (1.0, 0.0), (0, -1.0), (-1.0, 0.0),
(-1.0, 1.0)]
for index, coord in enumerate(h.exterior.coords):
self.assertAlmostEqual(coord[0], expected_coord[index][0])
self.assertAlmostEqual(coord[1], expected_coord[index][1])
def test_line_single_sideded_left(self):
g = geometry.LineString([[0, 0], [0, 1]])
h = g.buffer(1, resolution=1, single_sided=True)
self.assertEqual(h.geom_type, 'Polygon')
expected_coord = [(0.0, 1.0), (0.0, 0.0), (-1.0, 0.0), (-1.0, 1.0), (0.0, 1.0)]
for index, coord in enumerate(h.exterior.coords):
self.assertAlmostEqual(coord[0], expected_coord[index][0])
self.assertAlmostEqual(coord[1], expected_coord[index][1])
def test_line_single_sideded_right(self):
g = geometry.LineString([[0, 0], [0, 1]])
h = g.buffer(-1, resolution=1, single_sided=True)
self.assertEqual(h.geom_type, 'Polygon')
expected_coord = [(0.0, 0.0), (0.0, 1.0), (1.0, 1.0), (1.0, 0.0), (0.0, 0.0)]
for index, coord in enumerate(h.exterior.coords):
self.assertAlmostEqual(coord[0], expected_coord[index][0])
self.assertAlmostEqual(coord[1], expected_coord[index][1])
def test_polygon(self):
g = geometry.Polygon([[0, 0], [1, 0], [1, 1], [0, 1], [0, 0]])
h = g.buffer(1, resolution=1)
self.assertEqual(h.geom_type, 'Polygon')
expected_coord = [(-1.0, 0.0), (-1.0, 1.0), (0.0, 2.0), (1.0, 2.0), (2.0, 1.0), (2.0, 0.0),
(1.0, -1.0), (0.0, -1.0), (-1.0, 0.0)]
for index, coord in enumerate(h.exterior.coords):
self.assertAlmostEqual(coord[0], expected_coord[index][0])
self.assertAlmostEqual(coord[1], expected_coord[index][1])
def test_polygon_single_sideded(self):
g = geometry.Polygon([[0, 0], [1, 0], [1, 1], [0, 1], [0, 0]])
h = g.buffer(1, resolution=1, single_sided=True)
self.assertEqual(h.geom_type, 'Polygon')
expected_coord = [(-1.0, 0.0), (-1.0, 1.0), (0.0, 2.0), (1.0, 2.0), (2.0, 1.0), (2.0, 0.0),
(1.0, -1.0), (0.0, -1.0), (-1.0, 0.0)]
for index, coord in enumerate(h.exterior.coords):
self.assertAlmostEqual(coord[0], expected_coord[index][0])
self.assertAlmostEqual(coord[1], expected_coord[index][1])
| 48.705128 | 99 | 0.571729 | 584 | 3,799 | 3.630137 | 0.082192 | 0.068868 | 0.063679 | 0.039623 | 0.896226 | 0.895755 | 0.89434 | 0.891981 | 0.889623 | 0.876415 | 0 | 0.087362 | 0.237694 | 3,799 | 77 | 100 | 49.337662 | 0.644682 | 0.017636 | 0 | 0.727273 | 0 | 0 | 0.013158 | 0 | 0 | 0 | 0 | 0 | 0.333333 | 1 | 0.121212 | false | 0 | 0.030303 | 0 | 0.166667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
0e929e33eef528dd1186b9caace802e551e6ab10 | 11,800 | py | Python | pynos/versions/ver_7/ver_7_1_0/yang/brocade_dai.py | bdeetz/pynos | bd8a34e98f322de3fc06750827d8bbc3a0c00380 | [
"Apache-2.0"
] | 12 | 2015-09-21T23:56:09.000Z | 2018-03-30T04:35:32.000Z | pynos/versions/ver_7/ver_7_1_0/yang/brocade_dai.py | bdeetz/pynos | bd8a34e98f322de3fc06750827d8bbc3a0c00380 | [
"Apache-2.0"
] | 10 | 2016-09-15T19:03:27.000Z | 2017-07-17T23:38:01.000Z | pynos/versions/ver_7/ver_7_1_0/yang/brocade_dai.py | bdeetz/pynos | bd8a34e98f322de3fc06750827d8bbc3a0c00380 | [
"Apache-2.0"
] | 6 | 2015-08-14T08:05:23.000Z | 2022-02-03T15:33:54.000Z | #!/usr/bin/env python
import xml.etree.ElementTree as ET
class brocade_dai(object):
"""Auto generated class.
"""
def __init__(self, **kwargs):
self._callback = kwargs.pop('callback')
def arp_access_list_acl_name(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
arp = ET.SubElement(config, "arp", xmlns="urn:brocade.com:mgmt:brocade-dai")
access_list = ET.SubElement(arp, "access-list")
acl_name = ET.SubElement(access_list, "acl-name")
acl_name.text = kwargs.pop('acl_name')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def arp_access_list_permit_permit_list_ip_type(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
arp = ET.SubElement(config, "arp", xmlns="urn:brocade.com:mgmt:brocade-dai")
access_list = ET.SubElement(arp, "access-list")
acl_name_key = ET.SubElement(access_list, "acl-name")
acl_name_key.text = kwargs.pop('acl_name')
permit = ET.SubElement(access_list, "permit")
permit_list = ET.SubElement(permit, "permit-list")
host_ip_key = ET.SubElement(permit_list, "host-ip")
host_ip_key.text = kwargs.pop('host_ip')
mac_type_key = ET.SubElement(permit_list, "mac-type")
mac_type_key.text = kwargs.pop('mac_type')
host_mac_key = ET.SubElement(permit_list, "host-mac")
host_mac_key.text = kwargs.pop('host_mac')
ip_type = ET.SubElement(permit_list, "ip-type")
ip_type.text = kwargs.pop('ip_type')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def arp_access_list_permit_permit_list_host_ip(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
arp = ET.SubElement(config, "arp", xmlns="urn:brocade.com:mgmt:brocade-dai")
access_list = ET.SubElement(arp, "access-list")
acl_name_key = ET.SubElement(access_list, "acl-name")
acl_name_key.text = kwargs.pop('acl_name')
permit = ET.SubElement(access_list, "permit")
permit_list = ET.SubElement(permit, "permit-list")
ip_type_key = ET.SubElement(permit_list, "ip-type")
ip_type_key.text = kwargs.pop('ip_type')
mac_type_key = ET.SubElement(permit_list, "mac-type")
mac_type_key.text = kwargs.pop('mac_type')
host_mac_key = ET.SubElement(permit_list, "host-mac")
host_mac_key.text = kwargs.pop('host_mac')
host_ip = ET.SubElement(permit_list, "host-ip")
host_ip.text = kwargs.pop('host_ip')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def arp_access_list_permit_permit_list_mac_type(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
arp = ET.SubElement(config, "arp", xmlns="urn:brocade.com:mgmt:brocade-dai")
access_list = ET.SubElement(arp, "access-list")
acl_name_key = ET.SubElement(access_list, "acl-name")
acl_name_key.text = kwargs.pop('acl_name')
permit = ET.SubElement(access_list, "permit")
permit_list = ET.SubElement(permit, "permit-list")
ip_type_key = ET.SubElement(permit_list, "ip-type")
ip_type_key.text = kwargs.pop('ip_type')
host_ip_key = ET.SubElement(permit_list, "host-ip")
host_ip_key.text = kwargs.pop('host_ip')
host_mac_key = ET.SubElement(permit_list, "host-mac")
host_mac_key.text = kwargs.pop('host_mac')
mac_type = ET.SubElement(permit_list, "mac-type")
mac_type.text = kwargs.pop('mac_type')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def arp_access_list_permit_permit_list_host_mac(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
arp = ET.SubElement(config, "arp", xmlns="urn:brocade.com:mgmt:brocade-dai")
access_list = ET.SubElement(arp, "access-list")
acl_name_key = ET.SubElement(access_list, "acl-name")
acl_name_key.text = kwargs.pop('acl_name')
permit = ET.SubElement(access_list, "permit")
permit_list = ET.SubElement(permit, "permit-list")
ip_type_key = ET.SubElement(permit_list, "ip-type")
ip_type_key.text = kwargs.pop('ip_type')
host_ip_key = ET.SubElement(permit_list, "host-ip")
host_ip_key.text = kwargs.pop('host_ip')
mac_type_key = ET.SubElement(permit_list, "mac-type")
mac_type_key.text = kwargs.pop('mac_type')
host_mac = ET.SubElement(permit_list, "host-mac")
host_mac.text = kwargs.pop('host_mac')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def arp_access_list_permit_permit_list_log(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
arp = ET.SubElement(config, "arp", xmlns="urn:brocade.com:mgmt:brocade-dai")
access_list = ET.SubElement(arp, "access-list")
acl_name_key = ET.SubElement(access_list, "acl-name")
acl_name_key.text = kwargs.pop('acl_name')
permit = ET.SubElement(access_list, "permit")
permit_list = ET.SubElement(permit, "permit-list")
ip_type_key = ET.SubElement(permit_list, "ip-type")
ip_type_key.text = kwargs.pop('ip_type')
host_ip_key = ET.SubElement(permit_list, "host-ip")
host_ip_key.text = kwargs.pop('host_ip')
mac_type_key = ET.SubElement(permit_list, "mac-type")
mac_type_key.text = kwargs.pop('mac_type')
host_mac_key = ET.SubElement(permit_list, "host-mac")
host_mac_key.text = kwargs.pop('host_mac')
log = ET.SubElement(permit_list, "log")
callback = kwargs.pop('callback', self._callback)
return callback(config)
def arp_access_list_acl_name(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
arp = ET.SubElement(config, "arp", xmlns="urn:brocade.com:mgmt:brocade-dai")
access_list = ET.SubElement(arp, "access-list")
acl_name = ET.SubElement(access_list, "acl-name")
acl_name.text = kwargs.pop('acl_name')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def arp_access_list_permit_permit_list_ip_type(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
arp = ET.SubElement(config, "arp", xmlns="urn:brocade.com:mgmt:brocade-dai")
access_list = ET.SubElement(arp, "access-list")
acl_name_key = ET.SubElement(access_list, "acl-name")
acl_name_key.text = kwargs.pop('acl_name')
permit = ET.SubElement(access_list, "permit")
permit_list = ET.SubElement(permit, "permit-list")
host_ip_key = ET.SubElement(permit_list, "host-ip")
host_ip_key.text = kwargs.pop('host_ip')
mac_type_key = ET.SubElement(permit_list, "mac-type")
mac_type_key.text = kwargs.pop('mac_type')
host_mac_key = ET.SubElement(permit_list, "host-mac")
host_mac_key.text = kwargs.pop('host_mac')
ip_type = ET.SubElement(permit_list, "ip-type")
ip_type.text = kwargs.pop('ip_type')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def arp_access_list_permit_permit_list_host_ip(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
arp = ET.SubElement(config, "arp", xmlns="urn:brocade.com:mgmt:brocade-dai")
access_list = ET.SubElement(arp, "access-list")
acl_name_key = ET.SubElement(access_list, "acl-name")
acl_name_key.text = kwargs.pop('acl_name')
permit = ET.SubElement(access_list, "permit")
permit_list = ET.SubElement(permit, "permit-list")
ip_type_key = ET.SubElement(permit_list, "ip-type")
ip_type_key.text = kwargs.pop('ip_type')
mac_type_key = ET.SubElement(permit_list, "mac-type")
mac_type_key.text = kwargs.pop('mac_type')
host_mac_key = ET.SubElement(permit_list, "host-mac")
host_mac_key.text = kwargs.pop('host_mac')
host_ip = ET.SubElement(permit_list, "host-ip")
host_ip.text = kwargs.pop('host_ip')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def arp_access_list_permit_permit_list_mac_type(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
arp = ET.SubElement(config, "arp", xmlns="urn:brocade.com:mgmt:brocade-dai")
access_list = ET.SubElement(arp, "access-list")
acl_name_key = ET.SubElement(access_list, "acl-name")
acl_name_key.text = kwargs.pop('acl_name')
permit = ET.SubElement(access_list, "permit")
permit_list = ET.SubElement(permit, "permit-list")
ip_type_key = ET.SubElement(permit_list, "ip-type")
ip_type_key.text = kwargs.pop('ip_type')
host_ip_key = ET.SubElement(permit_list, "host-ip")
host_ip_key.text = kwargs.pop('host_ip')
host_mac_key = ET.SubElement(permit_list, "host-mac")
host_mac_key.text = kwargs.pop('host_mac')
mac_type = ET.SubElement(permit_list, "mac-type")
mac_type.text = kwargs.pop('mac_type')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def arp_access_list_permit_permit_list_host_mac(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
arp = ET.SubElement(config, "arp", xmlns="urn:brocade.com:mgmt:brocade-dai")
access_list = ET.SubElement(arp, "access-list")
acl_name_key = ET.SubElement(access_list, "acl-name")
acl_name_key.text = kwargs.pop('acl_name')
permit = ET.SubElement(access_list, "permit")
permit_list = ET.SubElement(permit, "permit-list")
ip_type_key = ET.SubElement(permit_list, "ip-type")
ip_type_key.text = kwargs.pop('ip_type')
host_ip_key = ET.SubElement(permit_list, "host-ip")
host_ip_key.text = kwargs.pop('host_ip')
mac_type_key = ET.SubElement(permit_list, "mac-type")
mac_type_key.text = kwargs.pop('mac_type')
host_mac = ET.SubElement(permit_list, "host-mac")
host_mac.text = kwargs.pop('host_mac')
callback = kwargs.pop('callback', self._callback)
return callback(config)
def arp_access_list_permit_permit_list_log(self, **kwargs):
"""Auto Generated Code
"""
config = ET.Element("config")
arp = ET.SubElement(config, "arp", xmlns="urn:brocade.com:mgmt:brocade-dai")
access_list = ET.SubElement(arp, "access-list")
acl_name_key = ET.SubElement(access_list, "acl-name")
acl_name_key.text = kwargs.pop('acl_name')
permit = ET.SubElement(access_list, "permit")
permit_list = ET.SubElement(permit, "permit-list")
ip_type_key = ET.SubElement(permit_list, "ip-type")
ip_type_key.text = kwargs.pop('ip_type')
host_ip_key = ET.SubElement(permit_list, "host-ip")
host_ip_key.text = kwargs.pop('host_ip')
mac_type_key = ET.SubElement(permit_list, "mac-type")
mac_type_key.text = kwargs.pop('mac_type')
host_mac_key = ET.SubElement(permit_list, "host-mac")
host_mac_key.text = kwargs.pop('host_mac')
log = ET.SubElement(permit_list, "log")
callback = kwargs.pop('callback', self._callback)
return callback(config)
| 45.914397 | 84 | 0.639746 | 1,576 | 11,800 | 4.523477 | 0.027919 | 0.16496 | 0.094824 | 0.094263 | 0.981905 | 0.981905 | 0.981905 | 0.981905 | 0.981905 | 0.981905 | 0 | 0 | 0.224831 | 11,800 | 257 | 85 | 45.914397 | 0.779381 | 0.033475 | 0 | 0.980198 | 1 | 0 | 0.149559 | 0.033862 | 0 | 0 | 0 | 0 | 0 | 1 | 0.064356 | false | 0 | 0.004951 | 0 | 0.133663 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
bc014951f6ff3112bc5ec97b286b23b60cdb4763 | 1,242 | py | Python | erri/python/lesson_51/divisibility_test.py | TGITS/programming-workouts | 799e805ccf3fd0936ec8ac2417f7193b8e9bcb55 | [
"MIT"
] | null | null | null | erri/python/lesson_51/divisibility_test.py | TGITS/programming-workouts | 799e805ccf3fd0936ec8ac2417f7193b8e9bcb55 | [
"MIT"
] | 16 | 2020-05-30T12:38:13.000Z | 2022-02-19T09:23:31.000Z | erri/python/lesson_51/divisibility_test.py | TGITS/programming-workouts | 799e805ccf3fd0936ec8ac2417f7193b8e9bcb55 | [
"MIT"
] | null | null | null | import divisibility
def test_divisible_numbers_by_5_and_7_between_1_10():
numbers = divisibility.numbers_divisible_by_5_and_7_between_values(0, 10)
print(numbers)
assert 1 == len(numbers)
assert [0] == numbers
def test_divisible_numbers_by_5_and_7_between_1_35():
numbers = divisibility.numbers_divisible_by_5_and_7_between_values(0, 35)
print(numbers)
assert 2 == len(numbers)
assert [0, 35] == numbers
def test_divisible_numbers_by_5_and_7_between_300_450():
numbers = divisibility.numbers_divisible_by_5_and_7_between_values(300, 450)
assert 4 == len(numbers)
assert [315, 350, 385, 420] == numbers
def test_divisible_numbers_by_5_and_13_between_1_10():
numbers = divisibility.numbers_divisible_by_5_and_13_between_values(0, 10)
assert 1 == len(numbers)
assert [0] == numbers
def test_divisible_numbers_by_5_and_13_between_1_85():
numbers = divisibility.numbers_divisible_by_5_and_13_between_values(0, 85)
assert 2 == len(numbers)
assert [0, 65] == numbers
def test_divisible_numbers_by_5_and_13_between_300_450():
numbers = divisibility.numbers_divisible_by_5_and_13_between_values(300, 450)
assert 2 == len(numbers)
assert [325, 390] == numbers
| 31.05 | 81 | 0.764895 | 192 | 1,242 | 4.447917 | 0.166667 | 0.042155 | 0.084309 | 0.161593 | 0.882904 | 0.834895 | 0.778689 | 0.778689 | 0.778689 | 0.778689 | 0 | 0.104167 | 0.149758 | 1,242 | 39 | 82 | 31.846154 | 0.704545 | 0 | 0 | 0.333333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.444444 | 1 | 0.222222 | false | 0 | 0.037037 | 0 | 0.259259 | 0.074074 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
bc2385557e04caefea3ff247698ca97e463f867b | 412,837 | py | Python | cisco-ios-xr/ydk/models/cisco_ios_xr/Cisco_IOS_XR_subscriber_pppoe_ma_oper.py | CiscoDevNet/ydk-py | 073731fea50694d0bc6cd8ebf10fec308dcc0aa9 | [
"ECL-2.0",
"Apache-2.0"
] | 177 | 2016-03-15T17:03:51.000Z | 2022-03-18T16:48:44.000Z | cisco-ios-xr/ydk/models/cisco_ios_xr/Cisco_IOS_XR_subscriber_pppoe_ma_oper.py | CiscoDevNet/ydk-py | 073731fea50694d0bc6cd8ebf10fec308dcc0aa9 | [
"ECL-2.0",
"Apache-2.0"
] | 18 | 2016-03-30T10:45:22.000Z | 2020-07-14T16:28:13.000Z | cisco-ios-xr/ydk/models/cisco_ios_xr/Cisco_IOS_XR_subscriber_pppoe_ma_oper.py | CiscoDevNet/ydk-py | 073731fea50694d0bc6cd8ebf10fec308dcc0aa9 | [
"ECL-2.0",
"Apache-2.0"
] | 85 | 2016-03-16T20:38:57.000Z | 2022-02-22T04:26:02.000Z | """ Cisco_IOS_XR_subscriber_pppoe_ma_oper
This module contains a collection of YANG definitions
for Cisco IOS\-XR subscriber\-pppoe\-ma package operational data.
This module contains definitions
for the following management objects\:
pppoe\: PPPoE operational data
Copyright (c) 2013\-2018 by Cisco Systems, Inc.
All rights reserved.
"""
import sys
from collections import OrderedDict
from ydk.types import Entity as _Entity_
from ydk.types import EntityPath, Identity, Enum, YType, YLeaf, YLeafList, YList, LeafDataList, Bits, Empty, Decimal64
from ydk.types import Entity, EntityPath, Identity, Enum, YType, YLeaf, YLeafList, YList, LeafDataList, Bits, Empty, Decimal64
from ydk.filters import YFilter
from ydk.errors import YError, YModelError
from ydk.errors.error_handler import handle_type_error as _handle_type_error
class PppoeMaLimitState(Enum):
"""
PppoeMaLimitState (Enum Class)
Pppoe ma limit state
.. data:: ok = 0
OK State
.. data:: warning = 1
Warn State
.. data:: block = 2
Block State
"""
ok = Enum.YLeaf(0, "ok")
warning = Enum.YLeaf(1, "warning")
block = Enum.YLeaf(2, "block")
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_subscriber_pppoe_ma_oper as meta
return meta._meta_table['PppoeMaLimitState']
class PppoeMaSessionIdbSrgState(Enum):
"""
PppoeMaSessionIdbSrgState (Enum Class)
Pppoe ma session idb srg state
.. data:: none = 0
SRG-None state
.. data:: active = 1
SRG-Active state
.. data:: standby = 2
SRG-Standby state
"""
none = Enum.YLeaf(0, "none")
active = Enum.YLeaf(1, "active")
standby = Enum.YLeaf(2, "standby")
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_subscriber_pppoe_ma_oper as meta
return meta._meta_table['PppoeMaSessionIdbSrgState']
class PppoeMaSessionState(Enum):
"""
PppoeMaSessionState (Enum Class)
Pppoe ma session state
.. data:: destroying = 0
Destroying session
.. data:: deleting = 1
Deleting interface
.. data:: initializing = 2
Initializing
.. data:: created = 3
Interface created
.. data:: stopping = 4
Stopping AAA session
.. data:: started = 5
AAA session started
.. data:: activated = 6
SubDB Config activated
.. data:: complete = 7
Complete
"""
destroying = Enum.YLeaf(0, "destroying")
deleting = Enum.YLeaf(1, "deleting")
initializing = Enum.YLeaf(2, "initializing")
created = Enum.YLeaf(3, "created")
stopping = Enum.YLeaf(4, "stopping")
started = Enum.YLeaf(5, "started")
activated = Enum.YLeaf(6, "activated")
complete = Enum.YLeaf(7, "complete")
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_subscriber_pppoe_ma_oper as meta
return meta._meta_table['PppoeMaSessionState']
class PppoeMaSessionTrig(Enum):
"""
PppoeMaSessionTrig (Enum Class)
Pppoe ma session trig
.. data:: pppoe_ma_session_trig_error = 0
pppoe ma session trig error
.. data:: pppoe_ma_session_trig_publish_encaps_attr_fail = 1
pppoe ma session trig publish encaps attr fail
.. data:: pppoe_ma_session_trig_if_create_fail = 2
pppoe ma session trig if create fail
.. data:: pppoe_ma_session_trig_iedge_session_start_fail = 3
pppoe ma session trig iedge session start fail
.. data:: pppoe_ma_session_trig_iedge_session_update_fail = 4
pppoe ma session trig iedge session update fail
.. data:: pppoe_ma_session_trig_sub_db_activate_fail = 5
pppoe ma session trig sub db activate fail
.. data:: pppoe_ma_session_trig_in_flight_timeout = 6
pppoe ma session trig in flight timeout
.. data:: pppoe_ma_session_trig_down = 7
pppoe ma session trig down
.. data:: pppoe_ma_session_trig_parent = 8
pppoe ma session trig parent
.. data:: pppoe_ma_session_trig_padt = 9
pppoe ma session trig padt
.. data:: pppoe_ma_session_trig_session_pak = 10
pppoe ma session trig session pak
.. data:: pppoe_ma_session_trig_final = 11
pppoe ma session trig final
.. data:: pppoe_ma_session_trig_no_im_or = 12
pppoe ma session trig no im or
.. data:: pppoe_ma_session_trig_restart = 13
pppoe ma session trig restart
.. data:: pppoe_ma_session_trig_admissions_config_change = 14
pppoe ma session trig admissions config change
.. data:: pppoe_ma_session_trig_iedge_disconnect = 15
pppoe ma session trig iedge disconnect
.. data:: pppoe_ma_session_trig_invalid_vlan_tags = 16
pppoe ma session trig invalid vlan tags
.. data:: pppoe_ma_session_trig_port_limit_disconnect = 17
pppoe ma session trig port limit disconnect
.. data:: pppoe_ma_session_trig_srg_disconnect = 18
pppoe ma session trig srg disconnect
.. data:: pppoe_ma_session_trig_srg_sweep = 19
pppoe ma session trig srg sweep
.. data:: pppoe_ma_session_trig_renegotiation = 20
pppoe ma session trig renegotiation
.. data:: pppoe_ma_session_trig_count = 21
pppoe ma session trig count
"""
pppoe_ma_session_trig_error = Enum.YLeaf(0, "pppoe-ma-session-trig-error")
pppoe_ma_session_trig_publish_encaps_attr_fail = Enum.YLeaf(1, "pppoe-ma-session-trig-publish-encaps-attr-fail")
pppoe_ma_session_trig_if_create_fail = Enum.YLeaf(2, "pppoe-ma-session-trig-if-create-fail")
pppoe_ma_session_trig_iedge_session_start_fail = Enum.YLeaf(3, "pppoe-ma-session-trig-iedge-session-start-fail")
pppoe_ma_session_trig_iedge_session_update_fail = Enum.YLeaf(4, "pppoe-ma-session-trig-iedge-session-update-fail")
pppoe_ma_session_trig_sub_db_activate_fail = Enum.YLeaf(5, "pppoe-ma-session-trig-sub-db-activate-fail")
pppoe_ma_session_trig_in_flight_timeout = Enum.YLeaf(6, "pppoe-ma-session-trig-in-flight-timeout")
pppoe_ma_session_trig_down = Enum.YLeaf(7, "pppoe-ma-session-trig-down")
pppoe_ma_session_trig_parent = Enum.YLeaf(8, "pppoe-ma-session-trig-parent")
pppoe_ma_session_trig_padt = Enum.YLeaf(9, "pppoe-ma-session-trig-padt")
pppoe_ma_session_trig_session_pak = Enum.YLeaf(10, "pppoe-ma-session-trig-session-pak")
pppoe_ma_session_trig_final = Enum.YLeaf(11, "pppoe-ma-session-trig-final")
pppoe_ma_session_trig_no_im_or = Enum.YLeaf(12, "pppoe-ma-session-trig-no-im-or")
pppoe_ma_session_trig_restart = Enum.YLeaf(13, "pppoe-ma-session-trig-restart")
pppoe_ma_session_trig_admissions_config_change = Enum.YLeaf(14, "pppoe-ma-session-trig-admissions-config-change")
pppoe_ma_session_trig_iedge_disconnect = Enum.YLeaf(15, "pppoe-ma-session-trig-iedge-disconnect")
pppoe_ma_session_trig_invalid_vlan_tags = Enum.YLeaf(16, "pppoe-ma-session-trig-invalid-vlan-tags")
pppoe_ma_session_trig_port_limit_disconnect = Enum.YLeaf(17, "pppoe-ma-session-trig-port-limit-disconnect")
pppoe_ma_session_trig_srg_disconnect = Enum.YLeaf(18, "pppoe-ma-session-trig-srg-disconnect")
pppoe_ma_session_trig_srg_sweep = Enum.YLeaf(19, "pppoe-ma-session-trig-srg-sweep")
pppoe_ma_session_trig_renegotiation = Enum.YLeaf(20, "pppoe-ma-session-trig-renegotiation")
pppoe_ma_session_trig_count = Enum.YLeaf(21, "pppoe-ma-session-trig-count")
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_subscriber_pppoe_ma_oper as meta
return meta._meta_table['PppoeMaSessionTrig']
class PppoeMaThrottleState(Enum):
"""
PppoeMaThrottleState (Enum Class)
Pppoe ma throttle state
.. data:: idle = 0
Idle State
.. data:: monitor = 1
Monitor State
.. data:: block = 2
Block State
"""
idle = Enum.YLeaf(0, "idle")
monitor = Enum.YLeaf(1, "monitor")
block = Enum.YLeaf(2, "block")
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_subscriber_pppoe_ma_oper as meta
return meta._meta_table['PppoeMaThrottleState']
class Pppoe(_Entity_):
"""
PPPoE operational data
.. attribute:: access_interface_statistics
PPPoE access interface statistics information
**type**\: :py:class:`AccessInterfaceStatistics <ydk.models.cisco_ios_xr.Cisco_IOS_XR_subscriber_pppoe_ma_oper.Pppoe.AccessInterfaceStatistics>`
**config**\: False
.. attribute:: nodes
Per\-node PPPoE operational data
**type**\: :py:class:`Nodes <ydk.models.cisco_ios_xr.Cisco_IOS_XR_subscriber_pppoe_ma_oper.Pppoe.Nodes>`
**config**\: False
"""
_prefix = 'subscriber-pppoe-ma-oper'
_revision = '2019-10-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Pppoe, self).__init__()
self._top_entity = None
self.yang_name = "pppoe"
self.yang_parent_name = "Cisco-IOS-XR-subscriber-pppoe-ma-oper"
self.is_top_level_class = True
self.has_list_ancestor = False
self.ylist_key_names = []
self._child_classes = OrderedDict([("access-interface-statistics", ("access_interface_statistics", Pppoe.AccessInterfaceStatistics)), ("nodes", ("nodes", Pppoe.Nodes))])
self._leafs = OrderedDict()
self.access_interface_statistics = Pppoe.AccessInterfaceStatistics()
self.access_interface_statistics.parent = self
self._children_name_map["access_interface_statistics"] = "access-interface-statistics"
self.nodes = Pppoe.Nodes()
self.nodes.parent = self
self._children_name_map["nodes"] = "nodes"
self._segment_path = lambda: "Cisco-IOS-XR-subscriber-pppoe-ma-oper:pppoe"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Pppoe, [], name, value)
class AccessInterfaceStatistics(_Entity_):
"""
PPPoE access interface statistics information
.. attribute:: access_interface_statistic
Statistics information for a PPPoE\-enabled access interface
**type**\: list of :py:class:`AccessInterfaceStatistic <ydk.models.cisco_ios_xr.Cisco_IOS_XR_subscriber_pppoe_ma_oper.Pppoe.AccessInterfaceStatistics.AccessInterfaceStatistic>`
**config**\: False
"""
_prefix = 'subscriber-pppoe-ma-oper'
_revision = '2019-10-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Pppoe.AccessInterfaceStatistics, self).__init__()
self.yang_name = "access-interface-statistics"
self.yang_parent_name = "pppoe"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = []
self._child_classes = OrderedDict([("access-interface-statistic", ("access_interface_statistic", Pppoe.AccessInterfaceStatistics.AccessInterfaceStatistic))])
self._leafs = OrderedDict()
self.access_interface_statistic = YList(self)
self._segment_path = lambda: "access-interface-statistics"
self._absolute_path = lambda: "Cisco-IOS-XR-subscriber-pppoe-ma-oper:pppoe/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Pppoe.AccessInterfaceStatistics, [], name, value)
class AccessInterfaceStatistic(_Entity_):
"""
Statistics information for a PPPoE\-enabled
access interface
.. attribute:: interface_name (key)
PPPoE Access Interface
**type**\: str
**pattern:** [a\-zA\-Z0\-9.\_/\-]+
**config**\: False
.. attribute:: packet_counts
Packet Counts
**type**\: :py:class:`PacketCounts <ydk.models.cisco_ios_xr.Cisco_IOS_XR_subscriber_pppoe_ma_oper.Pppoe.AccessInterfaceStatistics.AccessInterfaceStatistic.PacketCounts>`
**config**\: False
"""
_prefix = 'subscriber-pppoe-ma-oper'
_revision = '2019-10-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Pppoe.AccessInterfaceStatistics.AccessInterfaceStatistic, self).__init__()
self.yang_name = "access-interface-statistic"
self.yang_parent_name = "access-interface-statistics"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = ['interface_name']
self._child_classes = OrderedDict([("packet-counts", ("packet_counts", Pppoe.AccessInterfaceStatistics.AccessInterfaceStatistic.PacketCounts))])
self._leafs = OrderedDict([
('interface_name', (YLeaf(YType.str, 'interface-name'), ['str'])),
])
self.interface_name = None
self.packet_counts = Pppoe.AccessInterfaceStatistics.AccessInterfaceStatistic.PacketCounts()
self.packet_counts.parent = self
self._children_name_map["packet_counts"] = "packet-counts"
self._segment_path = lambda: "access-interface-statistic" + "[interface-name='" + str(self.interface_name) + "']"
self._absolute_path = lambda: "Cisco-IOS-XR-subscriber-pppoe-ma-oper:pppoe/access-interface-statistics/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Pppoe.AccessInterfaceStatistics.AccessInterfaceStatistic, ['interface_name'], name, value)
class PacketCounts(_Entity_):
"""
Packet Counts
.. attribute:: padi
PADI counts
**type**\: :py:class:`Padi <ydk.models.cisco_ios_xr.Cisco_IOS_XR_subscriber_pppoe_ma_oper.Pppoe.AccessInterfaceStatistics.AccessInterfaceStatistic.PacketCounts.Padi>`
**config**\: False
.. attribute:: pado
PADO counts
**type**\: :py:class:`Pado <ydk.models.cisco_ios_xr.Cisco_IOS_XR_subscriber_pppoe_ma_oper.Pppoe.AccessInterfaceStatistics.AccessInterfaceStatistic.PacketCounts.Pado>`
**config**\: False
.. attribute:: padr
PADR counts
**type**\: :py:class:`Padr <ydk.models.cisco_ios_xr.Cisco_IOS_XR_subscriber_pppoe_ma_oper.Pppoe.AccessInterfaceStatistics.AccessInterfaceStatistic.PacketCounts.Padr>`
**config**\: False
.. attribute:: pads_success
PADS Success counts
**type**\: :py:class:`PadsSuccess <ydk.models.cisco_ios_xr.Cisco_IOS_XR_subscriber_pppoe_ma_oper.Pppoe.AccessInterfaceStatistics.AccessInterfaceStatistic.PacketCounts.PadsSuccess>`
**config**\: False
.. attribute:: pads_error
PADS Error counts
**type**\: :py:class:`PadsError <ydk.models.cisco_ios_xr.Cisco_IOS_XR_subscriber_pppoe_ma_oper.Pppoe.AccessInterfaceStatistics.AccessInterfaceStatistic.PacketCounts.PadsError>`
**config**\: False
.. attribute:: padt
PADT counts
**type**\: :py:class:`Padt <ydk.models.cisco_ios_xr.Cisco_IOS_XR_subscriber_pppoe_ma_oper.Pppoe.AccessInterfaceStatistics.AccessInterfaceStatistic.PacketCounts.Padt>`
**config**\: False
.. attribute:: session_state
Session Stage counts
**type**\: :py:class:`SessionState <ydk.models.cisco_ios_xr.Cisco_IOS_XR_subscriber_pppoe_ma_oper.Pppoe.AccessInterfaceStatistics.AccessInterfaceStatistic.PacketCounts.SessionState>`
**config**\: False
.. attribute:: other
Other counts
**type**\: :py:class:`Other <ydk.models.cisco_ios_xr.Cisco_IOS_XR_subscriber_pppoe_ma_oper.Pppoe.AccessInterfaceStatistics.AccessInterfaceStatistic.PacketCounts.Other>`
**config**\: False
"""
_prefix = 'subscriber-pppoe-ma-oper'
_revision = '2019-10-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Pppoe.AccessInterfaceStatistics.AccessInterfaceStatistic.PacketCounts, self).__init__()
self.yang_name = "packet-counts"
self.yang_parent_name = "access-interface-statistic"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([("padi", ("padi", Pppoe.AccessInterfaceStatistics.AccessInterfaceStatistic.PacketCounts.Padi)), ("pado", ("pado", Pppoe.AccessInterfaceStatistics.AccessInterfaceStatistic.PacketCounts.Pado)), ("padr", ("padr", Pppoe.AccessInterfaceStatistics.AccessInterfaceStatistic.PacketCounts.Padr)), ("pads-success", ("pads_success", Pppoe.AccessInterfaceStatistics.AccessInterfaceStatistic.PacketCounts.PadsSuccess)), ("pads-error", ("pads_error", Pppoe.AccessInterfaceStatistics.AccessInterfaceStatistic.PacketCounts.PadsError)), ("padt", ("padt", Pppoe.AccessInterfaceStatistics.AccessInterfaceStatistic.PacketCounts.Padt)), ("session-state", ("session_state", Pppoe.AccessInterfaceStatistics.AccessInterfaceStatistic.PacketCounts.SessionState)), ("other", ("other", Pppoe.AccessInterfaceStatistics.AccessInterfaceStatistic.PacketCounts.Other))])
self._leafs = OrderedDict()
self.padi = Pppoe.AccessInterfaceStatistics.AccessInterfaceStatistic.PacketCounts.Padi()
self.padi.parent = self
self._children_name_map["padi"] = "padi"
self.pado = Pppoe.AccessInterfaceStatistics.AccessInterfaceStatistic.PacketCounts.Pado()
self.pado.parent = self
self._children_name_map["pado"] = "pado"
self.padr = Pppoe.AccessInterfaceStatistics.AccessInterfaceStatistic.PacketCounts.Padr()
self.padr.parent = self
self._children_name_map["padr"] = "padr"
self.pads_success = Pppoe.AccessInterfaceStatistics.AccessInterfaceStatistic.PacketCounts.PadsSuccess()
self.pads_success.parent = self
self._children_name_map["pads_success"] = "pads-success"
self.pads_error = Pppoe.AccessInterfaceStatistics.AccessInterfaceStatistic.PacketCounts.PadsError()
self.pads_error.parent = self
self._children_name_map["pads_error"] = "pads-error"
self.padt = Pppoe.AccessInterfaceStatistics.AccessInterfaceStatistic.PacketCounts.Padt()
self.padt.parent = self
self._children_name_map["padt"] = "padt"
self.session_state = Pppoe.AccessInterfaceStatistics.AccessInterfaceStatistic.PacketCounts.SessionState()
self.session_state.parent = self
self._children_name_map["session_state"] = "session-state"
self.other = Pppoe.AccessInterfaceStatistics.AccessInterfaceStatistic.PacketCounts.Other()
self.other.parent = self
self._children_name_map["other"] = "other"
self._segment_path = lambda: "packet-counts"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Pppoe.AccessInterfaceStatistics.AccessInterfaceStatistic.PacketCounts, [], name, value)
class Padi(_Entity_):
"""
PADI counts
.. attribute:: sent
Sent
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: received
Received
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: dropped
Dropped
**type**\: int
**range:** 0..4294967295
**config**\: False
"""
_prefix = 'subscriber-pppoe-ma-oper'
_revision = '2019-10-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Pppoe.AccessInterfaceStatistics.AccessInterfaceStatistic.PacketCounts.Padi, self).__init__()
self.yang_name = "padi"
self.yang_parent_name = "packet-counts"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('sent', (YLeaf(YType.uint32, 'sent'), ['int'])),
('received', (YLeaf(YType.uint32, 'received'), ['int'])),
('dropped', (YLeaf(YType.uint32, 'dropped'), ['int'])),
])
self.sent = None
self.received = None
self.dropped = None
self._segment_path = lambda: "padi"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Pppoe.AccessInterfaceStatistics.AccessInterfaceStatistic.PacketCounts.Padi, ['sent', 'received', 'dropped'], name, value)
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_subscriber_pppoe_ma_oper as meta
return meta._meta_table['Pppoe.AccessInterfaceStatistics.AccessInterfaceStatistic.PacketCounts.Padi']['meta_info']
class Pado(_Entity_):
"""
PADO counts
.. attribute:: sent
Sent
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: received
Received
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: dropped
Dropped
**type**\: int
**range:** 0..4294967295
**config**\: False
"""
_prefix = 'subscriber-pppoe-ma-oper'
_revision = '2019-10-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Pppoe.AccessInterfaceStatistics.AccessInterfaceStatistic.PacketCounts.Pado, self).__init__()
self.yang_name = "pado"
self.yang_parent_name = "packet-counts"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('sent', (YLeaf(YType.uint32, 'sent'), ['int'])),
('received', (YLeaf(YType.uint32, 'received'), ['int'])),
('dropped', (YLeaf(YType.uint32, 'dropped'), ['int'])),
])
self.sent = None
self.received = None
self.dropped = None
self._segment_path = lambda: "pado"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Pppoe.AccessInterfaceStatistics.AccessInterfaceStatistic.PacketCounts.Pado, ['sent', 'received', 'dropped'], name, value)
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_subscriber_pppoe_ma_oper as meta
return meta._meta_table['Pppoe.AccessInterfaceStatistics.AccessInterfaceStatistic.PacketCounts.Pado']['meta_info']
class Padr(_Entity_):
"""
PADR counts
.. attribute:: sent
Sent
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: received
Received
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: dropped
Dropped
**type**\: int
**range:** 0..4294967295
**config**\: False
"""
_prefix = 'subscriber-pppoe-ma-oper'
_revision = '2019-10-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Pppoe.AccessInterfaceStatistics.AccessInterfaceStatistic.PacketCounts.Padr, self).__init__()
self.yang_name = "padr"
self.yang_parent_name = "packet-counts"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('sent', (YLeaf(YType.uint32, 'sent'), ['int'])),
('received', (YLeaf(YType.uint32, 'received'), ['int'])),
('dropped', (YLeaf(YType.uint32, 'dropped'), ['int'])),
])
self.sent = None
self.received = None
self.dropped = None
self._segment_path = lambda: "padr"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Pppoe.AccessInterfaceStatistics.AccessInterfaceStatistic.PacketCounts.Padr, ['sent', 'received', 'dropped'], name, value)
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_subscriber_pppoe_ma_oper as meta
return meta._meta_table['Pppoe.AccessInterfaceStatistics.AccessInterfaceStatistic.PacketCounts.Padr']['meta_info']
class PadsSuccess(_Entity_):
"""
PADS Success counts
.. attribute:: sent
Sent
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: received
Received
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: dropped
Dropped
**type**\: int
**range:** 0..4294967295
**config**\: False
"""
_prefix = 'subscriber-pppoe-ma-oper'
_revision = '2019-10-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Pppoe.AccessInterfaceStatistics.AccessInterfaceStatistic.PacketCounts.PadsSuccess, self).__init__()
self.yang_name = "pads-success"
self.yang_parent_name = "packet-counts"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('sent', (YLeaf(YType.uint32, 'sent'), ['int'])),
('received', (YLeaf(YType.uint32, 'received'), ['int'])),
('dropped', (YLeaf(YType.uint32, 'dropped'), ['int'])),
])
self.sent = None
self.received = None
self.dropped = None
self._segment_path = lambda: "pads-success"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Pppoe.AccessInterfaceStatistics.AccessInterfaceStatistic.PacketCounts.PadsSuccess, ['sent', 'received', 'dropped'], name, value)
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_subscriber_pppoe_ma_oper as meta
return meta._meta_table['Pppoe.AccessInterfaceStatistics.AccessInterfaceStatistic.PacketCounts.PadsSuccess']['meta_info']
class PadsError(_Entity_):
"""
PADS Error counts
.. attribute:: sent
Sent
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: received
Received
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: dropped
Dropped
**type**\: int
**range:** 0..4294967295
**config**\: False
"""
_prefix = 'subscriber-pppoe-ma-oper'
_revision = '2019-10-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Pppoe.AccessInterfaceStatistics.AccessInterfaceStatistic.PacketCounts.PadsError, self).__init__()
self.yang_name = "pads-error"
self.yang_parent_name = "packet-counts"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('sent', (YLeaf(YType.uint32, 'sent'), ['int'])),
('received', (YLeaf(YType.uint32, 'received'), ['int'])),
('dropped', (YLeaf(YType.uint32, 'dropped'), ['int'])),
])
self.sent = None
self.received = None
self.dropped = None
self._segment_path = lambda: "pads-error"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Pppoe.AccessInterfaceStatistics.AccessInterfaceStatistic.PacketCounts.PadsError, ['sent', 'received', 'dropped'], name, value)
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_subscriber_pppoe_ma_oper as meta
return meta._meta_table['Pppoe.AccessInterfaceStatistics.AccessInterfaceStatistic.PacketCounts.PadsError']['meta_info']
class Padt(_Entity_):
"""
PADT counts
.. attribute:: sent
Sent
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: received
Received
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: dropped
Dropped
**type**\: int
**range:** 0..4294967295
**config**\: False
"""
_prefix = 'subscriber-pppoe-ma-oper'
_revision = '2019-10-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Pppoe.AccessInterfaceStatistics.AccessInterfaceStatistic.PacketCounts.Padt, self).__init__()
self.yang_name = "padt"
self.yang_parent_name = "packet-counts"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('sent', (YLeaf(YType.uint32, 'sent'), ['int'])),
('received', (YLeaf(YType.uint32, 'received'), ['int'])),
('dropped', (YLeaf(YType.uint32, 'dropped'), ['int'])),
])
self.sent = None
self.received = None
self.dropped = None
self._segment_path = lambda: "padt"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Pppoe.AccessInterfaceStatistics.AccessInterfaceStatistic.PacketCounts.Padt, ['sent', 'received', 'dropped'], name, value)
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_subscriber_pppoe_ma_oper as meta
return meta._meta_table['Pppoe.AccessInterfaceStatistics.AccessInterfaceStatistic.PacketCounts.Padt']['meta_info']
class SessionState(_Entity_):
"""
Session Stage counts
.. attribute:: sent
Sent
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: received
Received
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: dropped
Dropped
**type**\: int
**range:** 0..4294967295
**config**\: False
"""
_prefix = 'subscriber-pppoe-ma-oper'
_revision = '2019-10-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Pppoe.AccessInterfaceStatistics.AccessInterfaceStatistic.PacketCounts.SessionState, self).__init__()
self.yang_name = "session-state"
self.yang_parent_name = "packet-counts"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('sent', (YLeaf(YType.uint32, 'sent'), ['int'])),
('received', (YLeaf(YType.uint32, 'received'), ['int'])),
('dropped', (YLeaf(YType.uint32, 'dropped'), ['int'])),
])
self.sent = None
self.received = None
self.dropped = None
self._segment_path = lambda: "session-state"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Pppoe.AccessInterfaceStatistics.AccessInterfaceStatistic.PacketCounts.SessionState, ['sent', 'received', 'dropped'], name, value)
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_subscriber_pppoe_ma_oper as meta
return meta._meta_table['Pppoe.AccessInterfaceStatistics.AccessInterfaceStatistic.PacketCounts.SessionState']['meta_info']
class Other(_Entity_):
"""
Other counts
.. attribute:: sent
Sent
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: received
Received
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: dropped
Dropped
**type**\: int
**range:** 0..4294967295
**config**\: False
"""
_prefix = 'subscriber-pppoe-ma-oper'
_revision = '2019-10-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Pppoe.AccessInterfaceStatistics.AccessInterfaceStatistic.PacketCounts.Other, self).__init__()
self.yang_name = "other"
self.yang_parent_name = "packet-counts"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('sent', (YLeaf(YType.uint32, 'sent'), ['int'])),
('received', (YLeaf(YType.uint32, 'received'), ['int'])),
('dropped', (YLeaf(YType.uint32, 'dropped'), ['int'])),
])
self.sent = None
self.received = None
self.dropped = None
self._segment_path = lambda: "other"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Pppoe.AccessInterfaceStatistics.AccessInterfaceStatistic.PacketCounts.Other, ['sent', 'received', 'dropped'], name, value)
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_subscriber_pppoe_ma_oper as meta
return meta._meta_table['Pppoe.AccessInterfaceStatistics.AccessInterfaceStatistic.PacketCounts.Other']['meta_info']
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_subscriber_pppoe_ma_oper as meta
return meta._meta_table['Pppoe.AccessInterfaceStatistics.AccessInterfaceStatistic.PacketCounts']['meta_info']
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_subscriber_pppoe_ma_oper as meta
return meta._meta_table['Pppoe.AccessInterfaceStatistics.AccessInterfaceStatistic']['meta_info']
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_subscriber_pppoe_ma_oper as meta
return meta._meta_table['Pppoe.AccessInterfaceStatistics']['meta_info']
class Nodes(_Entity_):
"""
Per\-node PPPoE operational data
.. attribute:: node
PPPoE operational data for a particular node
**type**\: list of :py:class:`Node <ydk.models.cisco_ios_xr.Cisco_IOS_XR_subscriber_pppoe_ma_oper.Pppoe.Nodes.Node>`
**config**\: False
"""
_prefix = 'subscriber-pppoe-ma-oper'
_revision = '2019-10-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Pppoe.Nodes, self).__init__()
self.yang_name = "nodes"
self.yang_parent_name = "pppoe"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = []
self._child_classes = OrderedDict([("node", ("node", Pppoe.Nodes.Node))])
self._leafs = OrderedDict()
self.node = YList(self)
self._segment_path = lambda: "nodes"
self._absolute_path = lambda: "Cisco-IOS-XR-subscriber-pppoe-ma-oper:pppoe/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Pppoe.Nodes, [], name, value)
class Node(_Entity_):
"""
PPPoE operational data for a particular node
.. attribute:: node_name (key)
Node
**type**\: str
**pattern:** ([a\-zA\-Z0\-9\_]\*\\d+/){1,2}([a\-zA\-Z0\-9\_]\*\\d+)
**config**\: False
.. attribute:: disconnect_history
PPPoE disconnect history for a given node
**type**\: :py:class:`DisconnectHistory <ydk.models.cisco_ios_xr.Cisco_IOS_XR_subscriber_pppoe_ma_oper.Pppoe.Nodes.Node.DisconnectHistory>`
**config**\: False
.. attribute:: disconnect_history_unique
PPPoE unique disconnect history for a given node
**type**\: :py:class:`DisconnectHistoryUnique <ydk.models.cisco_ios_xr.Cisco_IOS_XR_subscriber_pppoe_ma_oper.Pppoe.Nodes.Node.DisconnectHistoryUnique>`
**config**\: False
.. attribute:: statistics
PPPoE statistics for a given node
**type**\: :py:class:`Statistics <ydk.models.cisco_ios_xr.Cisco_IOS_XR_subscriber_pppoe_ma_oper.Pppoe.Nodes.Node.Statistics>`
**config**\: False
.. attribute:: access_interface
PPPoE access interface information
**type**\: :py:class:`AccessInterface <ydk.models.cisco_ios_xr.Cisco_IOS_XR_subscriber_pppoe_ma_oper.Pppoe.Nodes.Node.AccessInterface>`
**config**\: False
.. attribute:: interfaces
Per interface PPPoE operational data
**type**\: :py:class:`Interfaces <ydk.models.cisco_ios_xr.Cisco_IOS_XR_subscriber_pppoe_ma_oper.Pppoe.Nodes.Node.Interfaces>`
**config**\: False
.. attribute:: bba_groups
PPPoE BBA\-Group information
**type**\: :py:class:`BbaGroups <ydk.models.cisco_ios_xr.Cisco_IOS_XR_subscriber_pppoe_ma_oper.Pppoe.Nodes.Node.BbaGroups>`
**config**\: False
.. attribute:: summary_total
PPPoE statistics for a given node
**type**\: :py:class:`SummaryTotal <ydk.models.cisco_ios_xr.Cisco_IOS_XR_subscriber_pppoe_ma_oper.Pppoe.Nodes.Node.SummaryTotal>`
**config**\: False
"""
_prefix = 'subscriber-pppoe-ma-oper'
_revision = '2019-10-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Pppoe.Nodes.Node, self).__init__()
self.yang_name = "node"
self.yang_parent_name = "nodes"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = ['node_name']
self._child_classes = OrderedDict([("disconnect-history", ("disconnect_history", Pppoe.Nodes.Node.DisconnectHistory)), ("disconnect-history-unique", ("disconnect_history_unique", Pppoe.Nodes.Node.DisconnectHistoryUnique)), ("statistics", ("statistics", Pppoe.Nodes.Node.Statistics)), ("access-interface", ("access_interface", Pppoe.Nodes.Node.AccessInterface)), ("interfaces", ("interfaces", Pppoe.Nodes.Node.Interfaces)), ("bba-groups", ("bba_groups", Pppoe.Nodes.Node.BbaGroups)), ("summary-total", ("summary_total", Pppoe.Nodes.Node.SummaryTotal))])
self._leafs = OrderedDict([
('node_name', (YLeaf(YType.str, 'node-name'), ['str'])),
])
self.node_name = None
self.disconnect_history = Pppoe.Nodes.Node.DisconnectHistory()
self.disconnect_history.parent = self
self._children_name_map["disconnect_history"] = "disconnect-history"
self.disconnect_history_unique = Pppoe.Nodes.Node.DisconnectHistoryUnique()
self.disconnect_history_unique.parent = self
self._children_name_map["disconnect_history_unique"] = "disconnect-history-unique"
self.statistics = Pppoe.Nodes.Node.Statistics()
self.statistics.parent = self
self._children_name_map["statistics"] = "statistics"
self.access_interface = Pppoe.Nodes.Node.AccessInterface()
self.access_interface.parent = self
self._children_name_map["access_interface"] = "access-interface"
self.interfaces = Pppoe.Nodes.Node.Interfaces()
self.interfaces.parent = self
self._children_name_map["interfaces"] = "interfaces"
self.bba_groups = Pppoe.Nodes.Node.BbaGroups()
self.bba_groups.parent = self
self._children_name_map["bba_groups"] = "bba-groups"
self.summary_total = Pppoe.Nodes.Node.SummaryTotal()
self.summary_total.parent = self
self._children_name_map["summary_total"] = "summary-total"
self._segment_path = lambda: "node" + "[node-name='" + str(self.node_name) + "']"
self._absolute_path = lambda: "Cisco-IOS-XR-subscriber-pppoe-ma-oper:pppoe/nodes/%s" % self._segment_path()
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Pppoe.Nodes.Node, ['node_name'], name, value)
class DisconnectHistory(_Entity_):
"""
PPPoE disconnect history for a given node
.. attribute:: current_idx
Current index of history
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: entry
Array of disconnected subscribers
**type**\: list of :py:class:`Entry <ydk.models.cisco_ios_xr.Cisco_IOS_XR_subscriber_pppoe_ma_oper.Pppoe.Nodes.Node.DisconnectHistory.Entry>`
**config**\: False
"""
_prefix = 'subscriber-pppoe-ma-oper'
_revision = '2019-10-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Pppoe.Nodes.Node.DisconnectHistory, self).__init__()
self.yang_name = "disconnect-history"
self.yang_parent_name = "node"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([("entry", ("entry", Pppoe.Nodes.Node.DisconnectHistory.Entry))])
self._leafs = OrderedDict([
('current_idx', (YLeaf(YType.uint32, 'current-idx'), ['int'])),
])
self.current_idx = None
self.entry = YList(self)
self._segment_path = lambda: "disconnect-history"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Pppoe.Nodes.Node.DisconnectHistory, ['current_idx'], name, value)
class Entry(_Entity_):
"""
Array of disconnected subscribers
.. attribute:: session_idb
Session IDB
**type**\: :py:class:`SessionIdb <ydk.models.cisco_ios_xr.Cisco_IOS_XR_subscriber_pppoe_ma_oper.Pppoe.Nodes.Node.DisconnectHistory.Entry.SessionIdb>`
**config**\: False
.. attribute:: timestamp
Time when disconnected
**type**\: int
**range:** 0..18446744073709551615
**config**\: False
.. attribute:: ifname
Interface name
**type**\: str
**config**\: False
.. attribute:: trigger
Disconnect Trigger
**type**\: :py:class:`PppoeMaSessionTrig <ydk.models.cisco_ios_xr.Cisco_IOS_XR_subscriber_pppoe_ma_oper.PppoeMaSessionTrig>`
**config**\: False
"""
_prefix = 'subscriber-pppoe-ma-oper'
_revision = '2019-10-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Pppoe.Nodes.Node.DisconnectHistory.Entry, self).__init__()
self.yang_name = "entry"
self.yang_parent_name = "disconnect-history"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([("session-idb", ("session_idb", Pppoe.Nodes.Node.DisconnectHistory.Entry.SessionIdb))])
self._leafs = OrderedDict([
('timestamp', (YLeaf(YType.uint64, 'timestamp'), ['int'])),
('ifname', (YLeaf(YType.str, 'ifname'), ['str'])),
('trigger', (YLeaf(YType.enumeration, 'trigger'), [('ydk.models.cisco_ios_xr.Cisco_IOS_XR_subscriber_pppoe_ma_oper', 'PppoeMaSessionTrig', '')])),
])
self.timestamp = None
self.ifname = None
self.trigger = None
self.session_idb = Pppoe.Nodes.Node.DisconnectHistory.Entry.SessionIdb()
self.session_idb.parent = self
self._children_name_map["session_idb"] = "session-idb"
self._segment_path = lambda: "entry"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Pppoe.Nodes.Node.DisconnectHistory.Entry, ['timestamp', 'ifname', 'trigger'], name, value)
class SessionIdb(_Entity_):
"""
Session IDB
.. attribute:: tags
Tags
**type**\: :py:class:`Tags <ydk.models.cisco_ios_xr.Cisco_IOS_XR_subscriber_pppoe_ma_oper.Pppoe.Nodes.Node.DisconnectHistory.Entry.SessionIdb.Tags>`
**config**\: False
.. attribute:: vlan_outer_tag
VLAN Outer Tag
**type**\: :py:class:`VlanOuterTag <ydk.models.cisco_ios_xr.Cisco_IOS_XR_subscriber_pppoe_ma_oper.Pppoe.Nodes.Node.DisconnectHistory.Entry.SessionIdb.VlanOuterTag>`
**config**\: False
.. attribute:: vlan_inner_tag
VLAN Inner Tag
**type**\: :py:class:`VlanInnerTag <ydk.models.cisco_ios_xr.Cisco_IOS_XR_subscriber_pppoe_ma_oper.Pppoe.Nodes.Node.DisconnectHistory.Entry.SessionIdb.VlanInnerTag>`
**config**\: False
.. attribute:: interface
Interface
**type**\: str
**pattern:** [a\-zA\-Z0\-9.\_/\-]+
**config**\: False
.. attribute:: access_interface
Access Interface
**type**\: str
**pattern:** [a\-zA\-Z0\-9.\_/\-]+
**config**\: False
.. attribute:: session_id
Session ID
**type**\: int
**range:** 0..65535
**config**\: False
.. attribute:: sub_label
Sub Label
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: peer_mac_address
Peer Mac\-Address
**type**\: str
**pattern:** [0\-9a\-fA\-F]{2}(\:[0\-9a\-fA\-F]{2}){5}
**config**\: False
.. attribute:: state
State
**type**\: :py:class:`PppoeMaSessionState <ydk.models.cisco_ios_xr.Cisco_IOS_XR_subscriber_pppoe_ma_oper.PppoeMaSessionState>`
**config**\: False
.. attribute:: cdm_object_handle
CDM Object Handle
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: chkpt_id
Chkpt ID
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: punted_count
Punted Count
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: port_limit
Port Limit
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: is_counted
Is BBA Counted
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: is_vlan_outer_tag
Is VLAN Outer Tag
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: is_vlan_inner_tag
Is VLAN Inner Tag
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: is_cleanup_pending
Is Cleanup Pending
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: is_disconnect_done_pending
Is Disconnect Done Pending
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: is_delete_done_pending
Is Delete Done Pending
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: is_intf_create_callback_pending
Is Interface Create Callback pending
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: is_publish_encaps_attr_pending
Is Publish Encaps Attr pending
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: is_publish_encaps_attr_cb_pending
Is Publish Encaps Attr Callback pending
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: is_intf_delete_callback_pending
Is Interface Delete Callback pending
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: is_intf_delete_pending
Is Interface Delete pending
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: is_im_owned_resource
Is IM Owned Resource
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: is_im_final_received
Is IM Final received
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: is_im_owned_resource_missing
Is IM Owned Resource missing
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: is_aaa_start_request_callback_pending
Is AAA Start request callback pending
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: is_aaa_owned_resource
Is AAA Owned Resource
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: is_aaa_disconnect_requested
Is AAA Disconnect Requested
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: is_aaa_disconnect_received
Is AAA Disconnect Received
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: is_sub_db_activate_callback_pending
Is SubDB Activate callback pending
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: is_pads_sent
Is PADS Sent
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: is_padt_received
Is PADT Received
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: is_in_flight
Is Session In Flight
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: is_radius_override
Is RADIUS override enabled
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: expected_notifications
Expected Notifications
**type**\: int
**range:** 0..255
**config**\: False
.. attribute:: received_notifications
Received Notifications
**type**\: int
**range:** 0..255
**config**\: False
.. attribute:: srg_state
SRG state
**type**\: :py:class:`PppoeMaSessionIdbSrgState <ydk.models.cisco_ios_xr.Cisco_IOS_XR_subscriber_pppoe_ma_oper.PppoeMaSessionIdbSrgState>`
**config**\: False
.. attribute:: is_srg_data_received
Is SRG Data Received
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: is_iedge_data_received
Is IEDGE Data Received
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
"""
_prefix = 'subscriber-pppoe-ma-oper'
_revision = '2019-10-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Pppoe.Nodes.Node.DisconnectHistory.Entry.SessionIdb, self).__init__()
self.yang_name = "session-idb"
self.yang_parent_name = "entry"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([("tags", ("tags", Pppoe.Nodes.Node.DisconnectHistory.Entry.SessionIdb.Tags)), ("vlan-outer-tag", ("vlan_outer_tag", Pppoe.Nodes.Node.DisconnectHistory.Entry.SessionIdb.VlanOuterTag)), ("vlan-inner-tag", ("vlan_inner_tag", Pppoe.Nodes.Node.DisconnectHistory.Entry.SessionIdb.VlanInnerTag))])
self._leafs = OrderedDict([
('interface', (YLeaf(YType.str, 'interface'), ['str'])),
('access_interface', (YLeaf(YType.str, 'access-interface'), ['str'])),
('session_id', (YLeaf(YType.uint16, 'session-id'), ['int'])),
('sub_label', (YLeaf(YType.uint32, 'sub-label'), ['int'])),
('peer_mac_address', (YLeaf(YType.str, 'peer-mac-address'), ['str'])),
('state', (YLeaf(YType.enumeration, 'state'), [('ydk.models.cisco_ios_xr.Cisco_IOS_XR_subscriber_pppoe_ma_oper', 'PppoeMaSessionState', '')])),
('cdm_object_handle', (YLeaf(YType.uint32, 'cdm-object-handle'), ['int'])),
('chkpt_id', (YLeaf(YType.uint32, 'chkpt-id'), ['int'])),
('punted_count', (YLeaf(YType.uint32, 'punted-count'), ['int'])),
('port_limit', (YLeaf(YType.uint32, 'port-limit'), ['int'])),
('is_counted', (YLeaf(YType.int32, 'is-counted'), ['int'])),
('is_vlan_outer_tag', (YLeaf(YType.int32, 'is-vlan-outer-tag'), ['int'])),
('is_vlan_inner_tag', (YLeaf(YType.int32, 'is-vlan-inner-tag'), ['int'])),
('is_cleanup_pending', (YLeaf(YType.int32, 'is-cleanup-pending'), ['int'])),
('is_disconnect_done_pending', (YLeaf(YType.int32, 'is-disconnect-done-pending'), ['int'])),
('is_delete_done_pending', (YLeaf(YType.int32, 'is-delete-done-pending'), ['int'])),
('is_intf_create_callback_pending', (YLeaf(YType.int32, 'is-intf-create-callback-pending'), ['int'])),
('is_publish_encaps_attr_pending', (YLeaf(YType.int32, 'is-publish-encaps-attr-pending'), ['int'])),
('is_publish_encaps_attr_cb_pending', (YLeaf(YType.int32, 'is-publish-encaps-attr-cb-pending'), ['int'])),
('is_intf_delete_callback_pending', (YLeaf(YType.int32, 'is-intf-delete-callback-pending'), ['int'])),
('is_intf_delete_pending', (YLeaf(YType.int32, 'is-intf-delete-pending'), ['int'])),
('is_im_owned_resource', (YLeaf(YType.int32, 'is-im-owned-resource'), ['int'])),
('is_im_final_received', (YLeaf(YType.int32, 'is-im-final-received'), ['int'])),
('is_im_owned_resource_missing', (YLeaf(YType.int32, 'is-im-owned-resource-missing'), ['int'])),
('is_aaa_start_request_callback_pending', (YLeaf(YType.int32, 'is-aaa-start-request-callback-pending'), ['int'])),
('is_aaa_owned_resource', (YLeaf(YType.int32, 'is-aaa-owned-resource'), ['int'])),
('is_aaa_disconnect_requested', (YLeaf(YType.int32, 'is-aaa-disconnect-requested'), ['int'])),
('is_aaa_disconnect_received', (YLeaf(YType.int32, 'is-aaa-disconnect-received'), ['int'])),
('is_sub_db_activate_callback_pending', (YLeaf(YType.int32, 'is-sub-db-activate-callback-pending'), ['int'])),
('is_pads_sent', (YLeaf(YType.int32, 'is-pads-sent'), ['int'])),
('is_padt_received', (YLeaf(YType.int32, 'is-padt-received'), ['int'])),
('is_in_flight', (YLeaf(YType.int32, 'is-in-flight'), ['int'])),
('is_radius_override', (YLeaf(YType.int32, 'is-radius-override'), ['int'])),
('expected_notifications', (YLeaf(YType.uint8, 'expected-notifications'), ['int'])),
('received_notifications', (YLeaf(YType.uint8, 'received-notifications'), ['int'])),
('srg_state', (YLeaf(YType.enumeration, 'srg-state'), [('ydk.models.cisco_ios_xr.Cisco_IOS_XR_subscriber_pppoe_ma_oper', 'PppoeMaSessionIdbSrgState', '')])),
('is_srg_data_received', (YLeaf(YType.int32, 'is-srg-data-received'), ['int'])),
('is_iedge_data_received', (YLeaf(YType.int32, 'is-iedge-data-received'), ['int'])),
])
self.interface = None
self.access_interface = None
self.session_id = None
self.sub_label = None
self.peer_mac_address = None
self.state = None
self.cdm_object_handle = None
self.chkpt_id = None
self.punted_count = None
self.port_limit = None
self.is_counted = None
self.is_vlan_outer_tag = None
self.is_vlan_inner_tag = None
self.is_cleanup_pending = None
self.is_disconnect_done_pending = None
self.is_delete_done_pending = None
self.is_intf_create_callback_pending = None
self.is_publish_encaps_attr_pending = None
self.is_publish_encaps_attr_cb_pending = None
self.is_intf_delete_callback_pending = None
self.is_intf_delete_pending = None
self.is_im_owned_resource = None
self.is_im_final_received = None
self.is_im_owned_resource_missing = None
self.is_aaa_start_request_callback_pending = None
self.is_aaa_owned_resource = None
self.is_aaa_disconnect_requested = None
self.is_aaa_disconnect_received = None
self.is_sub_db_activate_callback_pending = None
self.is_pads_sent = None
self.is_padt_received = None
self.is_in_flight = None
self.is_radius_override = None
self.expected_notifications = None
self.received_notifications = None
self.srg_state = None
self.is_srg_data_received = None
self.is_iedge_data_received = None
self.tags = Pppoe.Nodes.Node.DisconnectHistory.Entry.SessionIdb.Tags()
self.tags.parent = self
self._children_name_map["tags"] = "tags"
self.vlan_outer_tag = Pppoe.Nodes.Node.DisconnectHistory.Entry.SessionIdb.VlanOuterTag()
self.vlan_outer_tag.parent = self
self._children_name_map["vlan_outer_tag"] = "vlan-outer-tag"
self.vlan_inner_tag = Pppoe.Nodes.Node.DisconnectHistory.Entry.SessionIdb.VlanInnerTag()
self.vlan_inner_tag.parent = self
self._children_name_map["vlan_inner_tag"] = "vlan-inner-tag"
self._segment_path = lambda: "session-idb"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Pppoe.Nodes.Node.DisconnectHistory.Entry.SessionIdb, ['interface', 'access_interface', 'session_id', 'sub_label', 'peer_mac_address', 'state', 'cdm_object_handle', 'chkpt_id', 'punted_count', 'port_limit', 'is_counted', 'is_vlan_outer_tag', 'is_vlan_inner_tag', 'is_cleanup_pending', 'is_disconnect_done_pending', 'is_delete_done_pending', 'is_intf_create_callback_pending', 'is_publish_encaps_attr_pending', 'is_publish_encaps_attr_cb_pending', 'is_intf_delete_callback_pending', 'is_intf_delete_pending', 'is_im_owned_resource', 'is_im_final_received', 'is_im_owned_resource_missing', 'is_aaa_start_request_callback_pending', 'is_aaa_owned_resource', 'is_aaa_disconnect_requested', 'is_aaa_disconnect_received', 'is_sub_db_activate_callback_pending', 'is_pads_sent', 'is_padt_received', 'is_in_flight', 'is_radius_override', 'expected_notifications', 'received_notifications', 'srg_state', 'is_srg_data_received', 'is_iedge_data_received'], name, value)
class Tags(_Entity_):
"""
Tags
.. attribute:: access_loop_encapsulation
Access Loop Encapsulation
**type**\: :py:class:`AccessLoopEncapsulation <ydk.models.cisco_ios_xr.Cisco_IOS_XR_subscriber_pppoe_ma_oper.Pppoe.Nodes.Node.DisconnectHistory.Entry.SessionIdb.Tags.AccessLoopEncapsulation>`
**config**\: False
.. attribute:: is_service_name
Is Service Name
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: is_max_payload
Is Max Payload
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: is_host_uniq
Is Host Uniq
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: is_relay_session_id
Is Relay Session ID
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: is_vendor_specific
Is Vendor Specific
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: is_iwf
Is IWF
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: is_remote_id
Is Remote ID
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: is_circuit_id
Is Circuit ID
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: is_dsl_tag
Is DSL Tag
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: service_name
Service Name
**type**\: str
**config**\: False
.. attribute:: max_payload
Max Payload
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: host_uniq
Host Uniq
**type**\: str
**pattern:** ([0\-9a\-fA\-F]{2}(\:[0\-9a\-fA\-F]{2})\*)?
**config**\: False
.. attribute:: relay_session_id
Relay Session ID
**type**\: str
**pattern:** ([0\-9a\-fA\-F]{2}(\:[0\-9a\-fA\-F]{2})\*)?
**config**\: False
.. attribute:: remote_id
Remote ID
**type**\: str
**config**\: False
.. attribute:: circuit_id
Circuit ID
**type**\: str
**config**\: False
.. attribute:: is_dsl_actual_up
Is DSL Actual Up
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: is_dsl_actual_down
Is DSL Actual Down
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: is_dsl_min_up
Is DSL Min Up
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: is_dsl_min_down
Is DSL Min Down
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: is_dsl_attain_up
Is DSL Attain Up
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: is_dsl_attain_down
Is DSL Attain Down
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: is_dsl_max_up
Is DSL Max Up
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: is_dsl_max_down
Is DSL Max Down
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: is_dsl_min_up_low
Is DSL Min Up Low
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: is_dsl_min_down_low
Is DSL Min Down Low
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: is_dsl_max_delay_up
Is DSL Max Delay Up
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: is_dsl_actual_delay_up
Is DSL Actual Delay Up
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: is_dsl_max_delay_down
Is DSL Max Delay Down
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: is_dsl_actual_delay_down
Is DSL Actual Delay Down
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: is_access_loop_encapsulation
Is Access Loop Encapsulation
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: dsl_actual_up
DSL Actual Up
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: dsl_actual_down
DSL Actual Down
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: dsl_min_up
DSL Min Up
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: dsl_min_down
DSL Min Down
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: dsl_attain_up
DSL Attain Up
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: dsl_attain_down
DSL Attain Down
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: dsl_max_up
DSL Max Up
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: dsl_max_down
DSL Max Down
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: dsl_min_up_low
DSL Min Up Low
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: dsl_min_down_low
DSL Min Down Low
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: dsl_max_delay_up
DSL Max Delay Up
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: dsl_actual_delay_up
DSL Actual Delay Up
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: dsl_max_delay_down
DSL Max Delay Down
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: dsl_actual_delay_down
DSL Actual Delay Down
**type**\: int
**range:** 0..4294967295
**config**\: False
"""
_prefix = 'subscriber-pppoe-ma-oper'
_revision = '2019-10-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Pppoe.Nodes.Node.DisconnectHistory.Entry.SessionIdb.Tags, self).__init__()
self.yang_name = "tags"
self.yang_parent_name = "session-idb"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([("access-loop-encapsulation", ("access_loop_encapsulation", Pppoe.Nodes.Node.DisconnectHistory.Entry.SessionIdb.Tags.AccessLoopEncapsulation))])
self._leafs = OrderedDict([
('is_service_name', (YLeaf(YType.int32, 'is-service-name'), ['int'])),
('is_max_payload', (YLeaf(YType.int32, 'is-max-payload'), ['int'])),
('is_host_uniq', (YLeaf(YType.int32, 'is-host-uniq'), ['int'])),
('is_relay_session_id', (YLeaf(YType.int32, 'is-relay-session-id'), ['int'])),
('is_vendor_specific', (YLeaf(YType.int32, 'is-vendor-specific'), ['int'])),
('is_iwf', (YLeaf(YType.int32, 'is-iwf'), ['int'])),
('is_remote_id', (YLeaf(YType.int32, 'is-remote-id'), ['int'])),
('is_circuit_id', (YLeaf(YType.int32, 'is-circuit-id'), ['int'])),
('is_dsl_tag', (YLeaf(YType.int32, 'is-dsl-tag'), ['int'])),
('service_name', (YLeaf(YType.str, 'service-name'), ['str'])),
('max_payload', (YLeaf(YType.uint32, 'max-payload'), ['int'])),
('host_uniq', (YLeaf(YType.str, 'host-uniq'), ['str'])),
('relay_session_id', (YLeaf(YType.str, 'relay-session-id'), ['str'])),
('remote_id', (YLeaf(YType.str, 'remote-id'), ['str'])),
('circuit_id', (YLeaf(YType.str, 'circuit-id'), ['str'])),
('is_dsl_actual_up', (YLeaf(YType.int32, 'is-dsl-actual-up'), ['int'])),
('is_dsl_actual_down', (YLeaf(YType.int32, 'is-dsl-actual-down'), ['int'])),
('is_dsl_min_up', (YLeaf(YType.int32, 'is-dsl-min-up'), ['int'])),
('is_dsl_min_down', (YLeaf(YType.int32, 'is-dsl-min-down'), ['int'])),
('is_dsl_attain_up', (YLeaf(YType.int32, 'is-dsl-attain-up'), ['int'])),
('is_dsl_attain_down', (YLeaf(YType.int32, 'is-dsl-attain-down'), ['int'])),
('is_dsl_max_up', (YLeaf(YType.int32, 'is-dsl-max-up'), ['int'])),
('is_dsl_max_down', (YLeaf(YType.int32, 'is-dsl-max-down'), ['int'])),
('is_dsl_min_up_low', (YLeaf(YType.int32, 'is-dsl-min-up-low'), ['int'])),
('is_dsl_min_down_low', (YLeaf(YType.int32, 'is-dsl-min-down-low'), ['int'])),
('is_dsl_max_delay_up', (YLeaf(YType.int32, 'is-dsl-max-delay-up'), ['int'])),
('is_dsl_actual_delay_up', (YLeaf(YType.int32, 'is-dsl-actual-delay-up'), ['int'])),
('is_dsl_max_delay_down', (YLeaf(YType.int32, 'is-dsl-max-delay-down'), ['int'])),
('is_dsl_actual_delay_down', (YLeaf(YType.int32, 'is-dsl-actual-delay-down'), ['int'])),
('is_access_loop_encapsulation', (YLeaf(YType.int32, 'is-access-loop-encapsulation'), ['int'])),
('dsl_actual_up', (YLeaf(YType.uint32, 'dsl-actual-up'), ['int'])),
('dsl_actual_down', (YLeaf(YType.uint32, 'dsl-actual-down'), ['int'])),
('dsl_min_up', (YLeaf(YType.uint32, 'dsl-min-up'), ['int'])),
('dsl_min_down', (YLeaf(YType.uint32, 'dsl-min-down'), ['int'])),
('dsl_attain_up', (YLeaf(YType.uint32, 'dsl-attain-up'), ['int'])),
('dsl_attain_down', (YLeaf(YType.uint32, 'dsl-attain-down'), ['int'])),
('dsl_max_up', (YLeaf(YType.uint32, 'dsl-max-up'), ['int'])),
('dsl_max_down', (YLeaf(YType.uint32, 'dsl-max-down'), ['int'])),
('dsl_min_up_low', (YLeaf(YType.uint32, 'dsl-min-up-low'), ['int'])),
('dsl_min_down_low', (YLeaf(YType.uint32, 'dsl-min-down-low'), ['int'])),
('dsl_max_delay_up', (YLeaf(YType.uint32, 'dsl-max-delay-up'), ['int'])),
('dsl_actual_delay_up', (YLeaf(YType.uint32, 'dsl-actual-delay-up'), ['int'])),
('dsl_max_delay_down', (YLeaf(YType.uint32, 'dsl-max-delay-down'), ['int'])),
('dsl_actual_delay_down', (YLeaf(YType.uint32, 'dsl-actual-delay-down'), ['int'])),
])
self.is_service_name = None
self.is_max_payload = None
self.is_host_uniq = None
self.is_relay_session_id = None
self.is_vendor_specific = None
self.is_iwf = None
self.is_remote_id = None
self.is_circuit_id = None
self.is_dsl_tag = None
self.service_name = None
self.max_payload = None
self.host_uniq = None
self.relay_session_id = None
self.remote_id = None
self.circuit_id = None
self.is_dsl_actual_up = None
self.is_dsl_actual_down = None
self.is_dsl_min_up = None
self.is_dsl_min_down = None
self.is_dsl_attain_up = None
self.is_dsl_attain_down = None
self.is_dsl_max_up = None
self.is_dsl_max_down = None
self.is_dsl_min_up_low = None
self.is_dsl_min_down_low = None
self.is_dsl_max_delay_up = None
self.is_dsl_actual_delay_up = None
self.is_dsl_max_delay_down = None
self.is_dsl_actual_delay_down = None
self.is_access_loop_encapsulation = None
self.dsl_actual_up = None
self.dsl_actual_down = None
self.dsl_min_up = None
self.dsl_min_down = None
self.dsl_attain_up = None
self.dsl_attain_down = None
self.dsl_max_up = None
self.dsl_max_down = None
self.dsl_min_up_low = None
self.dsl_min_down_low = None
self.dsl_max_delay_up = None
self.dsl_actual_delay_up = None
self.dsl_max_delay_down = None
self.dsl_actual_delay_down = None
self.access_loop_encapsulation = Pppoe.Nodes.Node.DisconnectHistory.Entry.SessionIdb.Tags.AccessLoopEncapsulation()
self.access_loop_encapsulation.parent = self
self._children_name_map["access_loop_encapsulation"] = "access-loop-encapsulation"
self._segment_path = lambda: "tags"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Pppoe.Nodes.Node.DisconnectHistory.Entry.SessionIdb.Tags, ['is_service_name', 'is_max_payload', 'is_host_uniq', 'is_relay_session_id', 'is_vendor_specific', 'is_iwf', 'is_remote_id', 'is_circuit_id', 'is_dsl_tag', 'service_name', 'max_payload', 'host_uniq', 'relay_session_id', 'remote_id', 'circuit_id', 'is_dsl_actual_up', 'is_dsl_actual_down', 'is_dsl_min_up', 'is_dsl_min_down', 'is_dsl_attain_up', 'is_dsl_attain_down', 'is_dsl_max_up', 'is_dsl_max_down', 'is_dsl_min_up_low', 'is_dsl_min_down_low', 'is_dsl_max_delay_up', 'is_dsl_actual_delay_up', 'is_dsl_max_delay_down', 'is_dsl_actual_delay_down', 'is_access_loop_encapsulation', 'dsl_actual_up', 'dsl_actual_down', 'dsl_min_up', 'dsl_min_down', 'dsl_attain_up', 'dsl_attain_down', 'dsl_max_up', 'dsl_max_down', 'dsl_min_up_low', 'dsl_min_down_low', 'dsl_max_delay_up', 'dsl_actual_delay_up', 'dsl_max_delay_down', 'dsl_actual_delay_down'], name, value)
class AccessLoopEncapsulation(_Entity_):
"""
Access Loop Encapsulation
.. attribute:: data_link
Data Link
**type**\: int
**range:** 0..255
**config**\: False
.. attribute:: encaps1
Encaps 1
**type**\: int
**range:** 0..255
**config**\: False
.. attribute:: encaps2
Encaps 2
**type**\: int
**range:** 0..255
**config**\: False
"""
_prefix = 'subscriber-pppoe-ma-oper'
_revision = '2019-10-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Pppoe.Nodes.Node.DisconnectHistory.Entry.SessionIdb.Tags.AccessLoopEncapsulation, self).__init__()
self.yang_name = "access-loop-encapsulation"
self.yang_parent_name = "tags"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('data_link', (YLeaf(YType.uint8, 'data-link'), ['int'])),
('encaps1', (YLeaf(YType.uint8, 'encaps1'), ['int'])),
('encaps2', (YLeaf(YType.uint8, 'encaps2'), ['int'])),
])
self.data_link = None
self.encaps1 = None
self.encaps2 = None
self._segment_path = lambda: "access-loop-encapsulation"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Pppoe.Nodes.Node.DisconnectHistory.Entry.SessionIdb.Tags.AccessLoopEncapsulation, ['data_link', 'encaps1', 'encaps2'], name, value)
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_subscriber_pppoe_ma_oper as meta
return meta._meta_table['Pppoe.Nodes.Node.DisconnectHistory.Entry.SessionIdb.Tags.AccessLoopEncapsulation']['meta_info']
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_subscriber_pppoe_ma_oper as meta
return meta._meta_table['Pppoe.Nodes.Node.DisconnectHistory.Entry.SessionIdb.Tags']['meta_info']
class VlanOuterTag(_Entity_):
"""
VLAN Outer Tag
.. attribute:: ether_type
Ethertype. See IEEE 802.1Q for more information
**type**\: int
**range:** 0..65535
**config**\: False
.. attribute:: user_priority
User Priority
**type**\: int
**range:** 0..255
**config**\: False
.. attribute:: cfi
CFI
**type**\: int
**range:** 0..255
**config**\: False
.. attribute:: vlan_id
VLAN ID
**type**\: int
**range:** 0..65535
**config**\: False
"""
_prefix = 'subscriber-pppoe-ma-oper'
_revision = '2019-10-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Pppoe.Nodes.Node.DisconnectHistory.Entry.SessionIdb.VlanOuterTag, self).__init__()
self.yang_name = "vlan-outer-tag"
self.yang_parent_name = "session-idb"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('ether_type', (YLeaf(YType.uint16, 'ether-type'), ['int'])),
('user_priority', (YLeaf(YType.uint8, 'user-priority'), ['int'])),
('cfi', (YLeaf(YType.uint8, 'cfi'), ['int'])),
('vlan_id', (YLeaf(YType.uint16, 'vlan-id'), ['int'])),
])
self.ether_type = None
self.user_priority = None
self.cfi = None
self.vlan_id = None
self._segment_path = lambda: "vlan-outer-tag"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Pppoe.Nodes.Node.DisconnectHistory.Entry.SessionIdb.VlanOuterTag, ['ether_type', 'user_priority', 'cfi', 'vlan_id'], name, value)
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_subscriber_pppoe_ma_oper as meta
return meta._meta_table['Pppoe.Nodes.Node.DisconnectHistory.Entry.SessionIdb.VlanOuterTag']['meta_info']
class VlanInnerTag(_Entity_):
"""
VLAN Inner Tag
.. attribute:: ether_type
Ethertype. See IEEE 802.1Q for more information
**type**\: int
**range:** 0..65535
**config**\: False
.. attribute:: user_priority
User Priority
**type**\: int
**range:** 0..255
**config**\: False
.. attribute:: cfi
CFI
**type**\: int
**range:** 0..255
**config**\: False
.. attribute:: vlan_id
VLAN ID
**type**\: int
**range:** 0..65535
**config**\: False
"""
_prefix = 'subscriber-pppoe-ma-oper'
_revision = '2019-10-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Pppoe.Nodes.Node.DisconnectHistory.Entry.SessionIdb.VlanInnerTag, self).__init__()
self.yang_name = "vlan-inner-tag"
self.yang_parent_name = "session-idb"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('ether_type', (YLeaf(YType.uint16, 'ether-type'), ['int'])),
('user_priority', (YLeaf(YType.uint8, 'user-priority'), ['int'])),
('cfi', (YLeaf(YType.uint8, 'cfi'), ['int'])),
('vlan_id', (YLeaf(YType.uint16, 'vlan-id'), ['int'])),
])
self.ether_type = None
self.user_priority = None
self.cfi = None
self.vlan_id = None
self._segment_path = lambda: "vlan-inner-tag"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Pppoe.Nodes.Node.DisconnectHistory.Entry.SessionIdb.VlanInnerTag, ['ether_type', 'user_priority', 'cfi', 'vlan_id'], name, value)
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_subscriber_pppoe_ma_oper as meta
return meta._meta_table['Pppoe.Nodes.Node.DisconnectHistory.Entry.SessionIdb.VlanInnerTag']['meta_info']
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_subscriber_pppoe_ma_oper as meta
return meta._meta_table['Pppoe.Nodes.Node.DisconnectHistory.Entry.SessionIdb']['meta_info']
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_subscriber_pppoe_ma_oper as meta
return meta._meta_table['Pppoe.Nodes.Node.DisconnectHistory.Entry']['meta_info']
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_subscriber_pppoe_ma_oper as meta
return meta._meta_table['Pppoe.Nodes.Node.DisconnectHistory']['meta_info']
class DisconnectHistoryUnique(_Entity_):
"""
PPPoE unique disconnect history for a given
node
.. attribute:: disconnect_count
The total number of disconnects
**type**\: list of int
**range:** 0..4294967295
**config**\: False
.. attribute:: entry
Array of disconnected subscribers
**type**\: list of :py:class:`Entry <ydk.models.cisco_ios_xr.Cisco_IOS_XR_subscriber_pppoe_ma_oper.Pppoe.Nodes.Node.DisconnectHistoryUnique.Entry>`
**config**\: False
"""
_prefix = 'subscriber-pppoe-ma-oper'
_revision = '2019-10-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Pppoe.Nodes.Node.DisconnectHistoryUnique, self).__init__()
self.yang_name = "disconnect-history-unique"
self.yang_parent_name = "node"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([("entry", ("entry", Pppoe.Nodes.Node.DisconnectHistoryUnique.Entry))])
self._leafs = OrderedDict([
('disconnect_count', (YLeafList(YType.uint32, 'disconnect-count'), ['int'])),
])
self.disconnect_count = []
self.entry = YList(self)
self._segment_path = lambda: "disconnect-history-unique"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Pppoe.Nodes.Node.DisconnectHistoryUnique, ['disconnect_count'], name, value)
class Entry(_Entity_):
"""
Array of disconnected subscribers
.. attribute:: session_idb
Session IDB
**type**\: :py:class:`SessionIdb <ydk.models.cisco_ios_xr.Cisco_IOS_XR_subscriber_pppoe_ma_oper.Pppoe.Nodes.Node.DisconnectHistoryUnique.Entry.SessionIdb>`
**config**\: False
.. attribute:: timestamp
Time when disconnected
**type**\: int
**range:** 0..18446744073709551615
**config**\: False
.. attribute:: ifname
Interface name
**type**\: str
**config**\: False
.. attribute:: trigger
Disconnect Trigger
**type**\: :py:class:`PppoeMaSessionTrig <ydk.models.cisco_ios_xr.Cisco_IOS_XR_subscriber_pppoe_ma_oper.PppoeMaSessionTrig>`
**config**\: False
"""
_prefix = 'subscriber-pppoe-ma-oper'
_revision = '2019-10-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Pppoe.Nodes.Node.DisconnectHistoryUnique.Entry, self).__init__()
self.yang_name = "entry"
self.yang_parent_name = "disconnect-history-unique"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([("session-idb", ("session_idb", Pppoe.Nodes.Node.DisconnectHistoryUnique.Entry.SessionIdb))])
self._leafs = OrderedDict([
('timestamp', (YLeaf(YType.uint64, 'timestamp'), ['int'])),
('ifname', (YLeaf(YType.str, 'ifname'), ['str'])),
('trigger', (YLeaf(YType.enumeration, 'trigger'), [('ydk.models.cisco_ios_xr.Cisco_IOS_XR_subscriber_pppoe_ma_oper', 'PppoeMaSessionTrig', '')])),
])
self.timestamp = None
self.ifname = None
self.trigger = None
self.session_idb = Pppoe.Nodes.Node.DisconnectHistoryUnique.Entry.SessionIdb()
self.session_idb.parent = self
self._children_name_map["session_idb"] = "session-idb"
self._segment_path = lambda: "entry"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Pppoe.Nodes.Node.DisconnectHistoryUnique.Entry, ['timestamp', 'ifname', 'trigger'], name, value)
class SessionIdb(_Entity_):
"""
Session IDB
.. attribute:: tags
Tags
**type**\: :py:class:`Tags <ydk.models.cisco_ios_xr.Cisco_IOS_XR_subscriber_pppoe_ma_oper.Pppoe.Nodes.Node.DisconnectHistoryUnique.Entry.SessionIdb.Tags>`
**config**\: False
.. attribute:: vlan_outer_tag
VLAN Outer Tag
**type**\: :py:class:`VlanOuterTag <ydk.models.cisco_ios_xr.Cisco_IOS_XR_subscriber_pppoe_ma_oper.Pppoe.Nodes.Node.DisconnectHistoryUnique.Entry.SessionIdb.VlanOuterTag>`
**config**\: False
.. attribute:: vlan_inner_tag
VLAN Inner Tag
**type**\: :py:class:`VlanInnerTag <ydk.models.cisco_ios_xr.Cisco_IOS_XR_subscriber_pppoe_ma_oper.Pppoe.Nodes.Node.DisconnectHistoryUnique.Entry.SessionIdb.VlanInnerTag>`
**config**\: False
.. attribute:: interface
Interface
**type**\: str
**pattern:** [a\-zA\-Z0\-9.\_/\-]+
**config**\: False
.. attribute:: access_interface
Access Interface
**type**\: str
**pattern:** [a\-zA\-Z0\-9.\_/\-]+
**config**\: False
.. attribute:: session_id
Session ID
**type**\: int
**range:** 0..65535
**config**\: False
.. attribute:: sub_label
Sub Label
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: peer_mac_address
Peer Mac\-Address
**type**\: str
**pattern:** [0\-9a\-fA\-F]{2}(\:[0\-9a\-fA\-F]{2}){5}
**config**\: False
.. attribute:: state
State
**type**\: :py:class:`PppoeMaSessionState <ydk.models.cisco_ios_xr.Cisco_IOS_XR_subscriber_pppoe_ma_oper.PppoeMaSessionState>`
**config**\: False
.. attribute:: cdm_object_handle
CDM Object Handle
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: chkpt_id
Chkpt ID
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: punted_count
Punted Count
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: port_limit
Port Limit
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: is_counted
Is BBA Counted
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: is_vlan_outer_tag
Is VLAN Outer Tag
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: is_vlan_inner_tag
Is VLAN Inner Tag
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: is_cleanup_pending
Is Cleanup Pending
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: is_disconnect_done_pending
Is Disconnect Done Pending
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: is_delete_done_pending
Is Delete Done Pending
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: is_intf_create_callback_pending
Is Interface Create Callback pending
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: is_publish_encaps_attr_pending
Is Publish Encaps Attr pending
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: is_publish_encaps_attr_cb_pending
Is Publish Encaps Attr Callback pending
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: is_intf_delete_callback_pending
Is Interface Delete Callback pending
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: is_intf_delete_pending
Is Interface Delete pending
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: is_im_owned_resource
Is IM Owned Resource
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: is_im_final_received
Is IM Final received
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: is_im_owned_resource_missing
Is IM Owned Resource missing
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: is_aaa_start_request_callback_pending
Is AAA Start request callback pending
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: is_aaa_owned_resource
Is AAA Owned Resource
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: is_aaa_disconnect_requested
Is AAA Disconnect Requested
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: is_aaa_disconnect_received
Is AAA Disconnect Received
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: is_sub_db_activate_callback_pending
Is SubDB Activate callback pending
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: is_pads_sent
Is PADS Sent
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: is_padt_received
Is PADT Received
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: is_in_flight
Is Session In Flight
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: is_radius_override
Is RADIUS override enabled
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: expected_notifications
Expected Notifications
**type**\: int
**range:** 0..255
**config**\: False
.. attribute:: received_notifications
Received Notifications
**type**\: int
**range:** 0..255
**config**\: False
.. attribute:: srg_state
SRG state
**type**\: :py:class:`PppoeMaSessionIdbSrgState <ydk.models.cisco_ios_xr.Cisco_IOS_XR_subscriber_pppoe_ma_oper.PppoeMaSessionIdbSrgState>`
**config**\: False
.. attribute:: is_srg_data_received
Is SRG Data Received
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: is_iedge_data_received
Is IEDGE Data Received
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
"""
_prefix = 'subscriber-pppoe-ma-oper'
_revision = '2019-10-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Pppoe.Nodes.Node.DisconnectHistoryUnique.Entry.SessionIdb, self).__init__()
self.yang_name = "session-idb"
self.yang_parent_name = "entry"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([("tags", ("tags", Pppoe.Nodes.Node.DisconnectHistoryUnique.Entry.SessionIdb.Tags)), ("vlan-outer-tag", ("vlan_outer_tag", Pppoe.Nodes.Node.DisconnectHistoryUnique.Entry.SessionIdb.VlanOuterTag)), ("vlan-inner-tag", ("vlan_inner_tag", Pppoe.Nodes.Node.DisconnectHistoryUnique.Entry.SessionIdb.VlanInnerTag))])
self._leafs = OrderedDict([
('interface', (YLeaf(YType.str, 'interface'), ['str'])),
('access_interface', (YLeaf(YType.str, 'access-interface'), ['str'])),
('session_id', (YLeaf(YType.uint16, 'session-id'), ['int'])),
('sub_label', (YLeaf(YType.uint32, 'sub-label'), ['int'])),
('peer_mac_address', (YLeaf(YType.str, 'peer-mac-address'), ['str'])),
('state', (YLeaf(YType.enumeration, 'state'), [('ydk.models.cisco_ios_xr.Cisco_IOS_XR_subscriber_pppoe_ma_oper', 'PppoeMaSessionState', '')])),
('cdm_object_handle', (YLeaf(YType.uint32, 'cdm-object-handle'), ['int'])),
('chkpt_id', (YLeaf(YType.uint32, 'chkpt-id'), ['int'])),
('punted_count', (YLeaf(YType.uint32, 'punted-count'), ['int'])),
('port_limit', (YLeaf(YType.uint32, 'port-limit'), ['int'])),
('is_counted', (YLeaf(YType.int32, 'is-counted'), ['int'])),
('is_vlan_outer_tag', (YLeaf(YType.int32, 'is-vlan-outer-tag'), ['int'])),
('is_vlan_inner_tag', (YLeaf(YType.int32, 'is-vlan-inner-tag'), ['int'])),
('is_cleanup_pending', (YLeaf(YType.int32, 'is-cleanup-pending'), ['int'])),
('is_disconnect_done_pending', (YLeaf(YType.int32, 'is-disconnect-done-pending'), ['int'])),
('is_delete_done_pending', (YLeaf(YType.int32, 'is-delete-done-pending'), ['int'])),
('is_intf_create_callback_pending', (YLeaf(YType.int32, 'is-intf-create-callback-pending'), ['int'])),
('is_publish_encaps_attr_pending', (YLeaf(YType.int32, 'is-publish-encaps-attr-pending'), ['int'])),
('is_publish_encaps_attr_cb_pending', (YLeaf(YType.int32, 'is-publish-encaps-attr-cb-pending'), ['int'])),
('is_intf_delete_callback_pending', (YLeaf(YType.int32, 'is-intf-delete-callback-pending'), ['int'])),
('is_intf_delete_pending', (YLeaf(YType.int32, 'is-intf-delete-pending'), ['int'])),
('is_im_owned_resource', (YLeaf(YType.int32, 'is-im-owned-resource'), ['int'])),
('is_im_final_received', (YLeaf(YType.int32, 'is-im-final-received'), ['int'])),
('is_im_owned_resource_missing', (YLeaf(YType.int32, 'is-im-owned-resource-missing'), ['int'])),
('is_aaa_start_request_callback_pending', (YLeaf(YType.int32, 'is-aaa-start-request-callback-pending'), ['int'])),
('is_aaa_owned_resource', (YLeaf(YType.int32, 'is-aaa-owned-resource'), ['int'])),
('is_aaa_disconnect_requested', (YLeaf(YType.int32, 'is-aaa-disconnect-requested'), ['int'])),
('is_aaa_disconnect_received', (YLeaf(YType.int32, 'is-aaa-disconnect-received'), ['int'])),
('is_sub_db_activate_callback_pending', (YLeaf(YType.int32, 'is-sub-db-activate-callback-pending'), ['int'])),
('is_pads_sent', (YLeaf(YType.int32, 'is-pads-sent'), ['int'])),
('is_padt_received', (YLeaf(YType.int32, 'is-padt-received'), ['int'])),
('is_in_flight', (YLeaf(YType.int32, 'is-in-flight'), ['int'])),
('is_radius_override', (YLeaf(YType.int32, 'is-radius-override'), ['int'])),
('expected_notifications', (YLeaf(YType.uint8, 'expected-notifications'), ['int'])),
('received_notifications', (YLeaf(YType.uint8, 'received-notifications'), ['int'])),
('srg_state', (YLeaf(YType.enumeration, 'srg-state'), [('ydk.models.cisco_ios_xr.Cisco_IOS_XR_subscriber_pppoe_ma_oper', 'PppoeMaSessionIdbSrgState', '')])),
('is_srg_data_received', (YLeaf(YType.int32, 'is-srg-data-received'), ['int'])),
('is_iedge_data_received', (YLeaf(YType.int32, 'is-iedge-data-received'), ['int'])),
])
self.interface = None
self.access_interface = None
self.session_id = None
self.sub_label = None
self.peer_mac_address = None
self.state = None
self.cdm_object_handle = None
self.chkpt_id = None
self.punted_count = None
self.port_limit = None
self.is_counted = None
self.is_vlan_outer_tag = None
self.is_vlan_inner_tag = None
self.is_cleanup_pending = None
self.is_disconnect_done_pending = None
self.is_delete_done_pending = None
self.is_intf_create_callback_pending = None
self.is_publish_encaps_attr_pending = None
self.is_publish_encaps_attr_cb_pending = None
self.is_intf_delete_callback_pending = None
self.is_intf_delete_pending = None
self.is_im_owned_resource = None
self.is_im_final_received = None
self.is_im_owned_resource_missing = None
self.is_aaa_start_request_callback_pending = None
self.is_aaa_owned_resource = None
self.is_aaa_disconnect_requested = None
self.is_aaa_disconnect_received = None
self.is_sub_db_activate_callback_pending = None
self.is_pads_sent = None
self.is_padt_received = None
self.is_in_flight = None
self.is_radius_override = None
self.expected_notifications = None
self.received_notifications = None
self.srg_state = None
self.is_srg_data_received = None
self.is_iedge_data_received = None
self.tags = Pppoe.Nodes.Node.DisconnectHistoryUnique.Entry.SessionIdb.Tags()
self.tags.parent = self
self._children_name_map["tags"] = "tags"
self.vlan_outer_tag = Pppoe.Nodes.Node.DisconnectHistoryUnique.Entry.SessionIdb.VlanOuterTag()
self.vlan_outer_tag.parent = self
self._children_name_map["vlan_outer_tag"] = "vlan-outer-tag"
self.vlan_inner_tag = Pppoe.Nodes.Node.DisconnectHistoryUnique.Entry.SessionIdb.VlanInnerTag()
self.vlan_inner_tag.parent = self
self._children_name_map["vlan_inner_tag"] = "vlan-inner-tag"
self._segment_path = lambda: "session-idb"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Pppoe.Nodes.Node.DisconnectHistoryUnique.Entry.SessionIdb, ['interface', 'access_interface', 'session_id', 'sub_label', 'peer_mac_address', 'state', 'cdm_object_handle', 'chkpt_id', 'punted_count', 'port_limit', 'is_counted', 'is_vlan_outer_tag', 'is_vlan_inner_tag', 'is_cleanup_pending', 'is_disconnect_done_pending', 'is_delete_done_pending', 'is_intf_create_callback_pending', 'is_publish_encaps_attr_pending', 'is_publish_encaps_attr_cb_pending', 'is_intf_delete_callback_pending', 'is_intf_delete_pending', 'is_im_owned_resource', 'is_im_final_received', 'is_im_owned_resource_missing', 'is_aaa_start_request_callback_pending', 'is_aaa_owned_resource', 'is_aaa_disconnect_requested', 'is_aaa_disconnect_received', 'is_sub_db_activate_callback_pending', 'is_pads_sent', 'is_padt_received', 'is_in_flight', 'is_radius_override', 'expected_notifications', 'received_notifications', 'srg_state', 'is_srg_data_received', 'is_iedge_data_received'], name, value)
class Tags(_Entity_):
"""
Tags
.. attribute:: access_loop_encapsulation
Access Loop Encapsulation
**type**\: :py:class:`AccessLoopEncapsulation <ydk.models.cisco_ios_xr.Cisco_IOS_XR_subscriber_pppoe_ma_oper.Pppoe.Nodes.Node.DisconnectHistoryUnique.Entry.SessionIdb.Tags.AccessLoopEncapsulation>`
**config**\: False
.. attribute:: is_service_name
Is Service Name
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: is_max_payload
Is Max Payload
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: is_host_uniq
Is Host Uniq
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: is_relay_session_id
Is Relay Session ID
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: is_vendor_specific
Is Vendor Specific
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: is_iwf
Is IWF
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: is_remote_id
Is Remote ID
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: is_circuit_id
Is Circuit ID
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: is_dsl_tag
Is DSL Tag
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: service_name
Service Name
**type**\: str
**config**\: False
.. attribute:: max_payload
Max Payload
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: host_uniq
Host Uniq
**type**\: str
**pattern:** ([0\-9a\-fA\-F]{2}(\:[0\-9a\-fA\-F]{2})\*)?
**config**\: False
.. attribute:: relay_session_id
Relay Session ID
**type**\: str
**pattern:** ([0\-9a\-fA\-F]{2}(\:[0\-9a\-fA\-F]{2})\*)?
**config**\: False
.. attribute:: remote_id
Remote ID
**type**\: str
**config**\: False
.. attribute:: circuit_id
Circuit ID
**type**\: str
**config**\: False
.. attribute:: is_dsl_actual_up
Is DSL Actual Up
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: is_dsl_actual_down
Is DSL Actual Down
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: is_dsl_min_up
Is DSL Min Up
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: is_dsl_min_down
Is DSL Min Down
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: is_dsl_attain_up
Is DSL Attain Up
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: is_dsl_attain_down
Is DSL Attain Down
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: is_dsl_max_up
Is DSL Max Up
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: is_dsl_max_down
Is DSL Max Down
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: is_dsl_min_up_low
Is DSL Min Up Low
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: is_dsl_min_down_low
Is DSL Min Down Low
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: is_dsl_max_delay_up
Is DSL Max Delay Up
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: is_dsl_actual_delay_up
Is DSL Actual Delay Up
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: is_dsl_max_delay_down
Is DSL Max Delay Down
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: is_dsl_actual_delay_down
Is DSL Actual Delay Down
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: is_access_loop_encapsulation
Is Access Loop Encapsulation
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: dsl_actual_up
DSL Actual Up
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: dsl_actual_down
DSL Actual Down
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: dsl_min_up
DSL Min Up
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: dsl_min_down
DSL Min Down
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: dsl_attain_up
DSL Attain Up
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: dsl_attain_down
DSL Attain Down
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: dsl_max_up
DSL Max Up
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: dsl_max_down
DSL Max Down
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: dsl_min_up_low
DSL Min Up Low
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: dsl_min_down_low
DSL Min Down Low
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: dsl_max_delay_up
DSL Max Delay Up
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: dsl_actual_delay_up
DSL Actual Delay Up
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: dsl_max_delay_down
DSL Max Delay Down
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: dsl_actual_delay_down
DSL Actual Delay Down
**type**\: int
**range:** 0..4294967295
**config**\: False
"""
_prefix = 'subscriber-pppoe-ma-oper'
_revision = '2019-10-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Pppoe.Nodes.Node.DisconnectHistoryUnique.Entry.SessionIdb.Tags, self).__init__()
self.yang_name = "tags"
self.yang_parent_name = "session-idb"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([("access-loop-encapsulation", ("access_loop_encapsulation", Pppoe.Nodes.Node.DisconnectHistoryUnique.Entry.SessionIdb.Tags.AccessLoopEncapsulation))])
self._leafs = OrderedDict([
('is_service_name', (YLeaf(YType.int32, 'is-service-name'), ['int'])),
('is_max_payload', (YLeaf(YType.int32, 'is-max-payload'), ['int'])),
('is_host_uniq', (YLeaf(YType.int32, 'is-host-uniq'), ['int'])),
('is_relay_session_id', (YLeaf(YType.int32, 'is-relay-session-id'), ['int'])),
('is_vendor_specific', (YLeaf(YType.int32, 'is-vendor-specific'), ['int'])),
('is_iwf', (YLeaf(YType.int32, 'is-iwf'), ['int'])),
('is_remote_id', (YLeaf(YType.int32, 'is-remote-id'), ['int'])),
('is_circuit_id', (YLeaf(YType.int32, 'is-circuit-id'), ['int'])),
('is_dsl_tag', (YLeaf(YType.int32, 'is-dsl-tag'), ['int'])),
('service_name', (YLeaf(YType.str, 'service-name'), ['str'])),
('max_payload', (YLeaf(YType.uint32, 'max-payload'), ['int'])),
('host_uniq', (YLeaf(YType.str, 'host-uniq'), ['str'])),
('relay_session_id', (YLeaf(YType.str, 'relay-session-id'), ['str'])),
('remote_id', (YLeaf(YType.str, 'remote-id'), ['str'])),
('circuit_id', (YLeaf(YType.str, 'circuit-id'), ['str'])),
('is_dsl_actual_up', (YLeaf(YType.int32, 'is-dsl-actual-up'), ['int'])),
('is_dsl_actual_down', (YLeaf(YType.int32, 'is-dsl-actual-down'), ['int'])),
('is_dsl_min_up', (YLeaf(YType.int32, 'is-dsl-min-up'), ['int'])),
('is_dsl_min_down', (YLeaf(YType.int32, 'is-dsl-min-down'), ['int'])),
('is_dsl_attain_up', (YLeaf(YType.int32, 'is-dsl-attain-up'), ['int'])),
('is_dsl_attain_down', (YLeaf(YType.int32, 'is-dsl-attain-down'), ['int'])),
('is_dsl_max_up', (YLeaf(YType.int32, 'is-dsl-max-up'), ['int'])),
('is_dsl_max_down', (YLeaf(YType.int32, 'is-dsl-max-down'), ['int'])),
('is_dsl_min_up_low', (YLeaf(YType.int32, 'is-dsl-min-up-low'), ['int'])),
('is_dsl_min_down_low', (YLeaf(YType.int32, 'is-dsl-min-down-low'), ['int'])),
('is_dsl_max_delay_up', (YLeaf(YType.int32, 'is-dsl-max-delay-up'), ['int'])),
('is_dsl_actual_delay_up', (YLeaf(YType.int32, 'is-dsl-actual-delay-up'), ['int'])),
('is_dsl_max_delay_down', (YLeaf(YType.int32, 'is-dsl-max-delay-down'), ['int'])),
('is_dsl_actual_delay_down', (YLeaf(YType.int32, 'is-dsl-actual-delay-down'), ['int'])),
('is_access_loop_encapsulation', (YLeaf(YType.int32, 'is-access-loop-encapsulation'), ['int'])),
('dsl_actual_up', (YLeaf(YType.uint32, 'dsl-actual-up'), ['int'])),
('dsl_actual_down', (YLeaf(YType.uint32, 'dsl-actual-down'), ['int'])),
('dsl_min_up', (YLeaf(YType.uint32, 'dsl-min-up'), ['int'])),
('dsl_min_down', (YLeaf(YType.uint32, 'dsl-min-down'), ['int'])),
('dsl_attain_up', (YLeaf(YType.uint32, 'dsl-attain-up'), ['int'])),
('dsl_attain_down', (YLeaf(YType.uint32, 'dsl-attain-down'), ['int'])),
('dsl_max_up', (YLeaf(YType.uint32, 'dsl-max-up'), ['int'])),
('dsl_max_down', (YLeaf(YType.uint32, 'dsl-max-down'), ['int'])),
('dsl_min_up_low', (YLeaf(YType.uint32, 'dsl-min-up-low'), ['int'])),
('dsl_min_down_low', (YLeaf(YType.uint32, 'dsl-min-down-low'), ['int'])),
('dsl_max_delay_up', (YLeaf(YType.uint32, 'dsl-max-delay-up'), ['int'])),
('dsl_actual_delay_up', (YLeaf(YType.uint32, 'dsl-actual-delay-up'), ['int'])),
('dsl_max_delay_down', (YLeaf(YType.uint32, 'dsl-max-delay-down'), ['int'])),
('dsl_actual_delay_down', (YLeaf(YType.uint32, 'dsl-actual-delay-down'), ['int'])),
])
self.is_service_name = None
self.is_max_payload = None
self.is_host_uniq = None
self.is_relay_session_id = None
self.is_vendor_specific = None
self.is_iwf = None
self.is_remote_id = None
self.is_circuit_id = None
self.is_dsl_tag = None
self.service_name = None
self.max_payload = None
self.host_uniq = None
self.relay_session_id = None
self.remote_id = None
self.circuit_id = None
self.is_dsl_actual_up = None
self.is_dsl_actual_down = None
self.is_dsl_min_up = None
self.is_dsl_min_down = None
self.is_dsl_attain_up = None
self.is_dsl_attain_down = None
self.is_dsl_max_up = None
self.is_dsl_max_down = None
self.is_dsl_min_up_low = None
self.is_dsl_min_down_low = None
self.is_dsl_max_delay_up = None
self.is_dsl_actual_delay_up = None
self.is_dsl_max_delay_down = None
self.is_dsl_actual_delay_down = None
self.is_access_loop_encapsulation = None
self.dsl_actual_up = None
self.dsl_actual_down = None
self.dsl_min_up = None
self.dsl_min_down = None
self.dsl_attain_up = None
self.dsl_attain_down = None
self.dsl_max_up = None
self.dsl_max_down = None
self.dsl_min_up_low = None
self.dsl_min_down_low = None
self.dsl_max_delay_up = None
self.dsl_actual_delay_up = None
self.dsl_max_delay_down = None
self.dsl_actual_delay_down = None
self.access_loop_encapsulation = Pppoe.Nodes.Node.DisconnectHistoryUnique.Entry.SessionIdb.Tags.AccessLoopEncapsulation()
self.access_loop_encapsulation.parent = self
self._children_name_map["access_loop_encapsulation"] = "access-loop-encapsulation"
self._segment_path = lambda: "tags"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Pppoe.Nodes.Node.DisconnectHistoryUnique.Entry.SessionIdb.Tags, ['is_service_name', 'is_max_payload', 'is_host_uniq', 'is_relay_session_id', 'is_vendor_specific', 'is_iwf', 'is_remote_id', 'is_circuit_id', 'is_dsl_tag', 'service_name', 'max_payload', 'host_uniq', 'relay_session_id', 'remote_id', 'circuit_id', 'is_dsl_actual_up', 'is_dsl_actual_down', 'is_dsl_min_up', 'is_dsl_min_down', 'is_dsl_attain_up', 'is_dsl_attain_down', 'is_dsl_max_up', 'is_dsl_max_down', 'is_dsl_min_up_low', 'is_dsl_min_down_low', 'is_dsl_max_delay_up', 'is_dsl_actual_delay_up', 'is_dsl_max_delay_down', 'is_dsl_actual_delay_down', 'is_access_loop_encapsulation', 'dsl_actual_up', 'dsl_actual_down', 'dsl_min_up', 'dsl_min_down', 'dsl_attain_up', 'dsl_attain_down', 'dsl_max_up', 'dsl_max_down', 'dsl_min_up_low', 'dsl_min_down_low', 'dsl_max_delay_up', 'dsl_actual_delay_up', 'dsl_max_delay_down', 'dsl_actual_delay_down'], name, value)
class AccessLoopEncapsulation(_Entity_):
"""
Access Loop Encapsulation
.. attribute:: data_link
Data Link
**type**\: int
**range:** 0..255
**config**\: False
.. attribute:: encaps1
Encaps 1
**type**\: int
**range:** 0..255
**config**\: False
.. attribute:: encaps2
Encaps 2
**type**\: int
**range:** 0..255
**config**\: False
"""
_prefix = 'subscriber-pppoe-ma-oper'
_revision = '2019-10-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Pppoe.Nodes.Node.DisconnectHistoryUnique.Entry.SessionIdb.Tags.AccessLoopEncapsulation, self).__init__()
self.yang_name = "access-loop-encapsulation"
self.yang_parent_name = "tags"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('data_link', (YLeaf(YType.uint8, 'data-link'), ['int'])),
('encaps1', (YLeaf(YType.uint8, 'encaps1'), ['int'])),
('encaps2', (YLeaf(YType.uint8, 'encaps2'), ['int'])),
])
self.data_link = None
self.encaps1 = None
self.encaps2 = None
self._segment_path = lambda: "access-loop-encapsulation"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Pppoe.Nodes.Node.DisconnectHistoryUnique.Entry.SessionIdb.Tags.AccessLoopEncapsulation, ['data_link', 'encaps1', 'encaps2'], name, value)
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_subscriber_pppoe_ma_oper as meta
return meta._meta_table['Pppoe.Nodes.Node.DisconnectHistoryUnique.Entry.SessionIdb.Tags.AccessLoopEncapsulation']['meta_info']
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_subscriber_pppoe_ma_oper as meta
return meta._meta_table['Pppoe.Nodes.Node.DisconnectHistoryUnique.Entry.SessionIdb.Tags']['meta_info']
class VlanOuterTag(_Entity_):
"""
VLAN Outer Tag
.. attribute:: ether_type
Ethertype. See IEEE 802.1Q for more information
**type**\: int
**range:** 0..65535
**config**\: False
.. attribute:: user_priority
User Priority
**type**\: int
**range:** 0..255
**config**\: False
.. attribute:: cfi
CFI
**type**\: int
**range:** 0..255
**config**\: False
.. attribute:: vlan_id
VLAN ID
**type**\: int
**range:** 0..65535
**config**\: False
"""
_prefix = 'subscriber-pppoe-ma-oper'
_revision = '2019-10-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Pppoe.Nodes.Node.DisconnectHistoryUnique.Entry.SessionIdb.VlanOuterTag, self).__init__()
self.yang_name = "vlan-outer-tag"
self.yang_parent_name = "session-idb"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('ether_type', (YLeaf(YType.uint16, 'ether-type'), ['int'])),
('user_priority', (YLeaf(YType.uint8, 'user-priority'), ['int'])),
('cfi', (YLeaf(YType.uint8, 'cfi'), ['int'])),
('vlan_id', (YLeaf(YType.uint16, 'vlan-id'), ['int'])),
])
self.ether_type = None
self.user_priority = None
self.cfi = None
self.vlan_id = None
self._segment_path = lambda: "vlan-outer-tag"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Pppoe.Nodes.Node.DisconnectHistoryUnique.Entry.SessionIdb.VlanOuterTag, ['ether_type', 'user_priority', 'cfi', 'vlan_id'], name, value)
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_subscriber_pppoe_ma_oper as meta
return meta._meta_table['Pppoe.Nodes.Node.DisconnectHistoryUnique.Entry.SessionIdb.VlanOuterTag']['meta_info']
class VlanInnerTag(_Entity_):
"""
VLAN Inner Tag
.. attribute:: ether_type
Ethertype. See IEEE 802.1Q for more information
**type**\: int
**range:** 0..65535
**config**\: False
.. attribute:: user_priority
User Priority
**type**\: int
**range:** 0..255
**config**\: False
.. attribute:: cfi
CFI
**type**\: int
**range:** 0..255
**config**\: False
.. attribute:: vlan_id
VLAN ID
**type**\: int
**range:** 0..65535
**config**\: False
"""
_prefix = 'subscriber-pppoe-ma-oper'
_revision = '2019-10-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Pppoe.Nodes.Node.DisconnectHistoryUnique.Entry.SessionIdb.VlanInnerTag, self).__init__()
self.yang_name = "vlan-inner-tag"
self.yang_parent_name = "session-idb"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('ether_type', (YLeaf(YType.uint16, 'ether-type'), ['int'])),
('user_priority', (YLeaf(YType.uint8, 'user-priority'), ['int'])),
('cfi', (YLeaf(YType.uint8, 'cfi'), ['int'])),
('vlan_id', (YLeaf(YType.uint16, 'vlan-id'), ['int'])),
])
self.ether_type = None
self.user_priority = None
self.cfi = None
self.vlan_id = None
self._segment_path = lambda: "vlan-inner-tag"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Pppoe.Nodes.Node.DisconnectHistoryUnique.Entry.SessionIdb.VlanInnerTag, ['ether_type', 'user_priority', 'cfi', 'vlan_id'], name, value)
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_subscriber_pppoe_ma_oper as meta
return meta._meta_table['Pppoe.Nodes.Node.DisconnectHistoryUnique.Entry.SessionIdb.VlanInnerTag']['meta_info']
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_subscriber_pppoe_ma_oper as meta
return meta._meta_table['Pppoe.Nodes.Node.DisconnectHistoryUnique.Entry.SessionIdb']['meta_info']
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_subscriber_pppoe_ma_oper as meta
return meta._meta_table['Pppoe.Nodes.Node.DisconnectHistoryUnique.Entry']['meta_info']
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_subscriber_pppoe_ma_oper as meta
return meta._meta_table['Pppoe.Nodes.Node.DisconnectHistoryUnique']['meta_info']
class Statistics(_Entity_):
"""
PPPoE statistics for a given node
.. attribute:: packet_counts
Packet Counts
**type**\: :py:class:`PacketCounts <ydk.models.cisco_ios_xr.Cisco_IOS_XR_subscriber_pppoe_ma_oper.Pppoe.Nodes.Node.Statistics.PacketCounts>`
**config**\: False
.. attribute:: packet_error_counts
Packet Error Counts
**type**\: :py:class:`PacketErrorCounts <ydk.models.cisco_ios_xr.Cisco_IOS_XR_subscriber_pppoe_ma_oper.Pppoe.Nodes.Node.Statistics.PacketErrorCounts>`
**config**\: False
"""
_prefix = 'subscriber-pppoe-ma-oper'
_revision = '2019-10-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Pppoe.Nodes.Node.Statistics, self).__init__()
self.yang_name = "statistics"
self.yang_parent_name = "node"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([("packet-counts", ("packet_counts", Pppoe.Nodes.Node.Statistics.PacketCounts)), ("packet-error-counts", ("packet_error_counts", Pppoe.Nodes.Node.Statistics.PacketErrorCounts))])
self._leafs = OrderedDict()
self.packet_counts = Pppoe.Nodes.Node.Statistics.PacketCounts()
self.packet_counts.parent = self
self._children_name_map["packet_counts"] = "packet-counts"
self.packet_error_counts = Pppoe.Nodes.Node.Statistics.PacketErrorCounts()
self.packet_error_counts.parent = self
self._children_name_map["packet_error_counts"] = "packet-error-counts"
self._segment_path = lambda: "statistics"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Pppoe.Nodes.Node.Statistics, [], name, value)
class PacketCounts(_Entity_):
"""
Packet Counts
.. attribute:: padi
PADI counts
**type**\: :py:class:`Padi <ydk.models.cisco_ios_xr.Cisco_IOS_XR_subscriber_pppoe_ma_oper.Pppoe.Nodes.Node.Statistics.PacketCounts.Padi>`
**config**\: False
.. attribute:: pado
PADO counts
**type**\: :py:class:`Pado <ydk.models.cisco_ios_xr.Cisco_IOS_XR_subscriber_pppoe_ma_oper.Pppoe.Nodes.Node.Statistics.PacketCounts.Pado>`
**config**\: False
.. attribute:: padr
PADR counts
**type**\: :py:class:`Padr <ydk.models.cisco_ios_xr.Cisco_IOS_XR_subscriber_pppoe_ma_oper.Pppoe.Nodes.Node.Statistics.PacketCounts.Padr>`
**config**\: False
.. attribute:: pads_success
PADS Success counts
**type**\: :py:class:`PadsSuccess <ydk.models.cisco_ios_xr.Cisco_IOS_XR_subscriber_pppoe_ma_oper.Pppoe.Nodes.Node.Statistics.PacketCounts.PadsSuccess>`
**config**\: False
.. attribute:: pads_error
PADS Error counts
**type**\: :py:class:`PadsError <ydk.models.cisco_ios_xr.Cisco_IOS_XR_subscriber_pppoe_ma_oper.Pppoe.Nodes.Node.Statistics.PacketCounts.PadsError>`
**config**\: False
.. attribute:: padt
PADT counts
**type**\: :py:class:`Padt <ydk.models.cisco_ios_xr.Cisco_IOS_XR_subscriber_pppoe_ma_oper.Pppoe.Nodes.Node.Statistics.PacketCounts.Padt>`
**config**\: False
.. attribute:: session_state
Session Stage counts
**type**\: :py:class:`SessionState <ydk.models.cisco_ios_xr.Cisco_IOS_XR_subscriber_pppoe_ma_oper.Pppoe.Nodes.Node.Statistics.PacketCounts.SessionState>`
**config**\: False
.. attribute:: other
Other counts
**type**\: :py:class:`Other <ydk.models.cisco_ios_xr.Cisco_IOS_XR_subscriber_pppoe_ma_oper.Pppoe.Nodes.Node.Statistics.PacketCounts.Other>`
**config**\: False
"""
_prefix = 'subscriber-pppoe-ma-oper'
_revision = '2019-10-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Pppoe.Nodes.Node.Statistics.PacketCounts, self).__init__()
self.yang_name = "packet-counts"
self.yang_parent_name = "statistics"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([("padi", ("padi", Pppoe.Nodes.Node.Statistics.PacketCounts.Padi)), ("pado", ("pado", Pppoe.Nodes.Node.Statistics.PacketCounts.Pado)), ("padr", ("padr", Pppoe.Nodes.Node.Statistics.PacketCounts.Padr)), ("pads-success", ("pads_success", Pppoe.Nodes.Node.Statistics.PacketCounts.PadsSuccess)), ("pads-error", ("pads_error", Pppoe.Nodes.Node.Statistics.PacketCounts.PadsError)), ("padt", ("padt", Pppoe.Nodes.Node.Statistics.PacketCounts.Padt)), ("session-state", ("session_state", Pppoe.Nodes.Node.Statistics.PacketCounts.SessionState)), ("other", ("other", Pppoe.Nodes.Node.Statistics.PacketCounts.Other))])
self._leafs = OrderedDict()
self.padi = Pppoe.Nodes.Node.Statistics.PacketCounts.Padi()
self.padi.parent = self
self._children_name_map["padi"] = "padi"
self.pado = Pppoe.Nodes.Node.Statistics.PacketCounts.Pado()
self.pado.parent = self
self._children_name_map["pado"] = "pado"
self.padr = Pppoe.Nodes.Node.Statistics.PacketCounts.Padr()
self.padr.parent = self
self._children_name_map["padr"] = "padr"
self.pads_success = Pppoe.Nodes.Node.Statistics.PacketCounts.PadsSuccess()
self.pads_success.parent = self
self._children_name_map["pads_success"] = "pads-success"
self.pads_error = Pppoe.Nodes.Node.Statistics.PacketCounts.PadsError()
self.pads_error.parent = self
self._children_name_map["pads_error"] = "pads-error"
self.padt = Pppoe.Nodes.Node.Statistics.PacketCounts.Padt()
self.padt.parent = self
self._children_name_map["padt"] = "padt"
self.session_state = Pppoe.Nodes.Node.Statistics.PacketCounts.SessionState()
self.session_state.parent = self
self._children_name_map["session_state"] = "session-state"
self.other = Pppoe.Nodes.Node.Statistics.PacketCounts.Other()
self.other.parent = self
self._children_name_map["other"] = "other"
self._segment_path = lambda: "packet-counts"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Pppoe.Nodes.Node.Statistics.PacketCounts, [], name, value)
class Padi(_Entity_):
"""
PADI counts
.. attribute:: sent
Sent
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: received
Received
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: dropped
Dropped
**type**\: int
**range:** 0..4294967295
**config**\: False
"""
_prefix = 'subscriber-pppoe-ma-oper'
_revision = '2019-10-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Pppoe.Nodes.Node.Statistics.PacketCounts.Padi, self).__init__()
self.yang_name = "padi"
self.yang_parent_name = "packet-counts"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('sent', (YLeaf(YType.uint32, 'sent'), ['int'])),
('received', (YLeaf(YType.uint32, 'received'), ['int'])),
('dropped', (YLeaf(YType.uint32, 'dropped'), ['int'])),
])
self.sent = None
self.received = None
self.dropped = None
self._segment_path = lambda: "padi"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Pppoe.Nodes.Node.Statistics.PacketCounts.Padi, ['sent', 'received', 'dropped'], name, value)
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_subscriber_pppoe_ma_oper as meta
return meta._meta_table['Pppoe.Nodes.Node.Statistics.PacketCounts.Padi']['meta_info']
class Pado(_Entity_):
"""
PADO counts
.. attribute:: sent
Sent
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: received
Received
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: dropped
Dropped
**type**\: int
**range:** 0..4294967295
**config**\: False
"""
_prefix = 'subscriber-pppoe-ma-oper'
_revision = '2019-10-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Pppoe.Nodes.Node.Statistics.PacketCounts.Pado, self).__init__()
self.yang_name = "pado"
self.yang_parent_name = "packet-counts"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('sent', (YLeaf(YType.uint32, 'sent'), ['int'])),
('received', (YLeaf(YType.uint32, 'received'), ['int'])),
('dropped', (YLeaf(YType.uint32, 'dropped'), ['int'])),
])
self.sent = None
self.received = None
self.dropped = None
self._segment_path = lambda: "pado"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Pppoe.Nodes.Node.Statistics.PacketCounts.Pado, ['sent', 'received', 'dropped'], name, value)
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_subscriber_pppoe_ma_oper as meta
return meta._meta_table['Pppoe.Nodes.Node.Statistics.PacketCounts.Pado']['meta_info']
class Padr(_Entity_):
"""
PADR counts
.. attribute:: sent
Sent
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: received
Received
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: dropped
Dropped
**type**\: int
**range:** 0..4294967295
**config**\: False
"""
_prefix = 'subscriber-pppoe-ma-oper'
_revision = '2019-10-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Pppoe.Nodes.Node.Statistics.PacketCounts.Padr, self).__init__()
self.yang_name = "padr"
self.yang_parent_name = "packet-counts"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('sent', (YLeaf(YType.uint32, 'sent'), ['int'])),
('received', (YLeaf(YType.uint32, 'received'), ['int'])),
('dropped', (YLeaf(YType.uint32, 'dropped'), ['int'])),
])
self.sent = None
self.received = None
self.dropped = None
self._segment_path = lambda: "padr"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Pppoe.Nodes.Node.Statistics.PacketCounts.Padr, ['sent', 'received', 'dropped'], name, value)
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_subscriber_pppoe_ma_oper as meta
return meta._meta_table['Pppoe.Nodes.Node.Statistics.PacketCounts.Padr']['meta_info']
class PadsSuccess(_Entity_):
"""
PADS Success counts
.. attribute:: sent
Sent
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: received
Received
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: dropped
Dropped
**type**\: int
**range:** 0..4294967295
**config**\: False
"""
_prefix = 'subscriber-pppoe-ma-oper'
_revision = '2019-10-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Pppoe.Nodes.Node.Statistics.PacketCounts.PadsSuccess, self).__init__()
self.yang_name = "pads-success"
self.yang_parent_name = "packet-counts"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('sent', (YLeaf(YType.uint32, 'sent'), ['int'])),
('received', (YLeaf(YType.uint32, 'received'), ['int'])),
('dropped', (YLeaf(YType.uint32, 'dropped'), ['int'])),
])
self.sent = None
self.received = None
self.dropped = None
self._segment_path = lambda: "pads-success"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Pppoe.Nodes.Node.Statistics.PacketCounts.PadsSuccess, ['sent', 'received', 'dropped'], name, value)
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_subscriber_pppoe_ma_oper as meta
return meta._meta_table['Pppoe.Nodes.Node.Statistics.PacketCounts.PadsSuccess']['meta_info']
class PadsError(_Entity_):
"""
PADS Error counts
.. attribute:: sent
Sent
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: received
Received
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: dropped
Dropped
**type**\: int
**range:** 0..4294967295
**config**\: False
"""
_prefix = 'subscriber-pppoe-ma-oper'
_revision = '2019-10-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Pppoe.Nodes.Node.Statistics.PacketCounts.PadsError, self).__init__()
self.yang_name = "pads-error"
self.yang_parent_name = "packet-counts"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('sent', (YLeaf(YType.uint32, 'sent'), ['int'])),
('received', (YLeaf(YType.uint32, 'received'), ['int'])),
('dropped', (YLeaf(YType.uint32, 'dropped'), ['int'])),
])
self.sent = None
self.received = None
self.dropped = None
self._segment_path = lambda: "pads-error"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Pppoe.Nodes.Node.Statistics.PacketCounts.PadsError, ['sent', 'received', 'dropped'], name, value)
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_subscriber_pppoe_ma_oper as meta
return meta._meta_table['Pppoe.Nodes.Node.Statistics.PacketCounts.PadsError']['meta_info']
class Padt(_Entity_):
"""
PADT counts
.. attribute:: sent
Sent
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: received
Received
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: dropped
Dropped
**type**\: int
**range:** 0..4294967295
**config**\: False
"""
_prefix = 'subscriber-pppoe-ma-oper'
_revision = '2019-10-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Pppoe.Nodes.Node.Statistics.PacketCounts.Padt, self).__init__()
self.yang_name = "padt"
self.yang_parent_name = "packet-counts"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('sent', (YLeaf(YType.uint32, 'sent'), ['int'])),
('received', (YLeaf(YType.uint32, 'received'), ['int'])),
('dropped', (YLeaf(YType.uint32, 'dropped'), ['int'])),
])
self.sent = None
self.received = None
self.dropped = None
self._segment_path = lambda: "padt"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Pppoe.Nodes.Node.Statistics.PacketCounts.Padt, ['sent', 'received', 'dropped'], name, value)
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_subscriber_pppoe_ma_oper as meta
return meta._meta_table['Pppoe.Nodes.Node.Statistics.PacketCounts.Padt']['meta_info']
class SessionState(_Entity_):
"""
Session Stage counts
.. attribute:: sent
Sent
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: received
Received
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: dropped
Dropped
**type**\: int
**range:** 0..4294967295
**config**\: False
"""
_prefix = 'subscriber-pppoe-ma-oper'
_revision = '2019-10-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Pppoe.Nodes.Node.Statistics.PacketCounts.SessionState, self).__init__()
self.yang_name = "session-state"
self.yang_parent_name = "packet-counts"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('sent', (YLeaf(YType.uint32, 'sent'), ['int'])),
('received', (YLeaf(YType.uint32, 'received'), ['int'])),
('dropped', (YLeaf(YType.uint32, 'dropped'), ['int'])),
])
self.sent = None
self.received = None
self.dropped = None
self._segment_path = lambda: "session-state"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Pppoe.Nodes.Node.Statistics.PacketCounts.SessionState, ['sent', 'received', 'dropped'], name, value)
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_subscriber_pppoe_ma_oper as meta
return meta._meta_table['Pppoe.Nodes.Node.Statistics.PacketCounts.SessionState']['meta_info']
class Other(_Entity_):
"""
Other counts
.. attribute:: sent
Sent
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: received
Received
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: dropped
Dropped
**type**\: int
**range:** 0..4294967295
**config**\: False
"""
_prefix = 'subscriber-pppoe-ma-oper'
_revision = '2019-10-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Pppoe.Nodes.Node.Statistics.PacketCounts.Other, self).__init__()
self.yang_name = "other"
self.yang_parent_name = "packet-counts"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('sent', (YLeaf(YType.uint32, 'sent'), ['int'])),
('received', (YLeaf(YType.uint32, 'received'), ['int'])),
('dropped', (YLeaf(YType.uint32, 'dropped'), ['int'])),
])
self.sent = None
self.received = None
self.dropped = None
self._segment_path = lambda: "other"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Pppoe.Nodes.Node.Statistics.PacketCounts.Other, ['sent', 'received', 'dropped'], name, value)
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_subscriber_pppoe_ma_oper as meta
return meta._meta_table['Pppoe.Nodes.Node.Statistics.PacketCounts.Other']['meta_info']
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_subscriber_pppoe_ma_oper as meta
return meta._meta_table['Pppoe.Nodes.Node.Statistics.PacketCounts']['meta_info']
class PacketErrorCounts(_Entity_):
"""
Packet Error Counts
.. attribute:: no_interface_handle
No interface handle
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: no_packet_payload
No packet payload
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: no_packet_mac_address
No packet mac\-address
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: invalid_version_type_value
Invalid version\-type value
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: bad_packet_length
Bad packet length
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: unknown_interface
Unknown interface
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: pado_received
PADO received
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: pads_received
PADS received
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: unknown_packet_type_received
Unknown packet type received
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: unexpected_session_id_in_packet
Unexpected Session\-ID in packet
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: no_service_name_tag
No Service\-Name Tag
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: padt_for_unknown_session
PADT for unknown session
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: padt_with_wrong_peer_mac
PADT with wrong peer\-mac
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: padt_with_wrong_vlan_tags
PADT with wrong VLAN tags
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: zero_length_host_uniq
Zero\-length Host\-Uniq tag
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: padt_before_pads_sent
PADT before PADS sent
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: session_stage_packet_for_unknown_session
Session\-stage packet for unknown session
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: session_stage_packet_with_wrong_mac
Session\-stage packet with wrong mac
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: session_stage_packet_with_wrong_vlan_tags
Session\-stage packet with wrong VLAN tags
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: session_stage_packet_with_no_error
Session\-stage packet with no error
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: tag_too_short
Tag too short
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: bad_tag_length_field
Bad tag\-length field
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: multiple_service_name_tags
Multiple Service\-Name tags
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: multiple_max_payload_tags
Multiple Max\-Payload tags
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: invalid_max_payload_tag
Invalid Max\-Payload tag
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: multiple_vendor_specific_tags
Multiple Vendor\-specific tags
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: unexpected_ac_name_tag
Unexpected AC\-Name tag
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: unexpected_error_tags
Unexpected error tags
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: unknown_tag_received
Unknown tag received
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: no_iana_code_invendor_tag
No IANA code in vendor tag
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: invalid_iana_code_invendor_tag
Invalid IANA code in vendor tag
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: vendor_tag_too_short
Vendor tag too short
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: bad_vendor_tag_length_field
Bad vendor tag length field
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: multiple_host_uniq_tags
Multiple Host\-Uniq tags
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: multiple_relay_session_id_tags
Multiple relay\-session\-id tags
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: multiple_circuit_id_tags
Multiple Circuit\-ID tags
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: multiple_remote_id_tags
Multiple Remote\-ID tags
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: invalid_dsl_tag
Invalid DSL tag
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: multiple_of_the_same_dsl_tag
Multiple of the same DSL tag
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: invalid_iwf_tag
Invalid IWF tag
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: multiple_iwf_tags
Multiple IWF tags
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: unknownvendor_tag
Unknown vendor\-tag
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: no_space_left_in_packet
No space left in packet
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: duplicate_host_uniq_tag_received
Duplicate Host\-Uniq tag received
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: duplicate_relay_session_id_tag_received
Duplicate Relay Session ID tag received
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: packet_too_long
Packet too long
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: invalid_ale_tag
Invalid ALE tag
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: multiple_ale_tags
Multiple ALE tags
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: invalid_service_name
Invalid Service Name
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: invalid_peer_mac
Invalid Peer MAC
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: invalid_vlan_tags
Invalid VLAN Tags
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: packet_on_srg_slave
Packet Received on SRG Slave
**type**\: int
**range:** 0..4294967295
**config**\: False
"""
_prefix = 'subscriber-pppoe-ma-oper'
_revision = '2019-10-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Pppoe.Nodes.Node.Statistics.PacketErrorCounts, self).__init__()
self.yang_name = "packet-error-counts"
self.yang_parent_name = "statistics"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('no_interface_handle', (YLeaf(YType.uint32, 'no-interface-handle'), ['int'])),
('no_packet_payload', (YLeaf(YType.uint32, 'no-packet-payload'), ['int'])),
('no_packet_mac_address', (YLeaf(YType.uint32, 'no-packet-mac-address'), ['int'])),
('invalid_version_type_value', (YLeaf(YType.uint32, 'invalid-version-type-value'), ['int'])),
('bad_packet_length', (YLeaf(YType.uint32, 'bad-packet-length'), ['int'])),
('unknown_interface', (YLeaf(YType.uint32, 'unknown-interface'), ['int'])),
('pado_received', (YLeaf(YType.uint32, 'pado-received'), ['int'])),
('pads_received', (YLeaf(YType.uint32, 'pads-received'), ['int'])),
('unknown_packet_type_received', (YLeaf(YType.uint32, 'unknown-packet-type-received'), ['int'])),
('unexpected_session_id_in_packet', (YLeaf(YType.uint32, 'unexpected-session-id-in-packet'), ['int'])),
('no_service_name_tag', (YLeaf(YType.uint32, 'no-service-name-tag'), ['int'])),
('padt_for_unknown_session', (YLeaf(YType.uint32, 'padt-for-unknown-session'), ['int'])),
('padt_with_wrong_peer_mac', (YLeaf(YType.uint32, 'padt-with-wrong-peer-mac'), ['int'])),
('padt_with_wrong_vlan_tags', (YLeaf(YType.uint32, 'padt-with-wrong-vlan-tags'), ['int'])),
('zero_length_host_uniq', (YLeaf(YType.uint32, 'zero-length-host-uniq'), ['int'])),
('padt_before_pads_sent', (YLeaf(YType.uint32, 'padt-before-pads-sent'), ['int'])),
('session_stage_packet_for_unknown_session', (YLeaf(YType.uint32, 'session-stage-packet-for-unknown-session'), ['int'])),
('session_stage_packet_with_wrong_mac', (YLeaf(YType.uint32, 'session-stage-packet-with-wrong-mac'), ['int'])),
('session_stage_packet_with_wrong_vlan_tags', (YLeaf(YType.uint32, 'session-stage-packet-with-wrong-vlan-tags'), ['int'])),
('session_stage_packet_with_no_error', (YLeaf(YType.uint32, 'session-stage-packet-with-no-error'), ['int'])),
('tag_too_short', (YLeaf(YType.uint32, 'tag-too-short'), ['int'])),
('bad_tag_length_field', (YLeaf(YType.uint32, 'bad-tag-length-field'), ['int'])),
('multiple_service_name_tags', (YLeaf(YType.uint32, 'multiple-service-name-tags'), ['int'])),
('multiple_max_payload_tags', (YLeaf(YType.uint32, 'multiple-max-payload-tags'), ['int'])),
('invalid_max_payload_tag', (YLeaf(YType.uint32, 'invalid-max-payload-tag'), ['int'])),
('multiple_vendor_specific_tags', (YLeaf(YType.uint32, 'multiple-vendor-specific-tags'), ['int'])),
('unexpected_ac_name_tag', (YLeaf(YType.uint32, 'unexpected-ac-name-tag'), ['int'])),
('unexpected_error_tags', (YLeaf(YType.uint32, 'unexpected-error-tags'), ['int'])),
('unknown_tag_received', (YLeaf(YType.uint32, 'unknown-tag-received'), ['int'])),
('no_iana_code_invendor_tag', (YLeaf(YType.uint32, 'no-iana-code-invendor-tag'), ['int'])),
('invalid_iana_code_invendor_tag', (YLeaf(YType.uint32, 'invalid-iana-code-invendor-tag'), ['int'])),
('vendor_tag_too_short', (YLeaf(YType.uint32, 'vendor-tag-too-short'), ['int'])),
('bad_vendor_tag_length_field', (YLeaf(YType.uint32, 'bad-vendor-tag-length-field'), ['int'])),
('multiple_host_uniq_tags', (YLeaf(YType.uint32, 'multiple-host-uniq-tags'), ['int'])),
('multiple_relay_session_id_tags', (YLeaf(YType.uint32, 'multiple-relay-session-id-tags'), ['int'])),
('multiple_circuit_id_tags', (YLeaf(YType.uint32, 'multiple-circuit-id-tags'), ['int'])),
('multiple_remote_id_tags', (YLeaf(YType.uint32, 'multiple-remote-id-tags'), ['int'])),
('invalid_dsl_tag', (YLeaf(YType.uint32, 'invalid-dsl-tag'), ['int'])),
('multiple_of_the_same_dsl_tag', (YLeaf(YType.uint32, 'multiple-of-the-same-dsl-tag'), ['int'])),
('invalid_iwf_tag', (YLeaf(YType.uint32, 'invalid-iwf-tag'), ['int'])),
('multiple_iwf_tags', (YLeaf(YType.uint32, 'multiple-iwf-tags'), ['int'])),
('unknownvendor_tag', (YLeaf(YType.uint32, 'unknownvendor-tag'), ['int'])),
('no_space_left_in_packet', (YLeaf(YType.uint32, 'no-space-left-in-packet'), ['int'])),
('duplicate_host_uniq_tag_received', (YLeaf(YType.uint32, 'duplicate-host-uniq-tag-received'), ['int'])),
('duplicate_relay_session_id_tag_received', (YLeaf(YType.uint32, 'duplicate-relay-session-id-tag-received'), ['int'])),
('packet_too_long', (YLeaf(YType.uint32, 'packet-too-long'), ['int'])),
('invalid_ale_tag', (YLeaf(YType.uint32, 'invalid-ale-tag'), ['int'])),
('multiple_ale_tags', (YLeaf(YType.uint32, 'multiple-ale-tags'), ['int'])),
('invalid_service_name', (YLeaf(YType.uint32, 'invalid-service-name'), ['int'])),
('invalid_peer_mac', (YLeaf(YType.uint32, 'invalid-peer-mac'), ['int'])),
('invalid_vlan_tags', (YLeaf(YType.uint32, 'invalid-vlan-tags'), ['int'])),
('packet_on_srg_slave', (YLeaf(YType.uint32, 'packet-on-srg-slave'), ['int'])),
])
self.no_interface_handle = None
self.no_packet_payload = None
self.no_packet_mac_address = None
self.invalid_version_type_value = None
self.bad_packet_length = None
self.unknown_interface = None
self.pado_received = None
self.pads_received = None
self.unknown_packet_type_received = None
self.unexpected_session_id_in_packet = None
self.no_service_name_tag = None
self.padt_for_unknown_session = None
self.padt_with_wrong_peer_mac = None
self.padt_with_wrong_vlan_tags = None
self.zero_length_host_uniq = None
self.padt_before_pads_sent = None
self.session_stage_packet_for_unknown_session = None
self.session_stage_packet_with_wrong_mac = None
self.session_stage_packet_with_wrong_vlan_tags = None
self.session_stage_packet_with_no_error = None
self.tag_too_short = None
self.bad_tag_length_field = None
self.multiple_service_name_tags = None
self.multiple_max_payload_tags = None
self.invalid_max_payload_tag = None
self.multiple_vendor_specific_tags = None
self.unexpected_ac_name_tag = None
self.unexpected_error_tags = None
self.unknown_tag_received = None
self.no_iana_code_invendor_tag = None
self.invalid_iana_code_invendor_tag = None
self.vendor_tag_too_short = None
self.bad_vendor_tag_length_field = None
self.multiple_host_uniq_tags = None
self.multiple_relay_session_id_tags = None
self.multiple_circuit_id_tags = None
self.multiple_remote_id_tags = None
self.invalid_dsl_tag = None
self.multiple_of_the_same_dsl_tag = None
self.invalid_iwf_tag = None
self.multiple_iwf_tags = None
self.unknownvendor_tag = None
self.no_space_left_in_packet = None
self.duplicate_host_uniq_tag_received = None
self.duplicate_relay_session_id_tag_received = None
self.packet_too_long = None
self.invalid_ale_tag = None
self.multiple_ale_tags = None
self.invalid_service_name = None
self.invalid_peer_mac = None
self.invalid_vlan_tags = None
self.packet_on_srg_slave = None
self._segment_path = lambda: "packet-error-counts"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Pppoe.Nodes.Node.Statistics.PacketErrorCounts, ['no_interface_handle', 'no_packet_payload', 'no_packet_mac_address', 'invalid_version_type_value', 'bad_packet_length', 'unknown_interface', 'pado_received', 'pads_received', 'unknown_packet_type_received', 'unexpected_session_id_in_packet', 'no_service_name_tag', 'padt_for_unknown_session', 'padt_with_wrong_peer_mac', 'padt_with_wrong_vlan_tags', 'zero_length_host_uniq', 'padt_before_pads_sent', 'session_stage_packet_for_unknown_session', 'session_stage_packet_with_wrong_mac', 'session_stage_packet_with_wrong_vlan_tags', 'session_stage_packet_with_no_error', 'tag_too_short', 'bad_tag_length_field', 'multiple_service_name_tags', 'multiple_max_payload_tags', 'invalid_max_payload_tag', 'multiple_vendor_specific_tags', 'unexpected_ac_name_tag', 'unexpected_error_tags', 'unknown_tag_received', 'no_iana_code_invendor_tag', 'invalid_iana_code_invendor_tag', 'vendor_tag_too_short', 'bad_vendor_tag_length_field', 'multiple_host_uniq_tags', 'multiple_relay_session_id_tags', 'multiple_circuit_id_tags', 'multiple_remote_id_tags', 'invalid_dsl_tag', 'multiple_of_the_same_dsl_tag', 'invalid_iwf_tag', 'multiple_iwf_tags', 'unknownvendor_tag', 'no_space_left_in_packet', 'duplicate_host_uniq_tag_received', 'duplicate_relay_session_id_tag_received', 'packet_too_long', 'invalid_ale_tag', 'multiple_ale_tags', 'invalid_service_name', 'invalid_peer_mac', 'invalid_vlan_tags', 'packet_on_srg_slave'], name, value)
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_subscriber_pppoe_ma_oper as meta
return meta._meta_table['Pppoe.Nodes.Node.Statistics.PacketErrorCounts']['meta_info']
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_subscriber_pppoe_ma_oper as meta
return meta._meta_table['Pppoe.Nodes.Node.Statistics']['meta_info']
class AccessInterface(_Entity_):
"""
PPPoE access interface information
.. attribute:: summaries
PPPoE access interface summary information
**type**\: :py:class:`Summaries <ydk.models.cisco_ios_xr.Cisco_IOS_XR_subscriber_pppoe_ma_oper.Pppoe.Nodes.Node.AccessInterface.Summaries>`
**config**\: False
"""
_prefix = 'subscriber-pppoe-ma-oper'
_revision = '2019-10-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Pppoe.Nodes.Node.AccessInterface, self).__init__()
self.yang_name = "access-interface"
self.yang_parent_name = "node"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([("summaries", ("summaries", Pppoe.Nodes.Node.AccessInterface.Summaries))])
self._leafs = OrderedDict()
self.summaries = Pppoe.Nodes.Node.AccessInterface.Summaries()
self.summaries.parent = self
self._children_name_map["summaries"] = "summaries"
self._segment_path = lambda: "access-interface"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Pppoe.Nodes.Node.AccessInterface, [], name, value)
class Summaries(_Entity_):
"""
PPPoE access interface summary information
.. attribute:: summary
Summary information for a PPPoE\-enabled access interface
**type**\: list of :py:class:`Summary <ydk.models.cisco_ios_xr.Cisco_IOS_XR_subscriber_pppoe_ma_oper.Pppoe.Nodes.Node.AccessInterface.Summaries.Summary>`
**config**\: False
"""
_prefix = 'subscriber-pppoe-ma-oper'
_revision = '2019-10-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Pppoe.Nodes.Node.AccessInterface.Summaries, self).__init__()
self.yang_name = "summaries"
self.yang_parent_name = "access-interface"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([("summary", ("summary", Pppoe.Nodes.Node.AccessInterface.Summaries.Summary))])
self._leafs = OrderedDict()
self.summary = YList(self)
self._segment_path = lambda: "summaries"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Pppoe.Nodes.Node.AccessInterface.Summaries, [], name, value)
class Summary(_Entity_):
"""
Summary information for a PPPoE\-enabled
access interface
.. attribute:: interface_name (key)
PPPoE Access Interface
**type**\: str
**pattern:** [a\-zA\-Z0\-9.\_/\-]+
**config**\: False
.. attribute:: interface_name_xr
Interface
**type**\: str
**pattern:** [a\-zA\-Z0\-9.\_/\-]+
**config**\: False
.. attribute:: interface_state
Interface State
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: mac_address
Mac Address
**type**\: str
**pattern:** [0\-9a\-fA\-F]{2}(\:[0\-9a\-fA\-F]{2}){5}
**config**\: False
.. attribute:: bba_group_name
BBA Group
**type**\: str
**config**\: False
.. attribute:: is_ready
Is Ready
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: sessions
Session Count
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: incomplete_sessions
Incomplete Session Count
**type**\: int
**range:** 0..4294967295
**config**\: False
"""
_prefix = 'subscriber-pppoe-ma-oper'
_revision = '2019-10-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Pppoe.Nodes.Node.AccessInterface.Summaries.Summary, self).__init__()
self.yang_name = "summary"
self.yang_parent_name = "summaries"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = ['interface_name']
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('interface_name', (YLeaf(YType.str, 'interface-name'), ['str'])),
('interface_name_xr', (YLeaf(YType.str, 'interface-name-xr'), ['str'])),
('interface_state', (YLeaf(YType.uint32, 'interface-state'), ['int'])),
('mac_address', (YLeaf(YType.str, 'mac-address'), ['str'])),
('bba_group_name', (YLeaf(YType.str, 'bba-group-name'), ['str'])),
('is_ready', (YLeaf(YType.int32, 'is-ready'), ['int'])),
('sessions', (YLeaf(YType.uint32, 'sessions'), ['int'])),
('incomplete_sessions', (YLeaf(YType.uint32, 'incomplete-sessions'), ['int'])),
])
self.interface_name = None
self.interface_name_xr = None
self.interface_state = None
self.mac_address = None
self.bba_group_name = None
self.is_ready = None
self.sessions = None
self.incomplete_sessions = None
self._segment_path = lambda: "summary" + "[interface-name='" + str(self.interface_name) + "']"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Pppoe.Nodes.Node.AccessInterface.Summaries.Summary, ['interface_name', 'interface_name_xr', 'interface_state', 'mac_address', 'bba_group_name', 'is_ready', 'sessions', 'incomplete_sessions'], name, value)
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_subscriber_pppoe_ma_oper as meta
return meta._meta_table['Pppoe.Nodes.Node.AccessInterface.Summaries.Summary']['meta_info']
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_subscriber_pppoe_ma_oper as meta
return meta._meta_table['Pppoe.Nodes.Node.AccessInterface.Summaries']['meta_info']
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_subscriber_pppoe_ma_oper as meta
return meta._meta_table['Pppoe.Nodes.Node.AccessInterface']['meta_info']
class Interfaces(_Entity_):
"""
Per interface PPPoE operational data
.. attribute:: interface
Data for a PPPoE interface
**type**\: list of :py:class:`Interface <ydk.models.cisco_ios_xr.Cisco_IOS_XR_subscriber_pppoe_ma_oper.Pppoe.Nodes.Node.Interfaces.Interface>`
**config**\: False
"""
_prefix = 'subscriber-pppoe-ma-oper'
_revision = '2019-10-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Pppoe.Nodes.Node.Interfaces, self).__init__()
self.yang_name = "interfaces"
self.yang_parent_name = "node"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([("interface", ("interface", Pppoe.Nodes.Node.Interfaces.Interface))])
self._leafs = OrderedDict()
self.interface = YList(self)
self._segment_path = lambda: "interfaces"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Pppoe.Nodes.Node.Interfaces, [], name, value)
class Interface(_Entity_):
"""
Data for a PPPoE interface
.. attribute:: interface_name (key)
PPPoE Interface
**type**\: str
**pattern:** [a\-zA\-Z0\-9.\_/\-]+
**config**\: False
.. attribute:: tags
Tags
**type**\: :py:class:`Tags <ydk.models.cisco_ios_xr.Cisco_IOS_XR_subscriber_pppoe_ma_oper.Pppoe.Nodes.Node.Interfaces.Interface.Tags>`
**config**\: False
.. attribute:: interface_name_xr
Interface
**type**\: str
**pattern:** [a\-zA\-Z0\-9.\_/\-]+
**config**\: False
.. attribute:: access_interface_name
Access Interface
**type**\: str
**pattern:** [a\-zA\-Z0\-9.\_/\-]+
**config**\: False
.. attribute:: bba_group_name
BBA Group
**type**\: str
**config**\: False
.. attribute:: session_id
Session ID
**type**\: int
**range:** 0..65535
**config**\: False
.. attribute:: local_mac_address
Local Mac\-Address
**type**\: str
**pattern:** [0\-9a\-fA\-F]{2}(\:[0\-9a\-fA\-F]{2}){5}
**config**\: False
.. attribute:: peer_mac_address
Peer Mac\-Address
**type**\: str
**pattern:** [0\-9a\-fA\-F]{2}(\:[0\-9a\-fA\-F]{2}){5}
**config**\: False
.. attribute:: is_complete
Is Complete
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: vlan_outer_id
VLAN Outer ID
**type**\: int
**range:** 0..65535
**config**\: False
.. attribute:: vlan_inner_id
VLAN Inner ID
**type**\: int
**range:** 0..65535
**config**\: False
.. attribute:: srg_state
SRG state
**type**\: :py:class:`PppoeMaSessionIdbSrgState <ydk.models.cisco_ios_xr.Cisco_IOS_XR_subscriber_pppoe_ma_oper.PppoeMaSessionIdbSrgState>`
**config**\: False
"""
_prefix = 'subscriber-pppoe-ma-oper'
_revision = '2019-10-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Pppoe.Nodes.Node.Interfaces.Interface, self).__init__()
self.yang_name = "interface"
self.yang_parent_name = "interfaces"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = ['interface_name']
self._child_classes = OrderedDict([("tags", ("tags", Pppoe.Nodes.Node.Interfaces.Interface.Tags))])
self._leafs = OrderedDict([
('interface_name', (YLeaf(YType.str, 'interface-name'), ['str'])),
('interface_name_xr', (YLeaf(YType.str, 'interface-name-xr'), ['str'])),
('access_interface_name', (YLeaf(YType.str, 'access-interface-name'), ['str'])),
('bba_group_name', (YLeaf(YType.str, 'bba-group-name'), ['str'])),
('session_id', (YLeaf(YType.uint16, 'session-id'), ['int'])),
('local_mac_address', (YLeaf(YType.str, 'local-mac-address'), ['str'])),
('peer_mac_address', (YLeaf(YType.str, 'peer-mac-address'), ['str'])),
('is_complete', (YLeaf(YType.int32, 'is-complete'), ['int'])),
('vlan_outer_id', (YLeaf(YType.uint16, 'vlan-outer-id'), ['int'])),
('vlan_inner_id', (YLeaf(YType.uint16, 'vlan-inner-id'), ['int'])),
('srg_state', (YLeaf(YType.enumeration, 'srg-state'), [('ydk.models.cisco_ios_xr.Cisco_IOS_XR_subscriber_pppoe_ma_oper', 'PppoeMaSessionIdbSrgState', '')])),
])
self.interface_name = None
self.interface_name_xr = None
self.access_interface_name = None
self.bba_group_name = None
self.session_id = None
self.local_mac_address = None
self.peer_mac_address = None
self.is_complete = None
self.vlan_outer_id = None
self.vlan_inner_id = None
self.srg_state = None
self.tags = Pppoe.Nodes.Node.Interfaces.Interface.Tags()
self.tags.parent = self
self._children_name_map["tags"] = "tags"
self._segment_path = lambda: "interface" + "[interface-name='" + str(self.interface_name) + "']"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Pppoe.Nodes.Node.Interfaces.Interface, ['interface_name', 'interface_name_xr', 'access_interface_name', 'bba_group_name', 'session_id', 'local_mac_address', 'peer_mac_address', 'is_complete', 'vlan_outer_id', 'vlan_inner_id', 'srg_state'], name, value)
class Tags(_Entity_):
"""
Tags
.. attribute:: access_loop_encapsulation
Access Loop Encapsulation
**type**\: :py:class:`AccessLoopEncapsulation <ydk.models.cisco_ios_xr.Cisco_IOS_XR_subscriber_pppoe_ma_oper.Pppoe.Nodes.Node.Interfaces.Interface.Tags.AccessLoopEncapsulation>`
**config**\: False
.. attribute:: service_name
Service Name
**type**\: str
**config**\: False
.. attribute:: max_payload
Max Payload
**type**\: int
**range:** 0..65535
**config**\: False
.. attribute:: host_uniq
Host Uniq
**type**\: str
**pattern:** ([0\-9a\-fA\-F]{2}(\:[0\-9a\-fA\-F]{2})\*)?
**config**\: False
.. attribute:: relay_session_id
Relay Session ID
**type**\: str
**pattern:** ([0\-9a\-fA\-F]{2}(\:[0\-9a\-fA\-F]{2})\*)?
**config**\: False
.. attribute:: remote_id
Remote ID
**type**\: str
**config**\: False
.. attribute:: circuit_id
Circuit ID
**type**\: str
**config**\: False
.. attribute:: is_iwf
Is IWF
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: dsl_actual_up
DSL Actual Up
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: dsl_actual_down
DSL Actual Down
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: dsl_min_up
DSL Min Up
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: dsl_min_down
DSL Min Down
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: dsl_attain_up
DSL Attain Up
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: dsl_attain_down
DSL Attain Down
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: dsl_max_up
DSL Max Up
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: dsl_max_down
DSL Max Down
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: dsl_min_up_low
DSL Min Up Low
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: dsl_min_down_low
DSL Min Down Low
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: dsl_max_delay_up
DSL Max Delay Up
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: dsl_actual_delay_up
DSL Actual Delay Up
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: dsl_max_delay_down
DSL Max Delay Down
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: dsl_actual_delay_down
DSL Actual Delay Down
**type**\: int
**range:** 0..4294967295
**config**\: False
"""
_prefix = 'subscriber-pppoe-ma-oper'
_revision = '2019-10-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Pppoe.Nodes.Node.Interfaces.Interface.Tags, self).__init__()
self.yang_name = "tags"
self.yang_parent_name = "interface"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([("access-loop-encapsulation", ("access_loop_encapsulation", Pppoe.Nodes.Node.Interfaces.Interface.Tags.AccessLoopEncapsulation))])
self._leafs = OrderedDict([
('service_name', (YLeaf(YType.str, 'service-name'), ['str'])),
('max_payload', (YLeaf(YType.uint16, 'max-payload'), ['int'])),
('host_uniq', (YLeaf(YType.str, 'host-uniq'), ['str'])),
('relay_session_id', (YLeaf(YType.str, 'relay-session-id'), ['str'])),
('remote_id', (YLeaf(YType.str, 'remote-id'), ['str'])),
('circuit_id', (YLeaf(YType.str, 'circuit-id'), ['str'])),
('is_iwf', (YLeaf(YType.int32, 'is-iwf'), ['int'])),
('dsl_actual_up', (YLeaf(YType.uint32, 'dsl-actual-up'), ['int'])),
('dsl_actual_down', (YLeaf(YType.uint32, 'dsl-actual-down'), ['int'])),
('dsl_min_up', (YLeaf(YType.uint32, 'dsl-min-up'), ['int'])),
('dsl_min_down', (YLeaf(YType.uint32, 'dsl-min-down'), ['int'])),
('dsl_attain_up', (YLeaf(YType.uint32, 'dsl-attain-up'), ['int'])),
('dsl_attain_down', (YLeaf(YType.uint32, 'dsl-attain-down'), ['int'])),
('dsl_max_up', (YLeaf(YType.uint32, 'dsl-max-up'), ['int'])),
('dsl_max_down', (YLeaf(YType.uint32, 'dsl-max-down'), ['int'])),
('dsl_min_up_low', (YLeaf(YType.uint32, 'dsl-min-up-low'), ['int'])),
('dsl_min_down_low', (YLeaf(YType.uint32, 'dsl-min-down-low'), ['int'])),
('dsl_max_delay_up', (YLeaf(YType.uint32, 'dsl-max-delay-up'), ['int'])),
('dsl_actual_delay_up', (YLeaf(YType.uint32, 'dsl-actual-delay-up'), ['int'])),
('dsl_max_delay_down', (YLeaf(YType.uint32, 'dsl-max-delay-down'), ['int'])),
('dsl_actual_delay_down', (YLeaf(YType.uint32, 'dsl-actual-delay-down'), ['int'])),
])
self.service_name = None
self.max_payload = None
self.host_uniq = None
self.relay_session_id = None
self.remote_id = None
self.circuit_id = None
self.is_iwf = None
self.dsl_actual_up = None
self.dsl_actual_down = None
self.dsl_min_up = None
self.dsl_min_down = None
self.dsl_attain_up = None
self.dsl_attain_down = None
self.dsl_max_up = None
self.dsl_max_down = None
self.dsl_min_up_low = None
self.dsl_min_down_low = None
self.dsl_max_delay_up = None
self.dsl_actual_delay_up = None
self.dsl_max_delay_down = None
self.dsl_actual_delay_down = None
self.access_loop_encapsulation = Pppoe.Nodes.Node.Interfaces.Interface.Tags.AccessLoopEncapsulation()
self.access_loop_encapsulation.parent = self
self._children_name_map["access_loop_encapsulation"] = "access-loop-encapsulation"
self._segment_path = lambda: "tags"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Pppoe.Nodes.Node.Interfaces.Interface.Tags, ['service_name', 'max_payload', 'host_uniq', 'relay_session_id', 'remote_id', 'circuit_id', 'is_iwf', 'dsl_actual_up', 'dsl_actual_down', 'dsl_min_up', 'dsl_min_down', 'dsl_attain_up', 'dsl_attain_down', 'dsl_max_up', 'dsl_max_down', 'dsl_min_up_low', 'dsl_min_down_low', 'dsl_max_delay_up', 'dsl_actual_delay_up', 'dsl_max_delay_down', 'dsl_actual_delay_down'], name, value)
class AccessLoopEncapsulation(_Entity_):
"""
Access Loop Encapsulation
.. attribute:: data_link
Data Link
**type**\: int
**range:** 0..255
**config**\: False
.. attribute:: encaps1
Encaps 1
**type**\: int
**range:** 0..255
**config**\: False
.. attribute:: encaps2
Encaps 2
**type**\: int
**range:** 0..255
**config**\: False
"""
_prefix = 'subscriber-pppoe-ma-oper'
_revision = '2019-10-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Pppoe.Nodes.Node.Interfaces.Interface.Tags.AccessLoopEncapsulation, self).__init__()
self.yang_name = "access-loop-encapsulation"
self.yang_parent_name = "tags"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('data_link', (YLeaf(YType.uint8, 'data-link'), ['int'])),
('encaps1', (YLeaf(YType.uint8, 'encaps1'), ['int'])),
('encaps2', (YLeaf(YType.uint8, 'encaps2'), ['int'])),
])
self.data_link = None
self.encaps1 = None
self.encaps2 = None
self._segment_path = lambda: "access-loop-encapsulation"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Pppoe.Nodes.Node.Interfaces.Interface.Tags.AccessLoopEncapsulation, ['data_link', 'encaps1', 'encaps2'], name, value)
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_subscriber_pppoe_ma_oper as meta
return meta._meta_table['Pppoe.Nodes.Node.Interfaces.Interface.Tags.AccessLoopEncapsulation']['meta_info']
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_subscriber_pppoe_ma_oper as meta
return meta._meta_table['Pppoe.Nodes.Node.Interfaces.Interface.Tags']['meta_info']
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_subscriber_pppoe_ma_oper as meta
return meta._meta_table['Pppoe.Nodes.Node.Interfaces.Interface']['meta_info']
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_subscriber_pppoe_ma_oper as meta
return meta._meta_table['Pppoe.Nodes.Node.Interfaces']['meta_info']
class BbaGroups(_Entity_):
"""
PPPoE BBA\-Group information
.. attribute:: bba_group
PPPoE BBA\-Group information
**type**\: list of :py:class:`BbaGroup <ydk.models.cisco_ios_xr.Cisco_IOS_XR_subscriber_pppoe_ma_oper.Pppoe.Nodes.Node.BbaGroups.BbaGroup>`
**config**\: False
"""
_prefix = 'subscriber-pppoe-ma-oper'
_revision = '2019-10-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Pppoe.Nodes.Node.BbaGroups, self).__init__()
self.yang_name = "bba-groups"
self.yang_parent_name = "node"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([("bba-group", ("bba_group", Pppoe.Nodes.Node.BbaGroups.BbaGroup))])
self._leafs = OrderedDict()
self.bba_group = YList(self)
self._segment_path = lambda: "bba-groups"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Pppoe.Nodes.Node.BbaGroups, [], name, value)
class BbaGroup(_Entity_):
"""
PPPoE BBA\-Group information
.. attribute:: bba_group_name (key)
BBA Group
**type**\: str
**pattern:** [\\w\\\-\\.\:,\_@#%$\\+=\\\|;]+
**config**\: False
.. attribute:: limit_config
BBA\-Group limit configuration information
**type**\: :py:class:`LimitConfig <ydk.models.cisco_ios_xr.Cisco_IOS_XR_subscriber_pppoe_ma_oper.Pppoe.Nodes.Node.BbaGroups.BbaGroup.LimitConfig>`
**config**\: False
.. attribute:: limits
PPPoE session limit information
**type**\: :py:class:`Limits <ydk.models.cisco_ios_xr.Cisco_IOS_XR_subscriber_pppoe_ma_oper.Pppoe.Nodes.Node.BbaGroups.BbaGroup.Limits>`
**config**\: False
.. attribute:: throttles
PPPoE throttle information
**type**\: :py:class:`Throttles <ydk.models.cisco_ios_xr.Cisco_IOS_XR_subscriber_pppoe_ma_oper.Pppoe.Nodes.Node.BbaGroups.BbaGroup.Throttles>`
**config**\: False
.. attribute:: throttle_config
BBA\-Group throttle configuration information
**type**\: :py:class:`ThrottleConfig <ydk.models.cisco_ios_xr.Cisco_IOS_XR_subscriber_pppoe_ma_oper.Pppoe.Nodes.Node.BbaGroups.BbaGroup.ThrottleConfig>`
**config**\: False
"""
_prefix = 'subscriber-pppoe-ma-oper'
_revision = '2019-10-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Pppoe.Nodes.Node.BbaGroups.BbaGroup, self).__init__()
self.yang_name = "bba-group"
self.yang_parent_name = "bba-groups"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = ['bba_group_name']
self._child_classes = OrderedDict([("limit-config", ("limit_config", Pppoe.Nodes.Node.BbaGroups.BbaGroup.LimitConfig)), ("limits", ("limits", Pppoe.Nodes.Node.BbaGroups.BbaGroup.Limits)), ("throttles", ("throttles", Pppoe.Nodes.Node.BbaGroups.BbaGroup.Throttles)), ("throttle-config", ("throttle_config", Pppoe.Nodes.Node.BbaGroups.BbaGroup.ThrottleConfig))])
self._leafs = OrderedDict([
('bba_group_name', (YLeaf(YType.str, 'bba-group-name'), ['str'])),
])
self.bba_group_name = None
self.limit_config = Pppoe.Nodes.Node.BbaGroups.BbaGroup.LimitConfig()
self.limit_config.parent = self
self._children_name_map["limit_config"] = "limit-config"
self.limits = Pppoe.Nodes.Node.BbaGroups.BbaGroup.Limits()
self.limits.parent = self
self._children_name_map["limits"] = "limits"
self.throttles = Pppoe.Nodes.Node.BbaGroups.BbaGroup.Throttles()
self.throttles.parent = self
self._children_name_map["throttles"] = "throttles"
self.throttle_config = Pppoe.Nodes.Node.BbaGroups.BbaGroup.ThrottleConfig()
self.throttle_config.parent = self
self._children_name_map["throttle_config"] = "throttle-config"
self._segment_path = lambda: "bba-group" + "[bba-group-name='" + str(self.bba_group_name) + "']"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Pppoe.Nodes.Node.BbaGroups.BbaGroup, ['bba_group_name'], name, value)
class LimitConfig(_Entity_):
"""
BBA\-Group limit configuration information
.. attribute:: card
Card
**type**\: :py:class:`Card <ydk.models.cisco_ios_xr.Cisco_IOS_XR_subscriber_pppoe_ma_oper.Pppoe.Nodes.Node.BbaGroups.BbaGroup.LimitConfig.Card>`
**config**\: False
.. attribute:: access_intf
Access Interface
**type**\: :py:class:`AccessIntf <ydk.models.cisco_ios_xr.Cisco_IOS_XR_subscriber_pppoe_ma_oper.Pppoe.Nodes.Node.BbaGroups.BbaGroup.LimitConfig.AccessIntf>`
**config**\: False
.. attribute:: mac
MAC
**type**\: :py:class:`Mac <ydk.models.cisco_ios_xr.Cisco_IOS_XR_subscriber_pppoe_ma_oper.Pppoe.Nodes.Node.BbaGroups.BbaGroup.LimitConfig.Mac>`
**config**\: False
.. attribute:: mac_iwf
MAC IWF
**type**\: :py:class:`MacIwf <ydk.models.cisco_ios_xr.Cisco_IOS_XR_subscriber_pppoe_ma_oper.Pppoe.Nodes.Node.BbaGroups.BbaGroup.LimitConfig.MacIwf>`
**config**\: False
.. attribute:: mac_access_interface
MAC Access Interface
**type**\: :py:class:`MacAccessInterface <ydk.models.cisco_ios_xr.Cisco_IOS_XR_subscriber_pppoe_ma_oper.Pppoe.Nodes.Node.BbaGroups.BbaGroup.LimitConfig.MacAccessInterface>`
**config**\: False
.. attribute:: mac_iwf_access_interface
MAC IWF Access Interface
**type**\: :py:class:`MacIwfAccessInterface <ydk.models.cisco_ios_xr.Cisco_IOS_XR_subscriber_pppoe_ma_oper.Pppoe.Nodes.Node.BbaGroups.BbaGroup.LimitConfig.MacIwfAccessInterface>`
**config**\: False
.. attribute:: circuit_id
Circuit ID
**type**\: :py:class:`CircuitId <ydk.models.cisco_ios_xr.Cisco_IOS_XR_subscriber_pppoe_ma_oper.Pppoe.Nodes.Node.BbaGroups.BbaGroup.LimitConfig.CircuitId>`
**config**\: False
.. attribute:: remote_id
Remote ID
**type**\: :py:class:`RemoteId <ydk.models.cisco_ios_xr.Cisco_IOS_XR_subscriber_pppoe_ma_oper.Pppoe.Nodes.Node.BbaGroups.BbaGroup.LimitConfig.RemoteId>`
**config**\: False
.. attribute:: circuit_id_and_remote_id
Circuit ID and Remote ID
**type**\: :py:class:`CircuitIdAndRemoteId <ydk.models.cisco_ios_xr.Cisco_IOS_XR_subscriber_pppoe_ma_oper.Pppoe.Nodes.Node.BbaGroups.BbaGroup.LimitConfig.CircuitIdAndRemoteId>`
**config**\: False
.. attribute:: outer_vlan_id
Outer VLAN ID
**type**\: :py:class:`OuterVlanId <ydk.models.cisco_ios_xr.Cisco_IOS_XR_subscriber_pppoe_ma_oper.Pppoe.Nodes.Node.BbaGroups.BbaGroup.LimitConfig.OuterVlanId>`
**config**\: False
.. attribute:: inner_vlan_id
Inner VLAN ID
**type**\: :py:class:`InnerVlanId <ydk.models.cisco_ios_xr.Cisco_IOS_XR_subscriber_pppoe_ma_oper.Pppoe.Nodes.Node.BbaGroups.BbaGroup.LimitConfig.InnerVlanId>`
**config**\: False
.. attribute:: vlan_id
VLAN ID
**type**\: :py:class:`VlanId <ydk.models.cisco_ios_xr.Cisco_IOS_XR_subscriber_pppoe_ma_oper.Pppoe.Nodes.Node.BbaGroups.BbaGroup.LimitConfig.VlanId>`
**config**\: False
"""
_prefix = 'subscriber-pppoe-ma-oper'
_revision = '2019-10-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Pppoe.Nodes.Node.BbaGroups.BbaGroup.LimitConfig, self).__init__()
self.yang_name = "limit-config"
self.yang_parent_name = "bba-group"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([("card", ("card", Pppoe.Nodes.Node.BbaGroups.BbaGroup.LimitConfig.Card)), ("access-intf", ("access_intf", Pppoe.Nodes.Node.BbaGroups.BbaGroup.LimitConfig.AccessIntf)), ("mac", ("mac", Pppoe.Nodes.Node.BbaGroups.BbaGroup.LimitConfig.Mac)), ("mac-iwf", ("mac_iwf", Pppoe.Nodes.Node.BbaGroups.BbaGroup.LimitConfig.MacIwf)), ("mac-access-interface", ("mac_access_interface", Pppoe.Nodes.Node.BbaGroups.BbaGroup.LimitConfig.MacAccessInterface)), ("mac-iwf-access-interface", ("mac_iwf_access_interface", Pppoe.Nodes.Node.BbaGroups.BbaGroup.LimitConfig.MacIwfAccessInterface)), ("circuit-id", ("circuit_id", Pppoe.Nodes.Node.BbaGroups.BbaGroup.LimitConfig.CircuitId)), ("remote-id", ("remote_id", Pppoe.Nodes.Node.BbaGroups.BbaGroup.LimitConfig.RemoteId)), ("circuit-id-and-remote-id", ("circuit_id_and_remote_id", Pppoe.Nodes.Node.BbaGroups.BbaGroup.LimitConfig.CircuitIdAndRemoteId)), ("outer-vlan-id", ("outer_vlan_id", Pppoe.Nodes.Node.BbaGroups.BbaGroup.LimitConfig.OuterVlanId)), ("inner-vlan-id", ("inner_vlan_id", Pppoe.Nodes.Node.BbaGroups.BbaGroup.LimitConfig.InnerVlanId)), ("vlan-id", ("vlan_id", Pppoe.Nodes.Node.BbaGroups.BbaGroup.LimitConfig.VlanId))])
self._leafs = OrderedDict()
self.card = Pppoe.Nodes.Node.BbaGroups.BbaGroup.LimitConfig.Card()
self.card.parent = self
self._children_name_map["card"] = "card"
self.access_intf = Pppoe.Nodes.Node.BbaGroups.BbaGroup.LimitConfig.AccessIntf()
self.access_intf.parent = self
self._children_name_map["access_intf"] = "access-intf"
self.mac = Pppoe.Nodes.Node.BbaGroups.BbaGroup.LimitConfig.Mac()
self.mac.parent = self
self._children_name_map["mac"] = "mac"
self.mac_iwf = Pppoe.Nodes.Node.BbaGroups.BbaGroup.LimitConfig.MacIwf()
self.mac_iwf.parent = self
self._children_name_map["mac_iwf"] = "mac-iwf"
self.mac_access_interface = Pppoe.Nodes.Node.BbaGroups.BbaGroup.LimitConfig.MacAccessInterface()
self.mac_access_interface.parent = self
self._children_name_map["mac_access_interface"] = "mac-access-interface"
self.mac_iwf_access_interface = Pppoe.Nodes.Node.BbaGroups.BbaGroup.LimitConfig.MacIwfAccessInterface()
self.mac_iwf_access_interface.parent = self
self._children_name_map["mac_iwf_access_interface"] = "mac-iwf-access-interface"
self.circuit_id = Pppoe.Nodes.Node.BbaGroups.BbaGroup.LimitConfig.CircuitId()
self.circuit_id.parent = self
self._children_name_map["circuit_id"] = "circuit-id"
self.remote_id = Pppoe.Nodes.Node.BbaGroups.BbaGroup.LimitConfig.RemoteId()
self.remote_id.parent = self
self._children_name_map["remote_id"] = "remote-id"
self.circuit_id_and_remote_id = Pppoe.Nodes.Node.BbaGroups.BbaGroup.LimitConfig.CircuitIdAndRemoteId()
self.circuit_id_and_remote_id.parent = self
self._children_name_map["circuit_id_and_remote_id"] = "circuit-id-and-remote-id"
self.outer_vlan_id = Pppoe.Nodes.Node.BbaGroups.BbaGroup.LimitConfig.OuterVlanId()
self.outer_vlan_id.parent = self
self._children_name_map["outer_vlan_id"] = "outer-vlan-id"
self.inner_vlan_id = Pppoe.Nodes.Node.BbaGroups.BbaGroup.LimitConfig.InnerVlanId()
self.inner_vlan_id.parent = self
self._children_name_map["inner_vlan_id"] = "inner-vlan-id"
self.vlan_id = Pppoe.Nodes.Node.BbaGroups.BbaGroup.LimitConfig.VlanId()
self.vlan_id.parent = self
self._children_name_map["vlan_id"] = "vlan-id"
self._segment_path = lambda: "limit-config"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Pppoe.Nodes.Node.BbaGroups.BbaGroup.LimitConfig, [], name, value)
class Card(_Entity_):
"""
Card
.. attribute:: max_limit
Max Limit
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: threshold
Threshold
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: radius_override_enabled
Radius override is enabled
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
"""
_prefix = 'subscriber-pppoe-ma-oper'
_revision = '2019-10-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Pppoe.Nodes.Node.BbaGroups.BbaGroup.LimitConfig.Card, self).__init__()
self.yang_name = "card"
self.yang_parent_name = "limit-config"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('max_limit', (YLeaf(YType.uint32, 'max-limit'), ['int'])),
('threshold', (YLeaf(YType.uint32, 'threshold'), ['int'])),
('radius_override_enabled', (YLeaf(YType.int32, 'radius-override-enabled'), ['int'])),
])
self.max_limit = None
self.threshold = None
self.radius_override_enabled = None
self._segment_path = lambda: "card"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Pppoe.Nodes.Node.BbaGroups.BbaGroup.LimitConfig.Card, ['max_limit', 'threshold', 'radius_override_enabled'], name, value)
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_subscriber_pppoe_ma_oper as meta
return meta._meta_table['Pppoe.Nodes.Node.BbaGroups.BbaGroup.LimitConfig.Card']['meta_info']
class AccessIntf(_Entity_):
"""
Access Interface
.. attribute:: max_limit
Max Limit
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: threshold
Threshold
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: radius_override_enabled
Radius override is enabled
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
"""
_prefix = 'subscriber-pppoe-ma-oper'
_revision = '2019-10-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Pppoe.Nodes.Node.BbaGroups.BbaGroup.LimitConfig.AccessIntf, self).__init__()
self.yang_name = "access-intf"
self.yang_parent_name = "limit-config"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('max_limit', (YLeaf(YType.uint32, 'max-limit'), ['int'])),
('threshold', (YLeaf(YType.uint32, 'threshold'), ['int'])),
('radius_override_enabled', (YLeaf(YType.int32, 'radius-override-enabled'), ['int'])),
])
self.max_limit = None
self.threshold = None
self.radius_override_enabled = None
self._segment_path = lambda: "access-intf"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Pppoe.Nodes.Node.BbaGroups.BbaGroup.LimitConfig.AccessIntf, ['max_limit', 'threshold', 'radius_override_enabled'], name, value)
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_subscriber_pppoe_ma_oper as meta
return meta._meta_table['Pppoe.Nodes.Node.BbaGroups.BbaGroup.LimitConfig.AccessIntf']['meta_info']
class Mac(_Entity_):
"""
MAC
.. attribute:: max_limit
Max Limit
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: threshold
Threshold
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: radius_override_enabled
Radius override is enabled
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
"""
_prefix = 'subscriber-pppoe-ma-oper'
_revision = '2019-10-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Pppoe.Nodes.Node.BbaGroups.BbaGroup.LimitConfig.Mac, self).__init__()
self.yang_name = "mac"
self.yang_parent_name = "limit-config"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('max_limit', (YLeaf(YType.uint32, 'max-limit'), ['int'])),
('threshold', (YLeaf(YType.uint32, 'threshold'), ['int'])),
('radius_override_enabled', (YLeaf(YType.int32, 'radius-override-enabled'), ['int'])),
])
self.max_limit = None
self.threshold = None
self.radius_override_enabled = None
self._segment_path = lambda: "mac"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Pppoe.Nodes.Node.BbaGroups.BbaGroup.LimitConfig.Mac, ['max_limit', 'threshold', 'radius_override_enabled'], name, value)
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_subscriber_pppoe_ma_oper as meta
return meta._meta_table['Pppoe.Nodes.Node.BbaGroups.BbaGroup.LimitConfig.Mac']['meta_info']
class MacIwf(_Entity_):
"""
MAC IWF
.. attribute:: max_limit
Max Limit
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: threshold
Threshold
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: radius_override_enabled
Radius override is enabled
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
"""
_prefix = 'subscriber-pppoe-ma-oper'
_revision = '2019-10-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Pppoe.Nodes.Node.BbaGroups.BbaGroup.LimitConfig.MacIwf, self).__init__()
self.yang_name = "mac-iwf"
self.yang_parent_name = "limit-config"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('max_limit', (YLeaf(YType.uint32, 'max-limit'), ['int'])),
('threshold', (YLeaf(YType.uint32, 'threshold'), ['int'])),
('radius_override_enabled', (YLeaf(YType.int32, 'radius-override-enabled'), ['int'])),
])
self.max_limit = None
self.threshold = None
self.radius_override_enabled = None
self._segment_path = lambda: "mac-iwf"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Pppoe.Nodes.Node.BbaGroups.BbaGroup.LimitConfig.MacIwf, ['max_limit', 'threshold', 'radius_override_enabled'], name, value)
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_subscriber_pppoe_ma_oper as meta
return meta._meta_table['Pppoe.Nodes.Node.BbaGroups.BbaGroup.LimitConfig.MacIwf']['meta_info']
class MacAccessInterface(_Entity_):
"""
MAC Access Interface
.. attribute:: max_limit
Max Limit
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: threshold
Threshold
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: radius_override_enabled
Radius override is enabled
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
"""
_prefix = 'subscriber-pppoe-ma-oper'
_revision = '2019-10-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Pppoe.Nodes.Node.BbaGroups.BbaGroup.LimitConfig.MacAccessInterface, self).__init__()
self.yang_name = "mac-access-interface"
self.yang_parent_name = "limit-config"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('max_limit', (YLeaf(YType.uint32, 'max-limit'), ['int'])),
('threshold', (YLeaf(YType.uint32, 'threshold'), ['int'])),
('radius_override_enabled', (YLeaf(YType.int32, 'radius-override-enabled'), ['int'])),
])
self.max_limit = None
self.threshold = None
self.radius_override_enabled = None
self._segment_path = lambda: "mac-access-interface"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Pppoe.Nodes.Node.BbaGroups.BbaGroup.LimitConfig.MacAccessInterface, ['max_limit', 'threshold', 'radius_override_enabled'], name, value)
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_subscriber_pppoe_ma_oper as meta
return meta._meta_table['Pppoe.Nodes.Node.BbaGroups.BbaGroup.LimitConfig.MacAccessInterface']['meta_info']
class MacIwfAccessInterface(_Entity_):
"""
MAC IWF Access Interface
.. attribute:: max_limit
Max Limit
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: threshold
Threshold
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: radius_override_enabled
Radius override is enabled
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
"""
_prefix = 'subscriber-pppoe-ma-oper'
_revision = '2019-10-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Pppoe.Nodes.Node.BbaGroups.BbaGroup.LimitConfig.MacIwfAccessInterface, self).__init__()
self.yang_name = "mac-iwf-access-interface"
self.yang_parent_name = "limit-config"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('max_limit', (YLeaf(YType.uint32, 'max-limit'), ['int'])),
('threshold', (YLeaf(YType.uint32, 'threshold'), ['int'])),
('radius_override_enabled', (YLeaf(YType.int32, 'radius-override-enabled'), ['int'])),
])
self.max_limit = None
self.threshold = None
self.radius_override_enabled = None
self._segment_path = lambda: "mac-iwf-access-interface"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Pppoe.Nodes.Node.BbaGroups.BbaGroup.LimitConfig.MacIwfAccessInterface, ['max_limit', 'threshold', 'radius_override_enabled'], name, value)
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_subscriber_pppoe_ma_oper as meta
return meta._meta_table['Pppoe.Nodes.Node.BbaGroups.BbaGroup.LimitConfig.MacIwfAccessInterface']['meta_info']
class CircuitId(_Entity_):
"""
Circuit ID
.. attribute:: max_limit
Max Limit
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: threshold
Threshold
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: radius_override_enabled
Radius override is enabled
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
"""
_prefix = 'subscriber-pppoe-ma-oper'
_revision = '2019-10-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Pppoe.Nodes.Node.BbaGroups.BbaGroup.LimitConfig.CircuitId, self).__init__()
self.yang_name = "circuit-id"
self.yang_parent_name = "limit-config"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('max_limit', (YLeaf(YType.uint32, 'max-limit'), ['int'])),
('threshold', (YLeaf(YType.uint32, 'threshold'), ['int'])),
('radius_override_enabled', (YLeaf(YType.int32, 'radius-override-enabled'), ['int'])),
])
self.max_limit = None
self.threshold = None
self.radius_override_enabled = None
self._segment_path = lambda: "circuit-id"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Pppoe.Nodes.Node.BbaGroups.BbaGroup.LimitConfig.CircuitId, ['max_limit', 'threshold', 'radius_override_enabled'], name, value)
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_subscriber_pppoe_ma_oper as meta
return meta._meta_table['Pppoe.Nodes.Node.BbaGroups.BbaGroup.LimitConfig.CircuitId']['meta_info']
class RemoteId(_Entity_):
"""
Remote ID
.. attribute:: max_limit
Max Limit
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: threshold
Threshold
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: radius_override_enabled
Radius override is enabled
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
"""
_prefix = 'subscriber-pppoe-ma-oper'
_revision = '2019-10-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Pppoe.Nodes.Node.BbaGroups.BbaGroup.LimitConfig.RemoteId, self).__init__()
self.yang_name = "remote-id"
self.yang_parent_name = "limit-config"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('max_limit', (YLeaf(YType.uint32, 'max-limit'), ['int'])),
('threshold', (YLeaf(YType.uint32, 'threshold'), ['int'])),
('radius_override_enabled', (YLeaf(YType.int32, 'radius-override-enabled'), ['int'])),
])
self.max_limit = None
self.threshold = None
self.radius_override_enabled = None
self._segment_path = lambda: "remote-id"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Pppoe.Nodes.Node.BbaGroups.BbaGroup.LimitConfig.RemoteId, ['max_limit', 'threshold', 'radius_override_enabled'], name, value)
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_subscriber_pppoe_ma_oper as meta
return meta._meta_table['Pppoe.Nodes.Node.BbaGroups.BbaGroup.LimitConfig.RemoteId']['meta_info']
class CircuitIdAndRemoteId(_Entity_):
"""
Circuit ID and Remote ID
.. attribute:: max_limit
Max Limit
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: threshold
Threshold
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: radius_override_enabled
Radius override is enabled
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
"""
_prefix = 'subscriber-pppoe-ma-oper'
_revision = '2019-10-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Pppoe.Nodes.Node.BbaGroups.BbaGroup.LimitConfig.CircuitIdAndRemoteId, self).__init__()
self.yang_name = "circuit-id-and-remote-id"
self.yang_parent_name = "limit-config"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('max_limit', (YLeaf(YType.uint32, 'max-limit'), ['int'])),
('threshold', (YLeaf(YType.uint32, 'threshold'), ['int'])),
('radius_override_enabled', (YLeaf(YType.int32, 'radius-override-enabled'), ['int'])),
])
self.max_limit = None
self.threshold = None
self.radius_override_enabled = None
self._segment_path = lambda: "circuit-id-and-remote-id"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Pppoe.Nodes.Node.BbaGroups.BbaGroup.LimitConfig.CircuitIdAndRemoteId, ['max_limit', 'threshold', 'radius_override_enabled'], name, value)
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_subscriber_pppoe_ma_oper as meta
return meta._meta_table['Pppoe.Nodes.Node.BbaGroups.BbaGroup.LimitConfig.CircuitIdAndRemoteId']['meta_info']
class OuterVlanId(_Entity_):
"""
Outer VLAN ID
.. attribute:: max_limit
Max Limit
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: threshold
Threshold
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: radius_override_enabled
Radius override is enabled
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
"""
_prefix = 'subscriber-pppoe-ma-oper'
_revision = '2019-10-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Pppoe.Nodes.Node.BbaGroups.BbaGroup.LimitConfig.OuterVlanId, self).__init__()
self.yang_name = "outer-vlan-id"
self.yang_parent_name = "limit-config"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('max_limit', (YLeaf(YType.uint32, 'max-limit'), ['int'])),
('threshold', (YLeaf(YType.uint32, 'threshold'), ['int'])),
('radius_override_enabled', (YLeaf(YType.int32, 'radius-override-enabled'), ['int'])),
])
self.max_limit = None
self.threshold = None
self.radius_override_enabled = None
self._segment_path = lambda: "outer-vlan-id"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Pppoe.Nodes.Node.BbaGroups.BbaGroup.LimitConfig.OuterVlanId, ['max_limit', 'threshold', 'radius_override_enabled'], name, value)
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_subscriber_pppoe_ma_oper as meta
return meta._meta_table['Pppoe.Nodes.Node.BbaGroups.BbaGroup.LimitConfig.OuterVlanId']['meta_info']
class InnerVlanId(_Entity_):
"""
Inner VLAN ID
.. attribute:: max_limit
Max Limit
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: threshold
Threshold
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: radius_override_enabled
Radius override is enabled
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
"""
_prefix = 'subscriber-pppoe-ma-oper'
_revision = '2019-10-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Pppoe.Nodes.Node.BbaGroups.BbaGroup.LimitConfig.InnerVlanId, self).__init__()
self.yang_name = "inner-vlan-id"
self.yang_parent_name = "limit-config"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('max_limit', (YLeaf(YType.uint32, 'max-limit'), ['int'])),
('threshold', (YLeaf(YType.uint32, 'threshold'), ['int'])),
('radius_override_enabled', (YLeaf(YType.int32, 'radius-override-enabled'), ['int'])),
])
self.max_limit = None
self.threshold = None
self.radius_override_enabled = None
self._segment_path = lambda: "inner-vlan-id"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Pppoe.Nodes.Node.BbaGroups.BbaGroup.LimitConfig.InnerVlanId, ['max_limit', 'threshold', 'radius_override_enabled'], name, value)
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_subscriber_pppoe_ma_oper as meta
return meta._meta_table['Pppoe.Nodes.Node.BbaGroups.BbaGroup.LimitConfig.InnerVlanId']['meta_info']
class VlanId(_Entity_):
"""
VLAN ID
.. attribute:: max_limit
Max Limit
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: threshold
Threshold
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: radius_override_enabled
Radius override is enabled
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
"""
_prefix = 'subscriber-pppoe-ma-oper'
_revision = '2019-10-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Pppoe.Nodes.Node.BbaGroups.BbaGroup.LimitConfig.VlanId, self).__init__()
self.yang_name = "vlan-id"
self.yang_parent_name = "limit-config"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('max_limit', (YLeaf(YType.uint32, 'max-limit'), ['int'])),
('threshold', (YLeaf(YType.uint32, 'threshold'), ['int'])),
('radius_override_enabled', (YLeaf(YType.int32, 'radius-override-enabled'), ['int'])),
])
self.max_limit = None
self.threshold = None
self.radius_override_enabled = None
self._segment_path = lambda: "vlan-id"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Pppoe.Nodes.Node.BbaGroups.BbaGroup.LimitConfig.VlanId, ['max_limit', 'threshold', 'radius_override_enabled'], name, value)
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_subscriber_pppoe_ma_oper as meta
return meta._meta_table['Pppoe.Nodes.Node.BbaGroups.BbaGroup.LimitConfig.VlanId']['meta_info']
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_subscriber_pppoe_ma_oper as meta
return meta._meta_table['Pppoe.Nodes.Node.BbaGroups.BbaGroup.LimitConfig']['meta_info']
class Limits(_Entity_):
"""
PPPoE session limit information
.. attribute:: limit
PPPoE session limit state
**type**\: list of :py:class:`Limit <ydk.models.cisco_ios_xr.Cisco_IOS_XR_subscriber_pppoe_ma_oper.Pppoe.Nodes.Node.BbaGroups.BbaGroup.Limits.Limit>`
**config**\: False
"""
_prefix = 'subscriber-pppoe-ma-oper'
_revision = '2019-10-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Pppoe.Nodes.Node.BbaGroups.BbaGroup.Limits, self).__init__()
self.yang_name = "limits"
self.yang_parent_name = "bba-group"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([("limit", ("limit", Pppoe.Nodes.Node.BbaGroups.BbaGroup.Limits.Limit))])
self._leafs = OrderedDict()
self.limit = YList(self)
self._segment_path = lambda: "limits"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Pppoe.Nodes.Node.BbaGroups.BbaGroup.Limits, [], name, value)
class Limit(_Entity_):
"""
PPPoE session limit state
.. attribute:: interface_name
Access Interface
**type**\: str
**pattern:** [a\-zA\-Z0\-9.\_/\-]+
**config**\: False
.. attribute:: mac_address
MAC address
**type**\: str
**pattern:** [0\-9a\-fA\-F]{2}(\:[0\-9a\-fA\-F]{2}){5}
**config**\: False
.. attribute:: iwf
IWF flag
**type**\: bool
**config**\: False
.. attribute:: circuit_id
Circuit ID
**type**\: str
**pattern:** [\\w\\\-\\.\:,\_@#%$\\+=\\\|;]+
**config**\: False
.. attribute:: remote_id
Remote ID
**type**\: str
**pattern:** [\\w\\\-\\.\:,\_@#%$\\+=\\\|;]+
**config**\: False
.. attribute:: outer_vlan_id
Outer VLAN ID
**type**\: int
**range:** 0..4095
**config**\: False
.. attribute:: inner_vlan_id
Inner VLAN ID
**type**\: int
**range:** 0..4095
**config**\: False
.. attribute:: state
State
**type**\: :py:class:`PppoeMaLimitState <ydk.models.cisco_ios_xr.Cisco_IOS_XR_subscriber_pppoe_ma_oper.PppoeMaLimitState>`
**config**\: False
.. attribute:: session_count
Session Count
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: radius_override_set
Overridden limit has been set
**type**\: int
**range:** \-2147483648..2147483647
**config**\: False
.. attribute:: override_limit
Overridden limit if set
**type**\: int
**range:** 0..4294967295
**config**\: False
"""
_prefix = 'subscriber-pppoe-ma-oper'
_revision = '2019-10-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Pppoe.Nodes.Node.BbaGroups.BbaGroup.Limits.Limit, self).__init__()
self.yang_name = "limit"
self.yang_parent_name = "limits"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('interface_name', (YLeaf(YType.str, 'interface-name'), ['str'])),
('mac_address', (YLeaf(YType.str, 'mac-address'), ['str'])),
('iwf', (YLeaf(YType.boolean, 'iwf'), ['bool'])),
('circuit_id', (YLeaf(YType.str, 'circuit-id'), ['str'])),
('remote_id', (YLeaf(YType.str, 'remote-id'), ['str'])),
('outer_vlan_id', (YLeaf(YType.uint32, 'outer-vlan-id'), ['int'])),
('inner_vlan_id', (YLeaf(YType.uint32, 'inner-vlan-id'), ['int'])),
('state', (YLeaf(YType.enumeration, 'state'), [('ydk.models.cisco_ios_xr.Cisco_IOS_XR_subscriber_pppoe_ma_oper', 'PppoeMaLimitState', '')])),
('session_count', (YLeaf(YType.uint32, 'session-count'), ['int'])),
('radius_override_set', (YLeaf(YType.int32, 'radius-override-set'), ['int'])),
('override_limit', (YLeaf(YType.uint32, 'override-limit'), ['int'])),
])
self.interface_name = None
self.mac_address = None
self.iwf = None
self.circuit_id = None
self.remote_id = None
self.outer_vlan_id = None
self.inner_vlan_id = None
self.state = None
self.session_count = None
self.radius_override_set = None
self.override_limit = None
self._segment_path = lambda: "limit"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Pppoe.Nodes.Node.BbaGroups.BbaGroup.Limits.Limit, ['interface_name', 'mac_address', 'iwf', 'circuit_id', 'remote_id', 'outer_vlan_id', 'inner_vlan_id', 'state', 'session_count', 'radius_override_set', 'override_limit'], name, value)
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_subscriber_pppoe_ma_oper as meta
return meta._meta_table['Pppoe.Nodes.Node.BbaGroups.BbaGroup.Limits.Limit']['meta_info']
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_subscriber_pppoe_ma_oper as meta
return meta._meta_table['Pppoe.Nodes.Node.BbaGroups.BbaGroup.Limits']['meta_info']
class Throttles(_Entity_):
"""
PPPoE throttle information
.. attribute:: throttle
PPPoE session throttle state
**type**\: list of :py:class:`Throttle <ydk.models.cisco_ios_xr.Cisco_IOS_XR_subscriber_pppoe_ma_oper.Pppoe.Nodes.Node.BbaGroups.BbaGroup.Throttles.Throttle>`
**config**\: False
"""
_prefix = 'subscriber-pppoe-ma-oper'
_revision = '2019-10-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Pppoe.Nodes.Node.BbaGroups.BbaGroup.Throttles, self).__init__()
self.yang_name = "throttles"
self.yang_parent_name = "bba-group"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([("throttle", ("throttle", Pppoe.Nodes.Node.BbaGroups.BbaGroup.Throttles.Throttle))])
self._leafs = OrderedDict()
self.throttle = YList(self)
self._segment_path = lambda: "throttles"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Pppoe.Nodes.Node.BbaGroups.BbaGroup.Throttles, [], name, value)
class Throttle(_Entity_):
"""
PPPoE session throttle state
.. attribute:: interface_name
Access Interface
**type**\: str
**pattern:** [a\-zA\-Z0\-9.\_/\-]+
**config**\: False
.. attribute:: mac_address
MAC address
**type**\: str
**pattern:** [0\-9a\-fA\-F]{2}(\:[0\-9a\-fA\-F]{2}){5}
**config**\: False
.. attribute:: iwf
IWF flag
**type**\: bool
**config**\: False
.. attribute:: circuit_id
Circuit ID
**type**\: str
**pattern:** [\\w\\\-\\.\:,\_@#%$\\+=\\\|;]+
**config**\: False
.. attribute:: remote_id
Remote ID
**type**\: str
**pattern:** [\\w\\\-\\.\:,\_@#%$\\+=\\\|;]+
**config**\: False
.. attribute:: outer_vlan_id
Outer VLAN ID
**type**\: int
**range:** 0..4095
**config**\: False
.. attribute:: inner_vlan_id
Inner VLAN ID
**type**\: int
**range:** 0..4095
**config**\: False
.. attribute:: state
State
**type**\: :py:class:`PppoeMaThrottleState <ydk.models.cisco_ios_xr.Cisco_IOS_XR_subscriber_pppoe_ma_oper.PppoeMaThrottleState>`
**config**\: False
.. attribute:: time_left
Time left in seconds
**type**\: int
**range:** 0..4294967295
**config**\: False
**units**\: second
.. attribute:: since_reset
Number of seconds since counters reset
**type**\: int
**range:** 0..4294967295
**config**\: False
**units**\: second
.. attribute:: padi_count
PADI Count
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: padr_count
PADR Count
**type**\: int
**range:** 0..4294967295
**config**\: False
"""
_prefix = 'subscriber-pppoe-ma-oper'
_revision = '2019-10-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Pppoe.Nodes.Node.BbaGroups.BbaGroup.Throttles.Throttle, self).__init__()
self.yang_name = "throttle"
self.yang_parent_name = "throttles"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('interface_name', (YLeaf(YType.str, 'interface-name'), ['str'])),
('mac_address', (YLeaf(YType.str, 'mac-address'), ['str'])),
('iwf', (YLeaf(YType.boolean, 'iwf'), ['bool'])),
('circuit_id', (YLeaf(YType.str, 'circuit-id'), ['str'])),
('remote_id', (YLeaf(YType.str, 'remote-id'), ['str'])),
('outer_vlan_id', (YLeaf(YType.uint32, 'outer-vlan-id'), ['int'])),
('inner_vlan_id', (YLeaf(YType.uint32, 'inner-vlan-id'), ['int'])),
('state', (YLeaf(YType.enumeration, 'state'), [('ydk.models.cisco_ios_xr.Cisco_IOS_XR_subscriber_pppoe_ma_oper', 'PppoeMaThrottleState', '')])),
('time_left', (YLeaf(YType.uint32, 'time-left'), ['int'])),
('since_reset', (YLeaf(YType.uint32, 'since-reset'), ['int'])),
('padi_count', (YLeaf(YType.uint32, 'padi-count'), ['int'])),
('padr_count', (YLeaf(YType.uint32, 'padr-count'), ['int'])),
])
self.interface_name = None
self.mac_address = None
self.iwf = None
self.circuit_id = None
self.remote_id = None
self.outer_vlan_id = None
self.inner_vlan_id = None
self.state = None
self.time_left = None
self.since_reset = None
self.padi_count = None
self.padr_count = None
self._segment_path = lambda: "throttle"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Pppoe.Nodes.Node.BbaGroups.BbaGroup.Throttles.Throttle, ['interface_name', 'mac_address', 'iwf', 'circuit_id', 'remote_id', 'outer_vlan_id', 'inner_vlan_id', 'state', 'time_left', 'since_reset', 'padi_count', 'padr_count'], name, value)
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_subscriber_pppoe_ma_oper as meta
return meta._meta_table['Pppoe.Nodes.Node.BbaGroups.BbaGroup.Throttles.Throttle']['meta_info']
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_subscriber_pppoe_ma_oper as meta
return meta._meta_table['Pppoe.Nodes.Node.BbaGroups.BbaGroup.Throttles']['meta_info']
class ThrottleConfig(_Entity_):
"""
BBA\-Group throttle configuration information
.. attribute:: mac
MAC
**type**\: :py:class:`Mac <ydk.models.cisco_ios_xr.Cisco_IOS_XR_subscriber_pppoe_ma_oper.Pppoe.Nodes.Node.BbaGroups.BbaGroup.ThrottleConfig.Mac>`
**config**\: False
.. attribute:: mac_access_interface
MAC Access Interface
**type**\: :py:class:`MacAccessInterface <ydk.models.cisco_ios_xr.Cisco_IOS_XR_subscriber_pppoe_ma_oper.Pppoe.Nodes.Node.BbaGroups.BbaGroup.ThrottleConfig.MacAccessInterface>`
**config**\: False
.. attribute:: mac_iwf_access_interface
MAC IWF Access Interface
**type**\: :py:class:`MacIwfAccessInterface <ydk.models.cisco_ios_xr.Cisco_IOS_XR_subscriber_pppoe_ma_oper.Pppoe.Nodes.Node.BbaGroups.BbaGroup.ThrottleConfig.MacIwfAccessInterface>`
**config**\: False
.. attribute:: circuit_id
Circuit ID
**type**\: :py:class:`CircuitId <ydk.models.cisco_ios_xr.Cisco_IOS_XR_subscriber_pppoe_ma_oper.Pppoe.Nodes.Node.BbaGroups.BbaGroup.ThrottleConfig.CircuitId>`
**config**\: False
.. attribute:: remote_id
Remote ID
**type**\: :py:class:`RemoteId <ydk.models.cisco_ios_xr.Cisco_IOS_XR_subscriber_pppoe_ma_oper.Pppoe.Nodes.Node.BbaGroups.BbaGroup.ThrottleConfig.RemoteId>`
**config**\: False
.. attribute:: circuit_id_and_remote_id
Circuit ID and Remote ID
**type**\: :py:class:`CircuitIdAndRemoteId <ydk.models.cisco_ios_xr.Cisco_IOS_XR_subscriber_pppoe_ma_oper.Pppoe.Nodes.Node.BbaGroups.BbaGroup.ThrottleConfig.CircuitIdAndRemoteId>`
**config**\: False
.. attribute:: outer_vlan_id
Outer VLAN ID
**type**\: :py:class:`OuterVlanId <ydk.models.cisco_ios_xr.Cisco_IOS_XR_subscriber_pppoe_ma_oper.Pppoe.Nodes.Node.BbaGroups.BbaGroup.ThrottleConfig.OuterVlanId>`
**config**\: False
.. attribute:: inner_vlan_id
Inner VLAN ID
**type**\: :py:class:`InnerVlanId <ydk.models.cisco_ios_xr.Cisco_IOS_XR_subscriber_pppoe_ma_oper.Pppoe.Nodes.Node.BbaGroups.BbaGroup.ThrottleConfig.InnerVlanId>`
**config**\: False
.. attribute:: vlan_id
VLAN ID
**type**\: :py:class:`VlanId <ydk.models.cisco_ios_xr.Cisco_IOS_XR_subscriber_pppoe_ma_oper.Pppoe.Nodes.Node.BbaGroups.BbaGroup.ThrottleConfig.VlanId>`
**config**\: False
"""
_prefix = 'subscriber-pppoe-ma-oper'
_revision = '2019-10-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Pppoe.Nodes.Node.BbaGroups.BbaGroup.ThrottleConfig, self).__init__()
self.yang_name = "throttle-config"
self.yang_parent_name = "bba-group"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([("mac", ("mac", Pppoe.Nodes.Node.BbaGroups.BbaGroup.ThrottleConfig.Mac)), ("mac-access-interface", ("mac_access_interface", Pppoe.Nodes.Node.BbaGroups.BbaGroup.ThrottleConfig.MacAccessInterface)), ("mac-iwf-access-interface", ("mac_iwf_access_interface", Pppoe.Nodes.Node.BbaGroups.BbaGroup.ThrottleConfig.MacIwfAccessInterface)), ("circuit-id", ("circuit_id", Pppoe.Nodes.Node.BbaGroups.BbaGroup.ThrottleConfig.CircuitId)), ("remote-id", ("remote_id", Pppoe.Nodes.Node.BbaGroups.BbaGroup.ThrottleConfig.RemoteId)), ("circuit-id-and-remote-id", ("circuit_id_and_remote_id", Pppoe.Nodes.Node.BbaGroups.BbaGroup.ThrottleConfig.CircuitIdAndRemoteId)), ("outer-vlan-id", ("outer_vlan_id", Pppoe.Nodes.Node.BbaGroups.BbaGroup.ThrottleConfig.OuterVlanId)), ("inner-vlan-id", ("inner_vlan_id", Pppoe.Nodes.Node.BbaGroups.BbaGroup.ThrottleConfig.InnerVlanId)), ("vlan-id", ("vlan_id", Pppoe.Nodes.Node.BbaGroups.BbaGroup.ThrottleConfig.VlanId))])
self._leafs = OrderedDict()
self.mac = Pppoe.Nodes.Node.BbaGroups.BbaGroup.ThrottleConfig.Mac()
self.mac.parent = self
self._children_name_map["mac"] = "mac"
self.mac_access_interface = Pppoe.Nodes.Node.BbaGroups.BbaGroup.ThrottleConfig.MacAccessInterface()
self.mac_access_interface.parent = self
self._children_name_map["mac_access_interface"] = "mac-access-interface"
self.mac_iwf_access_interface = Pppoe.Nodes.Node.BbaGroups.BbaGroup.ThrottleConfig.MacIwfAccessInterface()
self.mac_iwf_access_interface.parent = self
self._children_name_map["mac_iwf_access_interface"] = "mac-iwf-access-interface"
self.circuit_id = Pppoe.Nodes.Node.BbaGroups.BbaGroup.ThrottleConfig.CircuitId()
self.circuit_id.parent = self
self._children_name_map["circuit_id"] = "circuit-id"
self.remote_id = Pppoe.Nodes.Node.BbaGroups.BbaGroup.ThrottleConfig.RemoteId()
self.remote_id.parent = self
self._children_name_map["remote_id"] = "remote-id"
self.circuit_id_and_remote_id = Pppoe.Nodes.Node.BbaGroups.BbaGroup.ThrottleConfig.CircuitIdAndRemoteId()
self.circuit_id_and_remote_id.parent = self
self._children_name_map["circuit_id_and_remote_id"] = "circuit-id-and-remote-id"
self.outer_vlan_id = Pppoe.Nodes.Node.BbaGroups.BbaGroup.ThrottleConfig.OuterVlanId()
self.outer_vlan_id.parent = self
self._children_name_map["outer_vlan_id"] = "outer-vlan-id"
self.inner_vlan_id = Pppoe.Nodes.Node.BbaGroups.BbaGroup.ThrottleConfig.InnerVlanId()
self.inner_vlan_id.parent = self
self._children_name_map["inner_vlan_id"] = "inner-vlan-id"
self.vlan_id = Pppoe.Nodes.Node.BbaGroups.BbaGroup.ThrottleConfig.VlanId()
self.vlan_id.parent = self
self._children_name_map["vlan_id"] = "vlan-id"
self._segment_path = lambda: "throttle-config"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Pppoe.Nodes.Node.BbaGroups.BbaGroup.ThrottleConfig, [], name, value)
class Mac(_Entity_):
"""
MAC
.. attribute:: limit
Limit
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: request_period
Request Period
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: blocking_period
Blocking Period
**type**\: int
**range:** 0..4294967295
**config**\: False
"""
_prefix = 'subscriber-pppoe-ma-oper'
_revision = '2019-10-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Pppoe.Nodes.Node.BbaGroups.BbaGroup.ThrottleConfig.Mac, self).__init__()
self.yang_name = "mac"
self.yang_parent_name = "throttle-config"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('limit', (YLeaf(YType.uint32, 'limit'), ['int'])),
('request_period', (YLeaf(YType.uint32, 'request-period'), ['int'])),
('blocking_period', (YLeaf(YType.uint32, 'blocking-period'), ['int'])),
])
self.limit = None
self.request_period = None
self.blocking_period = None
self._segment_path = lambda: "mac"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Pppoe.Nodes.Node.BbaGroups.BbaGroup.ThrottleConfig.Mac, ['limit', 'request_period', 'blocking_period'], name, value)
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_subscriber_pppoe_ma_oper as meta
return meta._meta_table['Pppoe.Nodes.Node.BbaGroups.BbaGroup.ThrottleConfig.Mac']['meta_info']
class MacAccessInterface(_Entity_):
"""
MAC Access Interface
.. attribute:: limit
Limit
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: request_period
Request Period
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: blocking_period
Blocking Period
**type**\: int
**range:** 0..4294967295
**config**\: False
"""
_prefix = 'subscriber-pppoe-ma-oper'
_revision = '2019-10-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Pppoe.Nodes.Node.BbaGroups.BbaGroup.ThrottleConfig.MacAccessInterface, self).__init__()
self.yang_name = "mac-access-interface"
self.yang_parent_name = "throttle-config"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('limit', (YLeaf(YType.uint32, 'limit'), ['int'])),
('request_period', (YLeaf(YType.uint32, 'request-period'), ['int'])),
('blocking_period', (YLeaf(YType.uint32, 'blocking-period'), ['int'])),
])
self.limit = None
self.request_period = None
self.blocking_period = None
self._segment_path = lambda: "mac-access-interface"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Pppoe.Nodes.Node.BbaGroups.BbaGroup.ThrottleConfig.MacAccessInterface, ['limit', 'request_period', 'blocking_period'], name, value)
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_subscriber_pppoe_ma_oper as meta
return meta._meta_table['Pppoe.Nodes.Node.BbaGroups.BbaGroup.ThrottleConfig.MacAccessInterface']['meta_info']
class MacIwfAccessInterface(_Entity_):
"""
MAC IWF Access Interface
.. attribute:: limit
Limit
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: request_period
Request Period
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: blocking_period
Blocking Period
**type**\: int
**range:** 0..4294967295
**config**\: False
"""
_prefix = 'subscriber-pppoe-ma-oper'
_revision = '2019-10-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Pppoe.Nodes.Node.BbaGroups.BbaGroup.ThrottleConfig.MacIwfAccessInterface, self).__init__()
self.yang_name = "mac-iwf-access-interface"
self.yang_parent_name = "throttle-config"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('limit', (YLeaf(YType.uint32, 'limit'), ['int'])),
('request_period', (YLeaf(YType.uint32, 'request-period'), ['int'])),
('blocking_period', (YLeaf(YType.uint32, 'blocking-period'), ['int'])),
])
self.limit = None
self.request_period = None
self.blocking_period = None
self._segment_path = lambda: "mac-iwf-access-interface"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Pppoe.Nodes.Node.BbaGroups.BbaGroup.ThrottleConfig.MacIwfAccessInterface, ['limit', 'request_period', 'blocking_period'], name, value)
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_subscriber_pppoe_ma_oper as meta
return meta._meta_table['Pppoe.Nodes.Node.BbaGroups.BbaGroup.ThrottleConfig.MacIwfAccessInterface']['meta_info']
class CircuitId(_Entity_):
"""
Circuit ID
.. attribute:: limit
Limit
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: request_period
Request Period
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: blocking_period
Blocking Period
**type**\: int
**range:** 0..4294967295
**config**\: False
"""
_prefix = 'subscriber-pppoe-ma-oper'
_revision = '2019-10-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Pppoe.Nodes.Node.BbaGroups.BbaGroup.ThrottleConfig.CircuitId, self).__init__()
self.yang_name = "circuit-id"
self.yang_parent_name = "throttle-config"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('limit', (YLeaf(YType.uint32, 'limit'), ['int'])),
('request_period', (YLeaf(YType.uint32, 'request-period'), ['int'])),
('blocking_period', (YLeaf(YType.uint32, 'blocking-period'), ['int'])),
])
self.limit = None
self.request_period = None
self.blocking_period = None
self._segment_path = lambda: "circuit-id"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Pppoe.Nodes.Node.BbaGroups.BbaGroup.ThrottleConfig.CircuitId, ['limit', 'request_period', 'blocking_period'], name, value)
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_subscriber_pppoe_ma_oper as meta
return meta._meta_table['Pppoe.Nodes.Node.BbaGroups.BbaGroup.ThrottleConfig.CircuitId']['meta_info']
class RemoteId(_Entity_):
"""
Remote ID
.. attribute:: limit
Limit
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: request_period
Request Period
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: blocking_period
Blocking Period
**type**\: int
**range:** 0..4294967295
**config**\: False
"""
_prefix = 'subscriber-pppoe-ma-oper'
_revision = '2019-10-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Pppoe.Nodes.Node.BbaGroups.BbaGroup.ThrottleConfig.RemoteId, self).__init__()
self.yang_name = "remote-id"
self.yang_parent_name = "throttle-config"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('limit', (YLeaf(YType.uint32, 'limit'), ['int'])),
('request_period', (YLeaf(YType.uint32, 'request-period'), ['int'])),
('blocking_period', (YLeaf(YType.uint32, 'blocking-period'), ['int'])),
])
self.limit = None
self.request_period = None
self.blocking_period = None
self._segment_path = lambda: "remote-id"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Pppoe.Nodes.Node.BbaGroups.BbaGroup.ThrottleConfig.RemoteId, ['limit', 'request_period', 'blocking_period'], name, value)
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_subscriber_pppoe_ma_oper as meta
return meta._meta_table['Pppoe.Nodes.Node.BbaGroups.BbaGroup.ThrottleConfig.RemoteId']['meta_info']
class CircuitIdAndRemoteId(_Entity_):
"""
Circuit ID and Remote ID
.. attribute:: limit
Limit
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: request_period
Request Period
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: blocking_period
Blocking Period
**type**\: int
**range:** 0..4294967295
**config**\: False
"""
_prefix = 'subscriber-pppoe-ma-oper'
_revision = '2019-10-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Pppoe.Nodes.Node.BbaGroups.BbaGroup.ThrottleConfig.CircuitIdAndRemoteId, self).__init__()
self.yang_name = "circuit-id-and-remote-id"
self.yang_parent_name = "throttle-config"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('limit', (YLeaf(YType.uint32, 'limit'), ['int'])),
('request_period', (YLeaf(YType.uint32, 'request-period'), ['int'])),
('blocking_period', (YLeaf(YType.uint32, 'blocking-period'), ['int'])),
])
self.limit = None
self.request_period = None
self.blocking_period = None
self._segment_path = lambda: "circuit-id-and-remote-id"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Pppoe.Nodes.Node.BbaGroups.BbaGroup.ThrottleConfig.CircuitIdAndRemoteId, ['limit', 'request_period', 'blocking_period'], name, value)
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_subscriber_pppoe_ma_oper as meta
return meta._meta_table['Pppoe.Nodes.Node.BbaGroups.BbaGroup.ThrottleConfig.CircuitIdAndRemoteId']['meta_info']
class OuterVlanId(_Entity_):
"""
Outer VLAN ID
.. attribute:: limit
Limit
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: request_period
Request Period
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: blocking_period
Blocking Period
**type**\: int
**range:** 0..4294967295
**config**\: False
"""
_prefix = 'subscriber-pppoe-ma-oper'
_revision = '2019-10-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Pppoe.Nodes.Node.BbaGroups.BbaGroup.ThrottleConfig.OuterVlanId, self).__init__()
self.yang_name = "outer-vlan-id"
self.yang_parent_name = "throttle-config"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('limit', (YLeaf(YType.uint32, 'limit'), ['int'])),
('request_period', (YLeaf(YType.uint32, 'request-period'), ['int'])),
('blocking_period', (YLeaf(YType.uint32, 'blocking-period'), ['int'])),
])
self.limit = None
self.request_period = None
self.blocking_period = None
self._segment_path = lambda: "outer-vlan-id"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Pppoe.Nodes.Node.BbaGroups.BbaGroup.ThrottleConfig.OuterVlanId, ['limit', 'request_period', 'blocking_period'], name, value)
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_subscriber_pppoe_ma_oper as meta
return meta._meta_table['Pppoe.Nodes.Node.BbaGroups.BbaGroup.ThrottleConfig.OuterVlanId']['meta_info']
class InnerVlanId(_Entity_):
"""
Inner VLAN ID
.. attribute:: limit
Limit
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: request_period
Request Period
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: blocking_period
Blocking Period
**type**\: int
**range:** 0..4294967295
**config**\: False
"""
_prefix = 'subscriber-pppoe-ma-oper'
_revision = '2019-10-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Pppoe.Nodes.Node.BbaGroups.BbaGroup.ThrottleConfig.InnerVlanId, self).__init__()
self.yang_name = "inner-vlan-id"
self.yang_parent_name = "throttle-config"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('limit', (YLeaf(YType.uint32, 'limit'), ['int'])),
('request_period', (YLeaf(YType.uint32, 'request-period'), ['int'])),
('blocking_period', (YLeaf(YType.uint32, 'blocking-period'), ['int'])),
])
self.limit = None
self.request_period = None
self.blocking_period = None
self._segment_path = lambda: "inner-vlan-id"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Pppoe.Nodes.Node.BbaGroups.BbaGroup.ThrottleConfig.InnerVlanId, ['limit', 'request_period', 'blocking_period'], name, value)
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_subscriber_pppoe_ma_oper as meta
return meta._meta_table['Pppoe.Nodes.Node.BbaGroups.BbaGroup.ThrottleConfig.InnerVlanId']['meta_info']
class VlanId(_Entity_):
"""
VLAN ID
.. attribute:: limit
Limit
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: request_period
Request Period
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: blocking_period
Blocking Period
**type**\: int
**range:** 0..4294967295
**config**\: False
"""
_prefix = 'subscriber-pppoe-ma-oper'
_revision = '2019-10-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Pppoe.Nodes.Node.BbaGroups.BbaGroup.ThrottleConfig.VlanId, self).__init__()
self.yang_name = "vlan-id"
self.yang_parent_name = "throttle-config"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('limit', (YLeaf(YType.uint32, 'limit'), ['int'])),
('request_period', (YLeaf(YType.uint32, 'request-period'), ['int'])),
('blocking_period', (YLeaf(YType.uint32, 'blocking-period'), ['int'])),
])
self.limit = None
self.request_period = None
self.blocking_period = None
self._segment_path = lambda: "vlan-id"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Pppoe.Nodes.Node.BbaGroups.BbaGroup.ThrottleConfig.VlanId, ['limit', 'request_period', 'blocking_period'], name, value)
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_subscriber_pppoe_ma_oper as meta
return meta._meta_table['Pppoe.Nodes.Node.BbaGroups.BbaGroup.ThrottleConfig.VlanId']['meta_info']
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_subscriber_pppoe_ma_oper as meta
return meta._meta_table['Pppoe.Nodes.Node.BbaGroups.BbaGroup.ThrottleConfig']['meta_info']
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_subscriber_pppoe_ma_oper as meta
return meta._meta_table['Pppoe.Nodes.Node.BbaGroups.BbaGroup']['meta_info']
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_subscriber_pppoe_ma_oper as meta
return meta._meta_table['Pppoe.Nodes.Node.BbaGroups']['meta_info']
class SummaryTotal(_Entity_):
"""
PPPoE statistics for a given node
.. attribute:: ready_access_interfaces
Ready Access Interface Count
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: not_ready_access_interfaces
Not Ready Access Interface Count
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: complete_sessions
Complete Session Count
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: incomplete_sessions
Incomplete Session Count
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: flow_control_limit
Flow Control credit limit
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: flow_control_in_flight_sessions
Flow Control In\-Flight Count
**type**\: int
**range:** 0..4294967295
**config**\: False
.. attribute:: flow_control_dropped_sessions
Flow Control Drop Count
**type**\: int
**range:** 0..18446744073709551615
**config**\: False
.. attribute:: flow_control_disconnected_sessions
Flow Control Disconnected Count
**type**\: int
**range:** 0..18446744073709551615
**config**\: False
.. attribute:: flow_control_successful_sessions
Flow Control Success Count, sessions completing call flow
**type**\: int
**range:** 0..18446744073709551615
**config**\: False
.. attribute:: pppoema_subscriber_infra_flow_control
PPPoEMASubscriberInfraFlowControl
**type**\: int
**range:** 0..4294967295
**config**\: False
"""
_prefix = 'subscriber-pppoe-ma-oper'
_revision = '2019-10-07'
def __init__(self):
if sys.version_info > (3,):
super().__init__()
else:
super(Pppoe.Nodes.Node.SummaryTotal, self).__init__()
self.yang_name = "summary-total"
self.yang_parent_name = "node"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_classes = OrderedDict([])
self._leafs = OrderedDict([
('ready_access_interfaces', (YLeaf(YType.uint32, 'ready-access-interfaces'), ['int'])),
('not_ready_access_interfaces', (YLeaf(YType.uint32, 'not-ready-access-interfaces'), ['int'])),
('complete_sessions', (YLeaf(YType.uint32, 'complete-sessions'), ['int'])),
('incomplete_sessions', (YLeaf(YType.uint32, 'incomplete-sessions'), ['int'])),
('flow_control_limit', (YLeaf(YType.uint32, 'flow-control-limit'), ['int'])),
('flow_control_in_flight_sessions', (YLeaf(YType.uint32, 'flow-control-in-flight-sessions'), ['int'])),
('flow_control_dropped_sessions', (YLeaf(YType.uint64, 'flow-control-dropped-sessions'), ['int'])),
('flow_control_disconnected_sessions', (YLeaf(YType.uint64, 'flow-control-disconnected-sessions'), ['int'])),
('flow_control_successful_sessions', (YLeaf(YType.uint64, 'flow-control-successful-sessions'), ['int'])),
('pppoema_subscriber_infra_flow_control', (YLeaf(YType.uint32, 'pppoema-subscriber-infra-flow-control'), ['int'])),
])
self.ready_access_interfaces = None
self.not_ready_access_interfaces = None
self.complete_sessions = None
self.incomplete_sessions = None
self.flow_control_limit = None
self.flow_control_in_flight_sessions = None
self.flow_control_dropped_sessions = None
self.flow_control_disconnected_sessions = None
self.flow_control_successful_sessions = None
self.pppoema_subscriber_infra_flow_control = None
self._segment_path = lambda: "summary-total"
self._is_frozen = True
def __setattr__(self, name, value):
self._perform_setattr(Pppoe.Nodes.Node.SummaryTotal, ['ready_access_interfaces', 'not_ready_access_interfaces', 'complete_sessions', 'incomplete_sessions', 'flow_control_limit', 'flow_control_in_flight_sessions', 'flow_control_dropped_sessions', 'flow_control_disconnected_sessions', 'flow_control_successful_sessions', 'pppoema_subscriber_infra_flow_control'], name, value)
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_subscriber_pppoe_ma_oper as meta
return meta._meta_table['Pppoe.Nodes.Node.SummaryTotal']['meta_info']
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_subscriber_pppoe_ma_oper as meta
return meta._meta_table['Pppoe.Nodes.Node']['meta_info']
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_subscriber_pppoe_ma_oper as meta
return meta._meta_table['Pppoe.Nodes']['meta_info']
def clone_ptr(self):
self._top_entity = Pppoe()
return self._top_entity
@staticmethod
def _meta_info():
from ydk.models.cisco_ios_xr._meta import _Cisco_IOS_XR_subscriber_pppoe_ma_oper as meta
return meta._meta_table['Pppoe']['meta_info']
| 47.425273 | 1,507 | 0.38506 | 29,016 | 412,837 | 5.173422 | 0.014371 | 0.037445 | 0.05769 | 0.023036 | 0.928613 | 0.898269 | 0.868232 | 0.842278 | 0.82168 | 0.805525 | 0 | 0.034276 | 0.524253 | 412,837 | 8,704 | 1,508 | 47.430722 | 0.730018 | 0.201857 | 0 | 0.770967 | 0 | 0.000337 | 0.160512 | 0.073477 | 0 | 0 | 0 | 0 | 0 | 1 | 0.078814 | false | 0 | 0.029976 | 0 | 0.177501 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
70c71f6a2a9fedcf44a326e3f8807e55775d225f | 11,048 | py | Python | http_test.py | is-already-taken/py-jira-cli | 1f34cad61f8c20064024d7755d2ff8c56ae94a2a | [
"MIT"
] | null | null | null | http_test.py | is-already-taken/py-jira-cli | 1f34cad61f8c20064024d7755d2ff8c56ae94a2a | [
"MIT"
] | null | null | null | http_test.py | is-already-taken/py-jira-cli | 1f34cad61f8c20064024d7755d2ff8c56ae94a2a | [
"MIT"
] | null | null | null |
import unittest
from mock import patch, Mock, MagicMock
import fudge
from fudge.inspector import arg
import pycurl
import http
import mox
class HttpTest(unittest.TestCase):
@fudge.patch("pycurl.Curl")
@fudge.patch("StringIO.StringIO")
def test_method_post(self, Curl_Mock, StringIO_Mock):
(StringIO_Mock.expects_call()
.returns_fake()
.provides("write")
.provides("getvalue").returns("RESPONSE DATA"))
mock = (Curl_Mock.expects_call()
.returns_fake())
# the method under test will call a method that has
# been passed via setopt(pycurl.WRITEFUNCTION, ...)
# Intercept the call to define cookies that the
# server would send
def get_header_fn(option, value):
if option == pycurl.HEADERFUNCTION:
value("Set-Cookie: new-cookie=1")
mock.expects('setopt').with_args(pycurl.VERBOSE, 0)
mock.expects('setopt').with_args(pycurl.URL, "http://host/path")
mock.expects('setopt').with_args(pycurl.POST, 1)
mock.expects('setopt').with_args(pycurl.POSTFIELDSIZE, len("REQUEST DATA"))
mock.expects('setopt').with_args(pycurl.POSTFIELDS, "REQUEST DATA")
mock.expects('setopt').with_args(pycurl.HTTPHEADER, ["Hdr-A: 1", "Hdr-B: 2"])
mock.expects('setopt').with_args(pycurl.COOKIE, "")
mock.provides('setopt').calls(get_header_fn)
mock.expects('setopt').with_args(pycurl.WRITEFUNCTION, arg.any())
mock.expects('perform')
mock.has_attr(RESPONSE_CODE="RESPONSE_CODE_MOCK")
mock.provides('getinfo').calls(lambda _: 200)
h = http.Http()
(status, buf, cookies) = h.method("POST", "http://host/path", "REQUEST DATA", headers=["Hdr-A: 1", "Hdr-B: 2"])
assert status == 200
assert buf == "RESPONSE DATA"
assert cookies == ["new-cookie=1"]
@fudge.patch("pycurl.Curl")
@fudge.patch("StringIO.StringIO")
def test_method_post_dict_data(self, Curl_Mock, StringIO_Mock):
(StringIO_Mock.expects_call()
.returns_fake()
.provides("write")
.provides("getvalue").returns("RESPONSE DATA"))
mock = (Curl_Mock.expects_call()
.returns_fake())
# the method under test will call a method that has
# been passed via setopt(pycurl.WRITEFUNCTION, ...)
# Intercept the call to define cookies that the
# server would send
def get_header_fn(option, value):
if option == pycurl.HEADERFUNCTION:
value("Set-Cookie: new-cookie=1")
mock.expects('setopt').with_args(pycurl.VERBOSE, 0)
mock.expects('setopt').with_args(pycurl.URL, "http://host/path")
mock.expects('setopt').with_args(pycurl.POST, 1)
mock.expects('setopt').with_args(pycurl.POSTFIELDSIZE, len("data1=A&data2=B"))
mock.expects('setopt').with_args(pycurl.POSTFIELDS, "data1=A&data2=B")
mock.expects('setopt').with_args(pycurl.HTTPHEADER, [
"Hdr-A: 1",
"Hdr-B: 2",
"Content-Type: application/x-www-form-urlencoded; charset=UTF-8"
])
mock.expects('setopt').with_args(pycurl.COOKIE, "")
mock.provides('setopt').calls(get_header_fn)
mock.expects('setopt').with_args(pycurl.WRITEFUNCTION, arg.any())
mock.expects('perform')
mock.has_attr(RESPONSE_CODE="RESPONSE_CODE_MOCK")
mock.provides('getinfo').calls(lambda _: 200)
h = http.Http()
(status, buf, cookies) = h.method("POST", "http://host/path", {"data1": "A", "data2": "B"}, headers=["Hdr-A: 1", "Hdr-B: 2"])
assert status == 200
assert buf == "RESPONSE DATA"
assert cookies == ["new-cookie=1"]
@fudge.patch("pycurl.Curl")
@fudge.patch("StringIO.StringIO")
def test_method_post_no_data(self, Curl_Mock, StringIO_Mock):
mock = (Curl_Mock.expects_call()
.returns_fake())
mock.expects('setopt').with_args(pycurl.VERBOSE, 0)
mock.expects('setopt').with_args(pycurl.URL, "http://host/path")
h = http.Http()
self.assertRaises(Exception, h.method, "POST", None)
@fudge.patch("pycurl.Curl")
@fudge.patch("StringIO.StringIO")
def test_method_put(self, Curl_Mock, StringIO_Mock):
(StringIO_Mock.expects_call()
.returns_fake()
.provides("write")
.provides("getvalue").returns("RESPONSE DATA"))
mock = (Curl_Mock.expects_call()
.returns_fake())
# the method under test will call a method that has
# been passed via setopt(pycurl.WRITEFUNCTION, ...)
# Intercept the call to define cookies that the
# server would send
def get_header_fn(option, value):
if option == pycurl.HEADERFUNCTION:
value("Set-Cookie: new-cookie=1")
mock.expects('setopt').with_args(pycurl.VERBOSE, 0)
mock.expects('setopt').with_args(pycurl.URL, "http://host/path")
mock.expects('setopt').with_args(pycurl.CUSTOMREQUEST, "PUT")
mock.expects('setopt').with_args(pycurl.POSTFIELDSIZE, len("REQUEST DATA"))
mock.expects('setopt').with_args(pycurl.POSTFIELDS, "REQUEST DATA")
mock.expects('setopt').with_args(pycurl.HTTPHEADER, ["Hdr-A: 1", "Hdr-B: 2"])
mock.expects('setopt').with_args(pycurl.COOKIE, "")
mock.provides('setopt').calls(get_header_fn)
mock.expects('setopt').with_args(pycurl.WRITEFUNCTION, arg.any())
mock.expects('perform')
mock.has_attr(RESPONSE_CODE="RESPONSE_CODE_MOCK")
mock.provides('getinfo').calls(lambda _: 200)
h = http.Http()
(status, buf, cookies) = h.method("PUT", "http://host/path", "REQUEST DATA", headers=["Hdr-A: 1", "Hdr-B: 2"])
assert status == 200
assert buf == "RESPONSE DATA"
assert cookies == ["new-cookie=1"]
@fudge.patch("pycurl.Curl")
@fudge.patch("StringIO.StringIO")
def test_method_put_dict_data(self, Curl_Mock, StringIO_Mock):
(StringIO_Mock.expects_call()
.returns_fake()
.provides("write")
.provides("getvalue").returns("RESPONSE DATA"))
mock = (Curl_Mock.expects_call()
.returns_fake())
# the method under test will call a method that has
# been passed via setopt(pycurl.WRITEFUNCTION, ...)
# Intercept the call to define cookies that the
# server would send
def get_header_fn(option, value):
if option == pycurl.HEADERFUNCTION:
value("Set-Cookie: new-cookie=1")
mock.expects('setopt').with_args(pycurl.VERBOSE, 0)
mock.expects('setopt').with_args(pycurl.URL, "http://host/path")
mock.expects('setopt').with_args(pycurl.CUSTOMREQUEST, "PUT")
mock.expects('setopt').with_args(pycurl.POSTFIELDSIZE, len("data1=A&data2=B"))
mock.expects('setopt').with_args(pycurl.POSTFIELDS, "data1=A&data2=B")
mock.expects('setopt').with_args(pycurl.HTTPHEADER, [
"Hdr-A: 1",
"Hdr-B: 2",
"Content-Type: application/x-www-form-urlencoded; charset=UTF-8"
])
mock.expects('setopt').with_args(pycurl.COOKIE, "")
mock.provides('setopt').calls(get_header_fn)
mock.expects('setopt').with_args(pycurl.WRITEFUNCTION, arg.any())
mock.expects('perform')
mock.has_attr(RESPONSE_CODE="RESPONSE_CODE_MOCK")
mock.provides('getinfo').calls(lambda _: 200)
h = http.Http()
(status, buf, cookies) = h.method("PUT", "http://host/path", {"data1": "A", "data2": "B"}, headers=["Hdr-A: 1", "Hdr-B: 2"])
assert status == 200
assert buf == "RESPONSE DATA"
assert cookies == ["new-cookie=1"]
@fudge.patch("pycurl.Curl")
@fudge.patch("StringIO.StringIO")
def test_method_post_no_data(self, Curl_Mock, StringIO_Mock):
mock = (Curl_Mock.expects_call()
.returns_fake())
mock.expects('setopt').with_args(pycurl.VERBOSE, 0)
mock.expects('setopt').with_args(pycurl.URL, "http://host/path")
h = http.Http()
self.assertRaises(Exception, h.method, "PUT", None)
@fudge.patch("pycurl.Curl")
@fudge.patch("StringIO.StringIO")
def test_method_get(self, Curl_Mock, StringIO_Mock):
(StringIO_Mock.expects_call()
.returns_fake()
.provides("write")
.provides("getvalue").returns("RESPONSE DATA"))
mock = (Curl_Mock.expects_call()
.returns_fake())
# the method under test will call a method that has
# been passed via setopt(pycurl.WRITEFUNCTION, ...)
# Intercept the call to define cookies that the
# server would send
def get_header_fn(option, value):
if option == pycurl.HEADERFUNCTION:
value("Set-Cookie: new-cookie=1")
mock.expects('setopt').with_args(pycurl.VERBOSE, 0)
mock.expects('setopt').with_args(pycurl.URL, "http://host/path")
mock.expects('setopt').with_args(pycurl.HTTPGET, 1)
mock.expects('setopt').with_args(pycurl.HTTPHEADER, [
"Hdr-A: 1",
"Hdr-B: 2"
])
mock.expects('setopt').with_args(pycurl.COOKIE, "")
mock.provides('setopt').calls(get_header_fn)
mock.expects('setopt').with_args(pycurl.WRITEFUNCTION, arg.any())
mock.expects('perform')
mock.has_attr(RESPONSE_CODE="RESPONSE_CODE_MOCK")
mock.provides('getinfo').calls(lambda _: 200)
h = http.Http()
(status, buf, cookies) = h.method("GET", "http://host/path", headers=["Hdr-A: 1", "Hdr-B: 2"])
assert status == 200
assert buf == "RESPONSE DATA"
assert cookies == ["new-cookie=1"]
@fudge.patch("pycurl.Curl")
@fudge.patch("StringIO.StringIO")
def test_method_raise_http_exception_on_perform_fail(self, Curl_Mock, StringIO_Mock):
(StringIO_Mock.expects_call()
.returns_fake()
.provides("write")
.provides("getvalue").returns("RESPONSE DATA"))
mock = (Curl_Mock.expects_call()
.returns_fake())
def raise_perform_error():
raise pycurl.error(42, "PyCURL Exception")
mock.provides('setopt')
mock.provides('perform').calls(raise_perform_error)
mock.has_attr(RESPONSE_CODE="RESPONSE_CODE_MOCK")
h = http.Http()
self.assertRaises(http.HttpError, h.method, "GET", None)
@fudge.patch("pycurl.Curl")
@fudge.patch("StringIO.StringIO")
def test_method_raise_original_exception_on_perform_fail(self, Curl_Mock, StringIO_Mock):
(StringIO_Mock.expects_call()
.returns_fake()
.provides("write")
.provides("getvalue").returns("RESPONSE DATA"))
mock = (Curl_Mock.expects_call()
.returns_fake())
class CustomException(Exception):
def __init__(self, msg):
pass
def raise_perform_error():
raise CustomException("Custom exception")
mock.provides('setopt')
mock.provides('perform').calls(raise_perform_error)
mock.has_attr(RESPONSE_CODE="RESPONSE_CODE_MOCK")
h = http.Http()
self.assertRaises(CustomException, h.method, "GET", None)
def setUp(self):
self.mox = mox.Mox()
def tearDown(self):
self.mox.UnsetStubs()
def test_get(self):
h = http.Http()
self.mox.StubOutWithMock(h, "method")
h.method("GET", "http://host/path", None, ["Hdr-A: 1"], ["cookie=abc"]).AndReturn( (200, "Response", ["set-cookie-a=1", "set-cookie-b=2"]) )
self.mox.ReplayAll()
h.get("http://host/path", ["Hdr-A: 1"], ["cookie=abc"])
self.mox.VerifyAll()
def test_post(self):
h = http.Http()
self.mox.StubOutWithMock(h, "method")
h.method("POST", "http://host/path", "DATA", ["Hdr-A: 1"], ["cookie=abc"]).AndReturn( (200, "Response", ["set-cookie-a=1", "set-cookie-b=2"]) )
self.mox.ReplayAll()
h.post("http://host/path", "DATA", ["Hdr-A: 1"], ["cookie=abc"])
self.mox.VerifyAll()
def test_put(self):
h = http.Http()
self.mox.StubOutWithMock(h, "method")
h.method("PUT", "http://host/path", "DATA", ["Hdr-A: 1"], ["cookie=abc"]).AndReturn( (200, "Response", ["set-cookie-a=1", "set-cookie-b=2"]) )
self.mox.ReplayAll()
h.put("http://host/path", "DATA", ["Hdr-A: 1"], ["cookie=abc"])
self.mox.VerifyAll()
| 31.033708 | 145 | 0.694334 | 1,538 | 11,048 | 4.849805 | 0.083225 | 0.092908 | 0.095723 | 0.118246 | 0.945703 | 0.939 | 0.934978 | 0.934844 | 0.934844 | 0.934844 | 0 | 0.011075 | 0.125543 | 11,048 | 355 | 146 | 31.121127 | 0.760998 | 0.074131 | 0 | 0.816667 | 0 | 0 | 0.219359 | 0.006662 | 0 | 0 | 0 | 0 | 0.079167 | 1 | 0.091667 | false | 0.004167 | 0.029167 | 0 | 0.129167 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
cb106380db181ede1cf45d142b895ff39406132c | 3,763 | py | Python | mitorch/models/darknet.py | shonohs/shtorch_models | afc20dbc9fd272a74ca08ec460401721b1476e72 | [
"MIT"
] | 3 | 2020-06-26T11:30:33.000Z | 2021-01-23T07:42:25.000Z | mitorch/models/darknet.py | shonohs/shtorch_models | afc20dbc9fd272a74ca08ec460401721b1476e72 | [
"MIT"
] | null | null | null | mitorch/models/darknet.py | shonohs/shtorch_models | afc20dbc9fd272a74ca08ec460401721b1476e72 | [
"MIT"
] | 2 | 2020-04-22T16:19:28.000Z | 2020-06-11T09:06:37.000Z | """
YOLO9000: Better, Faster, Stronger (https://arxiv.org/pdf/1612.08242)
"""
import collections
import torch
from .model import Model
from .modules import Conv2dAct
class Darknet19(Model):
def __init__(self):
super().__init__(1024)
self.features = torch.nn.Sequential(collections.OrderedDict([
('conv0', Conv2dAct(3, 32, kernel_size=3, padding=1, activation='leaky_relu')),
('pool0', torch.nn.MaxPool2d(kernel_size=2, stride=2)),
('conv1', Conv2dAct(32, 64, kernel_size=3, padding=1, activation='leaky_relu')),
('pool1', torch.nn.MaxPool2d(kernel_size=2, stride=2)),
('conv2', Conv2dAct(64, 128, kernel_size=3, padding=1, activation='leaky_relu')),
('conv3', Conv2dAct(128, 64, kernel_size=1, activation='leaky_relu')),
('conv4', Conv2dAct(64, 128, kernel_size=3, padding=1, activation='leaky_relu')),
('pool2', torch.nn.MaxPool2d(kernel_size=2, stride=2)),
('conv5', Conv2dAct(128, 256, kernel_size=3, padding=1, activation='leaky_relu')),
('conv6', Conv2dAct(256, 128, kernel_size=1, activation='leaky_relu')),
('conv7', Conv2dAct(128, 256, kernel_size=3, padding=1, activation='leaky_relu')),
('pool3', torch.nn.MaxPool2d(kernel_size=2, stride=2)),
('conv8', Conv2dAct(256, 512, kernel_size=3, padding=1, activation='leaky_relu')),
('conv9', Conv2dAct(512, 256, kernel_size=1, activation='leaky_relu')),
('conv10', Conv2dAct(256, 512, kernel_size=3, padding=1, activation='leaky_relu')),
('conv11', Conv2dAct(512, 256, kernel_size=1, activation='leaky_relu')),
('conv12', Conv2dAct(256, 512, kernel_size=3, padding=1, activation='leaky_relu')),
('pool4', torch.nn.MaxPool2d(kernel_size=2, stride=2)),
('conv13', Conv2dAct(512, 1024, kernel_size=3, padding=1, activation='leaky_relu')),
('conv14', Conv2dAct(1024, 512, kernel_size=1, activation='leaky_relu')),
('conv15', Conv2dAct(512, 1024, kernel_size=3, padding=1, activation='leaky_relu')),
('conv16', Conv2dAct(1024, 512, kernel_size=1, activation='leaky_relu')),
('conv17', Conv2dAct(512, 1024, kernel_size=3, padding=1, activation='leaky_relu')),
('pool5', torch.nn.AdaptiveAvgPool2d(1)),
('flatten', torch.nn.Flatten())]))
class TinyDarknet(Model):
"""Base model for TinyYoloV2"""
def __init__(self):
super().__init__(1024)
self.features = torch.nn.Sequential(collections.OrderedDict([
('conv0', Conv2dAct(3, 16, kernel_size=3, padding=1, activation='leaky_relu')),
('pool0', torch.nn.MaxPool2d(kernel_size=2, stride=2)),
('conv1', Conv2dAct(16, 32, kernel_size=3, padding=1, activation='leaky_relu')),
('pool1', torch.nn.MaxPool2d(kernel_size=2, stride=2)),
('conv2', Conv2dAct(32, 64, kernel_size=3, padding=1, activation='leaky_relu')),
('pool2', torch.nn.MaxPool2d(kernel_size=2, stride=2)),
('conv3', Conv2dAct(64, 128, kernel_size=3, padding=1, activation='leaky_relu')),
('pool3', torch.nn.MaxPool2d(kernel_size=2, stride=2)),
('conv4', Conv2dAct(128, 256, kernel_size=3, padding=1, activation='leaky_relu')),
('pool4', torch.nn.MaxPool2d(kernel_size=2, stride=2)),
('conv5', Conv2dAct(256, 512, kernel_size=3, padding=1, activation='leaky_relu')),
('pool5', torch.nn.MaxPool2d(kernel_size=2, stride=1)),
('conv6', Conv2dAct(512, 1024, kernel_size=3, padding=1, activation='leaky_relu')),
('pool6', torch.nn.AdaptiveAvgPool2d(1)),
('flatten', torch.nn.Flatten())]))
| 61.688525 | 96 | 0.623704 | 463 | 3,763 | 4.902808 | 0.164147 | 0.15859 | 0.176211 | 0.220264 | 0.85859 | 0.85859 | 0.832159 | 0.820705 | 0.78326 | 0.701322 | 0 | 0.105281 | 0.199841 | 3,763 | 60 | 97 | 62.716667 | 0.648622 | 0.025246 | 0 | 0.346154 | 0 | 0 | 0.126402 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.038462 | false | 0 | 0.076923 | 0 | 0.153846 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
cba37bc88e0f9cb4d890049b9e44ae109ad6ce85 | 20,278 | py | Python | integration/python/integration_api/api/kms_api.py | sumit4-ttn/SDK | b3ae385e5415e47ac70abd0b3fdeeaeee9aa7cff | [
"Apache-2.0"
] | null | null | null | integration/python/integration_api/api/kms_api.py | sumit4-ttn/SDK | b3ae385e5415e47ac70abd0b3fdeeaeee9aa7cff | [
"Apache-2.0"
] | null | null | null | integration/python/integration_api/api/kms_api.py | sumit4-ttn/SDK | b3ae385e5415e47ac70abd0b3fdeeaeee9aa7cff | [
"Apache-2.0"
] | null | null | null | # coding: utf-8
"""
Hydrogen Integration API
The Hydrogen Integration API # noqa: E501
OpenAPI spec version: 1.2.1
Contact: info@hydrogenplatform.com
Generated by: https://github.com/swagger-api/swagger-codegen.git
"""
from __future__ import absolute_import
import re # noqa: F401
# python 2 and python 3 compatibility library
import six
from integration_api.api_client import ApiClient
class KMSApi(object):
"""NOTE: This class is auto generated by the swagger code generator program.
Do not edit the class manually.
Ref: https://github.com/swagger-api/swagger-codegen
"""
def __init__(self, api_client=None):
if api_client is None:
api_client = ApiClient()
self.api_client = api_client
def create_kms_using_post(self, kms_config, **kwargs): # noqa: E501
"""Create an secret key # noqa: E501
Create an secret key. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_kms_using_post(kms_config, async_req=True)
>>> result = thread.get()
:param async_req bool
:param KmsConfig kms_config: kmsConfig (required)
:return: KmsConfig
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.create_kms_using_post_with_http_info(kms_config, **kwargs) # noqa: E501
else:
(data) = self.create_kms_using_post_with_http_info(kms_config, **kwargs) # noqa: E501
return data
def create_kms_using_post_with_http_info(self, kms_config, **kwargs): # noqa: E501
"""Create an secret key # noqa: E501
Create an secret key. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.create_kms_using_post_with_http_info(kms_config, async_req=True)
>>> result = thread.get()
:param async_req bool
:param KmsConfig kms_config: kmsConfig (required)
:return: KmsConfig
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['kms_config'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method create_kms_using_post" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'kms_config' is set
if ('kms_config' not in params or
params['kms_config'] is None):
raise ValueError("Missing the required parameter `kms_config` when calling `create_kms_using_post`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'kms_config' in params:
body_params = params['kms_config']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
return self.api_client.call_api(
'/kms', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='KmsConfig', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def delete_kms_using_delete(self, kms_id, **kwargs): # noqa: E501
"""Delete an secret key value # noqa: E501
Permanently delete an secret key value under a tenant. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_kms_using_delete(kms_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str kms_id: KMS Id (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.delete_kms_using_delete_with_http_info(kms_id, **kwargs) # noqa: E501
else:
(data) = self.delete_kms_using_delete_with_http_info(kms_id, **kwargs) # noqa: E501
return data
def delete_kms_using_delete_with_http_info(self, kms_id, **kwargs): # noqa: E501
"""Delete an secret key value # noqa: E501
Permanently delete an secret key value under a tenant. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.delete_kms_using_delete_with_http_info(kms_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str kms_id: KMS Id (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['kms_id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method delete_kms_using_delete" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'kms_id' is set
if ('kms_id' not in params or
params['kms_id'] is None):
raise ValueError("Missing the required parameter `kms_id` when calling `delete_kms_using_delete`") # noqa: E501
collection_formats = {}
path_params = {}
if 'kms_id' in params:
path_params['kms_id'] = params['kms_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['*/*']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
return self.api_client.call_api(
'/kms/{kms_id}', 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None, # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_kms_all_using_get(self, **kwargs): # noqa: E501
"""List all KMS Clients # noqa: E501
Get details for all clients registered with your firm. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_kms_all_using_get(async_req=True)
>>> result = thread.get()
:param async_req bool
:param bool ascending: ascending
:param str order_by: order_by
:param int page: page
:param int size: size
:return: PageKmsConfig
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_kms_all_using_get_with_http_info(**kwargs) # noqa: E501
else:
(data) = self.get_kms_all_using_get_with_http_info(**kwargs) # noqa: E501
return data
def get_kms_all_using_get_with_http_info(self, **kwargs): # noqa: E501
"""List all KMS Clients # noqa: E501
Get details for all clients registered with your firm. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_kms_all_using_get_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool
:param bool ascending: ascending
:param str order_by: order_by
:param int page: page
:param int size: size
:return: PageKmsConfig
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['ascending', 'order_by', 'page', 'size'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_kms_all_using_get" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'ascending' in params:
query_params.append(('ascending', params['ascending'])) # noqa: E501
if 'order_by' in params:
query_params.append(('order_by', params['order_by'])) # noqa: E501
if 'page' in params:
query_params.append(('page', params['page'])) # noqa: E501
if 'size' in params:
query_params.append(('size', params['size'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
return self.api_client.call_api(
'/kms', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='PageKmsConfig', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def get_kms_using_get(self, kms_id, **kwargs): # noqa: E501
"""Retrieve an secret key value # noqa: E501
Retrieve the information for a specific value associated with a Secret key. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_kms_using_get(kms_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str kms_id: KMS Id (required)
:return: KmsConfig
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.get_kms_using_get_with_http_info(kms_id, **kwargs) # noqa: E501
else:
(data) = self.get_kms_using_get_with_http_info(kms_id, **kwargs) # noqa: E501
return data
def get_kms_using_get_with_http_info(self, kms_id, **kwargs): # noqa: E501
"""Retrieve an secret key value # noqa: E501
Retrieve the information for a specific value associated with a Secret key. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.get_kms_using_get_with_http_info(kms_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str kms_id: KMS Id (required)
:return: KmsConfig
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['kms_id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method get_kms_using_get" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'kms_id' is set
if ('kms_id' not in params or
params['kms_id'] is None):
raise ValueError("Missing the required parameter `kms_id` when calling `get_kms_using_get`") # noqa: E501
collection_formats = {}
path_params = {}
if 'kms_id' in params:
path_params['kms_id'] = params['kms_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
return self.api_client.call_api(
'/kms/{kms_id}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='KmsConfig', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def update_kms_using_put(self, kms_config, kms_id, **kwargs): # noqa: E501
"""Update an Key Value # noqa: E501
Update the information for an key value. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_kms_using_put(kms_config, kms_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param KmsConfig kms_config: kmsConfig (required)
:param str kms_id: kms_id (required)
:return: KmsConfig
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.update_kms_using_put_with_http_info(kms_config, kms_id, **kwargs) # noqa: E501
else:
(data) = self.update_kms_using_put_with_http_info(kms_config, kms_id, **kwargs) # noqa: E501
return data
def update_kms_using_put_with_http_info(self, kms_config, kms_id, **kwargs): # noqa: E501
"""Update an Key Value # noqa: E501
Update the information for an key value. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.update_kms_using_put_with_http_info(kms_config, kms_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param KmsConfig kms_config: kmsConfig (required)
:param str kms_id: kms_id (required)
:return: KmsConfig
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['kms_config', 'kms_id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method update_kms_using_put" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'kms_config' is set
if ('kms_config' not in params or
params['kms_config'] is None):
raise ValueError("Missing the required parameter `kms_config` when calling `update_kms_using_put`") # noqa: E501
# verify the required parameter 'kms_id' is set
if ('kms_id' not in params or
params['kms_id'] is None):
raise ValueError("Missing the required parameter `kms_id` when calling `update_kms_using_put`") # noqa: E501
collection_formats = {}
path_params = {}
if 'kms_id' in params:
path_params['kms_id'] = params['kms_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'kms_config' in params:
body_params = params['kms_config']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['oauth2'] # noqa: E501
return self.api_client.call_api(
'/kms/{kms_id}', 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='KmsConfig', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
| 37.973783 | 126 | 0.606273 | 2,429 | 20,278 | 4.784273 | 0.070811 | 0.053007 | 0.024094 | 0.030978 | 0.93572 | 0.921177 | 0.912142 | 0.898115 | 0.886843 | 0.879787 | 0 | 0.017356 | 0.303876 | 20,278 | 533 | 127 | 38.045028 | 0.805894 | 0.329766 | 0 | 0.775801 | 1 | 0 | 0.174715 | 0.039066 | 0 | 0 | 0 | 0 | 0 | 1 | 0.039146 | false | 0 | 0.014235 | 0 | 0.11032 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
cbaaee0d302d117cdb6d9d75a88bdc9c72a41a76 | 37 | py | Python | PTA/GPLT/L1-Python/052.py | cnsteven/online-judge | 60ee841a97e2bc0dc9c7b23fe5daa186898ab8b7 | [
"MIT"
] | 1 | 2019-05-04T10:28:32.000Z | 2019-05-04T10:28:32.000Z | PTA/GPLT/L1-Python/052.py | cnsteven/online-judge | 60ee841a97e2bc0dc9c7b23fe5daa186898ab8b7 | [
"MIT"
] | null | null | null | PTA/GPLT/L1-Python/052.py | cnsteven/online-judge | 60ee841a97e2bc0dc9c7b23fe5daa186898ab8b7 | [
"MIT"
] | 3 | 2020-12-31T04:36:38.000Z | 2021-07-25T07:39:31.000Z | print("2018\nwo3 men2 yao4 ying2 !")
| 18.5 | 36 | 0.702703 | 6 | 37 | 4.333333 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.25 | 0.135135 | 37 | 1 | 37 | 37 | 0.5625 | 0 | 0 | 0 | 0 | 0 | 0.72973 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 7 |
1dd63ddd5437b84599364158b368a03e67d1f772 | 204 | py | Python | updatesproducer/db/iupdates_repository.py | AppleteeYT/Iris | b60deb6575820253bad50b48b9b39023d6440fd4 | [
"Apache-2.0"
] | null | null | null | updatesproducer/db/iupdates_repository.py | AppleteeYT/Iris | b60deb6575820253bad50b48b9b39023d6440fd4 | [
"Apache-2.0"
] | null | null | null | updatesproducer/db/iupdates_repository.py | AppleteeYT/Iris | b60deb6575820253bad50b48b9b39023d6440fd4 | [
"Apache-2.0"
] | null | null | null | from abc import ABC
class IUpdatesRepository(ABC):
def get_user_latest_update_time(self, user_id):
pass
def set_user_latest_update_time(self, user_id, latest_update_time):
pass
| 20.4 | 71 | 0.735294 | 30 | 204 | 4.6 | 0.5 | 0.26087 | 0.347826 | 0.289855 | 0.434783 | 0.434783 | 0.434783 | 0 | 0 | 0 | 0 | 0 | 0.205882 | 204 | 9 | 72 | 22.666667 | 0.851852 | 0 | 0 | 0.333333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0.333333 | 0.166667 | 0 | 0.666667 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 7 |
1de14650e1908bf351d8d6fbd904977694fb8686 | 271 | py | Python | pyra_pytorch/__init__.py | vlbthambawita/pyra-pip | 2b4056045ca5b9618571deb7c05ab05ef7476d2c | [
"MIT"
] | null | null | null | pyra_pytorch/__init__.py | vlbthambawita/pyra-pip | 2b4056045ca5b9618571deb7c05ab05ef7476d2c | [
"MIT"
] | 1 | 2021-11-17T13:45:26.000Z | 2022-01-19T20:19:07.000Z | pyra_pytorch/__init__.py | vlbthambawita/pyra-pytorch | 2b4056045ca5b9618571deb7c05ab05ef7476d2c | [
"MIT"
] | null | null | null | from pyra_pytorch.pyra_pytorch import DatasetWithGridEncoding as PYRADataset
from pyra_pytorch.pyra_pytorch import DatasetWithGridEncodingFromFilePaths as PYRADatasetFromPaths
from pyra_pytorch.pyra_pytorch import DatasetWithGridEncodingFromDataFrame as PYRADatasetFromDF | 90.333333 | 98 | 0.926199 | 27 | 271 | 9.074074 | 0.407407 | 0.269388 | 0.183673 | 0.232653 | 0.391837 | 0.391837 | 0 | 0 | 0 | 0 | 0 | 0 | 0.062731 | 271 | 3 | 99 | 90.333333 | 0.964567 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
380386814488c6da3e4d7673dcbea1de845baba0 | 7,513 | py | Python | TransmitRF_OK.py | PJCzx/Rx-Tx | 74b9dc20a4d3f2164c4e66265749398f18596c9d | [
"MIT"
] | 2 | 2017-11-21T19:53:06.000Z | 2019-07-31T12:40:23.000Z | TransmitRF_OK.py | PJCzx/Rx-Tx | 74b9dc20a4d3f2164c4e66265749398f18596c9d | [
"MIT"
] | null | null | null | TransmitRF_OK.py | PJCzx/Rx-Tx | 74b9dc20a4d3f2164c4e66265749398f18596c9d | [
"MIT"
] | 2 | 2020-10-02T08:14:39.000Z | 2021-02-22T03:52:14.000Z | import time
import sys
import RPi.GPIO as GPIO
up = 'up'
up_1 = '101011100110111011011110101001000111111101111111111110100110110000'
up_2 = '101011100110111011011110101001000111111101111111111011011101111110'
down = 'down'
down_1 = '101011100110111011011110101001000111111101111111110111100101000000'
down_2 = '101011100110111011011110101001000111111101111111111011011101111110'
stop = 'stop'
stop_1 = '101011100110111011011110101001000111111101111111111011100110000000'
pause = 0.5
short_delay = 0.00018
long_delay = 0.00058
extended_delay = 0.00518
half_code_delay = 0.01033
NUM_ATTEMPTS = 4
TRANSMIT_PIN = 23
print "var settings done"
def transmit_code(code):
'''Transmit a chosen code string using the GPIO transmitter'''
print "entering code"
GPIO.setmode(GPIO.BCM)
GPIO.setup(TRANSMIT_PIN, GPIO.OUT)
GPIO.output(TRANSMIT_PIN, 0)
time.sleep(pause)
# UP COMMAND
if code is up or code == 100:
print "going up"
for t in range(NUM_ATTEMPTS):
print "entering for %s on %s" % (t, NUM_ATTEMPTS)
GPIO.output(TRANSMIT_PIN, 1)
time.sleep(extended_delay)
for i in up_1:
if i == '1':
GPIO.output(TRANSMIT_PIN, 1)
time.sleep(short_delay)
GPIO.output(TRANSMIT_PIN, 0)
time.sleep(long_delay)
print " 1 sent"
elif i == '0':
GPIO.output(TRANSMIT_PIN, 1)
time.sleep(long_delay)
GPIO.output(TRANSMIT_PIN, 0)
time.sleep(short_delay)
print " 0 sent"
else:
continue
GPIO.output(TRANSMIT_PIN, 0)
time.sleep(extended_delay)
print " extended delay sent"
print "out of for loop up_1"
time.sleep(half_code_delay)
print "adding half code delay"
for t in range(NUM_ATTEMPTS):
print "entering for %s on %s" % (t, NUM_ATTEMPTS)
GPIO.output(TRANSMIT_PIN, 1)
time.sleep(extended_delay)
for i in up_2:
if i == '1':
GPIO.output(TRANSMIT_PIN, 1)
time.sleep(short_delay)
GPIO.output(TRANSMIT_PIN, 0)
time.sleep(long_delay)
print " 1 sent"
elif i == '0':
GPIO.output(TRANSMIT_PIN, 1)
time.sleep(long_delay)
GPIO.output(TRANSMIT_PIN, 0)
time.sleep(short_delay)
print " 0 sent"
else:
continue
GPIO.output(TRANSMIT_PIN, 0)
time.sleep(extended_delay)
print " extended delay sent"
print "out of for loop up_2"
print "out of if up"
# END UP COMMAND
# DOWN COMMAND
if code is down or code == 0:
print "going down"
for t in range(NUM_ATTEMPTS):
print "entering for %s on %s" % (t, NUM_ATTEMPTS)
GPIO.output(TRANSMIT_PIN, 1)
time.sleep(extended_delay)
for i in down_1:
if i == '1':
GPIO.output(TRANSMIT_PIN, 1)
time.sleep(short_delay)
GPIO.output(TRANSMIT_PIN, 0)
time.sleep(long_delay)
print " 1 sent"
elif i == '0':
GPIO.output(TRANSMIT_PIN, 1)
time.sleep(long_delay)
GPIO.output(TRANSMIT_PIN, 0)
time.sleep(short_delay)
print " 0 sent"
else:
continue
GPIO.output(TRANSMIT_PIN, 0)
time.sleep(extended_delay)
print " extended delay sent"
print "out of for loop down_1"
time.sleep(half_code_delay)
print "adding half code delay"
for t in range(NUM_ATTEMPTS):
print "entering for %s on %s" % (t, NUM_ATTEMPTS)
GPIO.output(TRANSMIT_PIN, 1)
time.sleep(extended_delay)
for i in down_2:
if i == '1':
GPIO.output(TRANSMIT_PIN, 1)
time.sleep(short_delay)
GPIO.output(TRANSMIT_PIN, 0)
time.sleep(long_delay)
print " 1 sent"
elif i == '0':
GPIO.output(TRANSMIT_PIN, 1)
time.sleep(long_delay)
GPIO.output(TRANSMIT_PIN, 0)
time.sleep(short_delay)
print " 0 sent"
else:
continue
GPIO.output(TRANSMIT_PIN, 0)
time.sleep(extended_delay)
print " extended delay sent"
print "out of for loop down_2"
print "out of if down"
# END DOWN COMMAND
# STOP COMMAND
if code is stop or (code < 100 and code > 0):
print "stopping"
for t in range(NUM_ATTEMPTS):
print "entering for %s on %s" % (t, NUM_ATTEMPTS)
GPIO.output(TRANSMIT_PIN, 1)
time.sleep(extended_delay)
for i in stop_1:
if i == '1':
GPIO.output(TRANSMIT_PIN, 1)
time.sleep(short_delay)
GPIO.output(TRANSMIT_PIN, 0)
time.sleep(long_delay)
print " 1 sent"
elif i == '0':
GPIO.output(TRANSMIT_PIN, 1)
time.sleep(long_delay)
GPIO.output(TRANSMIT_PIN, 0)
time.sleep(short_delay)
print " 0 sent"
else:
continue
GPIO.output(TRANSMIT_PIN, 0)
time.sleep(extended_delay)
print " extended delay sent"
print "out of for loop stop_1"
# time.sleep(half_code_delay)
# print "adding half code delay"
# for t in range(NUM_ATTEMPTS):
# print "entering for %s on %s" % (t, NUM_ATTEMPTS)
# GPIO.output(TRANSMIT_PIN, 1)
# time.sleep(extended_delay)
# for i in stop_2:
# if i == '1':
# GPIO.output(TRANSMIT_PIN, 1)
# time.sleep(short_delay)
# GPIO.output(TRANSMIT_PIN, 0)
# time.sleep(long_delay)
# print " 1 sent"
# elif i == '0':
# GPIO.output(TRANSMIT_PIN, 1)
# time.sleep(long_delay)
# GPIO.output(TRANSMIT_PIN, 0)
# time.sleep(short_delay)
# print " 0 sent"
# else:
# continue
# GPIO.output(TRANSMIT_PIN, 0)
# time.sleep(extended_delay)
# print " extended delay sent"
# print "out of for loop stop_2"
print "out of if stop"
# END STOP COMMAND
GPIO.output(TRANSMIT_PIN, 0)
time.sleep(half_code_delay)
GPIO.cleanup()
if __name__ == '__main__':
for argument in sys.argv[1:]:
exec('transmit_code(' + str(argument) + ')')
| 37.009852 | 78 | 0.491947 | 806 | 7,513 | 4.425558 | 0.101737 | 0.103448 | 0.191758 | 0.223717 | 0.74208 | 0.727502 | 0.727502 | 0.710121 | 0.710121 | 0.710121 | 0 | 0.103929 | 0.427526 | 7,513 | 202 | 79 | 37.193069 | 0.725413 | 0.117796 | 0 | 0.704403 | 0 | 0 | 0.143579 | 0.052124 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.018868 | null | null | 0.220126 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
381962801f11372636adf11aac6c677fdfa7a264 | 5,242 | py | Python | mimo/abstraction.py | pnickl/mimo | 81c4bbd2594e2136445009eae752ab8a1602a1cf | [
"MIT"
] | 3 | 2020-05-19T12:01:48.000Z | 2020-10-15T11:51:37.000Z | mimo/abstraction.py | pnickl/mimo | 81c4bbd2594e2136445009eae752ab8a1602a1cf | [
"MIT"
] | null | null | null | mimo/abstraction.py | pnickl/mimo | 81c4bbd2594e2136445009eae752ab8a1602a1cf | [
"MIT"
] | null | null | null | import abc
import copy
from operator import add, sub
from functools import reduce
from future.utils import with_metaclass
from mimo.util.data import islist
# Base classes
class Distribution(with_metaclass(abc.ABCMeta, object)):
@abc.abstractmethod
def rvs(self, size=1):
# random variates (samples)
pass
@abc.abstractmethod
def log_likelihood(self, x):
"""
log likelihood (either log probability mass function or log probability
density function) of x, which has the same type as the output of rvs()
"""
pass
@abc.abstractmethod
def mean(self):
pass
@abc.abstractmethod
def mode(self):
pass
@abc.abstractmethod
def log_partition(self):
pass
@abc.abstractmethod
def entropy(self):
pass
class BayesianDistribution(with_metaclass(abc.ABCMeta, Distribution)):
@abc.abstractmethod
def empirical_bayes(self, data):
"""
(optional) set hyperparameters via empirical bayes
e.g. treat argument as a pseudo-dataset for exponential family
"""
raise NotImplementedError
# Algorithm interfaces for inference in distributions
@abc.abstractmethod
def resample(self, data=[]):
pass
@abc.abstractmethod
def copy_sample(self):
"""
return an object copy suitable for making lists of posterior samples
(override this method to prevent copying shared structures into each sample)
"""
return copy.deepcopy(self)
@abc.abstractmethod
def resample_and_copy(self):
self.resample()
return self.copy_sample()
@abc.abstractmethod
def expected_log_likelihood(self, data):
pass
@abc.abstractmethod
def meanfield_update(self, data, weights):
pass
@abc.abstractmethod
def variational_lowerbound(self):
raise NotImplementedError
@abc.abstractmethod
def meanfield_sgdstep(self, stats, weights, prob, stepsize):
pass
@abc.abstractmethod
def max_likelihood(self, data, weights=None):
"""
sets the parameters set to their maximum likelihood values given the
(weighted) data
"""
pass
@property
def nb_params(self):
raise NotImplementedError
@abc.abstractmethod
def max_aposteriori(self, data, weights=None):
"""
sets the parameters to their MAP values given the (weighted) data
analogous to max_likelihood but includes hyperparameters
"""
pass
class Conditional(with_metaclass(abc.ABCMeta, object)):
@abc.abstractmethod
def rvs(self, x):
# random variates (samples)
pass
@abc.abstractmethod
def log_likelihood(self, y, x):
"""
log likelihood (either log probability mass function or log probability
density function) of y, which has the same type as the output of rvs()
x is a conditional variable of the density
"""
pass
@abc.abstractmethod
def mean(self, x):
pass
@abc.abstractmethod
def mode(self, x):
pass
@abc.abstractmethod
def log_partition(self):
pass
@abc.abstractmethod
def entropy(self):
pass
class BayesianConditional(with_metaclass(abc.ABCMeta, Conditional)):
def empirical_bayes(self, data):
"""
(optional) set hyperparameters via empirical bayes
e.g. treat argument as a pseudo-dataset for exponential family
"""
raise NotImplementedError
@abc.abstractmethod
def resample(self, data=[]):
pass
def resample_and_copy(self):
self.resample()
return self.copy_sample()
@abc.abstractmethod
def expected_log_likelihood(self, x):
pass
@abc.abstractmethod
def meanfield_update(self, data, weights):
pass
def variational_lowerbound(self):
raise NotImplementedError
@abc.abstractmethod
def meanfield_sgdstep(self, stats, weights, prob, stepsize):
pass
@abc.abstractmethod
def max_likelihood(self, data, weights=None):
"""
sets the parameters set to their maximum likelihood values given the
(weighted) data
"""
pass
@property
def nb_params(self):
raise NotImplementedError
@abc.abstractmethod
def max_aposteriori(self, data, weights=None):
"""
sets the parameters to their MAP values given the (weighted) data
analogous to max_likelihood but includes hyperparameters
"""
pass
class Statistics(tuple):
def __new__(cls, x):
return tuple.__new__(Statistics, x)
def __add__(self, y):
gsum = lambda x, y: reduce(lambda a, b: list(map(add, a, b)) if islist(x, y) else a + b, [x, y])
return Statistics(tuple(map(gsum, self, y)))
def __sub__(self, y):
gsub = lambda x, y: reduce(lambda a, b: list(map(sub, a, b)) if islist(x, y) else a - b, [x, y])
return Statistics(tuple(map(gsub, self, y)))
def __mul__(self, a):
return Statistics(a * e for e in self)
def __rmul__(self, a):
return Statistics(a * e for e in self)
| 25.201923 | 104 | 0.638688 | 617 | 5,242 | 5.329011 | 0.231767 | 0.144769 | 0.170316 | 0.116788 | 0.789234 | 0.789234 | 0.742092 | 0.717762 | 0.717762 | 0.700122 | 0 | 0.000265 | 0.279092 | 5,242 | 207 | 105 | 25.323672 | 0.869807 | 0.235788 | 0 | 0.727273 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.31405 | false | 0.198347 | 0.049587 | 0.024793 | 0.471074 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 8 |
699d391a3f400c4143515472e690f2751f026a68 | 81,481 | py | Python | examples/subsystem/spacecraft_parsed_classes.py | WinstonPais/hydrus | 0d1ccaa8fd2512d815f32c627327d0ce4770c220 | [
"MIT"
] | 214 | 2017-02-06T17:52:18.000Z | 2022-02-14T08:42:34.000Z | examples/subsystem/spacecraft_parsed_classes.py | WinstonPais/hydrus | 0d1ccaa8fd2512d815f32c627327d0ce4770c220 | [
"MIT"
] | 447 | 2016-12-16T11:33:43.000Z | 2022-02-25T10:44:14.000Z | examples/subsystem/spacecraft_parsed_classes.py | WinstonPais/hydrus | 0d1ccaa8fd2512d815f32c627327d0ce4770c220 | [
"MIT"
] | 233 | 2017-05-30T08:33:05.000Z | 2022-03-04T12:05:17.000Z | """Parsed classes generated by hydra-openapi-parser(https://github.com/HTTP-APIs/hydra-openapi-parser) for spacecraft data."""
parsed_classes = [
{
"@type": "Class",
"supportedProperty": [
{
"@type": "SupportedProperty",
"title": "isSubsystemOf",
"property": "http://ontology.projectchronos.eu/spacecraft/isSubsystemOf",
"description": "subject is a device or a system of devices that is subsystem of a wider system or device",
"readonly": "false",
"required": "false",
"writeonly": "false"
},
{
"@type": "SupportedProperty",
"title": "isComponentOf",
"property": "http://ontology.projectchronos.eu/spacecraft/isComponentOf",
"description": "the subject is a member of a wider artifact, that is a set of artifacts",
"readonly": "false",
"required": "false",
"writeonly": "false"
},
{
"@type": "SupportedProperty",
"title": "hasSubSystem",
"property": "http://ontology.projectchronos.eu/spacecraft/hasSubSystem",
"readonly": "false",
"required": "false",
"writeonly": "false"
},
{
"@type": "SupportedProperty",
"title": "isDeployedIn",
"property": "http://ontology.projectchronos.eu/spacecraft/isDeployedIn",
"description": "the environment in which a device or a system of devices is designed to work",
"readonly": "false",
"required": "false",
"writeonly": "false"
}
],
"title": "cubicMillimeters",
"supportedOperation": [
{
"statusCodes": [],
"@type": "hydraspec:Operation",
"returns": "http://ontology.projectchronos.eu/subsystems/cubicMillimeters",
"label": "Creates a new cubicMillimeters entity",
"method": "POST",
"@id": "_:cubicMillimeters_create",
"description": "null",
"expects": "http://ontology.projectchronos.eu/subsystems/cubicMillimeters"
},
{
"statusCodes": [
{
"code": 404,
"description": "If the cubicMillimeters entity wasn't found."
}
],
"@type": "hydraspec:Operation",
"returns": "http://ontology.projectchronos.eu/subsystems/cubicMillimeters",
"label": "Replaces an existing cubicMillimeters entity",
"method": "PUT",
"@id": "_:cubicMillimeters_replace",
"description": "null",
"expects": "http://ontology.projectchronos.eu/subsystems/cubicMillimeters"
},
{
"statusCodes": [],
"@type": "hydraspec:Operation",
"returns": "null",
"label": "Deletes a cubicMillimeters entity",
"method": "DELETE",
"@id": "_:cubicMillimeters_delete",
"description": "null",
"expects": "null"
},
{
"statusCodes": [
{
"code": 404,
"description": "If the cubicMillimeters entity wasn't found."
}
],
"@type": "hydraspec:Operation",
"returns": "http://ontology.projectchronos.eu/subsystems/cubicMillimeters",
"label": "Retrieves a cubicMillimeters entity",
"method": "GET",
"@id": "_:cubicMillimeters_retrieve",
"description": "null",
"expects": "null"
}
],
"description": "unit of measure for volume",
"@id": "http://ontology.projectchronos.eu/subsystems/cubicMillimeters"
},
{
"@type": "Class",
"supportedProperty": [
{
"@type": "SupportedProperty",
"title": "isSubsystemOf",
"property": "http://ontology.projectchronos.eu/spacecraft/isSubsystemOf",
"description": "subject is a device or a system of devices that is subsystem of a wider system or device",
"readonly": "false",
"required": "false",
"writeonly": "false"
},
{
"@type": "SupportedProperty",
"title": "isComponentOf",
"property": "http://ontology.projectchronos.eu/spacecraft/isComponentOf",
"description": "the subject is a member of a wider artifact, that is a set of artifacts",
"readonly": "false",
"required": "false",
"writeonly": "false"
},
{
"@type": "SupportedProperty",
"title": "hasSubSystem",
"property": "http://ontology.projectchronos.eu/spacecraft/hasSubSystem",
"readonly": "false",
"required": "false",
"writeonly": "false"
},
{
"@type": "SupportedProperty",
"title": "isDeployedIn",
"property": "http://ontology.projectchronos.eu/spacecraft/isDeployedIn",
"description": "the environment in which a device or a system of devices is designed to work",
"readonly": "false",
"required": "false",
"writeonly": "false"
},
{
"required": "false",
"@type": "SupportedProperty",
"property": "http://ontology.projectchronos.eu/subsytems/objective",
"writeonly": "false",
"readonly": "false"
},
{
"required": "false",
"@type": "SupportedProperty",
"property": "http://ontology.projectchronos.eu/subsytems/isComponentOf",
"writeonly": "false",
"readonly": "false"
},
{
"required": "false",
"@type": "SupportedProperty",
"property": "http://ontology.projectchronos.eu/subsystems/hasWireOutWith",
"writeonly": "false",
"readonly": "false"
},
{
"required": "false",
"@type": "SupportedProperty",
"property": "http://ontology.projectchronos.eu/subsystems/hasWireInWith",
"writeonly": "false",
"readonly": "false"
},
{
"required": "false",
"@type": "SupportedProperty",
"property": "http://ontology.projectchronos.eu/subsystems/maxWorkingTemperature",
"writeonly": "false",
"readonly": "false"
},
{
"required": "false",
"@type": "SupportedProperty",
"property": "http://ontology.projectchronos.eu/subsystems/minWorkingTemperature",
"writeonly": "false",
"readonly": "false"
},
{
"required": "false",
"@type": "SupportedProperty",
"property": "http://ontology.projectchronos.eu/subsystems/hasVoltage",
"writeonly": "false",
"readonly": "false"
}
],
"title": "Spacecraft_Detector",
"supportedOperation": [
{
"statusCodes": [],
"@type": "hydraspec:Operation",
"returns": "http://ontology.projectchronos.eu/subsystems/Spacecraft_Detector",
"label": "Creates a new Spacecraft_Detector entity",
"method": "POST",
"@id": "_:Spacecraft_Detector_create",
"description": "null",
"expects": "http://ontology.projectchronos.eu/subsystems/Spacecraft_Detector"
},
{
"statusCodes": [
{
"code": 404,
"description": "If the Spacecraft_Detector entity wasn't found."
}
],
"@type": "hydraspec:Operation",
"returns": "http://ontology.projectchronos.eu/subsystems/Spacecraft_Detector",
"label": "Replaces an existing Spacecraft_Detector entity",
"method": "PUT",
"@id": "_:Spacecraft_Detector_replace",
"description": "null",
"expects": "http://ontology.projectchronos.eu/subsystems/Spacecraft_Detector"
},
{
"statusCodes": [],
"@type": "hydraspec:Operation",
"returns": "null",
"label": "Deletes a Spacecraft_Detector entity",
"method": "DELETE",
"@id": "_:Spacecraft_Detector_delete",
"description": "null",
"expects": "null"
},
{
"statusCodes": [
{
"code": 404,
"description": "If the Spacecraft_Detector entity wasn't found."
}
],
"@type": "hydraspec:Operation",
"returns": "http://ontology.projectchronos.eu/subsystems/Spacecraft_Detector",
"label": "Retrieves a Spacecraft_Detector entity",
"method": "GET",
"@id": "_:Spacecraft_Detector_retrieve",
"description": "null",
"expects": "null"
}
],
"description": "A space detector is a sensor supported by another device that let it collect data, that is deployed into a spacecraft and works outside Earth lower atmosphere",
"@id": "http://ontology.projectchronos.eu/subsystems/Spacecraft_Detector"
},
{
"@type": "Class",
"supportedProperty": [
{
"@type": "SupportedProperty",
"title": "isSubsystemOf",
"property": "http://ontology.projectchronos.eu/spacecraft/isSubsystemOf",
"description": "subject is a device or a system of devices that is subsystem of a wider system or device",
"readonly": "false",
"required": "false",
"writeonly": "false"
},
{
"@type": "SupportedProperty",
"title": "isComponentOf",
"property": "http://ontology.projectchronos.eu/spacecraft/isComponentOf",
"description": "the subject is a member of a wider artifact, that is a set of artifacts",
"readonly": "false",
"required": "false",
"writeonly": "false"
},
{
"@type": "SupportedProperty",
"title": "hasSubSystem",
"property": "http://ontology.projectchronos.eu/spacecraft/hasSubSystem",
"readonly": "false",
"required": "false",
"writeonly": "false"
},
{
"@type": "SupportedProperty",
"title": "embedSensor",
"property": "http://ontology.projectchronos.eu/subsystems/embedSensor",
"description": "a subsystem that holds a sensor",
"readonly": "false",
"required": "false",
"writeonly": "false"
},
{
"@type": "SupportedProperty",
"title": "isDeployedIn",
"property": "http://ontology.projectchronos.eu/spacecraft/isDeployedIn",
"description": "the environment in which a device or a system of devices is designed to work",
"readonly": "false",
"required": "false",
"writeonly": "false"
},
{
"required": "false",
"@type": "SupportedProperty",
"property": "http://ontology.projectchronos.eu/subsytems/function",
"writeonly": "false",
"readonly": "false"
},
{
"required": "false",
"@type": "SupportedProperty",
"property": "http://ontology.projectchronos.eu/subsystems/typeOfPropellant",
"writeonly": "false",
"readonly": "false"
},
{
"required": "false",
"@type": "SupportedProperty",
"property": "http://ontology.projectchronos.eu/subsystems/hasSpecificImpulse",
"writeonly": "false",
"readonly": "false"
},
{
"required": "false",
"@type": "SupportedProperty",
"property": "http://ontology.projectchronos.eu/subsystems/maxWorkingTemperature",
"writeonly": "false",
"readonly": "false"
},
{
"required": "false",
"@type": "SupportedProperty",
"property": "http://ontology.projectchronos.eu/subsystems/minWorkingTemperature",
"writeonly": "false",
"readonly": "false"
},
{
"required": "false",
"@type": "SupportedProperty",
"property": "http://ontology.projectchronos.eu/subsystems/subSystemType",
"writeonly": "false",
"readonly": "false"
},
{
"required": "false",
"@type": "SupportedProperty",
"property": "http://ontology.projectchronos.eu/subsystems/hasWireInWith",
"writeonly": "false",
"readonly": "false"
}
],
"title": "Spacecraft_Propulsion",
"supportedOperation": [
{
"statusCodes": [],
"@type": "hydraspec:Operation",
"returns": "http://ontology.projectchronos.eu/subsystems/Spacecraft_Propulsion",
"label": "Creates a new Spacecraft_Propulsion entity",
"method": "POST",
"@id": "_:Spacecraft_Propulsion_create",
"description": "null",
"expects": "http://ontology.projectchronos.eu/subsystems/Spacecraft_Propulsion"
},
{
"statusCodes": [
{
"code": 404,
"description": "If the Spacecraft_Propulsion entity wasn't found."
}
],
"@type": "hydraspec:Operation",
"returns": "http://ontology.projectchronos.eu/subsystems/Spacecraft_Propulsion",
"label": "Replaces an existing Spacecraft_Propulsion entity",
"method": "PUT",
"@id": "_:Spacecraft_Propulsion_replace",
"description": "null",
"expects": "http://ontology.projectchronos.eu/subsystems/Spacecraft_Propulsion"
},
{
"statusCodes": [],
"@type": "hydraspec:Operation",
"returns": "null",
"label": "Deletes a Spacecraft_Propulsion entity",
"method": "DELETE",
"@id": "_:Spacecraft_Propulsion_delete",
"description": "null",
"expects": "null"
},
{
"statusCodes": [
{
"code": 404,
"description": "If the Spacecraft_Propulsion entity wasn't found."
}
],
"@type": "hydraspec:Operation",
"returns": "http://ontology.projectchronos.eu/subsystems/Spacecraft_Propulsion",
"label": "Retrieves a Spacecraft_Propulsion entity",
"method": "GET",
"@id": "_:Spacecraft_Propulsion_retrieve",
"description": "null",
"expects": "null"
}
],
"description": "Complex devices-subsystems used for impelling (processes of applying a force which results in translational motion) a spacecraft, in the specific http://umbel.org/umbel/rc/ProjectilePropelling",
"@id": "http://ontology.projectchronos.eu/subsystems/Spacecraft_Propulsion"
},
{
"@type": "Class",
"supportedProperty": [
{
"@type": "SupportedProperty",
"title": "isSubsystemOf",
"property": "http://ontology.projectchronos.eu/spacecraft/isSubsystemOf",
"description": "subject is a device or a system of devices that is subsystem of a wider system or device",
"readonly": "false",
"required": "false",
"writeonly": "false"
},
{
"@type": "SupportedProperty",
"title": "isComponentOf",
"property": "http://ontology.projectchronos.eu/spacecraft/isComponentOf",
"description": "the subject is a member of a wider artifact, that is a set of artifacts",
"readonly": "false",
"required": "false",
"writeonly": "false"
},
{
"@type": "SupportedProperty",
"title": "hasSubSystem",
"property": "http://ontology.projectchronos.eu/spacecraft/hasSubSystem",
"readonly": "false",
"required": "false",
"writeonly": "false"
},
{
"@type": "SupportedProperty",
"title": "embedSensor",
"property": "http://ontology.projectchronos.eu/subsystems/embedSensor",
"description": "a subsystem that holds a sensor",
"readonly": "false",
"required": "false",
"writeonly": "false"
},
{
"@type": "SupportedProperty",
"title": "isDeployedIn",
"property": "http://ontology.projectchronos.eu/spacecraft/isDeployedIn",
"description": "the environment in which a device or a system of devices is designed to work",
"readonly": "false",
"required": "false",
"writeonly": "false"
},
{
"required": "false",
"@type": "SupportedProperty",
"property": "http://ontology.projectchronos.eu/subsytems/function",
"writeonly": "false",
"readonly": "false"
},
{
"required": "false",
"@type": "SupportedProperty",
"property": "http://ontology.projectchronos.eu/subsystems/hasEfficiency",
"writeonly": "false",
"readonly": "false"
},
{
"required": "false",
"@type": "SupportedProperty",
"property": "http://ontology.projectchronos.eu/subsystems/hasVoltage",
"writeonly": "false",
"readonly": "false"
},
{
"required": "false",
"@type": "SupportedProperty",
"property": "http://ontology.projectchronos.eu/subsystems/subSystemType",
"writeonly": "false",
"readonly": "false"
},
{
"required": "false",
"@type": "SupportedProperty",
"property": "http://ontology.projectchronos.eu/subsystems/maxWorkingTemperature",
"writeonly": "false",
"readonly": "false"
},
{
"required": "false",
"@type": "SupportedProperty",
"property": "http://ontology.projectchronos.eu/subsystems/minWorkingTemperature",
"writeonly": "false",
"readonly": "false"
},
{
"required": "false",
"@type": "SupportedProperty",
"property": "http://ontology.projectchronos.eu/subsystems/hasWireOutWith",
"writeonly": "false",
"readonly": "false"
}
],
"title": "Spacecraft_PrimaryPower",
"supportedOperation": [
{
"statusCodes": [],
"@type": "hydraspec:Operation",
"returns": "http://ontology.projectchronos.eu/subsystems/Spacecraft_PrimaryPower",
"label": "Creates a new Spacecraft_PrimaryPower entity",
"method": "POST",
"@id": "_:Spacecraft_PrimaryPower_create",
"description": "null",
"expects": "http://ontology.projectchronos.eu/subsystems/Spacecraft_PrimaryPower"
},
{
"statusCodes": [
{
"code": 404,
"description": "If the Spacecraft_PrimaryPower entity wasn't found."
}
],
"@type": "hydraspec:Operation",
"returns": "http://ontology.projectchronos.eu/subsystems/Spacecraft_PrimaryPower",
"label": "Replaces an existing Spacecraft_PrimaryPower entity",
"method": "PUT",
"@id": "_:Spacecraft_PrimaryPower_replace",
"description": "null",
"expects": "http://ontology.projectchronos.eu/subsystems/Spacecraft_PrimaryPower"
},
{
"statusCodes": [],
"@type": "hydraspec:Operation",
"returns": "null",
"label": "Deletes a Spacecraft_PrimaryPower entity",
"method": "DELETE",
"@id": "_:Spacecraft_PrimaryPower_delete",
"description": "null",
"expects": "null"
},
{
"statusCodes": [
{
"code": 404,
"description": "If the Spacecraft_PrimaryPower entity wasn't found."
}
],
"@type": "hydraspec:Operation",
"returns": "http://ontology.projectchronos.eu/subsystems/Spacecraft_PrimaryPower",
"label": "Retrieves a Spacecraft_PrimaryPower entity",
"method": "GET",
"@id": "_:Spacecraft_PrimaryPower_retrieve",
"description": "null",
"expects": "null"
}
],
"description": "Complex devices-subsystems used for collecting energy.",
"@id": "http://ontology.projectchronos.eu/subsystems/Spacecraft_PrimaryPower"
},
{
"@type": "Class",
"supportedProperty": [
{
"@type": "SupportedProperty",
"title": "isSubsystemOf",
"property": "http://ontology.projectchronos.eu/spacecraft/isSubsystemOf",
"description": "subject is a device or a system of devices that is subsystem of a wider system or device",
"readonly": "false",
"required": "false",
"writeonly": "false"
},
{
"@type": "SupportedProperty",
"title": "isComponentOf",
"property": "http://ontology.projectchronos.eu/spacecraft/isComponentOf",
"description": "the subject is a member of a wider artifact, that is a set of artifacts",
"readonly": "false",
"required": "false",
"writeonly": "false"
},
{
"@type": "SupportedProperty",
"title": "hasSubSystem",
"property": "http://ontology.projectchronos.eu/spacecraft/hasSubSystem",
"readonly": "false",
"required": "false",
"writeonly": "false"
},
{
"@type": "SupportedProperty",
"title": "embedSensor",
"property": "http://ontology.projectchronos.eu/subsystems/embedSensor",
"description": "a subsystem that holds a sensor",
"readonly": "false",
"required": "false",
"writeonly": "false"
},
{
"@type": "SupportedProperty",
"title": "isDeployedIn",
"property": "http://ontology.projectchronos.eu/spacecraft/isDeployedIn",
"description": "the environment in which a device or a system of devices is designed to work",
"readonly": "false",
"required": "false",
"writeonly": "false"
},
{
"required": "false",
"@type": "SupportedProperty",
"property": "http://ontology.projectchronos.eu/subsytems/function",
"writeonly": "false",
"readonly": "false"
},
{
"required": "false",
"@type": "SupportedProperty",
"property": "http://ontology.projectchronos.eu/subsystems/subSystemType",
"writeonly": "false",
"readonly": "false"
},
{
"required": "false",
"@type": "SupportedProperty",
"property": "http://ontology.projectchronos.eu/subsystems/hasWireInWith",
"writeonly": "false",
"readonly": "false"
},
{
"required": "false",
"@type": "SupportedProperty",
"property": "http://ontology.projectchronos.eu/subsystems/maxWorkingTemperature",
"writeonly": "false",
"readonly": "false"
},
{
"required": "false",
"@type": "SupportedProperty",
"property": "http://ontology.projectchronos.eu/subsystems/minWorkingTemperature",
"writeonly": "false",
"readonly": "false"
},
{
"required": "false",
"@type": "SupportedProperty",
"property": "http://ontology.projectchronos.eu/subsystems/hasWireOutWith",
"writeonly": "false",
"readonly": "false"
}
],
"title": "Spacecraft_BackupPower",
"supportedOperation": [
{
"statusCodes": [],
"@type": "hydraspec:Operation",
"returns": "http://ontology.projectchronos.eu/subsystems/Spacecraft_BackupPower",
"label": "Creates a new Spacecraft_BackupPower entity",
"method": "POST",
"@id": "_:Spacecraft_BackupPower_create",
"description": "null",
"expects": "http://ontology.projectchronos.eu/subsystems/Spacecraft_BackupPower"
},
{
"statusCodes": [
{
"code": 404,
"description": "If the Spacecraft_BackupPower entity wasn't found."
}
],
"@type": "hydraspec:Operation",
"returns": "http://ontology.projectchronos.eu/subsystems/Spacecraft_BackupPower",
"label": "Replaces an existing Spacecraft_BackupPower entity",
"method": "PUT",
"@id": "_:Spacecraft_BackupPower_replace",
"description": "null",
"expects": "http://ontology.projectchronos.eu/subsystems/Spacecraft_BackupPower"
},
{
"statusCodes": [],
"@type": "hydraspec:Operation",
"returns": "null",
"label": "Deletes a Spacecraft_BackupPower entity",
"method": "DELETE",
"@id": "_:Spacecraft_BackupPower_delete",
"description": "null",
"expects": "null"
},
{
"statusCodes": [
{
"code": 404,
"description": "If the Spacecraft_BackupPower entity wasn't found."
}
],
"@type": "hydraspec:Operation",
"returns": "http://ontology.projectchronos.eu/subsystems/Spacecraft_BackupPower",
"label": "Retrieves a Spacecraft_BackupPower entity",
"method": "GET",
"@id": "_:Spacecraft_BackupPower_retrieve",
"description": "null",
"expects": "null"
}
],
"description": "Complex devices-subsystems used for storing energy.",
"@id": "http://ontology.projectchronos.eu/subsystems/Spacecraft_BackupPower"
},
{
"@type": "Class",
"supportedProperty": [
{
"@type": "SupportedProperty",
"title": "isSubsystemOf",
"property": "http://ontology.projectchronos.eu/spacecraft/isSubsystemOf",
"description": "subject is a device or a system of devices that is subsystem of a wider system or device",
"readonly": "false",
"required": "false",
"writeonly": "false"
},
{
"@type": "SupportedProperty",
"title": "isComponentOf",
"property": "http://ontology.projectchronos.eu/spacecraft/isComponentOf",
"description": "the subject is a member of a wider artifact, that is a set of artifacts",
"readonly": "false",
"required": "false",
"writeonly": "false"
},
{
"@type": "SupportedProperty",
"title": "hasSubSystem",
"property": "http://ontology.projectchronos.eu/spacecraft/hasSubSystem",
"readonly": "false",
"required": "false",
"writeonly": "false"
},
{
"@type": "SupportedProperty",
"title": "embedSensor",
"property": "http://ontology.projectchronos.eu/subsystems/embedSensor",
"description": "a subsystem that holds a sensor",
"readonly": "false",
"required": "false",
"writeonly": "false"
},
{
"@type": "SupportedProperty",
"title": "isDeployedIn",
"property": "http://ontology.projectchronos.eu/spacecraft/isDeployedIn",
"description": "the environment in which a device or a system of devices is designed to work",
"readonly": "false",
"required": "false",
"writeonly": "false"
},
{
"required": "false",
"@type": "SupportedProperty",
"property": "http://ontology.projectchronos.eu/subsytems/function",
"writeonly": "false",
"readonly": "false"
}
],
"title": "Spacecraft_Thermal",
"supportedOperation": [
{
"statusCodes": [],
"@type": "hydraspec:Operation",
"returns": "http://ontology.projectchronos.eu/subsystems/Spacecraft_Thermal",
"label": "Creates a new Spacecraft_Thermal entity",
"method": "POST",
"@id": "_:Spacecraft_Thermal_create",
"description": "null",
"expects": "http://ontology.projectchronos.eu/subsystems/Spacecraft_Thermal"
},
{
"statusCodes": [
{
"code": 404,
"description": "If the Spacecraft_Thermal entity wasn't found."
}
],
"@type": "hydraspec:Operation",
"returns": "http://ontology.projectchronos.eu/subsystems/Spacecraft_Thermal",
"label": "Replaces an existing Spacecraft_Thermal entity",
"method": "PUT",
"@id": "_:Spacecraft_Thermal_replace",
"description": "null",
"expects": "http://ontology.projectchronos.eu/subsystems/Spacecraft_Thermal"
},
{
"statusCodes": [],
"@type": "hydraspec:Operation",
"returns": "null",
"label": "Deletes a Spacecraft_Thermal entity",
"method": "DELETE",
"@id": "_:Spacecraft_Thermal_delete",
"description": "null",
"expects": "null"
},
{
"statusCodes": [
{
"code": 404,
"description": "If the Spacecraft_Thermal entity wasn't found."
}
],
"@type": "hydraspec:Operation",
"returns": "http://ontology.projectchronos.eu/subsystems/Spacecraft_Thermal",
"label": "Retrieves a Spacecraft_Thermal entity",
"method": "GET",
"@id": "_:Spacecraft_Thermal_retrieve",
"description": "null",
"expects": "null"
}
],
"description": "Shields, shells or any device insulation from/reflecting radiation exploiting emission and absorption events",
"@id": "http://ontology.projectchronos.eu/subsystems/Spacecraft_Thermal"
},
{
"@type": "Class",
"supportedProperty": [
{
"@type": "SupportedProperty",
"title": "isSubsystemOf",
"property": "http://ontology.projectchronos.eu/spacecraft/isSubsystemOf",
"description": "subject is a device or a system of devices that is subsystem of a wider system or device",
"readonly": "false",
"required": "false",
"writeonly": "false"
},
{
"@type": "SupportedProperty",
"title": "isComponentOf",
"property": "http://ontology.projectchronos.eu/spacecraft/isComponentOf",
"description": "the subject is a member of a wider artifact, that is a set of artifacts",
"readonly": "false",
"required": "false",
"writeonly": "false"
},
{
"@type": "SupportedProperty",
"title": "hasSubSystem",
"property": "http://ontology.projectchronos.eu/spacecraft/hasSubSystem",
"readonly": "false",
"required": "false",
"writeonly": "false"
},
{
"@type": "SupportedProperty",
"title": "isDeployedIn",
"property": "http://ontology.projectchronos.eu/spacecraft/isDeployedIn",
"description": "the environment in which a device or a system of devices is designed to work",
"readonly": "false",
"required": "false",
"writeonly": "false"
},
{
"required": "false",
"@type": "SupportedProperty",
"property": "http://ontology.projectchronos.eu/subsystems/subSystemType",
"writeonly": "false",
"readonly": "false"
},
{
"required": "false",
"@type": "SupportedProperty",
"property": "http://ontology.projectchronos.eu/subsystems/maxWorkingTemperature",
"writeonly": "false",
"readonly": "false"
},
{
"required": "false",
"@type": "SupportedProperty",
"property": "http://ontology.projectchronos.eu/subsystems/minWorkingTemperature",
"writeonly": "false",
"readonly": "false"
}
],
"title": "Spacecraft_Thermal_PassiveDevice",
"supportedOperation": [
{
"statusCodes": [],
"@type": "hydraspec:Operation",
"returns": "http://ontology.projectchronos.eu/subsystems/Spacecraft_Thermal_PassiveDevice",
"label": "Creates a new Spacecraft_Thermal_PassiveDevice entity",
"method": "POST",
"@id": "_:Spacecraft_Thermal_PassiveDevice_create",
"description": "null",
"expects": "http://ontology.projectchronos.eu/subsystems/Spacecraft_Thermal_PassiveDevice"
},
{
"statusCodes": [
{
"code": 404,
"description": "If the Spacecraft_Thermal_PassiveDevice entity wasn't found."
}
],
"@type": "hydraspec:Operation",
"returns": "http://ontology.projectchronos.eu/subsystems/Spacecraft_Thermal_PassiveDevice",
"label": "Replaces an existing Spacecraft_Thermal_PassiveDevice entity",
"method": "PUT",
"@id": "_:Spacecraft_Thermal_PassiveDevice_replace",
"description": "null",
"expects": "http://ontology.projectchronos.eu/subsystems/Spacecraft_Thermal_PassiveDevice"
},
{
"statusCodes": [],
"@type": "hydraspec:Operation",
"returns": "null",
"label": "Deletes a Spacecraft_Thermal_PassiveDevice entity",
"method": "DELETE",
"@id": "_:Spacecraft_Thermal_PassiveDevice_delete",
"description": "null",
"expects": "null"
},
{
"statusCodes": [
{
"code": 404,
"description": "If the Spacecraft_Thermal_PassiveDevice entity wasn't found."
}
],
"@type": "hydraspec:Operation",
"returns": "http://ontology.projectchronos.eu/subsystems/Spacecraft_Thermal_PassiveDevice",
"label": "Retrieves a Spacecraft_Thermal_PassiveDevice entity",
"method": "GET",
"@id": "_:Spacecraft_Thermal_PassiveDevice_retrieve",
"description": "null",
"expects": "null"
}
],
"description": "They are passive because they mostly transform radiation into heating/cooling ",
"@id": "http://ontology.projectchronos.eu/subsystems/Spacecraft_Thermal_PassiveDevice"
},
{
"@type": "Class",
"supportedProperty": [
{
"@type": "SupportedProperty",
"title": "isSubsystemOf",
"property": "http://ontology.projectchronos.eu/spacecraft/isSubsystemOf",
"description": "subject is a device or a system of devices that is subsystem of a wider system or device",
"readonly": "false",
"required": "false",
"writeonly": "false"
},
{
"@type": "SupportedProperty",
"title": "isComponentOf",
"property": "http://ontology.projectchronos.eu/spacecraft/isComponentOf",
"description": "the subject is a member of a wider artifact, that is a set of artifacts",
"readonly": "false",
"required": "false",
"writeonly": "false"
},
{
"@type": "SupportedProperty",
"title": "hasSubSystem",
"property": "http://ontology.projectchronos.eu/spacecraft/hasSubSystem",
"readonly": "false",
"required": "false",
"writeonly": "false"
},
{
"@type": "SupportedProperty",
"title": "isDeployedIn",
"property": "http://ontology.projectchronos.eu/spacecraft/isDeployedIn",
"description": "the environment in which a device or a system of devices is designed to work",
"readonly": "false",
"required": "false",
"writeonly": "false"
},
{
"required": "false",
"@type": "SupportedProperty",
"property": "http://ontology.projectchronos.eu/subsystems/subSystemType",
"writeonly": "false",
"readonly": "false"
},
{
"required": "false",
"@type": "SupportedProperty",
"property": "http://ontology.projectchronos.eu/subsystems/hasWireInWith",
"writeonly": "false",
"readonly": "false"
}
],
"title": "Spacecraft_Thermal_ActiveDevice",
"supportedOperation": [
{
"statusCodes": [],
"@type": "hydraspec:Operation",
"returns": "http://ontology.projectchronos.eu/subsystems/Spacecraft_Thermal_ActiveDevice",
"label": "Creates a new Spacecraft_Thermal_ActiveDevice entity",
"method": "POST",
"@id": "_:Spacecraft_Thermal_ActiveDevice_create",
"description": "null",
"expects": "http://ontology.projectchronos.eu/subsystems/Spacecraft_Thermal_ActiveDevice"
},
{
"statusCodes": [
{
"code": 404,
"description": "If the Spacecraft_Thermal_ActiveDevice entity wasn't found."
}
],
"@type": "hydraspec:Operation",
"returns": "http://ontology.projectchronos.eu/subsystems/Spacecraft_Thermal_ActiveDevice",
"label": "Replaces an existing Spacecraft_Thermal_ActiveDevice entity",
"method": "PUT",
"@id": "_:Spacecraft_Thermal_ActiveDevice_replace",
"description": "null",
"expects": "http://ontology.projectchronos.eu/subsystems/Spacecraft_Thermal_ActiveDevice"
},
{
"statusCodes": [],
"@type": "hydraspec:Operation",
"returns": "null",
"label": "Deletes a Spacecraft_Thermal_ActiveDevice entity",
"method": "DELETE",
"@id": "_:Spacecraft_Thermal_ActiveDevice_delete",
"description": "null",
"expects": "null"
},
{
"statusCodes": [
{
"code": 404,
"description": "If the Spacecraft_Thermal_ActiveDevice entity wasn't found."
}
],
"@type": "hydraspec:Operation",
"returns": "http://ontology.projectchronos.eu/subsystems/Spacecraft_Thermal_ActiveDevice",
"label": "Retrieves a Spacecraft_Thermal_ActiveDevice entity",
"method": "GET",
"@id": "_:Spacecraft_Thermal_ActiveDevice_retrieve",
"description": "null",
"expects": "null"
}
],
"description": "Complex devices-subsystems used to protect sensors or electronic devices from over/under-heating, like refrigeration absorption.",
"@id": "http://ontology.projectchronos.eu/subsystems/Spacecraft_Thermal_ActiveDevice"
},
{
"@type": "Class",
"supportedProperty": [
{
"@type": "SupportedProperty",
"title": "isSubsystemOf",
"property": "http://ontology.projectchronos.eu/spacecraft/isSubsystemOf",
"description": "subject is a device or a system of devices that is subsystem of a wider system or device",
"readonly": "false",
"required": "false",
"writeonly": "false"
},
{
"@type": "SupportedProperty",
"title": "isComponentOf",
"property": "http://ontology.projectchronos.eu/spacecraft/isComponentOf",
"description": "the subject is a member of a wider artifact, that is a set of artifacts",
"readonly": "false",
"required": "false",
"writeonly": "false"
},
{
"@type": "SupportedProperty",
"title": "hasSubSystem",
"property": "http://ontology.projectchronos.eu/spacecraft/hasSubSystem",
"readonly": "false",
"required": "false",
"writeonly": "false"
},
{
"@type": "SupportedProperty",
"title": "embedSensor",
"property": "http://ontology.projectchronos.eu/subsystems/embedSensor",
"description": "a subsystem that holds a sensor",
"readonly": "false",
"required": "false",
"writeonly": "false"
},
{
"@type": "SupportedProperty",
"title": "isDeployedIn",
"property": "http://ontology.projectchronos.eu/spacecraft/isDeployedIn",
"description": "the environment in which a device or a system of devices is designed to work",
"readonly": "false",
"required": "false",
"writeonly": "false"
},
{
"required": "false",
"@type": "SupportedProperty",
"property": "http://ontology.projectchronos.eu/subsytems/function",
"writeonly": "false",
"readonly": "false"
},
{
"required": "false",
"@type": "SupportedProperty",
"property": "http://ontology.projectchronos.eu/subsystems/maxWorkingTemperature",
"writeonly": "false",
"readonly": "false"
},
{
"required": "false",
"@type": "SupportedProperty",
"property": "http://ontology.projectchronos.eu/subsystems/minWorkingTemperature",
"writeonly": "false",
"readonly": "false"
},
{
"required": "false",
"@type": "SupportedProperty",
"property": "http://ontology.projectchronos.eu/subsystems/standsMaxTemperature",
"writeonly": "false",
"readonly": "false"
},
{
"required": "false",
"@type": "SupportedProperty",
"property": "http://ontology.projectchronos.eu/subsystems/subSystemType",
"writeonly": "false",
"readonly": "false"
}
],
"title": "Spacecraft_Structure",
"supportedOperation": [
{
"statusCodes": [],
"@type": "hydraspec:Operation",
"returns": "http://ontology.projectchronos.eu/subsystems/Spacecraft_Structure",
"label": "Creates a new Spacecraft_Structure entity",
"method": "POST",
"@id": "_:Spacecraft_Structure_create",
"description": "null",
"expects": "http://ontology.projectchronos.eu/subsystems/Spacecraft_Structure"
},
{
"statusCodes": [
{
"code": 404,
"description": "If the Spacecraft_Structure entity wasn't found."
}
],
"@type": "hydraspec:Operation",
"returns": "http://ontology.projectchronos.eu/subsystems/Spacecraft_Structure",
"label": "Replaces an existing Spacecraft_Structure entity",
"method": "PUT",
"@id": "_:Spacecraft_Structure_replace",
"description": "null",
"expects": "http://ontology.projectchronos.eu/subsystems/Spacecraft_Structure"
},
{
"statusCodes": [],
"@type": "hydraspec:Operation",
"returns": "null",
"label": "Deletes a Spacecraft_Structure entity",
"method": "DELETE",
"@id": "_:Spacecraft_Structure_delete",
"description": "null",
"expects": "null"
},
{
"statusCodes": [
{
"code": 404,
"description": "If the Spacecraft_Structure entity wasn't found."
}
],
"@type": "hydraspec:Operation",
"returns": "http://ontology.projectchronos.eu/subsystems/Spacecraft_Structure",
"label": "Retrieves a Spacecraft_Structure entity",
"method": "GET",
"@id": "_:Spacecraft_Structure_retrieve",
"description": "null",
"expects": "null"
}
],
"description": "It's the skeleton and framework of the spacecraft.",
"@id": "http://ontology.projectchronos.eu/subsystems/Spacecraft_Structure"
},
{
"@type": "Class",
"supportedProperty": [
{
"@type": "SupportedProperty",
"title": "isSubsystemOf",
"property": "http://ontology.projectchronos.eu/spacecraft/isSubsystemOf",
"description": "subject is a device or a system of devices that is subsystem of a wider system or device",
"readonly": "false",
"required": "false",
"writeonly": "false"
},
{
"@type": "SupportedProperty",
"title": "isComponentOf",
"property": "http://ontology.projectchronos.eu/spacecraft/isComponentOf",
"description": "the subject is a member of a wider artifact, that is a set of artifacts",
"readonly": "false",
"required": "false",
"writeonly": "false"
},
{
"@type": "SupportedProperty",
"title": "hasSubSystem",
"property": "http://ontology.projectchronos.eu/spacecraft/hasSubSystem",
"readonly": "false",
"required": "false",
"writeonly": "false"
},
{
"@type": "SupportedProperty",
"title": "embedSensor",
"property": "http://ontology.projectchronos.eu/subsystems/embedSensor",
"description": "a subsystem that holds a sensor",
"readonly": "false",
"required": "false",
"writeonly": "false"
},
{
"@type": "SupportedProperty",
"title": "isDeployedIn",
"property": "http://ontology.projectchronos.eu/spacecraft/isDeployedIn",
"description": "the environment in which a device or a system of devices is designed to work",
"readonly": "false",
"required": "false",
"writeonly": "false"
},
{
"required": "false",
"@type": "SupportedProperty",
"property": "http://ontology.projectchronos.eu/subsytems/function",
"writeonly": "false",
"readonly": "false"
},
{
"required": "false",
"@type": "SupportedProperty",
"property": "http://ontology.projectchronos.eu/subsystems/hasVoltage",
"writeonly": "false",
"readonly": "false"
},
{
"required": "false",
"@type": "SupportedProperty",
"property": "http://ontology.projectchronos.eu/subsystems/hasMaxClock",
"writeonly": "false",
"readonly": "false"
},
{
"required": "false",
"@type": "SupportedProperty",
"property": "http://ontology.projectchronos.eu/subsystems/hasMinClock",
"writeonly": "false",
"readonly": "false"
},
{
"required": "false",
"@type": "SupportedProperty",
"property": "http://ontology.projectchronos.eu/subsystems/hasDataStorage",
"writeonly": "false",
"readonly": "false"
},
{
"required": "false",
"@type": "SupportedProperty",
"property": "http://ontology.projectchronos.eu/subsystems/hasDataStorageExternal",
"writeonly": "false",
"readonly": "false"
},
{
"required": "false",
"@type": "SupportedProperty",
"property": "http://ontology.projectchronos.eu/subsystems/hasRAM",
"writeonly": "false",
"readonly": "false"
},
{
"required": "false",
"@type": "SupportedProperty",
"property": "http://ontology.projectchronos.eu/subsystems/hasMinTemperature",
"writeonly": "false",
"readonly": "false"
},
{
"required": "false",
"@type": "SupportedProperty",
"property": "http://ontology.projectchronos.eu/subsystems/hasMaxTemperature",
"writeonly": "false",
"readonly": "false"
},
{
"required": "false",
"@type": "SupportedProperty",
"property": "http://ontology.projectchronos.eu/subsystems/subSystemType",
"writeonly": "false",
"readonly": "false"
},
{
"required": "false",
"@type": "SupportedProperty",
"property": "http://ontology.projectchronos.eu/subsystems/maxWorkingTemperature",
"writeonly": "false",
"readonly": "false"
},
{
"required": "false",
"@type": "SupportedProperty",
"property": "http://ontology.projectchronos.eu/subsystems/minWorkingTemperature",
"writeonly": "false",
"readonly": "false"
},
{
"required": "false",
"@type": "SupportedProperty",
"property": "http://ontology.projectchronos.eu/subsystems/hasWireOutWith",
"writeonly": "false",
"readonly": "false"
},
{
"required": "false",
"@type": "SupportedProperty",
"property": "http://ontology.projectchronos.eu/subsystems/hasWireInWith",
"writeonly": "false",
"readonly": "false"
}
],
"title": "Spacecraft_CDH",
"supportedOperation": [
{
"statusCodes": [],
"@type": "hydraspec:Operation",
"returns": "http://ontology.projectchronos.eu/subsystems/Spacecraft_CDH",
"label": "Creates a new Spacecraft_CDH entity",
"method": "POST",
"@id": "_:Spacecraft_CDH_create",
"description": "null",
"expects": "http://ontology.projectchronos.eu/subsystems/Spacecraft_CDH"
},
{
"statusCodes": [
{
"code": 404,
"description": "If the Spacecraft_CDH entity wasn't found."
}
],
"@type": "hydraspec:Operation",
"returns": "http://ontology.projectchronos.eu/subsystems/Spacecraft_CDH",
"label": "Replaces an existing Spacecraft_CDH entity",
"method": "PUT",
"@id": "_:Spacecraft_CDH_replace",
"description": "null",
"expects": "http://ontology.projectchronos.eu/subsystems/Spacecraft_CDH"
},
{
"statusCodes": [],
"@type": "hydraspec:Operation",
"returns": "null",
"label": "Deletes a Spacecraft_CDH entity",
"method": "DELETE",
"@id": "_:Spacecraft_CDH_delete",
"description": "null",
"expects": "null"
},
{
"statusCodes": [
{
"code": 404,
"description": "If the Spacecraft_CDH entity wasn't found."
}
],
"@type": "hydraspec:Operation",
"returns": "http://ontology.projectchronos.eu/subsystems/Spacecraft_CDH",
"label": "Retrieves a Spacecraft_CDH entity",
"method": "GET",
"@id": "_:Spacecraft_CDH_retrieve",
"description": "null",
"expects": "null"
}
],
"description": "The DH system shall: Enable HK and science data flow \u2013 Housekeeping data (Temperatures, Pressures, Voltages, Currents, Status,...) \u2013 Attitude data \u2013 Payload data (e.g., Science data) - Receive and distribute commands - Perform TM and TC protocols - Distribute timing signals - Synchronization of data \u2013 Time stamping of data - Provide data storage - Execute commands and schedules - Control subsystems and payloads - Monitor spacecraft health - Make autonomous decisions - Perform data compression.",
"@id": "http://ontology.projectchronos.eu/subsystems/Spacecraft_CDH"
},
{
"@type": "Class",
"supportedProperty": [
{
"@type": "SupportedProperty",
"title": "isSubsystemOf",
"property": "http://ontology.projectchronos.eu/spacecraft/isSubsystemOf",
"description": "subject is a device or a system of devices that is subsystem of a wider system or device",
"readonly": "false",
"required": "false",
"writeonly": "false"
},
{
"@type": "SupportedProperty",
"title": "isComponentOf",
"property": "http://ontology.projectchronos.eu/spacecraft/isComponentOf",
"description": "the subject is a member of a wider artifact, that is a set of artifacts",
"readonly": "false",
"required": "false",
"writeonly": "false"
},
{
"@type": "SupportedProperty",
"title": "hasSubSystem",
"property": "http://ontology.projectchronos.eu/spacecraft/hasSubSystem",
"readonly": "false",
"required": "false",
"writeonly": "false"
},
{
"@type": "SupportedProperty",
"title": "embedSensor",
"property": "http://ontology.projectchronos.eu/subsystems/embedSensor",
"description": "a subsystem that holds a sensor",
"readonly": "false",
"required": "false",
"writeonly": "false"
},
{
"@type": "SupportedProperty",
"title": "isDeployedIn",
"property": "http://ontology.projectchronos.eu/spacecraft/isDeployedIn",
"description": "the environment in which a device or a system of devices is designed to work",
"readonly": "false",
"required": "false",
"writeonly": "false"
},
{
"required": "false",
"@type": "SupportedProperty",
"property": "http://ontology.projectchronos.eu/subsytems/function",
"writeonly": "false",
"readonly": "false"
},
{
"required": "false",
"@type": "SupportedProperty",
"property": "http://ontology.projectchronos.eu/subsystems/hasMinTemperature",
"writeonly": "false",
"readonly": "false"
},
{
"required": "false",
"@type": "SupportedProperty",
"property": "http://ontology.projectchronos.eu/subsystems/hasMaxTemperature",
"writeonly": "false",
"readonly": "false"
},
{
"required": "false",
"@type": "SupportedProperty",
"property": "http://ontology.projectchronos.eu/subsystems/subSystemType",
"writeonly": "false",
"readonly": "false"
},
{
"required": "false",
"@type": "SupportedProperty",
"property": "http://ontology.projectchronos.eu/subsystems/hasWireOutWith",
"writeonly": "false",
"readonly": "false"
},
{
"required": "false",
"@type": "SupportedProperty",
"property": "http://ontology.projectchronos.eu/subsystems/maxWorkingTemperature",
"writeonly": "false",
"readonly": "false"
},
{
"required": "false",
"@type": "SupportedProperty",
"property": "http://ontology.projectchronos.eu/subsystems/minWorkingTemperature",
"writeonly": "false",
"readonly": "false"
},
{
"required": "false",
"@type": "SupportedProperty",
"property": "http://ontology.projectchronos.eu/subsystems/hasWireInWith",
"writeonly": "false",
"readonly": "false"
}
],
"title": "Spacecraft_Communication",
"supportedOperation": [
{
"statusCodes": [],
"@type": "hydraspec:Operation",
"returns": "http://ontology.projectchronos.eu/subsystems/Spacecraft_Communication",
"label": "Creates a new Spacecraft_Communication entity",
"method": "POST",
"@id": "_:Spacecraft_Communication_create",
"description": "null",
"expects": "http://ontology.projectchronos.eu/subsystems/Spacecraft_Communication"
},
{
"statusCodes": [
{
"code": 404,
"description": "If the Spacecraft_Communication entity wasn't found."
}
],
"@type": "hydraspec:Operation",
"returns": "http://ontology.projectchronos.eu/subsystems/Spacecraft_Communication",
"label": "Replaces an existing Spacecraft_Communication entity",
"method": "PUT",
"@id": "_:Spacecraft_Communication_replace",
"description": "null",
"expects": "http://ontology.projectchronos.eu/subsystems/Spacecraft_Communication"
},
{
"statusCodes": [],
"@type": "hydraspec:Operation",
"returns": "null",
"label": "Deletes a Spacecraft_Communication entity",
"method": "DELETE",
"@id": "_:Spacecraft_Communication_delete",
"description": "null",
"expects": "null"
},
{
"statusCodes": [
{
"code": 404,
"description": "If the Spacecraft_Communication entity wasn't found."
}
],
"@type": "hydraspec:Operation",
"returns": "http://ontology.projectchronos.eu/subsystems/Spacecraft_Communication",
"label": "Retrieves a Spacecraft_Communication entity",
"method": "GET",
"@id": "_:Spacecraft_Communication_retrieve",
"description": "null",
"expects": "null"
}
],
"description": "Complex devices-subsystems used for transmitting/receiving radio waves.",
"@id": "http://ontology.projectchronos.eu/subsystems/Spacecraft_Communication"
},
{
"@type": "Class",
"supportedProperty": [
{
"@type": "SupportedProperty",
"title": "isSubsystemOf",
"property": "http://ontology.projectchronos.eu/spacecraft/isSubsystemOf",
"description": "subject is a device or a system of devices that is subsystem of a wider system or device",
"readonly": "false",
"required": "false",
"writeonly": "false"
},
{
"@type": "SupportedProperty",
"title": "isComponentOf",
"property": "http://ontology.projectchronos.eu/spacecraft/isComponentOf",
"description": "the subject is a member of a wider artifact, that is a set of artifacts",
"readonly": "false",
"required": "false",
"writeonly": "false"
},
{
"@type": "SupportedProperty",
"title": "hasSubSystem",
"property": "http://ontology.projectchronos.eu/spacecraft/hasSubSystem",
"readonly": "false",
"required": "false",
"writeonly": "false"
},
{
"@type": "SupportedProperty",
"title": "embedSensor",
"property": "http://ontology.projectchronos.eu/subsystems/embedSensor",
"description": "a subsystem that holds a sensor",
"readonly": "false",
"required": "false",
"writeonly": "false"
},
{
"@type": "SupportedProperty",
"title": "isDeployedIn",
"property": "http://ontology.projectchronos.eu/spacecraft/isDeployedIn",
"description": "the environment in which a device or a system of devices is designed to work",
"readonly": "false",
"required": "false",
"writeonly": "false"
},
{
"required": "false",
"@type": "SupportedProperty",
"property": "http://ontology.projectchronos.eu/subsytems/function",
"writeonly": "false",
"readonly": "false"
},
{
"required": "false",
"@type": "SupportedProperty",
"property": "http://ontology.projectchronos.eu/subsystems/standsMaxTemperature",
"writeonly": "false",
"readonly": "false"
},
{
"required": "false",
"@type": "SupportedProperty",
"property": "http://ontology.projectchronos.eu/subsystems/maxWorkingTemperature",
"writeonly": "false",
"readonly": "false"
},
{
"required": "false",
"@type": "SupportedProperty",
"property": "http://ontology.projectchronos.eu/subsystems/minWorkingTemperature",
"writeonly": "false",
"readonly": "false"
},
{
"required": "false",
"@type": "SupportedProperty",
"property": "http://ontology.projectchronos.eu/subsystems/subSystemType",
"writeonly": "false",
"readonly": "false"
},
{
"required": "false",
"@type": "SupportedProperty",
"property": "http://ontology.projectchronos.eu/subsystems/hasWireOutWith",
"writeonly": "false",
"readonly": "false"
}
],
"title": "Spacecraft_AODCS",
"supportedOperation": [
{
"statusCodes": [],
"@type": "hydraspec:Operation",
"returns": "http://ontology.projectchronos.eu/subsystems/Spacecraft_AODCS",
"label": "Creates a new Spacecraft_AODCS entity",
"method": "POST",
"@id": "_:Spacecraft_AODCS_create",
"description": "null",
"expects": "http://ontology.projectchronos.eu/subsystems/Spacecraft_AODCS"
},
{
"statusCodes": [
{
"code": 404,
"description": "If the Spacecraft_AODCS entity wasn't found."
}
],
"@type": "hydraspec:Operation",
"returns": "http://ontology.projectchronos.eu/subsystems/Spacecraft_AODCS",
"label": "Replaces an existing Spacecraft_AODCS entity",
"method": "PUT",
"@id": "_:Spacecraft_AODCS_replace",
"description": "null",
"expects": "http://ontology.projectchronos.eu/subsystems/Spacecraft_AODCS"
},
{
"statusCodes": [],
"@type": "hydraspec:Operation",
"returns": "null",
"label": "Deletes a Spacecraft_AODCS entity",
"method": "DELETE",
"@id": "_:Spacecraft_AODCS_delete",
"description": "null",
"expects": "null"
},
{
"statusCodes": [
{
"code": 404,
"description": "If the Spacecraft_AODCS entity wasn't found."
}
],
"@type": "hydraspec:Operation",
"returns": "http://ontology.projectchronos.eu/subsystems/Spacecraft_AODCS",
"label": "Retrieves a Spacecraft_AODCS entity",
"method": "GET",
"@id": "_:Spacecraft_AODCS_retrieve",
"description": "null",
"expects": "null"
}
],
"description": "Complex devices-subsystems used to set the direction and the position of the spacecraft, it controls flight dynamics.",
"@id": "http://ontology.projectchronos.eu/subsystems/Spacecraft_AODCS"
},
{
"@type": "Class",
"supportedProperty": [
{
"@type": "SupportedProperty",
"title": "isSubsystemOf",
"property": "http://ontology.projectchronos.eu/spacecraft/isSubsystemOf",
"description": "subject is a device or a system of devices that is subsystem of a wider system or device",
"readonly": "false",
"required": "false",
"writeonly": "false"
},
{
"@type": "SupportedProperty",
"title": "isComponentOf",
"property": "http://ontology.projectchronos.eu/spacecraft/isComponentOf",
"description": "the subject is a member of a wider artifact, that is a set of artifacts",
"readonly": "false",
"required": "false",
"writeonly": "false"
},
{
"@type": "SupportedProperty",
"title": "hasSubSystem",
"property": "http://ontology.projectchronos.eu/spacecraft/hasSubSystem",
"readonly": "false",
"required": "false",
"writeonly": "false"
},
{
"@type": "SupportedProperty",
"title": "isDeployedIn",
"property": "http://ontology.projectchronos.eu/spacecraft/isDeployedIn",
"description": "the environment in which a device or a system of devices is designed to work",
"readonly": "false",
"required": "false",
"writeonly": "false"
},
{
"required": "false",
"@type": "SupportedProperty",
"property": "http://ontology.projectchronos.eu/subsystems/hasWireInWith",
"writeonly": "false",
"readonly": "false"
}
],
"title": "Spacecraft_AODCS_Active",
"supportedOperation": [
{
"statusCodes": [],
"@type": "hydraspec:Operation",
"returns": "http://ontology.projectchronos.eu/subsystems/Spacecraft_AODCS_ActiveDevice",
"label": "Creates a new Spacecraft_AODCS_Active entity",
"method": "POST",
"@id": "_:Spacecraft_AODCS_Active_create",
"description": "null",
"expects": "http://ontology.projectchronos.eu/subsystems/Spacecraft_AODCS_ActiveDevice"
},
{
"statusCodes": [
{
"code": 404,
"description": "If the Spacecraft_AODCS_Active entity wasn't found."
}
],
"@type": "hydraspec:Operation",
"returns": "http://ontology.projectchronos.eu/subsystems/Spacecraft_AODCS_ActiveDevice",
"label": "Replaces an existing Spacecraft_AODCS_Active entity",
"method": "PUT",
"@id": "_:Spacecraft_AODCS_Active_replace",
"description": "null",
"expects": "http://ontology.projectchronos.eu/subsystems/Spacecraft_AODCS_ActiveDevice"
},
{
"statusCodes": [],
"@type": "hydraspec:Operation",
"returns": "null",
"label": "Deletes a Spacecraft_AODCS_Active entity",
"method": "DELETE",
"@id": "_:Spacecraft_AODCS_Active_delete",
"description": "null",
"expects": "null"
},
{
"statusCodes": [
{
"code": 404,
"description": "If the Spacecraft_AODCS_Active entity wasn't found."
}
],
"@type": "hydraspec:Operation",
"returns": "http://ontology.projectchronos.eu/subsystems/Spacecraft_AODCS_ActiveDevice",
"label": "Retrieves a Spacecraft_AODCS_Active entity",
"method": "GET",
"@id": "_:Spacecraft_AODCS_Active_retrieve",
"description": "null",
"expects": "null"
}
],
"description": "Do NOT use any additional power from the spacecraft generator",
"@id": "http://ontology.projectchronos.eu/subsystems/Spacecraft_AODCS_ActiveDevice"
},
{
"@type": "Class",
"supportedProperty": [
{
"@type": "SupportedProperty",
"title": "isSubsystemOf",
"property": "http://ontology.projectchronos.eu/spacecraft/isSubsystemOf",
"description": "subject is a device or a system of devices that is subsystem of a wider system or device",
"readonly": "false",
"required": "false",
"writeonly": "false"
},
{
"@type": "SupportedProperty",
"title": "isComponentOf",
"property": "http://ontology.projectchronos.eu/spacecraft/isComponentOf",
"description": "the subject is a member of a wider artifact, that is a set of artifacts",
"readonly": "false",
"required": "false",
"writeonly": "false"
},
{
"@type": "SupportedProperty",
"title": "hasSubSystem",
"property": "http://ontology.projectchronos.eu/spacecraft/hasSubSystem",
"readonly": "false",
"required": "false",
"writeonly": "false"
},
{
"@type": "SupportedProperty",
"title": "isDeployedIn",
"property": "http://ontology.projectchronos.eu/spacecraft/isDeployedIn",
"description": "the environment in which a device or a system of devices is designed to work",
"readonly": "false",
"required": "false",
"writeonly": "false"
},
{
"required": "false",
"@type": "SupportedProperty",
"property": "http://ontology.projectchronos.eu/subsystems/hasWireInWith",
"writeonly": "false",
"readonly": "false"
}
],
"title": "Spacecraft_AODCS_PassiveDevice",
"supportedOperation": [
{
"statusCodes": [],
"@type": "hydraspec:Operation",
"returns": "http://ontology.projectchronos.eu/subsystems/Spacecraft_AODCS_PassiveDevice",
"label": "Creates a new Spacecraft_AODCS_PassiveDevice entity",
"method": "POST",
"@id": "_:Spacecraft_AODCS_PassiveDevice_create",
"description": "null",
"expects": "http://ontology.projectchronos.eu/subsystems/Spacecraft_AODCS_PassiveDevice"
},
{
"statusCodes": [
{
"code": 404,
"description": "If the Spacecraft_AODCS_PassiveDevice entity wasn't found."
}
],
"@type": "hydraspec:Operation",
"returns": "http://ontology.projectchronos.eu/subsystems/Spacecraft_AODCS_PassiveDevice",
"label": "Replaces an existing Spacecraft_AODCS_PassiveDevice entity",
"method": "PUT",
"@id": "_:Spacecraft_AODCS_PassiveDevice_replace",
"description": "null",
"expects": "http://ontology.projectchronos.eu/subsystems/Spacecraft_AODCS_PassiveDevice"
},
{
"statusCodes": [],
"@type": "hydraspec:Operation",
"returns": "null",
"label": "Deletes a Spacecraft_AODCS_PassiveDevice entity",
"method": "DELETE",
"@id": "_:Spacecraft_AODCS_PassiveDevice_delete",
"description": "null",
"expects": "null"
},
{
"statusCodes": [
{
"code": 404,
"description": "If the Spacecraft_AODCS_PassiveDevice entity wasn't found."
}
],
"@type": "hydraspec:Operation",
"returns": "http://ontology.projectchronos.eu/subsystems/Spacecraft_AODCS_PassiveDevice",
"label": "Retrieves a Spacecraft_AODCS_PassiveDevice entity",
"method": "GET",
"@id": "_:Spacecraft_AODCS_PassiveDevice_retrieve",
"description": "null",
"expects": "null"
}
],
"description": "DO use any additional power from the spacecraft generator",
"@id": "http://ontology.projectchronos.eu/subsystems/Spacecraft_AODCS_PassiveDevice"
}
]
| 43.294899 | 544 | 0.471901 | 5,399 | 81,481 | 7.047416 | 0.044638 | 0.068123 | 0.147599 | 0.158953 | 0.941786 | 0.903414 | 0.858709 | 0.844595 | 0.835029 | 0.833478 | 0 | 0.00204 | 0.398363 | 81,481 | 1,881 | 545 | 43.317916 | 0.774122 | 0.001473 | 0 | 0.667021 | 1 | 0.00266 | 0.510184 | 0.039899 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.018617 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
385d0679b11e93153f900ea52a913ca0757ce446 | 17,023 | py | Python | tf_quant_finance/models/hull_white/swaption_test.py | DevarakondaV/tf-quant-finance | 4502b843ca138c2ae8ad77978a2cf52fa38dbbe5 | [
"Apache-2.0"
] | null | null | null | tf_quant_finance/models/hull_white/swaption_test.py | DevarakondaV/tf-quant-finance | 4502b843ca138c2ae8ad77978a2cf52fa38dbbe5 | [
"Apache-2.0"
] | null | null | null | tf_quant_finance/models/hull_white/swaption_test.py | DevarakondaV/tf-quant-finance | 4502b843ca138c2ae8ad77978a2cf52fa38dbbe5 | [
"Apache-2.0"
] | null | null | null | # Lint as: python3
# Copyright 2020 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# https://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Tests for cap_floor.py."""
from absl.testing import parameterized
import numpy as np
import tensorflow.compat.v2 as tf
import tf_quant_finance as tff
from tensorflow.python.framework import test_util # pylint: disable=g-direct-tensorflow-import
@test_util.run_all_in_graph_and_eager_modes
class HullWhiteSwaptionTest(parameterized.TestCase, tf.test.TestCase):
def setUp(self):
self.mean_reversion_1d = [0.03]
self.volatility_1d = [0.02]
self.volatility_time_dep_1d = [0.01, 0.02]
self.mean_reversion_2d = [0.03, 0.03]
self.volatility_2d = [0.02, 0.02]
self.expiries = np.array([1.0])
self.float_leg_start_times = np.array([1.0, 1.25, 1.5, 1.75])
self.float_leg_end_times = np.array([1.25, 1.5, 1.75, 2.0])
self.fixed_leg_payment_times = np.array([1.25, 1.5, 1.75, 2.0])
self.float_leg_daycount_fractions = 0.25 * np.ones_like(
self.float_leg_start_times)
self.fixed_leg_daycount_fractions = 0.25 * np.ones_like(
self.fixed_leg_payment_times)
self.fixed_leg_coupon = 0.011 * np.ones_like(self.fixed_leg_payment_times)
self.expiries_1d = np.array([1.0, 1.0])
self.float_leg_start_times_1d = np.array([[1.0, 1.25, 1.5, 1.75],
[1.0, 1.25, 1.5, 1.75]])
self.float_leg_end_times_1d = np.array([[1.25, 1.5, 1.75, 2.0],
[1.25, 1.5, 1.75, 2.0]])
self.fixed_leg_payment_times_1d = np.array([[1.25, 1.5, 1.75, 2.0],
[1.25, 1.5, 1.75, 2.0]])
self.float_leg_daycount_fractions_1d = 0.25 * np.ones_like(
self.float_leg_start_times_1d)
self.fixed_leg_daycount_fractions_1d = 0.25 * np.ones_like(
self.fixed_leg_payment_times_1d)
self.fixed_leg_coupon_1d = 0.011 * np.ones_like(
self.fixed_leg_payment_times_1d)
self.expiries_2d = np.array([[1.0, 1.0], [1.0, 1.0]])
self.float_leg_start_times_2d = np.array([[[1.0, 1.25, 1.5, 1.75],
[1.0, 1.25, 1.5, 1.75]],
[[1.0, 1.25, 1.5, 1.75],
[1.0, 1.25, 1.5, 1.75]]])
self.float_leg_end_times_2d = np.array([[[1.25, 1.5, 1.75, 2.0],
[1.25, 1.5, 1.75, 2.0]],
[[1.25, 1.5, 1.75, 2.0],
[1.25, 1.5, 1.75, 2.0]]])
self.fixed_leg_payment_times_2d = np.array([[[1.25, 1.5, 1.75, 2.0],
[1.25, 1.5, 1.75, 2.0]],
[[1.25, 1.5, 1.75, 2.0],
[1.25, 1.5, 1.75, 2.0]]])
self.float_leg_daycount_fractions_2d = 0.25 * np.ones_like(
self.float_leg_start_times_2d)
self.fixed_leg_daycount_fractions_2d = 0.25 * np.ones_like(
self.fixed_leg_payment_times_2d)
self.fixed_leg_coupon_2d = 0.011 * np.ones_like(
self.fixed_leg_payment_times_2d)
super(HullWhiteSwaptionTest, self).setUp()
@parameterized.named_parameters(
{
'testcase_name': 'analytic',
'use_analytic_pricing': True,
'error_tol': 1e-8,
}, {
'testcase_name': 'simulation',
'use_analytic_pricing': False,
'error_tol': 1e-3,
})
def test_correctness_1d(self, use_analytic_pricing, error_tol):
"""Tests model with constant parameters in 1 dimension."""
# 1y x 1y swaption with quarterly payments.
dtype = tf.float64
zero_rate_fn = lambda x: 0.01 * tf.ones_like(x, dtype=dtype)
price = tff.models.hull_white.swaption_price(
expiries=self.expiries,
floating_leg_start_times=self.float_leg_start_times,
floating_leg_end_times=self.float_leg_end_times,
fixed_leg_payment_times=self.fixed_leg_payment_times,
floating_leg_daycount_fractions=self.float_leg_daycount_fractions,
fixed_leg_daycount_fractions=self.fixed_leg_daycount_fractions,
fixed_leg_coupon=self.fixed_leg_coupon,
reference_rate_fn=zero_rate_fn,
notional=100.,
dim=1,
mean_reversion=self.mean_reversion_1d,
volatility=self.volatility_1d,
use_analytic_pricing=use_analytic_pricing,
num_samples=500000,
time_step=0.1,
random_type=tff.math.random.RandomType.PSEUDO_ANTITHETIC,
seed=0,
dtype=dtype)
self.assertEqual(price.dtype, dtype)
self.assertAllEqual(price.shape, [1, 1])
price = self.evaluate(price)
self.assertAllClose(price, [[0.7163243383624043]],
rtol=error_tol, atol=error_tol)
@parameterized.named_parameters(
{
'testcase_name': 'analytic',
'use_analytic_pricing': True,
'error_tol': 1e-8,
}, {
'testcase_name': 'simulation',
'use_analytic_pricing': False,
'error_tol': 5e-3,
})
def test_receiver_1d(self, use_analytic_pricing, error_tol):
"""Tests model with constant parameters in 1 dimension."""
# 1y x 1y receiver swaption with quarterly payments.
dtype = tf.float64
zero_rate_fn = lambda x: 0.01 * tf.ones_like(x, dtype=dtype)
price = tff.models.hull_white.swaption_price(
expiries=self.expiries,
floating_leg_start_times=self.float_leg_start_times,
floating_leg_end_times=self.float_leg_end_times,
fixed_leg_payment_times=self.fixed_leg_payment_times,
floating_leg_daycount_fractions=self.float_leg_daycount_fractions,
fixed_leg_daycount_fractions=self.fixed_leg_daycount_fractions,
fixed_leg_coupon=self.fixed_leg_coupon,
reference_rate_fn=zero_rate_fn,
notional=100.,
dim=1,
mean_reversion=self.mean_reversion_1d,
volatility=self.volatility_1d,
is_payer_swaption=False,
use_analytic_pricing=use_analytic_pricing,
num_samples=500000,
time_step=0.1,
random_type=tff.math.random.RandomType.PSEUDO_ANTITHETIC,
seed=0,
dtype=dtype)
self.assertEqual(price.dtype, dtype)
self.assertAllEqual(price.shape, [1, 1])
price = self.evaluate(price)
self.assertAllClose(price, [[0.813482544626056]],
rtol=error_tol, atol=error_tol)
@parameterized.named_parameters(
{
'testcase_name': 'analytic',
'use_analytic_pricing': True,
'error_tol': 1e-8,
}, {
'testcase_name': 'simulation',
'use_analytic_pricing': False,
'error_tol': 1e-3,
})
def test_time_dep_1d(self, use_analytic_pricing, error_tol):
"""Tests model with time-dependent parameters in 1 dimension."""
# 1y x 1y swaption with quarterly payments.
dtype = tf.float64
zero_rate_fn = lambda x: 0.01 * tf.ones_like(x, dtype=dtype)
volatility = tff.math.piecewise.PiecewiseConstantFunc(
jump_locations=[0.5], values=self.volatility_time_dep_1d,
dtype=dtype)
price = tff.models.hull_white.swaption_price(
expiries=self.expiries,
floating_leg_start_times=self.float_leg_start_times,
floating_leg_end_times=self.float_leg_end_times,
fixed_leg_payment_times=self.fixed_leg_payment_times,
floating_leg_daycount_fractions=self.float_leg_daycount_fractions,
fixed_leg_daycount_fractions=self.fixed_leg_daycount_fractions,
fixed_leg_coupon=self.fixed_leg_coupon,
reference_rate_fn=zero_rate_fn,
notional=100.,
dim=1,
mean_reversion=self.mean_reversion_1d,
volatility=volatility,
use_analytic_pricing=use_analytic_pricing,
num_samples=1000000,
time_step=0.1,
random_type=tff.math.random.RandomType.PSEUDO_ANTITHETIC,
seed=0,
dtype=dtype)
self.assertEqual(price.dtype, dtype)
self.assertAllEqual(price.shape, [1, 1])
price = self.evaluate(price)
self.assertAllClose(price, [[0.5593057004094042]],
rtol=error_tol, atol=error_tol)
@parameterized.named_parameters(
{
'testcase_name': 'analytic',
'use_analytic_pricing': True,
'error_tol': 1e-8,
}, {
'testcase_name': 'simulation',
'use_analytic_pricing': False,
'error_tol': 1e-3,
})
def test_1d_batch_1d(self, use_analytic_pricing, error_tol):
"""Tests 1-d batch."""
dtype = tf.float64
zero_rate_fn = lambda x: 0.01 * tf.ones_like(x, dtype=dtype)
price = tff.models.hull_white.swaption_price(
expiries=self.expiries_1d,
floating_leg_start_times=self.float_leg_start_times_1d,
floating_leg_end_times=self.float_leg_end_times_1d,
fixed_leg_payment_times=self.fixed_leg_payment_times_1d,
floating_leg_daycount_fractions=self.float_leg_daycount_fractions_1d,
fixed_leg_daycount_fractions=self.fixed_leg_daycount_fractions_1d,
fixed_leg_coupon=self.fixed_leg_coupon_1d,
reference_rate_fn=zero_rate_fn,
notional=100.,
dim=1,
mean_reversion=self.mean_reversion_1d,
volatility=self.volatility_1d,
use_analytic_pricing=use_analytic_pricing,
num_samples=500000,
time_step=0.1,
random_type=tff.math.random.RandomType.PSEUDO_ANTITHETIC,
seed=0,
dtype=dtype)
self.assertEqual(price.dtype, dtype)
self.assertAllEqual(price.shape, [2, 1])
price = self.evaluate(price)
self.assertAllClose(price, [[0.7163243383624043],
[0.7163243383624043]],
rtol=error_tol, atol=error_tol)
@parameterized.named_parameters(
{
'testcase_name': 'analytic',
'use_analytic_pricing': True,
'error_tol': 1e-8,
}, {
'testcase_name': 'simulation',
'use_analytic_pricing': False,
'error_tol': 1e-3,
})
def test_2d_batch_1d(self, use_analytic_pricing, error_tol):
"""Tests 2-d batch."""
dtype = tf.float64
zero_rate_fn = lambda x: 0.01 * tf.ones_like(x, dtype=dtype)
price = tff.models.hull_white.swaption_price(
expiries=self.expiries_2d,
floating_leg_start_times=self.float_leg_start_times_2d,
floating_leg_end_times=self.float_leg_end_times_2d,
fixed_leg_payment_times=self.fixed_leg_payment_times_2d,
floating_leg_daycount_fractions=self.float_leg_daycount_fractions_2d,
fixed_leg_daycount_fractions=self.fixed_leg_daycount_fractions_2d,
fixed_leg_coupon=self.fixed_leg_coupon_2d,
reference_rate_fn=zero_rate_fn,
notional=100.,
dim=1,
mean_reversion=self.mean_reversion_1d,
volatility=self.volatility_1d,
use_analytic_pricing=use_analytic_pricing,
num_samples=500000,
time_step=0.1,
random_type=tff.math.random.RandomType.PSEUDO_ANTITHETIC,
seed=0,
dtype=dtype)
self.assertEqual(price.dtype, dtype)
self.assertAllEqual(price.shape, [2, 2, 1])
price = self.evaluate(price)
expected = [
0.7163243383624043, 0.7163243383624043, 0.7163243383624043,
0.7163243383624043
]
self.assertAllClose(
price, tf.reshape(expected, (2, 2, 1)), rtol=error_tol, atol=error_tol)
@parameterized.named_parameters(
{
'testcase_name': 'analytic',
'use_analytic_pricing': True,
'error_tol': 1e-8,
}, {
'testcase_name': 'simulation',
'use_analytic_pricing': False,
'error_tol': 1e-3,
})
def test_correctness_2d(self, use_analytic_pricing, error_tol):
"""Tests model with constant parameters in 2 dimensions."""
# 1y x 1y swaption with quarterly payments.
dtype = tf.float64
zero_rate_fn = lambda x: 0.01 * tf.ones_like(x, dtype=dtype)
price = tff.models.hull_white.swaption_price(
expiries=self.expiries,
floating_leg_start_times=self.float_leg_start_times,
floating_leg_end_times=self.float_leg_end_times,
fixed_leg_payment_times=self.fixed_leg_payment_times,
floating_leg_daycount_fractions=self.float_leg_daycount_fractions,
fixed_leg_daycount_fractions=self.fixed_leg_daycount_fractions,
fixed_leg_coupon=self.fixed_leg_coupon,
reference_rate_fn=zero_rate_fn,
notional=100.,
dim=2,
mean_reversion=self.mean_reversion_2d,
volatility=self.volatility_2d,
use_analytic_pricing=use_analytic_pricing,
num_samples=500000,
time_step=0.1,
random_type=tff.math.random.RandomType.PSEUDO_ANTITHETIC,
seed=0,
dtype=dtype)
self.assertEqual(price.dtype, dtype)
self.assertAllEqual(price.shape, [1, 2])
price = self.evaluate(price)
self.assertAllClose(price, [[0.7163243383624043, 0.7163243383624043]],
rtol=error_tol, atol=error_tol)
@parameterized.named_parameters(
{
'testcase_name': 'analytic',
'use_analytic_pricing': True,
'error_tol': 1e-8,
}, {
'testcase_name': 'simulation',
'use_analytic_pricing': False,
'error_tol': 5e-3,
})
def test_1d_batch_2d(self, use_analytic_pricing, error_tol):
"""Tests 1-d batch."""
dtype = tf.float64
zero_rate_fn = lambda x: 0.01 * tf.ones_like(x, dtype=dtype)
price = tff.models.hull_white.swaption_price(
expiries=self.expiries_1d,
floating_leg_start_times=self.float_leg_start_times_1d,
floating_leg_end_times=self.float_leg_end_times_1d,
fixed_leg_payment_times=self.fixed_leg_payment_times_1d,
floating_leg_daycount_fractions=self.float_leg_daycount_fractions_1d,
fixed_leg_daycount_fractions=self.fixed_leg_daycount_fractions_1d,
fixed_leg_coupon=self.fixed_leg_coupon_1d,
reference_rate_fn=zero_rate_fn,
notional=100.,
dim=2,
mean_reversion=self.mean_reversion_2d,
volatility=self.volatility_2d,
use_analytic_pricing=use_analytic_pricing,
num_samples=500000,
time_step=0.1,
random_type=tff.math.random.RandomType.PSEUDO_ANTITHETIC,
seed=0,
dtype=dtype)
self.assertEqual(price.dtype, dtype)
self.assertAllEqual(price.shape, [2, 2])
price = self.evaluate(price)
self.assertAllClose(price, [[0.7163243383624043, 0.7163243383624043],
[0.7163243383624043, 0.7163243383624043]],
rtol=error_tol, atol=error_tol)
@parameterized.named_parameters(
{
'testcase_name': 'analytic',
'use_analytic_pricing': True,
'error_tol': 1e-8,
}, {
'testcase_name': 'simulation',
'use_analytic_pricing': False,
'error_tol': 1e-3,
})
def test_2d_batch_2d(self, use_analytic_pricing, error_tol):
"""Tests 2-d batch."""
dtype = tf.float64
zero_rate_fn = lambda x: 0.01 * tf.ones_like(x, dtype=dtype)
price = tff.models.hull_white.swaption_price(
expiries=self.expiries_2d,
floating_leg_start_times=self.float_leg_start_times_2d,
floating_leg_end_times=self.float_leg_end_times_2d,
fixed_leg_payment_times=self.fixed_leg_payment_times_2d,
floating_leg_daycount_fractions=self.float_leg_daycount_fractions_2d,
fixed_leg_daycount_fractions=self.fixed_leg_daycount_fractions_2d,
fixed_leg_coupon=self.fixed_leg_coupon_2d,
reference_rate_fn=zero_rate_fn,
notional=100.,
dim=2,
mean_reversion=self.mean_reversion_2d,
volatility=self.volatility_2d,
use_analytic_pricing=use_analytic_pricing,
num_samples=500000,
time_step=0.1,
random_type=tff.math.random.RandomType.PSEUDO_ANTITHETIC,
seed=0,
dtype=dtype)
self.assertEqual(price.dtype, dtype)
self.assertAllEqual(price.shape, [2, 2, 2])
price = self.evaluate(price)
expected = [
0.7163243383624043, 0.7163243383624043, 0.7163243383624043,
0.7163243383624043, 0.7163243383624043, 0.7163243383624043,
0.7163243383624043, 0.7163243383624043
]
self.assertAllClose(
price, tf.reshape(expected, (2, 2, 2)), rtol=error_tol, atol=error_tol)
if __name__ == '__main__':
tf.test.main()
| 39.588372 | 95 | 0.654526 | 2,253 | 17,023 | 4.616511 | 0.091878 | 0.048457 | 0.069224 | 0.048072 | 0.891068 | 0.881934 | 0.87482 | 0.871743 | 0.867128 | 0.857321 | 0 | 0.07582 | 0.24226 | 17,023 | 429 | 96 | 39.680653 | 0.730522 | 0.064442 | 0 | 0.769022 | 0 | 0 | 0.051948 | 0 | 0 | 0 | 0 | 0 | 0.065217 | 1 | 0.024457 | false | 0 | 0.013587 | 0 | 0.040761 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
38a65634ecff5d4b18121209124fbe177454c571 | 2,151 | py | Python | SOLO/preprocess/finger_flag.py | youngster-all/Unified-Gesture | 8d20e051f2b692244c38a4d016629f13fe81c200 | [
"MIT"
] | 1 | 2019-12-24T17:00:11.000Z | 2019-12-24T17:00:11.000Z | SOLO/preprocess/finger_flag.py | youngster-all/Unified-Gesture | 8d20e051f2b692244c38a4d016629f13fe81c200 | [
"MIT"
] | null | null | null | SOLO/preprocess/finger_flag.py | youngster-all/Unified-Gesture | 8d20e051f2b692244c38a4d016629f13fe81c200 | [
"MIT"
] | 1 | 2019-12-24T17:39:46.000Z | 2019-12-24T17:39:46.000Z | """ Considering the presence of the finger in a dataset as True else False """
class Finger:
def __int__(self):
pass
def SingleOne(self):
thumb = False
index = True
middle = False
ring = False
pinky = False
return thumb, index, middle, ring, pinky
def SingleTwo(self):
thumb = False
index = True
middle = True
ring = False
pinky = False
return thumb, index, middle, ring, pinky
def SingleThree(self):
thumb = False
index = True
middle = True
ring = True
pinky = False
return thumb, index, middle, ring, pinky
def SingleFour(self):
thumb = False
index = True
middle = True
ring = True
pinky = True
return thumb, index, middle, ring, pinky
def SingleFive(self):
thumb = True
index = True
middle = True
ring = True
pinky = True
return thumb, index, middle, ring, pinky
def SingleSix(self):
thumb = True
index = False
middle = False
ring = False
pinky = True
return thumb, index, middle, ring, pinky
def SingleSeven(self):
thumb = True
index = True
middle = False
ring = False
pinky = True
return thumb, index, middle, ring, pinky
def SingleEight(self):
thumb = True
index = True
middle = False
ring = False
pinky = False
return thumb, index, middle, ring, pinky
def SingleNine(self):
thumb = False
index = True
middle = False
ring = False
pinky = False
return thumb, index, middle, ring, pinky
def SingleGood(self):
thumb = True
index = False
middle = False
ring = False
pinky = False
return thumb, index, middle, ring, pinky
def SingleBad(self):
thumb = True
index = False
middle = False
ring = False
pinky = False
return thumb, index, middle, ring, pinky
| 22.642105 | 78 | 0.529521 | 228 | 2,151 | 4.97807 | 0.149123 | 0.087225 | 0.155066 | 0.213216 | 0.828194 | 0.828194 | 0.81674 | 0.81674 | 0.787665 | 0.753304 | 0 | 0 | 0.404463 | 2,151 | 94 | 79 | 22.882979 | 0.886027 | 0.032543 | 0 | 0.825 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.15 | false | 0.0125 | 0 | 0 | 0.3 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
38c6019efc701dfea699498e04c5c5a3c717adfd | 1,846 | py | Python | Python/web.py | 2000jedi/WebViewer | 5211abb9f9c50f87b4f21a68b9533b2ec1e89a9f | [
"MIT"
] | null | null | null | Python/web.py | 2000jedi/WebViewer | 5211abb9f9c50f87b4f21a68b9533b2ec1e89a9f | [
"MIT"
] | null | null | null | Python/web.py | 2000jedi/WebViewer | 5211abb9f9c50f87b4f21a68b9533b2ec1e89a9f | [
"MIT"
] | null | null | null | import requests
from tomorrow import threads
class Get:
def __init__(self, url, param=None, special=None, after_complete=None):
self.url = url
self.param = {} if param is None else param
self.special = special
self.result = ""
self.completed = False
self.thread = None
self.after_complete_param = after_complete
def after_completed(self):
pass
@threads(5, timeout=30)
def get(self):
if self.special is None:
self.result = requests.get(self.url, params=self.param, timeout=10).text
elif self.special == 'json':
self.result = requests.get(self.url, params=self.param).json()
elif self.special == 'binary':
self.result = requests.get(self.url, params=self.param).content
else:
self.result = 'Cannot detect mode ', self.special
self.completed = True
self.after_completed()
class Post:
def __init__(self, url, param=None, special=None, after_complete=None):
self.url = url
self.param = {} if param is None else param
self.special = special
self.result = ""
self.completed = False
self.thread = None
self.after_complete_param = after_complete
def after_completed(self):
pass
@threads(5, timeout=30)
def post(self):
if self.special is None:
self.result = requests.post(self.url, params=self.param).text
elif self.special == 'json':
self.result = requests.post(self.url, params=self.param).json()
elif self.special == 'binary':
self.result = requests.post(self.url, params=self.param).content
else:
self.result = 'Cannot detect mode ', self.special
self.completed = True
self.after_completed()
| 32.385965 | 84 | 0.609426 | 228 | 1,846 | 4.846491 | 0.171053 | 0.063348 | 0.097738 | 0.092308 | 0.934842 | 0.934842 | 0.934842 | 0.934842 | 0.893213 | 0.727602 | 0 | 0.006047 | 0.283315 | 1,846 | 56 | 85 | 32.964286 | 0.829176 | 0 | 0 | 0.75 | 0 | 0 | 0.031419 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | false | 0.041667 | 0.041667 | 0 | 0.208333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
38c92f5e52afe146cbe6c9a59e5b018789f3bd6d | 62 | py | Python | examples/math.log2/ex1.py | mcorne/python-by-example | 15339c0909c84b51075587a6a66391100971c033 | [
"MIT"
] | null | null | null | examples/math.log2/ex1.py | mcorne/python-by-example | 15339c0909c84b51075587a6a66391100971c033 | [
"MIT"
] | null | null | null | examples/math.log2/ex1.py | mcorne/python-by-example | 15339c0909c84b51075587a6a66391100971c033 | [
"MIT"
] | null | null | null | import math
print(math.log2(65536))
print(math.log(65536, 2))
| 15.5 | 25 | 0.741935 | 11 | 62 | 4.181818 | 0.636364 | 0.391304 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.210526 | 0.080645 | 62 | 3 | 26 | 20.666667 | 0.596491 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.333333 | 0 | 0.333333 | 0.666667 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 7 |
aa6e893ca857e185710e8a740d1ce5f80b12f466 | 16,078 | py | Python | ListSnippets/lsSnips.py | MooersLab/jupyterlabpymolpysnipsplus | b886750d63372434df53d4d6d7cdad6cb02ae4e7 | [
"MIT"
] | null | null | null | ListSnippets/lsSnips.py | MooersLab/jupyterlabpymolpysnipsplus | b886750d63372434df53d4d6d7cdad6cb02ae4e7 | [
"MIT"
] | null | null | null | ListSnippets/lsSnips.py | MooersLab/jupyterlabpymolpysnipsplus | b886750d63372434df53d4d6d7cdad6cb02ae4e7 | [
"MIT"
] | null | null | null | # Description: List all snips by tab trigger and description.
# Source: placeHolder
"""
cmd.do('"""Tab trigger Description')
cmd.do('--------------------------- --------------------------------------------------------------------------------------------------------------------------------------------------------------------')
cmd.do('antialias Set antialias to get smooth edges')
cmd.do('ao Ambient occlussion.')
cmd.do('aveB4resiX AveBResiX, prints the resiude number and the average bfactor.')
cmd.do(' Uses reduce and lambda, builtin Python functional porgramming functions.')
cmd.do(' Note that you need to convert the length of the list of Bfactors from an integer to a float before division into the sum.')
cmd.do('averageB Uses a regular list as opposed to PyMOL"s stored list. Edit the selection as needed.')
cmd.do('bs Ball and stick representation.')
cmd.do('bu Biological unit.')
cmd.do('carvedDensity Carved electron density.')
cmd.do('cblind Color blindness.')
cmd.do('centerpi Center pi.')
cmd.do('coordinate Coordinate covalent bonds to metals and H-bonds friom RNA.')
cmd.do('cribbon Color ribbon H red, strand yellow, loop green.')
cmd.do('cspheres Colored spheres.')
cmd.do('discreteCartoonColoring Turn on discrete colors between secondary structure elements.')
cmd.do('distance H-bond distances.')
cmd.do('doubleBond Valence bond.')
cmd.do('drawHbonds Draw H-bonds.')
cmd.do('duplicateObject Duplicate object.')
cmd.do('ellipcol Set ellipsoid color.')
cmd.do('extractPartObj Create a new object from part of an existing object.')
cmd.do('fasta Print Fasta from PDB file.')
cmd.do('fetch2FoFc Fetch 2FoFc map.')
cmd.do('fetchCIF Fetch cif file.')
cmd.do('fetchFoFc Fetch fofc map.')
cmd.do('fetchPath Set path for location to save fetched pdb files.')
cmd.do('filledRing Filled rings in nucleic acids.')
cmd.do('findHbonds Find hbonds around a residue.')
cmd.do('fog Blur the background atoms.')
cmd.do('getCoordinates Get coordinates.')
cmd.do('hbond Hbond setup.')
cmd.do('hbonddash H-bond dashes.')
cmd.do('hideSelection Trun off magenta squares on current selection.')
cmd.do('hidealtloc Hide alt loc.')
cmd.do('internalGUImode2 Makes the background of the internal gui transparent with the viewport extended into this region of the gui. This may be a useful mode for workshops.')
cmd.do('internalGUIwidth Set the width of the internal gui.')
cmd.do('labelCAs Label the CA atoms with the Ala333 style format')
cmd.do('labelMainChain Label the main chain atoms by resn,resi,atom name.')
cmd.do('labelResnResi Label CA atom with residue name and residue number.')
cmd.do('labelSS Label SS.')
cmd.do('labelWatersHOH Label waters HOH.')
cmd.do('labelWatersW Label waters W.')
cmd.do('loadPDBbs Load PDB ball-and-stick.')
cmd.do('loadPDBfile Load a pdb file in the current directory.')
cmd.do('loadPDBnb Load PDB nb spheres.')
cmd.do('lsSnips List all snips by tab trigger and description')
cmd.do('lspymolrc Print list of active pymolrc files.')
cmd.do('molscriptRibbon Molscript ribbons.')
cmd.do('ms Measure surface area.')
cmd.do('oneLetter One letter amino acid.')
cmd.do('pearl The pearl effect is made with two spheres with the outer sphere being transparent.')
cmd.do('printBs Print the B-factors of a residue.')
cmd.do('printBs2digits Print B-values for a residue with the B"s rounded off to two decimal places.')
cmd.do('printBspartB Print B factors of part B of a rsidues.')
cmd.do('printDoc Print document string of a function.')
cmd.do('printNameB4ResiX Print name and b-factor for a rsidue. You can change this to a named selection or a residue range ( e.g., resi 133:155). Use the noH variant if H atoms are present.')
cmd.do('printResiResnNameB4ResiX Print resn, resi, atom name, and b-factor.')
cmd.do('printResiResnNameB4ResiXNoH Print name and b-factor for a rsidue. You can change this to a named selection or a residue range ( e.g., resi 133:155). The noH variant.')
cmd.do('pseudolabel Position label with pseudoatom.')
cmd.do('puttyCartoon Create a putty cartoon.')
cmd.do('ringMode Set the ring mode to a value between 0 and 6 in cartoons of nucleic acids.')
cmd.do('rmwater Remove waters from molecular object.')
cmd.do('rotate Rotate about axis.')
cmd.do('rv Return settings in rounded format.')
cmd.do('savePNG Save a png file of current scene to the current directory.')
cmd.do('saxsEnvelope Display SAXS envelope')
cmd.do('sc111 Display all symmetry mates in one unit cell. Asumes supercell.py (sse PyMOL Wiki) is in $HOME/Scripts/PyMOLscripts/.')
cmd.do('sc222 Run Tom Holder"s supercell script to generate three cells in all directions.')
cmd.do('scaleRadiusColor Scale the radius and color of atoms as spheres by property in the B-value column.')
cmd.do('selectAllBut Select all nitrogen atom in a selelction except from lysine.')
cmd.do('selectAtomsAround Select atoms within a radius around a ligand.')
cmd.do('selectChain Select a chain.')
cmd.do('selectElement Select atoms by elemnt.')
cmd.do('selectHelices Select atoms by alpha helices.')
cmd.do('selectLoops Select atoms by beta loops.')
cmd.do('selectName Select atoms by name.')
cmd.do('selectResi Select residues by a range of numbers.')
cmd.do('selectResidues Select residues by name.')
cmd.do('selectResiduesAround Select residues within a radius around a ligand.')
cmd.do('selectStrands Select atoms by beta strands.')
cmd.do('setcolor Set color name to a RGB code.')
cmd.do('setpath Set additional path for PyMOL to search on startup')
cmd.do('sidehChainHelper In cartoons, hide the backbone atoms of selected residues when showing then as sticks.')
cmd.do('sigDigits Set number of decimals places to show in distance labels.')
cmd.do('sigang Set angle labels to display 2 decimals places')
cmd.do('sigdist set distance labels to display 2 decimals')
cmd.do('solventRadius Set radius of ball used to make solvent accessible surface.')
cmd.do('spng Save png flle with timestamp')
cmd.do('spse Save pse flle with timestamp')
cmd.do('stack Base-stacking figure.')
cmd.do('stereoDraw Stereo draw.')
cmd.do('stereoRay Stereo ray.')
cmd.do('threeMaps Three electron density.')
cmd.do('turnAboutAxis Turn about axis.')
cmd.do('undoSelection Undo a selection.')
cmd.do('volumeRamp Volume ramp.')
cmd.do('writeCommandReference2HTML Write the command reference to html file in the present working directory."""')
"""
cmd.do('"""Tab trigger Description')
cmd.do('--------------------------- --------------------------------------------------------------------------------------------------------------------------------------------------------------------')
cmd.do('antialias Set antialias to get smooth edges')
cmd.do('ao Ambient occlussion.')
cmd.do('aveB4resiX AveBResiX, prints the resiude number and the average bfactor.')
cmd.do(' Uses reduce and lambda, builtin Python functional porgramming functions.')
cmd.do(' Note that you need to convert the length of the list of Bfactors from an integer to a float before division into the sum.')
cmd.do('averageB Uses a regular list as opposed to PyMOL"s stored list. Edit the selection as needed.')
cmd.do('bs Ball and stick representation.')
cmd.do('bu Biological unit.')
cmd.do('carvedDensity Carved electron density.')
cmd.do('cblind Color blindness.')
cmd.do('centerpi Center pi.')
cmd.do('coordinate Coordinate covalent bonds to metals and H-bonds friom RNA.')
cmd.do('cribbon Color ribbon H red, strand yellow, loop green.')
cmd.do('cspheres Colored spheres.')
cmd.do('discreteCartoonColoring Turn on discrete colors between secondary structure elements.')
cmd.do('distance H-bond distances.')
cmd.do('doubleBond Valence bond.')
cmd.do('drawHbonds Draw H-bonds.')
cmd.do('duplicateObject Duplicate object.')
cmd.do('ellipcol Set ellipsoid color.')
cmd.do('extractPartObj Create a new object from part of an existing object.')
cmd.do('fasta Print Fasta from PDB file.')
cmd.do('fetch2FoFc Fetch 2FoFc map.')
cmd.do('fetchCIF Fetch cif file.')
cmd.do('fetchFoFc Fetch fofc map.')
cmd.do('fetchPath Set path for location to save fetched pdb files.')
cmd.do('filledRing Filled rings in nucleic acids.')
cmd.do('findHbonds Find hbonds around a residue.')
cmd.do('fog Blur the background atoms.')
cmd.do('getCoordinates Get coordinates.')
cmd.do('hbond Hbond setup.')
cmd.do('hbonddash H-bond dashes.')
cmd.do('hideSelection Trun off magenta squares on current selection.')
cmd.do('hidealtloc Hide alt loc.')
cmd.do('internalGUImode2 Makes the background of the internal gui transparent with the viewport extended into this region of the gui. This may be a useful mode for workshops.')
cmd.do('internalGUIwidth Set the width of the internal gui.')
cmd.do('labelCAs Label the CA atoms with the Ala333 style format')
cmd.do('labelMainChain Label the main chain atoms by resn,resi,atom name.')
cmd.do('labelResnResi Label CA atom with residue name and residue number.')
cmd.do('labelSS Label SS.')
cmd.do('labelWatersHOH Label waters HOH.')
cmd.do('labelWatersW Label waters W.')
cmd.do('loadPDBbs Load PDB ball-and-stick.')
cmd.do('loadPDBfile Load a pdb file in the current directory.')
cmd.do('loadPDBnb Load PDB nb spheres.')
cmd.do('lsSnips List all snips by tab trigger and description')
cmd.do('lspymolrc Print list of active pymolrc files.')
cmd.do('molscriptRibbon Molscript ribbons.')
cmd.do('ms Measure surface area.')
cmd.do('oneLetter One letter amino acid.')
cmd.do('pearl The pearl effect is made with two spheres with the outer sphere being transparent.')
cmd.do('printBs Print the B-factors of a residue.')
cmd.do('printBs2digits Print B-values for a residue with the B"s rounded off to two decimal places.')
cmd.do('printBspartB Print B factors of part B of a rsidues.')
cmd.do('printDoc Print document string of a function.')
cmd.do('printNameB4ResiX Print name and b-factor for a rsidue. You can change this to a named selection or a residue range ( e.g., resi 133:155). Use the noH variant if H atoms are present.')
cmd.do('printResiResnNameB4ResiX Print resn, resi, atom name, and b-factor.')
cmd.do('printResiResnNameB4ResiXNoH Print name and b-factor for a rsidue. You can change this to a named selection or a residue range ( e.g., resi 133:155). The noH variant.')
cmd.do('pseudolabel Position label with pseudoatom.')
cmd.do('puttyCartoon Create a putty cartoon.')
cmd.do('ringMode Set the ring mode to a value between 0 and 6 in cartoons of nucleic acids.')
cmd.do('rmwater Remove waters from molecular object.')
cmd.do('rotate Rotate about axis.')
cmd.do('rv Return settings in rounded format.')
cmd.do('savePNG Save a png file of current scene to the current directory.')
cmd.do('saxsEnvelope Display SAXS envelope')
cmd.do('sc111 Display all symmetry mates in one unit cell. Asumes supercell.py (sse PyMOL Wiki) is in $HOME/Scripts/PyMOLscripts/.')
cmd.do('sc222 Run Tom Holder"s supercell script to generate three cells in all directions.')
cmd.do('scaleRadiusColor Scale the radius and color of atoms as spheres by property in the B-value column.')
cmd.do('selectAllBut Select all nitrogen atom in a selelction except from lysine.')
cmd.do('selectAtomsAround Select atoms within a radius around a ligand.')
cmd.do('selectChain Select a chain.')
cmd.do('selectElement Select atoms by elemnt.')
cmd.do('selectHelices Select atoms by alpha helices.')
cmd.do('selectLoops Select atoms by beta loops.')
cmd.do('selectName Select atoms by name.')
cmd.do('selectResi Select residues by a range of numbers.')
cmd.do('selectResidues Select residues by name.')
cmd.do('selectResiduesAround Select residues within a radius around a ligand.')
cmd.do('selectStrands Select atoms by beta strands.')
cmd.do('setcolor Set color name to a RGB code.')
cmd.do('setpath Set additional path for PyMOL to search on startup')
cmd.do('sidehChainHelper In cartoons, hide the backbone atoms of selected residues when showing then as sticks.')
cmd.do('sigDigits Set number of decimals places to show in distance labels.')
cmd.do('sigang Set angle labels to display 2 decimals places')
cmd.do('sigdist set distance labels to display 2 decimals')
cmd.do('solventRadius Set radius of ball used to make solvent accessible surface.')
cmd.do('spng Save png flle with timestamp')
cmd.do('spse Save pse flle with timestamp')
cmd.do('stack Base-stacking figure.')
cmd.do('stereoDraw Stereo draw.')
cmd.do('stereoRay Stereo ray.')
cmd.do('threeMaps Three electron density.')
cmd.do('turnAboutAxis Turn about axis.')
cmd.do('undoSelection Undo a selection.')
cmd.do('volumeRamp Volume ramp.')
cmd.do('writeCommandReference2HTML Write the command reference to html file in the present working directory."""')
| 78.429268 | 203 | 0.581167 | 1,967 | 16,078 | 4.750381 | 0.207931 | 0.10595 | 0.013913 | 0.00899 | 0.997003 | 0.997003 | 0.997003 | 0.997003 | 0.992937 | 0.992937 | 0 | 0.006171 | 0.314654 | 16,078 | 204 | 204 | 78.813725 | 0.841819 | 0 | 0 | 1 | 0 | 0.040404 | 0.859362 | 0.039967 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0.080808 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
aac7dbfb66184565b1385fd5ab8b4051592b2949 | 158 | py | Python | gym_rock_paper_scissors/envs/__init__.py | kyuhyoung/gym-rock-paper-scissors | f527c9f0835193008f04575bca1b63a815c44c8a | [
"MIT"
] | 5 | 2020-02-17T16:29:22.000Z | 2022-01-24T15:42:49.000Z | gym_rock_paper_scissors/envs/__init__.py | kyuhyoung/gym-rock-paper-scissors | f527c9f0835193008f04575bca1b63a815c44c8a | [
"MIT"
] | 1 | 2019-01-15T16:09:36.000Z | 2019-01-15T16:09:36.000Z | gym_rock_paper_scissors/envs/__init__.py | kyuhyoung/gym-rock-paper-scissors | f527c9f0835193008f04575bca1b63a815c44c8a | [
"MIT"
] | 3 | 2019-01-09T09:44:54.000Z | 2021-09-06T01:04:25.000Z | from gym_rock_paper_scissors.envs.rock_paper_scissors_env import RockPaperScissorsEnv
from gym_rock_paper_scissors.envs.rock_paper_scissors_env import Action
| 52.666667 | 85 | 0.924051 | 24 | 158 | 5.583333 | 0.416667 | 0.268657 | 0.507463 | 0.238806 | 0.80597 | 0.80597 | 0.80597 | 0.80597 | 0.80597 | 0.80597 | 0 | 0 | 0.050633 | 158 | 2 | 86 | 79 | 0.893333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 14 |
2ab604647bc9df25816461629b74d3d51fe8c25e | 10,570 | py | Python | test/integration_tests/test_web_cli_e2e.py | babatana/stograde | c1c447e99c44c23cef9dd857e669861f3708ae77 | [
"MIT"
] | null | null | null | test/integration_tests/test_web_cli_e2e.py | babatana/stograde | c1c447e99c44c23cef9dd857e669861f3708ae77 | [
"MIT"
] | null | null | null | test/integration_tests/test_web_cli_e2e.py | babatana/stograde | c1c447e99c44c23cef9dd857e669861f3708ae77 | [
"MIT"
] | null | null | null | import os
import sys
from unittest import mock
import pytest
from stograde.common import chdir
from stograde.toolkit.__main__ import main
from stograde.webapp import server
_dir = os.path.dirname(os.path.realpath(__file__))
@pytest.mark.datafiles(os.path.join(_dir, 'fixtures', 'web_tests'))
def test_stograde_web_student_menu(datafiles, capsys):
args = [sys.argv[0]] + ['web', '--port', '1500', 'hw2',
'--skip-repo-update', '--skip-spec-update',
'--skip-version-check', '--skip-dependency-check']
with chdir(str(datafiles)):
with mock.patch('stograde.webapp.web_cli.prompt') as mock_prompt:
with mock.patch('sys.argv', args):
mock_prompt.side_effect = []
try:
main()
raise AssertionError
except StopIteration:
pass
assert mock_prompt.call_args[0][0][0] == {
'type': 'list',
'name': 'student',
'message': 'Choose student',
'choices': ['QUIT', 'student6', 'student7 NO SUBMISSION', 'student8'],
}
assert mock_prompt.call_count == 1
out, _ = capsys.readouterr()
assert out == 'Loading repos. Please wait...\n'
@pytest.mark.datafiles(os.path.join(_dir, 'fixtures', 'web_tests'))
def test_stograde_web_student_menu_blank(datafiles, capsys):
args = [sys.argv[0]] + ['web', '--port', '1501', 'hw2',
'--skip-repo-update', '--skip-spec-update',
'--skip-version-check', '--skip-dependency-check']
with chdir(str(datafiles)):
with mock.patch('stograde.webapp.web_cli.prompt') as mock_prompt:
with mock.patch('sys.argv', args):
mock_prompt.side_effect = [None]
main()
assert mock_prompt.call_count == 1
out, _ = capsys.readouterr()
assert out == 'Loading repos. Please wait...\n'
@pytest.mark.datafiles(os.path.join(_dir, 'fixtures', 'web_tests'))
def test_stograde_web_student_menu_quit(datafiles, capsys):
args = [sys.argv[0]] + ['web', '--port', '1502', 'hw2',
'--skip-repo-update', '--skip-spec-update',
'--skip-version-check', '--skip-dependency-check']
with chdir(str(datafiles)):
with mock.patch('stograde.webapp.web_cli.prompt') as mock_prompt:
with mock.patch('sys.argv', args):
mock_prompt.side_effect = [{'student': 'QUIT'}]
main()
assert mock_prompt.call_count == 1
out, _ = capsys.readouterr()
assert out == 'Loading repos. Please wait...\n'
@pytest.mark.datafiles(os.path.join(_dir, 'fixtures', 'web_tests'))
def test_stograde_web_student_menu_student_no_submission(datafiles, capsys):
args = [sys.argv[0]] + ['web', '--port', '1503', 'hw2',
'--skip-repo-update', '--skip-spec-update',
'--skip-version-check', '--skip-dependency-check']
with chdir(str(datafiles)):
with mock.patch('stograde.webapp.web_cli.prompt') as mock_prompt:
with mock.patch('sys.argv', args):
mock_prompt.side_effect = [{'student': 'student7 NO SUBMISSION'}]
try:
main()
raise AssertionError
except StopIteration:
pass
assert mock_prompt.call_args[0][0][0] == {
'type': 'list',
'name': 'student',
'message': 'Choose student',
'choices': ['QUIT', 'student6', 'student7 NO SUBMISSION', 'student8'],
}
assert mock_prompt.call_count == 2
out, _ = capsys.readouterr()
assert out == 'Loading repos. Please wait...\n'
@pytest.mark.datafiles(os.path.join(_dir, 'fixtures', 'web_tests'))
def test_stograde_web_file_menu(datafiles, capsys):
args = [sys.argv[0]] + ['web', '--port', '1504', 'hw2',
'--skip-repo-update', '--skip-spec-update',
'--skip-version-check', '--skip-dependency-check']
with chdir(str(datafiles)):
with mock.patch('stograde.webapp.web_cli.prompt') as mock_prompt:
with mock.patch('sys.argv', args):
mock_prompt.side_effect = [{'student': 'student6'}]
try:
main()
raise AssertionError
except StopIteration:
pass
assert mock_prompt.call_args[0][0][0] == {
'type': 'list',
'name': 'file',
'message': 'Choose file',
'choices': ['BACK', 'second.cpp', 'third.cpp'],
}
assert mock_prompt.call_count == 2
out, _ = capsys.readouterr()
assert out == 'Loading repos. Please wait...\nProcessing...\n'
@pytest.mark.datafiles(os.path.join(_dir, 'fixtures', 'web_tests'))
def test_stograde_web_file_menu_missing_files(datafiles, capsys):
args = [sys.argv[0]] + ['web', '--port', '1505', 'hw2',
'--skip-repo-update', '--skip-spec-update',
'--skip-version-check', '--skip-dependency-check']
with chdir(str(datafiles)):
with mock.patch('stograde.webapp.web_cli.prompt') as mock_prompt:
with mock.patch('sys.argv', args):
mock_prompt.side_effect = [{'student': 'student8'}]
try:
main()
raise AssertionError
except StopIteration:
pass
assert mock_prompt.call_args[0][0][0] == {
'type': 'list',
'name': 'file',
'message': 'Choose file',
'choices': ['BACK', 'second.cpp MISSING', 'third.cpp MISSING (OPTIONAL)'],
}
assert mock_prompt.call_count == 2
out, _ = capsys.readouterr()
assert out == 'Loading repos. Please wait...\nProcessing...\n'
@pytest.mark.datafiles(os.path.join(_dir, 'fixtures', 'web_tests'))
def test_stograde_web_file_menu_back(datafiles, capsys):
args = [sys.argv[0]] + ['web', '--port', '1506', 'hw2',
'--skip-repo-update', '--skip-spec-update',
'--skip-version-check', '--skip-dependency-check']
with chdir(str(datafiles)):
with mock.patch('stograde.webapp.web_cli.prompt') as mock_prompt:
with mock.patch('sys.argv', args):
mock_prompt.side_effect = [{'student': 'student6'},
{'file': 'BACK'}]
try:
main()
raise AssertionError
except StopIteration:
pass
assert mock_prompt.call_args[0][0][0] == {
'type': 'list',
'name': 'student',
'message': 'Choose student',
'choices': ['QUIT', 'student6', 'student7 NO SUBMISSION', 'student8'],
}
assert mock_prompt.call_count == 3
out, _ = capsys.readouterr()
assert out == 'Loading repos. Please wait...\nProcessing...\n'
@pytest.mark.datafiles(os.path.join(_dir, 'fixtures', 'web_tests'))
def test_stograde_web_file_menu_blank(datafiles, capsys):
args = [sys.argv[0]] + ['web', '--port', '1507', 'hw2',
'--skip-repo-update', '--skip-spec-update',
'--skip-version-check', '--skip-dependency-check']
with chdir(str(datafiles)):
with mock.patch('stograde.webapp.web_cli.prompt') as mock_prompt:
with mock.patch('sys.argv', args):
mock_prompt.side_effect = [{'student': 'student6'},
None]
try:
main()
raise AssertionError
except StopIteration:
pass
assert mock_prompt.call_args[0][0][0] == {
'type': 'list',
'name': 'student',
'message': 'Choose student',
'choices': ['QUIT', 'student6', 'student7 NO SUBMISSION', 'student8'],
}
assert mock_prompt.call_count == 3
out, _ = capsys.readouterr()
assert out == 'Loading repos. Please wait...\nProcessing...\n'
@pytest.mark.datafiles(os.path.join(_dir, 'fixtures', 'web_tests'))
def test_stograde_web_file_menu_file(datafiles, capsys):
args = [sys.argv[0]] + ['web', '--port', '1508', 'hw2',
'--skip-repo-update', '--skip-spec-update',
'--skip-version-check', '--skip-dependency-check']
with chdir(str(datafiles)):
with mock.patch('stograde.webapp.web_cli.prompt') as mock_prompt:
with mock.patch('sys.argv', args):
mock_prompt.side_effect = [{'student': 'student6'},
{'file': 'second.cpp'}]
assert server.work_dir == '.'
try:
main()
raise AssertionError
except StopIteration:
pass
assert mock_prompt.call_args[0][0][0] == {
'type': 'list',
'name': 'file',
'message': 'Choose file',
'choices': ['BACK', 'second.cpp', 'third.cpp'],
}
assert mock_prompt.call_count == 3
assert server.work_dir != '.'
out, _ = capsys.readouterr()
assert out == 'Loading repos. Please wait...\nProcessing...\n'
@pytest.mark.datafiles(os.path.join(_dir, 'fixtures', 'web_tests'))
def test_stograde_web_not_web_spec(datafiles, capsys):
args = [sys.argv[0]] + ['web', '--port', '1509', 'hw1',
'--skip-repo-update', '--skip-spec-update',
'--skip-version-check', '--skip-dependency-check']
with chdir(str(datafiles)):
try:
with mock.patch('sys.argv', args):
main()
raise AssertionError
except SystemExit:
pass
out, _ = capsys.readouterr()
assert out == 'No web files in assignment hw1\n'
| 38.297101 | 94 | 0.509934 | 1,064 | 10,570 | 4.915414 | 0.100564 | 0.06501 | 0.047228 | 0.061185 | 0.910899 | 0.905545 | 0.900765 | 0.900765 | 0.861759 | 0.848757 | 0 | 0.015663 | 0.341627 | 10,570 | 275 | 95 | 38.436364 | 0.735882 | 0 | 0 | 0.771028 | 0 | 0 | 0.243141 | 0.058657 | 0 | 0 | 0 | 0 | 0.168224 | 1 | 0.046729 | false | 0.037383 | 0.03271 | 0 | 0.079439 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
2aeb3bf179dfb29a08f6c9294819616464c9f038 | 1,813 | py | Python | ai-lab/envs/maze_env.py | GiacomoFerro/ai-lab-2019 | 0483f2138d6ab9f0d62a983f676640e04b333469 | [
"MIT"
] | 6 | 2020-03-25T07:58:24.000Z | 2021-12-26T05:53:32.000Z | ai-lab/envs/maze_env.py | GiacomoFerro/ai-lab-2019 | 0483f2138d6ab9f0d62a983f676640e04b333469 | [
"MIT"
] | 2 | 2020-03-20T18:10:49.000Z | 2021-11-29T12:27:51.000Z | ai-lab/envs/maze_env.py | GiacomoFerro/ai-lab-2019 | 0483f2138d6ab9f0d62a983f676640e04b333469 | [
"MIT"
] | 3 | 2022-02-03T16:40:08.000Z | 2022-02-06T17:28:11.000Z | from envs.obsgrid_env import ObsGrid
class SmallMazeEnv(ObsGrid):
"""
Small fully observable maze environment with deterministic actions
"""
def __init__(self):
actions = {0: "L", 1: "R", 2: "U", 3: "D"}
grid = [
["C", "C", "S", "C"],
["C", "C", "W", "C"],
["C", "C", "C", "C"],
["C", "W", "W", "W"],
["C", "C", "C", "G"]
]
rewards = {"C": 0, "S": 0, "G": 1}
actdyn = {0: {0: 1.0}, 1: {1: 1.0}, 2: {2: 1.0}, 3: {3: 1.0}}
super().__init__(actions, grid, actdyn, rewards)
class GrdMazeEnv(ObsGrid):
"""
Small fully observable maze environment with deterministic actions where greedy search is optimal and expands less
states than A*
"""
def __init__(self):
actions = {0: "L", 1: "R", 2: "U", 3: "D"}
grid = [
["C", "C", "C", "S"],
["C", "C", "W", "C"],
["C", "C", "C", "C"],
["C", "W", "W", "W"],
["C", "C", "C", "G"]
]
rewards = {"C": 0, "S": 0, "G": 1}
actdyn = {0: {0: 1.0}, 1: {1: 1.0}, 2: {2: 1.0}, 3: {3: 1.0}}
super().__init__(actions, grid, actdyn, rewards)
class BlockedMazeEnv(ObsGrid):
"""
Small fully observable maze environment with deterministic actions where a solution does not exist
"""
def __init__(self):
actions = {0: "L", 1: "R", 2: "U", 3: "D"}
grid = [
["C", "C", "S", "C"],
["C", "C", "W", "C"],
["C", "C", "C", "C"],
["C", "C", "W", "W"],
["C", "C", "W", "G"]
]
rewards = {"C": 0, "S": 0, "G": 1}
actdyn = {0: {0: 1.0}, 1: {1: 1.0}, 2: {2: 1.0}, 3: {3: 1.0}}
super().__init__(actions, grid, actdyn, rewards)
| 31.807018 | 118 | 0.405957 | 245 | 1,813 | 2.902041 | 0.208163 | 0.084388 | 0.075949 | 0.056259 | 0.800281 | 0.797468 | 0.797468 | 0.797468 | 0.793249 | 0.700422 | 0 | 0.058081 | 0.344732 | 1,813 | 56 | 119 | 32.375 | 0.540404 | 0.162714 | 0 | 0.75 | 0 | 0 | 0.055177 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.075 | false | 0 | 0.025 | 0 | 0.175 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
631c169c21eb0be5ac8947cfe49dd81343dcb21f | 27,200 | py | Python | DIKB/evidence-entry-audit-trail.py | dbmi-pitt/DIKB-Evidence-analytics | 9ffd629db30c41ced224ff2afdf132ce9276ae3f | [
"MIT"
] | 3 | 2015-06-08T17:58:54.000Z | 2022-03-10T18:49:44.000Z | DIKB/evidence-entry-audit-trail.py | dbmi-pitt/DIKB-Evidence-analytics | 9ffd629db30c41ced224ff2afdf132ce9276ae3f | [
"MIT"
] | null | null | null | DIKB/evidence-entry-audit-trail.py | dbmi-pitt/DIKB-Evidence-analytics | 9ffd629db30c41ced224ff2afdf132ce9276ae3f | [
"MIT"
] | null | null | null | # # ### an audit trail of evidence entries from command line
# # from DIKB_Utils import *
# # from DIKB import *
# # from DrugModel import *
# # from EvidenceModel import *
# # ev = EvidenceBase("evidence","123")
# # dikb = DIKB("dikb","123", ev)
# # dikb.unpickleKB("../var/DIKB/dikb.pickle")
# # ev.unpickleKB("../var/evidence-base/ev.pickle")
# # ev.renotifyObservers()
# # '''
# # a = ContValAssertion("erythromycin","increases_auc","midazolam")
# # e1 = PKStudy()
# # e1.create("10579141","midazolam: 15mg po; erythromycin 500 mg tidx7 d, po; change in AUC: 220", "rct1", "boycer", "01302006", 15.0, 500.0, 2.2)
# # a.insertEvidence("for", e1)
# # ev.addAssertion(a)
# # ev.objects["erythromycin_increases_auc_midazolam"].ready_for_classification = True
# # dikb.pickleKB("../var/DIKB/dikb.pickle")
# # ev.pickleKB("../var/evidence-base/ev.pickle")
# # a = ContValAssertion("diltiazem","increases_auc","midazolam")
# # e1 = PKStudy()
# # e1.create("10579141","midazolam: 15mg po; diltiazem 60 mg tidx3 d, po; change in AUC: 380", "rct1", "boycer", "01302006", 15, 60.0, 3.8)
# # a.insertEvidence("for", e1)
# # ev.addAssertion(a)
# # ev.objects["diltiazem_increases_auc_midazolam"].ready_for_classification = True
# # dikb.pickleKB("../var/DIKB/dikb.pickle")
# # ev.pickleKB("../var/evidence-base/ev.pickle")
# # '''
# # #
# # a = ContValAssertion("itraconazole", "increases_auc" ,"midazolam")
# # e1 = PKStudy()
# # e1.create("8181191", "# volunteers: 9; midazolam: 7.5mg po; itraconazole 200mg tidx4 d, po; change in AUC: 10-15", "rct1","boycer", "01312006" , 7.5, 200, 15)
# # a.insertEvidence( "for", e1)
# # ev.addAssertion(a)
# # ev.objects["itraconazole_increases_auc_midazolam"].ready_for_classification = True
# # #
# # a = ContValAssertion("itraconazole", "increases_auc" ,"midazolam")
# # e1 = PKStudy()
# # e1.create("8527290", "#volunteers: 20; midazolam: 7.5mg po; itraconazole 100mg tidx d, po; change in AUC: 6-fold", "rct1","boycer", "01312006" , 7.5, 100, 6)
# # a.insertEvidence( "for", e1)
# # ev.addAssertion(a)
# # #
# # a = ContValAssertion("itraconazole", "increases_auc" ,"midazolam")
# # e1 = PKStudy()
# # e1.create("8623953", "#volunteers: 12; midazolam: 7.5mg po; itraconazole 200mg tidx 6d, po; change in AUC: 3.5-7 fold", "rct1","boycer", "01312006" ,7.5, 200, 7.0)
# # a.insertEvidence( "for", e1)
# # ev.addAssertion(a)
# # #
# # a = ContValAssertion("ketoconazole", "increases_auc" ,"midazolam")
# # e1 = PKStudy()
# # e1.create("8181191", "#volunteers: 9; midazolam: 7.5mg po; ketoconazole 400mg tidx 4d, po; change in AUC: 10-15", "rct1","boycer", "01312006" , 7.5, 400, 15)
# # a.insertEvidence( "for", e1)
# # ev.addAssertion(a)
# # ev.objects["ketoconazole_increases_auc_midazolam"].ready_for_classification = True
# # #
# # a = ContValAssertion("itraconazole", "increases_auc" ,"alprazolam")
# # e1 = PKStudy()
# # e1.create("9784084", "#volunteers: 10; alprazolam: .8mg po; itraconazole 200mg tidx 6d, po; change in AUC: 1.6", "rct1","boycer", "01312006", .8, 200, 1.6)
# # a.insertEvidence( "for", e1)
# # ev.addAssertion(a)
# # ev.objects["itraconazole_increases_auc_alprazolam"].ready_for_classification = True
# # #
# # a = ContValAssertion("itraconazole", "increases_auc" ,"triazolam")
# # e1 = PKStudy()
# # e1.create("7995001", "#volunteers: 9; triazolam: .25mg po; itraconazole 200mg tidx 4d, po; change in AUC: 26", "rct1","boycer", "01312006", .25, 200, 26)
# # a.insertEvidence( "for", e1)
# # ev.addAssertion(a)
# # ev.objects["itraconazole_increases_auc_triazolam"].ready_for_classification = True
# # #
# # a = ContValAssertion("itraconazole", "increases_auc" ,"triazolam")
# # e1 = PKStudy()
# # e1.create("8841155", "#volunteers: 10; triazolam: .25mg po; itraconazole 200mg tidx 1d, po; change in AUC: 2.6", "rct1","boycer", "01312006", .25, 200, 2.6)
# # a.insertEvidence( "for", e1)
# # ev.addAssertion(a)
# # ev.objects["itraconazole_increases_auc_triazolam"].ready_for_classification = True
# # #
# # a = ContValAssertion("itraconazole", "increases_auc" ,"simvastatin")
# # e1 = PKStudy()
# # e1.create("9542477", "#volunteers: 10; simvastatin: 40mg po; itraconazole 200mg tidx 4d, po; change in AUC: 10", "rct1","boycer", "01312006", 40, 200, 10)
# # a.insertEvidence( "for", e1)
# # ev.addAssertion(a)
# # ev.objects["itraconazole_increases_auc_simvastatin"].ready_for_classification = True
# # #
# # a = ContValAssertion("itraconazole", "increases_auc" ,"lovastatin")
# # e1 = PKStudy()
# # e1.create("8689812", "#volunteers: 12; lovastatin: 40mg po; itraconazole 200mg tidx 4d, po; change in AUC: 35", "rct1","boycer", "01312006", 40,200 , 35 )
# # a.insertEvidence( "for", e1)
# # ev.addAssertion(a)
# # ev.objects["itraconazole_increases_auc_lovastatin"].ready_for_classification = True
# # #
# # a = ContValAssertion("itraconazole", "increases_auc" ,"lovastatin")
# # e1 = PKStudy()
# # e1.create("9690949", "#volunteers: 10; lovastatin: 40mg po; itraconazole 100mg tidx 4d, po; change in AUC: 13.8", "rct1","boycer", "01312006",40 ,100 ,13.8)
# # a.insertEvidence( "for", e1)
# # ev.addAssertion(a)
# # ev.objects["itraconazole_increases_auc_lovastatin"].ready_for_classification = True
# # #
# # a = ContValAssertion("itraconazole", "increases_auc" ,"fluvastatin")
# # e1 = PKStudy()
# # e1.create("9690949", "(as summarized in UW DIDB) #volunteers: 10; fluvastatin: 40mg po; itraconazole 100mg tidx 4d, po; change in AUC: 13.6", "rct1","boycer", "01312006", 40, 100, 13.6)
# # a.insertEvidence( "for", e1)
# # ev.addAssertion(a)
# # ev.objects["itraconazole_increases_auc_fluvastatin"].ready_for_classification = True
# # #
# # a = ContValAssertion("itraconazole", "increases_auc" ,"atorvastatin")
# # e1 = PKStudy()
# # e1.create("11061579", "#volunteers: 18; atorvastatin: 20mg po; itraconazole 200mg tidx 5d, po; change in AUC: 150%", "rct1","boycer", "01312006", 20, 200, 1.5)
# # a.insertEvidence( "for", e1)
# # ev.addAssertion(a)
# # ev.objects["itraconazole_increases_auc_atorvastatin"].ready_for_classification = True
# # #
# # a = ContValAssertion("itraconazole", "increases_auc" ,"pravastatin")
# # e1 = PKStudy()
# # e1.create("11061579", "#volunteers: 18; pravastatin: 40mg po; itraconazole 200mg tidx 5d, po; change in AUC: 51%", "rct1","boycer", "01312006", 40, 200, .5)
# # a.insertEvidence( "for", e1)
# # ev.addAssertion(a)
# # ev.objects["itraconazole_increases_auc_pravastatin"].ready_for_classification = True
# # #
# # a = ContValAssertion("itraconazole", "increases_auc" ,"pravastatin")
# # e1 = PKStudy()
# # e1.create("9542477", "#volunteers: 10; pravastatin: 40mg po; itraconazole 200mg tidx 4d, po; change in AUC: 71.6%", "rct1","boycer", "01312006", 40, 200, .7)
# # a.insertEvidence( "for", e1)
# # ev.addAssertion(a)
# # ev.objects["itraconazole_increases_auc_pravastatin"].ready_for_classification = True
# # #
# # a = ContValAssertion("itraconazole", "increases_auc" ,"rosuvastatin")
# # e1 = PKStudy()
# # e1.create("12709722", "#volunteers: 14; rosuvastatin: 80mg po; itraconazole 200mg tidx 5d, po; change in AUC: 26.4%", "rct1","boycer", "01312006", 80,200 , .26 )
# # a.insertEvidence( "for", e1)
# # ev.addAssertion(a)
# # ev.objects["itraconazole_increases_auc_rosuvastatin"].ready_for_classification = True
# # #
# # a = ContValAssertion("itraconazole", "increases_auc" ,"rosuvastatin")
# # e1 = PKStudy()
# # e1.create("12709722", "#volunteers: 12; rosuvastatin: 10mg po; itraconazole 200mg tidx 5d, po; change in AUC: .37.3%", "rct1","boycer", "01312006", 10,200 , .37)
# # a.insertEvidence( "for", e1)
# # ev.addAssertion(a)
# # ev.objects["itraconazole_increases_auc_rosuvastatin"].ready_for_classification = True
# # #
# # a = ContValAssertion("erythromycin", "increases_auc" ,"alprazolam")
# # e1 = PKStudy()
# # e1.create("8646822", "#volunteers: 12; alprazolam: 0.8mg po; erythromycin 400mg tidx d, po; change in AUC: 147.2% ", "rct1","boycer", "01312006", .8, 400, 1.47)
# # a.insertEvidence( "for", e1)
# # ev.addAssertion(a)
# # ev.objects["erythromycin_increases_auc_alprazolam"].ready_for_classification = True
# # #
# # a = ContValAssertion("erythromycin", "increases_auc" ,"triazolam")
# # e1 = PKStudy()
# # e1.create("3771812", "#volunteers: 16; triazolam: 0.5mg po; erythromycin 333mg tidx 3d, po; change in AUC: 106% ", "rct1","boycer", "01312006", 0.5, 333, 1.06)
# # a.insertEvidence( "for", e1)
# # ev.addAssertion(a)
# # ev.objects["erythromycin_increases_auc_triazolam"].ready_for_classification = True
# # #
# # a = ContValAssertion("erythromycin", "increases_auc" ,"triazolam")
# # e1 = PKStudy()
# # e1.create("9757151", "#volunteers: 12; triazolam: .125mg po; erythromycin 500mg tidx 2d, po; change in AUC: 280%", "rct1","boycer", "01312006", .125, 500, 2.8)
# # a.insertEvidence( "for", e1)
# # ev.addAssertion(a)
# # ev.objects["erythromycin_increases_auc_triazolam"].ready_for_classification = True
# # #
# # a = ContValAssertion("erythromycin", "increases_auc" ,"simvastatin")
# # e1 = PKStudy()
# # e1.create("9728898", "(as summarized in UW DIDB) #volunteers: 12; simvastatin: 40mg po; erythromycin 500mg tidx 2d, po; change in AUC: 521.5%", "rct1","boycer", "01312006", 40, 500, 5.22)
# # a.insertEvidence( "for", e1)
# # ev.addAssertion(a)
# # ev.objects["erythromycin_increases_auc_simvastatin"].ready_for_classification = True
# # #
# # a = ContValAssertion("erythromycin", "increases_auc" ,"atorvastatin")
# # e1 = PKStudy()
# # e1.create("10234598", "(as summarized in UW DIDB) #volunteers: 11; atorvastatin: 10mg po; erythromycin 500mg tidx 11d, po; change in AUC: 32.5%", "rct1","boycer", "01312006", 10, 500, .33)
# # a.insertEvidence( "for", e1)
# # ev.addAssertion(a)
# # ev.objects["erythromycin_increases_auc_atorvastatin"].ready_for_classification = True
# # #
# # a = ContValAssertion("ketoconazole", "increases_auc" ,"alprazolam")
# # e1 = PKStudy()
# # e1.create("9757147", "(as summarized in UW DIDB) #volunteers: 7; alprazolam: 1mg po; ketoconazole 200mg tidx 4d, po; change in AUC: 298.3% ", "rct1","boycer", "01312006",1 ,200 , 2.98 )
# # a.insertEvidence( "for", e1)
# # ev.addAssertion(a)
# # ev.objects["ketoconazole_increases_auc_alprazolam"].ready_for_classification = True
# # #
# # a = ContValAssertion("ketoconazole", "increases_auc" ,"alprazolam")
# # e1 = PKStudy()
# # e1.create("10634135", "(as summarized in UW DIDB) #volunteers: 4; alprazolam: 1mg po; ketoconazole 200mg tidx 2d, po; change in AUC: 76% ", "rct1","boycer", "01312006", 1, 200, .76)
# # a.insertEvidence( "for", e1)
# # ev.addAssertion(a)
# # ev.objects["ketoconazole_increases_auc_alprazolam"].ready_for_classification = True
# # #
# # a = ContValAssertion("ketoconazole", "increases_auc" ,"triazolam")
# # e1 = PKStudy()
# # e1.create("9757147", "(as summarized in UW DIDB) #volunteers: 6; triazolam: .25mg po; ketoconazole 200mg tidx 4d, po; change in AUC: 1271.7%", "rct1","boycer", "01312006", .25 , 200 , 12.7 )
# # a.insertEvidence( "for", e1)
# # ev.addAssertion(a)
# # ev.objects["ketoconazole_increases_auc_triazolam"].ready_for_classification = True
# # #
# # a = ContValAssertion("fluconazole", "increases_auc" ,"triazolam")
# # e1 = PKStudy()
# # e1.create("8904618", "(as summarized in UW DIDB) #volunteers: 8; triazolam: .25mg po; fluconazole 100mg tidx 4d, po; change in AUC: 105.4%", "rct1","boycer", "01312006", .25,100 , 1.05)
# # a.insertEvidence( "for", e1)
# # ev.addAssertion(a)
# # ev.objects["fluconazole_increases_auc_triazolam"].ready_for_classification = True
# # #
# # a = ContValAssertion("fluconazole", "increases_auc" ,"triazolam")
# # e1 = PKStudy()
# # e1.create("8904618", "(as summarized in UW DIDB) #volunteers: 8; triazolam: .25mg po; fluconazole 200mg tidx 4d, po; change in AUC: 342.4%", "rct1","boycer", "01312006", .25, 200, 3.42)
# # a.insertEvidence( "for", e1)
# # ev.addAssertion(a)
# # ev.objects["fluconazole_increases_auc_triazolam"].ready_for_classification = True
# # #
# # a = ContValAssertion("fluconazole", "increases_auc" ,"triazolam")
# # e1 = PKStudy()
# # e1.create("8904618", "(as summarized in UW DIDB) #volunteers: 8; triazolam: .25mg po; fluconazole 50mg tidx 4d, po; change in AUC: 63%", "rct1","boycer", "01312006", .25, 50, .63)
# # a.insertEvidence( "for", e1)
# # ev.addAssertion(a)
# # ev.objects["fluconazole_increases_auc_triazolam"].ready_for_classification = True
# # #
# # a = ContValAssertion("fluconazole", "increases_auc" ,"triazolam")
# # e1 = PKStudy()
# # e1.create("8730978", "(as summarized in UW DIDB) #volunteers: 12; triazolam: .25mg po; fluconazole 100mg tidx 4d, po; change in AUC: 145.9%", "rct1","boycer", "01312006", .25, 100, 1.46)
# # a.insertEvidence( "for", e1)
# # ev.addAssertion(a)
# # ev.objects["fluconazole_increases_auc_triazolam"].ready_for_classification = True
# # #
# # a = ContValAssertion("fluconazole", "increases_auc" ,"fluvastatin")
# # e1 = PKStudy()
# # e1.create("10952477", "(as summarized in UW DIDB) #volunteers: 12; fluvastatin: 40mg po; fluconazole 400 mg on day 1 and 200 mg on days 2-4 mg tidx 4d, po; change in AUC: 83.7% ", "rct1","boycer", "01312006", 40, 400, .84)
# # a.insertEvidence( "for", e1)
# # ev.addAssertion(a)
# # ev.objects["fluconazole_increases_auc_fluvastatin"].ready_for_classification = True
# # #
# # a = ContValAssertion("fluconazole", "increases_auc" ,"pravastatin")
# # e1 = PKStudy()
# # e1.create("10952477", "(as summarized in UW DIDB) #volunteers: 12; pravastatin: 40mg po; fluconazole 400 mg on day 1 and 200 mg on days 2-4; change in AUC: 35.8%", "rct1","boycer", "01312006", 40, 200, .36)
# # a.insertEvidence( "for", e1)
# # ev.addAssertion(a)
# # ev.objects["fluconazole_increases_auc_pravastatin"].ready_for_classification = True
# # #
# # a = ContValAssertion("fluconazole", "increases_auc" ,"midazolam")
# # e1 = PKStudy()
# # e1.create("16172814", "(as summarized in UW DIDB) #volunteers: 12; midazolam: 3mg po; fluconazole 100mg single dose, po; change in AUC: 116.4% ", "rct1","boycer", "01312006", 3 ,100 , 1.16 )
# # a.insertEvidence( "for", e1)
# # ev.addAssertion(a)
# # ev.objects["fluconazole_increases_auc_midazolam"].ready_for_classification = True
# # #
# # a = ContValAssertion("fluconazole", "increases_auc" ,"midazolam")
# # e1 = PKStudy()
# # e1.create("16172814", "(as summarized in UW DIDB) #volunteers: 12; midazolam: 3mg po; fluconazole 200mg single dose, po; change in AUC: 231.9%", "rct1","boycer", "01312006", 3, 200, 2.32)
# # a.insertEvidence( "for", e1)
# # ev.addAssertion(a)
# # ev.objects["fluconazole_increases_auc_midazolam"].ready_for_classification = True
# # #
# # a = ContValAssertion("fluconazole", "increases_auc" ,"midazolam")
# # e1 = PKStudy()
# # e1.create("16172814", "(as summarized in UW DIDB) #volunteers: 12; midazolam: 3mg po; fluconazole 400mg single dose, po; change in AUC: 393% ", "rct1","boycer", "01312006", 3 ,400 , 3.93)
# # a.insertEvidence( "for", e1)
# # ev.addAssertion(a)
# # ev.objects["fluconazole_increases_auc_midazolam"].ready_for_classification = True
# # #
# # a = ContValAssertion("fluconazole", "increases_auc" ,"midazolam")
# # e1 = PKStudy()
# # e1.create("8623953", "(as summarized in UW DIDB) #volunteers: 12; midazolam: 7.5mg po; fluconazole 400 mg at D1 and then 200 mg (5 days); change in AUC: 259.8%", "rct1","boycer", "01312006",7.5 , 400, 2.6)
# # a.insertEvidence( "for", e1)
# # ev.addAssertion(a)
# # ev.objects["fluconazole_increases_auc_midazolam"].ready_for_classification = True
# # #
# # a = ContValAssertion("ketoconazole", "increases_auc" ,"midazolam")
# # e1 = PKStudy()
# # e1.create("10579473", "(as summarized in UW DIDB) #volunteers: 9 ; midazolam: 6mg po; ketoconazole 200mg tidx 1.5d, po; change in AUC: 1261.6%", "rct1","boycer", "01312006", 6, 200, 12.62)
# # a.insertEvidence( "for", e1)
# # ev.addAssertion(a)
# # ev.objects["ketoconazole_increases_auc_midazolam"].ready_for_classification = True
# # #
# # a = ContValAssertion("ketoconazole", "increases_auc" ,"midazolam")
# # e1 = PKStudy()
# # e1.create("14551182", "(as summarized in UW DIDB) #volunteers: 10; midazolam: 10mg po; ketoconazole 200mg tidx 12d, po; change in AUC: 772+-596 ", "rct1","boycer", "01312006", 10, 200, 7.72)
# # a.insertEvidence( "for", e1)
# # ev.addAssertion(a)
# # ev.objects["ketoconazole_increases_auc_midazolam"].ready_for_classification = True
# # #
# # a = ContValAssertion("erythromycin", "increases_auc" ,"midazolam")
# # e1 = PKStudy()
# # e1.create("8720318", "(as summarized in UW DIDB) #volunteers: 12; midazolam: 15mg po; erythromycin 500mg tidx 5d, po; change in AUC: 281.4%", "rct1","boycer", "01312006", 15, 500, 2.81)
# # a.insertEvidence( "for", e1)
# # ev.addAssertion(a)
# # ev.objects["erythromycin_increases_auc_midazolam"].ready_for_classification = True
# # #
# # a = ContValAssertion("erythromycin", "increases_auc" ,"midazolam")
# # e1 = PKStudy()
# # e1.create("8453848", "(as summarized in UW DIDB) #volunteers: 12; midazolam: 15mg po; erythromycin 500mg tidx 7d, po; change in AUC: 341.7%", "rct1","boycer", "01312006", 15, 500, 3.42)
# # a.insertEvidence( "for", e1)
# # ev.addAssertion(a)
# # ev.objects["erythromycin_increases_auc_midazolam"].ready_for_classification = True
# # #
# # a = ContValAssertion("clarithromycin", "increases_auc" ,"midazolam")
# # e1 = PKStudy()
# # e1.create("8880291", "(as summarized in UW DIDB) #volunteers: 12; midazolam: 15mg po; clarithromycin 250mg bid, po; change in AUC: 257.2%", "rct1","boycer", "01312006", 15, 250, 2.57)
# # a.insertEvidence( "for", e1)
# # ev.addAssertion(a)
# # ev.objects["clarithromycin_increases_auc_midazolam"].ready_for_classification = False
# # #
# # a = ContValAssertion("clarithromycin", "increases_auc" ,"simvastatin")
# # e1 = PKStudy()
# # e1.create("15518608", "(as summarized in UW DIDB) #volunteers: 15; simvastatin: 40mg once daily on days 1-7 alone and with clarithromycin on days 10-17; clarithromycin 500mg BID on days 10-18; change in AUC: 895.5%", "rct1","boycer", "01312006", 40, 500, 8.96)
# # a.insertEvidence( "for", e1)
# # ev.addAssertion(a)
# # ev.objects["clarithromycin_increases_auc_simvastatin"].ready_for_classification = False
# # #
# # a = ContValAssertion("clarithromycin", "increases_auc" ,"pravastatin")
# # e1 = PKStudy()
# # e1.create("15518608", "(as summarized in UW DIDB) #volunteers: 15; pravastatin: 40mg once daily on days 1-7 alone and with clarithromycin on days 10-17; clarithromycin 500mg BID on days 10-18; change in AUC: 111.1%", "rct1","boycer", "01312006", 40, 500, 1.11)
# # a.insertEvidence( "for", e1)
# # ev.addAssertion(a)
# # ev.objects["clarithromycin_increases_auc_pravastatin"].ready_for_classification = False
# # #
# # a = ContValAssertion("clarithromycin", "increases_auc" ,"atorvastatin")
# # e1 = PKStudy()
# # e1.create("15518608", "(as summarized in UW DIDB) #volunteers: 15; atorvastatin: 80mg once daily on days 1-7 alone and with clarithromycin on days 10-17; clarithromycin 500mg BID on days 10-18; change in AUC: 345.1%", "rct1","boycer", "01312006", 80, 500, 3.45)
# # a.insertEvidence( "for", e1)
# # ev.addAssertion(a)
# # ev.objects["clarithromycin_increases_auc_atorvastatin"].ready_for_classification = False
# # #
# # a = ContValAssertion("clarithromycin", "increases_auc" ,"atorvastatin")
# # e1 = PKStudy()
# # e1.create("11936570", "(as summarized in UW DIDB) #volunteers: 24; atorvastatin: 10mg po, once daily; clarithromycin 500mg tidx 3d, po, bid, from day 6 to day 8 of atorvastatin treatment; change in AUC: 81.9%", "rct1","boycer", "01312006", 10, 500, .82)
# # a.insertEvidence( "for", e1)
# # ev.addAssertion(a)
# # ev.objects["clarithromycin_increases_auc_atorvastatin"].ready_for_classification = False
# # #
# # a = ContValAssertion("clarithromycin", "increases_auc" ,"triazolam")
# # e1 = PKStudy()
# # e1.create("9757151", "(as summarized in UW DIDB) #volunteers: 12; triazolam: 0.125mg po single dose, 1h after the 3rd dose of clarithromycin or placebo; clarithromycin 500mg tidx 2d, po; change in AUC: 425.5%", "rct1","boycer", "01312006", 0.125, 500, 4.255)
# # a.insertEvidence( "for", e1)
# # ev.addAssertion(a)
# # ev.objects["clarithromycin_increases_auc_triazolam"].ready_for_classification = False
# # #
# # a = ContValAssertion("nefazodone", "increases_auc" ,"midazolam")
# # e1 = PKStudy()
# # e1.create("14551182", "(as summarized in UW DIDB) #volunteers: 10; midazolam: 10mg single dose of midazolam oral solution (prepared as a 1:1 mixture of injectable midazolam and flavored, dye-free syrup), alone and 1 hour after the last dose of nefazodone; nefazodone 200 mg bid (100 mg bid for 5 days and 200 mg bid for 7 days); change in AUC: 444%", "rct1","boycer", "01312006", 10, 200, 4.44)
# # a.insertEvidence( "for", e1)
# # ev.addAssertion(a)
# # ev.objects["nefazodone_increases_auc_midazolam"].ready_for_classification = False
# # #
# # a = ContValAssertion("nefazodone", "increases_auc" ,"alprazolam")
# # e1 = PKStudy()
# # e1.create("8748428", "(as summarized in UW DIDB) #volunteers: 48; alprazolam: 1mg bid po 7days; nefazodone 200mg bidx 7d, po; change in AUC:98% ", "rct1","boycer", "01312006", 1, 200, .98)
# # a.insertEvidence( "for", e1)
# # ev.addAssertion(a)
# # ev.objects["nefazodone_increases_auc_alprazolam"].ready_for_classification = False
# # #
# # a = ContValAssertion("nefazodone", "increases_auc" ,"alprazolam")
# # e1 = PKStudy()
# # e1.create("14709940", "(as summarized in UW DIDB) #volunteers: 16 (CYP2D6 EMs); alprazolam: 2mg po single dose before and on the last day of nefazodone therapy; nefazodone 400mg BID; 200 mg/day for 3 days then 400 mg/day for 5 days, po; change in AUC: 47.3%", "rct1","boycer", "01312006", 2, 400, .473)
# # a.insertEvidence( "for", e1)
# # ev.addAssertion(a)
# # ev.objects["nefazodone_increases_auc_alprazolam"].ready_for_classification = False
# # #
# # a = ContValAssertion("nefazodone", "increases_auc" ,"triazolam")
# # e1 = PKStudy()
# # e1.create("8830062", "(as summarized in UW DIDB) #volunteers: 12; triazolam: .25mg po; nefazodone 200mg bidx 7d, po; change in AUC: 289.9%", "rct1","boycer", "01312006", .25, 200, 2.899)
# # a.insertEvidence( "for", e1)
# # ev.addAssertion(a)
# # ev.objects["nefazodone_increases_auc_triazolam"].ready_for_classification = False
# # #
# # a = ContValAssertion("diltiazem", "increases_auc" ,"midazolam")
# # e1 = PKStudy()
# # e1.create("8198928", "(as summarized in UW DIDB) #volunteers: 9; midazolam: 15mg po; diltiazem 60mg tidx 2d, po; change in AUC: 275%", "rct1","boycer", "01312006", 15, 60, 2.75)
# # a.insertEvidence( "for", e1)
# # ev.addAssertion(a)
# # ev.objects["diltiazem_increases_auc_midazolam"].ready_for_classification = False
# # #
# # a = ContValAssertion("diltiazem", "increases_auc" ,"triazolam")
# # e1 = PKStudy()
# # e1.create("8612379", "(as summarized in UW DIDB) #volunteers: 10; triazolam: .25mg po; diltiazem 60mg tidx 2d, po; change in AUC: 238.1", "rct1","boycer", "01312006", .25, 60, 2.381)
# # a.insertEvidence( "for", e1)
# # ev.addAssertion(a)
# # ev.objects["diltiazem_increases_auc_triazolam"].ready_for_classification = False
# # #
# # a = ContValAssertion("diltiazem", "increases_auc" ,"triazolam")
# # e1 = PKStudy()
# # e1.create("9146848", "(as summarized in UW DIDB) #volunteers: 7; triazolam: 0.25mg po; diltiazem 60mg tid for 3 days and 1 hour before triazolam intake, po; change in AUC: 127.5%", "rct1","boycer", "01312006", .25, 60, 1.275)
# # a.insertEvidence( "for", e1)
# # ev.addAssertion(a)
# # ev.objects["diltiazem_increases_auc_triazolam"].ready_for_classification = False
# # #
# # a = ContValAssertion("diltiazem", "increases_auc" ,"simvastatin")
# # e1 = PKStudy()
# # e1.create("10741630", "(as summarized in UW DIDB) #volunteers: 10; simvastatin: 20mg po; diltiazem 120mg bid 2wks, po; change in AUC: 381.4% ", "rct1","boycer", "01312006", 20, 120, 3.814)
# # a.insertEvidence( "for", e1)
# # ev.addAssertion(a)
# # ev.objects["diltiazem_increases_auc_simvastatin"].ready_for_classification = False
# # #
# # a = ContValAssertion("diltiazem", "increases_auc" ,"lovastatin")
# # e1 = PKStudy()
# # e1.create("9797793", "(as summarized in UW DIDB) #volunteers: 10; lovastatin: 20mg po; diltiazem 120mg bid x 2wks, po; change in AUC: 257.2%", "rct1","boycer", "01312006", 20, 120, 2.572)
# # a.insertEvidence( "for", e1)
# # ev.addAssertion(a)
# # ev.objects["diltiazem_increases_auc_lovastatin"].ready_for_classification = False
# # #
# # a = ContValAssertion("diltiazem", "increases_auc" ,"pravastatin")
# # e1 = PKStudy()
# # e1.create("9797793", "(as summarized in UW DIDB) #volunteers: 10; pravastatin: 20mg po; diltiazem 120mg bid x 2wks, po; change in AUC: 2.7%", "rct1","boycer", "01312006", 20, 120, .027)
# # a.insertEvidence( "for", e1)
# # ev.addAssertion(a)
# # ev.objects["diltiazem_increases_auc_pravastatin"].ready_for_classification = False
# # #
# # dikb.pickleKB("../var/DIKB/dikb.pickle")
# # ev.pickleKB("../var/evidence-base/ev.pickle")
# ########### fraction cleared by
# a = Assertion_m_discrete('alprazolam', 'fraction_cleared_by', 'cyp3a4')
# e1 = EvidenceContinousVal()
# e1.create("16236041","alprazolam - f_mCYP: 0.80", 'est1', 'boycer', '02062006', 0.8)
# a.insertEvidence( "for", e1)
# ev.addAssertion(a)
# #
# a = Assertion_m_discrete('midazolam', 'fraction_cleared_by', 'cyp3a4')
# e1 = EvidenceContinousVal()
# e1.create("16236041","midazolam - f_mCYP: 0.99", 'est3', 'boycer', '02062006', 0.99)
# a.insertEvidence( "for", e1)
# ev.addAssertion(a)
# a = Assertion_m_discrete('midazolam', 'fraction_cleared_by', 'cyp3a4')
# e1 = EvidenceContinousVal()
# e1.create("16236041","midazolam - f_mCYP: 0.94", 'est1', 'boycer', '02062006', 0.99)
# a.insertEvidence( "for", e1)
# ev.addAssertion(a)
# ##
# a = Assertion_m_discrete('simvastatin', 'fraction_cleared_by', 'cyp3a4')
# e1 = EvidenceContinousVal()
# e1.create("16236041","simvastatin - f_mCYP: 0.99", 'est2', 'boycer', '02062006', 0.99)
# a.insertEvidence( "for", e1)
# ev.addAssertion(a)
# #
# a = Assertion_m_discrete('triazolam', 'fraction_cleared_by', 'cyp3a4')
# e1 = EvidenceContinousVal()
# e1.create("16236041","triazolam - f_mCYP: 0.98", 'est3', 'boycer', '02062006', 0.98)
# a.insertEvidence( "for", e1)
# ev.addAssertion(a)
# a = Assertion_m_discrete('triazolam', 'fraction_cleared_by', 'cyp3a4')
# e1 = EvidenceContinousVal()
# e1.create("16236041","triazolam - f_mCYP: 0.92", 'est1', 'boycer', '02062006', 0.92)
# a.insertEvidence( "for", e1)
# ev.addAssertion(a)
# ##
# a = Assertion_m_discrete('lovastatin', 'fraction_cleared_by', 'cyp3a4')
# e1 = EvidenceContinousVal()
# e1.create("16236041","lovastatin - f_mCYP: 0.99", 'est2', 'boycer', '02062006', 0.99)
# a.insertEvidence( "for", e1)
# ev.addAssertion(a)
# #
# a1 = Assertion_continuous_s_val('phenytoin','bioavailability','continuous_value')
# e1 = EvidenceContinousVal()
# e1.create("Goodman and Gillman 10th Ed", "90.0%", 'ast', 'boycer', '02142006', 90.0)
# a1.evidence_for.append(e1)
# ev.addAssertion(a1)
# ev.objects['phenytoin_bioavailability_continuous_value'].ready_for_classification = True
# #
# dikb.pickleKB("../var/DIKB/dikb.pickle")
# ev.pickleKB("../var/evidence-base/ev.pickle")
| 46.023689 | 399 | 0.684265 | 3,478 | 27,200 | 5.244393 | 0.092007 | 0.072368 | 0.05614 | 0.069079 | 0.846436 | 0.804002 | 0.761349 | 0.722697 | 0.706908 | 0.62511 | 0 | 0.101804 | 0.13364 | 27,200 | 590 | 400 | 46.101695 | 0.672226 | 0.941618 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
2d575fcb23b46ecb39a34623e34997a8f2eb162b | 4,762 | py | Python | src/feat/vocabulary.py | linminhtoo/megan | 2d598be264217ec0815ac2ef3df5a05252b865b3 | [
"MIT"
] | 29 | 2020-10-07T14:00:54.000Z | 2022-02-06T13:14:46.000Z | src/feat/vocabulary.py | linminhtoo/megan | 2d598be264217ec0815ac2ef3df5a05252b865b3 | [
"MIT"
] | 4 | 2021-01-14T21:24:46.000Z | 2021-05-31T07:11:37.000Z | src/feat/vocabulary.py | linminhtoo/megan | 2d598be264217ec0815ac2ef3df5a05252b865b3 | [
"MIT"
] | 13 | 2020-10-22T03:37:05.000Z | 2022-03-01T12:27:54.000Z | """
Vocabulary of all possible reaction generation actions
"""
# Found on Maxus092 and USPTO-50k train+valid
DEFAULT_ACTION_VOCABULARY = [
('change_atom', (1, 0, 0, 0)),
('change_atom', (0, 0, 1, 1)),
('change_atom', (-1, 0, 0, 0)),
('change_atom', (0, 0, 0, 0)),
('change_atom', (0, 1, 1, 0)),
('change_atom', (0, 1, 0, 0)),
('change_atom', (1, 0, 0, 1)),
('change_atom', (0, 0, 1, 0)),
('change_atom', (-1, 0, 0, 1)),
('change_atom', (0, 0, 0, 1)),
('change_atom', (0, 2, 0, 0)),
('change_atom', (1, 0, 1, 0)),
('change_atom', (0, 2, 1, 0)),
('change_atom', (3, 0, 0, 0)),
('change_atom', (1, 0, 1, 1)),
('change_bond', (2, 2)),
('change_bond', (2, 3)),
('change_bond', (2, 0)),
('change_bond', (1, 0)),
('change_bond', (12, 0)),
('change_bond', (3, 0)),
('change_bond', (None, None)),
('add_atom', ((1, 0), (14, 0, 0, 0, 0))),
('add_atom', ((2, 0), (7, -1, 0, 0, 0))),
('add_atom', ((2, 0), (16, 0, 0, 1, 0))),
('add_atom', ((2, 0), (16, 0, 1, 0, 0))),
('add_atom', ((1, 0), (7, 0, 0, 0, 1))),
('add_atom', ((1, 0), (12, 1, 0, 0, 0))),
('add_atom', ((1, 0), (9, 0, 0, 0, 0))),
('add_atom', ((3, 0), (7, 0, 0, 0, 0))),
('add_atom', ((1, 0), (15, 0, 0, 1, 0))),
('add_atom', ((12, 0), (8, 0, 0, 0, 1))),
('add_atom', ((1, 0), (83, 0, 0, 0, 0))),
('add_atom', ((1, 0), (34, 0, 0, 0, 0))),
('add_atom', ((2, 0), (15, 1, 0, 0, 0))),
('add_atom', ((2, 0), (50, 0, 0, 0, 0))),
('add_atom', ((2, 2), (7, 0, 0, 0, 0))),
('add_atom', ((1, 0), (16, -1, 0, 0, 0))),
('add_atom', ((1, 0), (35, 0, 0, 0, 0))),
('add_atom', ((1, 0), (8, 1, 0, 0, 0))),
('add_atom', ((1, 0), (13, 1, 0, 0, 0))),
('add_atom', ((1, 0), (15, 0, 0, 0, 0))),
('add_atom', ((1, 0), (15, 0, 0, 0, 1))),
('add_atom', ((2, 0), (6, 0, 0, 0, 0))),
('add_atom', ((3, 0), (6, 0, 0, 0, 0))),
('add_atom', ((2, 0), (5, 0, 0, 0, 0))),
('add_atom', ((1, 0), (82, 3, 0, 0, 0))),
('add_atom', ((1, 0), (30, 1, 0, 0, 0))),
('add_atom', ((1, 0), (13, -1, 0, 0, 0))),
('add_atom', ((1, 0), (8, -1, 0, 0, 0))),
('add_atom', ((3, 0), (6, -1, 0, 0, 0))),
('add_atom', ((2, 0), (6, 0, 0, 0, 1))),
('add_atom', ((1, 0), (6, 0, 2, 0, 0))),
('add_atom', ((1, 0), (5, 2, 0, 0, 0))),
('add_atom', ((1, 0), (11, 0, 0, 0, 0))),
('add_atom', ((1, 0), (16, 0, 0, 0, 0))),
('add_atom', ((12, 0), (16, 0, 0, 0, 1))),
('add_atom', ((2, 0), (8, 0, 0, 0, 0))),
('add_atom', ((1, 0), (7, -1, 0, 0, 0))),
('add_atom', ((1, 0), (7, 1, 0, 0, 1))),
('add_atom', ((1, 0), (16, 0, 0, 1, 0))),
('add_atom', ((1, 0), (16, 0, 1, 0, 0))),
('add_atom', ((1, 0), (33, 1, 0, 0, 0))),
('add_atom', ((12, 0), (8, 1, 0, 0, 1))),
('add_atom', ((2, 0), (14, 0, 0, 0, 0))),
('add_atom', ((1, 0), (17, 0, 0, 0, 0))),
('add_atom', ((1, 0), (6, 0, 2, 1, 0))),
('add_atom', ((1, 0), (53, 1, 0, 1, 0))),
('add_atom', ((1, 0), (6, 0, 0, 0, 1))),
('add_atom', ((2, 0), (7, 0, 0, 0, 0))),
('add_atom', ((1, 0), (53, 0, 0, 0, 0))),
('add_atom', ((1, 0), (16, 0, 2, 0, 0))),
('add_atom', ((2, 0), (16, 0, 0, 0, 0))),
('add_atom', ((2, 2), (6, 0, 0, 0, 0))),
('add_atom', ((1, 0), (6, -1, 0, 0, 0))),
('add_atom', ((1, 0), (30, 0, 0, 0, 0))),
('add_atom', ((1, 0), (19, 0, 0, 0, 0))),
('add_atom', ((1, 0), (16, 1, 0, 0, 0))),
('add_atom', ((2, 3), (7, 0, 0, 0, 0))),
('add_atom', ((1, 0), (7, 1, 0, 0, 0))),
('add_atom', ((2, 3), (6, 0, 0, 0, 0))),
('add_atom', ((12, 0), (7, 1, 0, 0, 1))),
('add_atom', ((1, 0), (30, 2, 0, 0, 0))),
('add_atom', ((1, 0), (5, 0, 0, 0, 0))),
('add_atom', ((1, 0), (50, 0, 0, 0, 0))),
('add_atom', ((12, 0), (7, 1, 0, 1, 1))),
('add_atom', ((12, 0), (7, 0, 0, 1, 1))),
('add_atom', ((1, 0), (5, -1, 0, 0, 0))),
('add_atom', ((2, 0), (15, 0, 0, 0, 0))),
('add_atom', ((12, 0), (16, 1, 0, 0, 1))),
('add_atom', ((12, 0), (7, 0, 0, 0, 1))),
('add_atom', ((1, 0), (7, 0, 0, 0, 0))),
('add_atom', ((1, 0), (14, 0, 2, 0, 0))),
('add_atom', ((1, 0), (6, 0, 0, 0, 0))),
('add_atom', ((12, 0), (6, 0, 0, 0, 1))),
('add_atom', ((1, 0), (6, 0, 1, 1, 0))),
('add_atom', ((1, 0), (15, 1, 0, 0, 0))),
('add_atom', ((1, 0), (1, 0, 0, 0, 0))),
('add_atom', ((2, 0), (7, 1, 0, 0, 0))),
('add_atom', ((1, 0), (13, 0, 0, 0, 0))),
('add_atom', ((1, 0), (53, 1, 0, 0, 0))),
('add_atom', ((1, 0), (12, 0, 0, 0, 0))),
('add_atom', ((1, 0), (3, 0, 0, 0, 0))),
('add_atom', ((1, 0), (8, 0, 0, 0, 0))),
('add_atom', ((1, 0), (6, 0, 1, 0, 0))),
('add_atom', ((1, 0), (29, 0, 0, 0, 0))),
('add_ring', 'benzene'),
('stop', )
]
| 41.051724 | 54 | 0.370013 | 896 | 4,762 | 1.844866 | 0.05692 | 0.240774 | 0.196007 | 0.343013 | 0.848155 | 0.813672 | 0.772535 | 0.735632 | 0.484574 | 0.399879 | 0 | 0.202714 | 0.257245 | 4,762 | 115 | 55 | 41.408696 | 0.264631 | 0.02079 | 0 | 0 | 0 | 0 | 0.20043 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
2dda8d6be0fd481092d5a4c068a5829826ae3e90 | 3,461 | py | Python | test/line_test.py | TimeExceed/draw.py | 361dceda86a983815f48b898b3d6c75c776f6aa0 | [
"BSD-3-Clause"
] | null | null | null | test/line_test.py | TimeExceed/draw.py | 361dceda86a983815f48b898b3d6c75c776f6aa0 | [
"BSD-3-Clause"
] | null | null | null | test/line_test.py | TimeExceed/draw.py | 361dceda86a983815f48b898b3d6c75c776f6aa0 | [
"BSD-3-Clause"
] | null | null | null | import testa
from fathom import Point, ORIGIN
import fathom.tikz as tikz
import fathom.colors as colors
import fathom.line_styles as line_styles
@testa.is_(expect=r'''
\documentclass[UTF8]{ctexart}
\usepackage[a0paper]{geometry}
\usepackage{tikz}
\usetikzlibrary{arrows.meta,arrows}
\pagestyle{empty}
\begin{document}
\begin{tikzpicture}[>=Stealth]
\draw (0.00cm,0.00cm) -- (1.00cm,0.00cm);
\end{tikzpicture}
\end{document}
''')
def draw_line():
canvas = tikz.Canvas()
canvas.new_line(src=ORIGIN, dst=Point(1, 0))
return canvas.draw()
@testa.is_(expect=r'''
\documentclass[UTF8]{ctexart}
\usepackage[a0paper]{geometry}
\usepackage{tikz}
\usetikzlibrary{arrows.meta,arrows}
\pagestyle{empty}
\begin{document}
\begin{tikzpicture}[>=Stealth]
\draw[->] (0.00cm,0.00cm) -- (1.00cm,0.00cm);
\end{tikzpicture}
\end{document}
''')
def draw_arrow():
canvas = tikz.Canvas()
canvas.new_arrow(src=ORIGIN, dst=Point(1, 0))
return canvas.draw()
@testa.is_(expect=r'''
\documentclass[UTF8]{ctexart}
\usepackage[a0paper]{geometry}
\usepackage{tikz}
\usetikzlibrary{arrows.meta,arrows}
\pagestyle{empty}
\begin{document}
\begin{tikzpicture}[>=Stealth]
\draw[<->] (0.00cm,0.00cm) -- (1.00cm,0.00cm);
\end{tikzpicture}
\end{document}
''')
def draw_dblarrow():
canvas = tikz.Canvas()
canvas.new_dblarrow(src=ORIGIN, dst=Point(1, 0))
return canvas.draw()
@testa.is_(expect=r'''
\documentclass[UTF8]{ctexart}
\usepackage[a0paper]{geometry}
\usepackage{tikz}
\usetikzlibrary{arrows.meta,arrows}
\pagestyle{empty}
\begin{document}
\begin{tikzpicture}[>=Stealth]
\draw[<-] (0.00cm,0.00cm) -- (1.00cm,0.00cm);
\end{tikzpicture}
\end{document}
''')
def draw_backward_arrow():
canvas = tikz.Canvas()
canvas.new_backward_arrow(src=ORIGIN, dst=Point(1, 0))
return canvas.draw()
@testa.is_(expect=r'''
\documentclass[UTF8]{ctexart}
\usepackage[a0paper]{geometry}
\usepackage{tikz}
\usetikzlibrary{arrows.meta,arrows}
\pagestyle{empty}
\begin{document}
\begin{tikzpicture}[>=Stealth]
\draw (1.00cm,0.00cm) -- (3.00cm,0.00cm);
\end{tikzpicture}
\end{document}
''')
def line_from_shape():
canvas = tikz.Canvas()
c0 = canvas.new_circle(center=ORIGIN, radius=1, pen_color=colors.INVISIBLE)
canvas.new_line(src=c0, dst=Point(3, 0))
return canvas.draw()
@testa.is_(expect=r'''
\documentclass[UTF8]{ctexart}
\usepackage[a0paper]{geometry}
\usepackage{tikz}
\usetikzlibrary{arrows.meta,arrows}
\pagestyle{empty}
\begin{document}
\begin{tikzpicture}[>=Stealth]
\draw (0.00cm,0.00cm) -- (2.00cm,0.00cm);
\end{tikzpicture}
\end{document}
''')
def line_to_shape():
canvas = tikz.Canvas()
c1 = canvas.new_circle(
center=Point(3, 0),
radius=1,
pen_color=colors.INVISIBLE)
canvas.new_line(src=ORIGIN, dst=c1)
return canvas.draw()
@testa.is_(expect=r'''
\documentclass[UTF8]{ctexart}
\usepackage[a0paper]{geometry}
\usepackage{tikz}
\usetikzlibrary{arrows.meta,arrows}
\pagestyle{empty}
\begin{document}
\begin{tikzpicture}[>=Stealth]
\draw (1.00cm,0.00cm) -- (2.00cm,0.00cm);
\end{tikzpicture}
\end{document}
''')
def line_between_shapes():
canvas = tikz.Canvas()
c0 = canvas.new_circle(
center=ORIGIN,
radius=1,
pen_color=colors.INVISIBLE)
c1 = canvas.new_circle(
center=Point(3, 0),
radius=1,
pen_color=colors.INVISIBLE)
canvas.new_line(src=c0, dst=c1)
return canvas.draw()
if __name__ == '__main__':
testa.main()
| 23.228188 | 79 | 0.697775 | 460 | 3,461 | 5.154348 | 0.130435 | 0.040067 | 0.053142 | 0.041333 | 0.90426 | 0.878532 | 0.843948 | 0.843948 | 0.843948 | 0.826655 | 0 | 0.040696 | 0.119619 | 3,461 | 148 | 80 | 23.385135 | 0.737447 | 0 | 0 | 0.765152 | 0 | 0.05303 | 0.527882 | 0.250795 | 0 | 0 | 0 | 0 | 0 | 1 | 0.05303 | false | 0 | 0.037879 | 0 | 0.143939 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
933655f6de74ef6b59471554326f5dcb858943e5 | 1,752 | py | Python | tests.py | nikita7957/curs | c422be3ad2d517f2862cfe227b3232b0d41e7230 | [
"MIT"
] | null | null | null | tests.py | nikita7957/curs | c422be3ad2d517f2862cfe227b3232b0d41e7230 | [
"MIT"
] | 8 | 2017-06-01T16:28:31.000Z | 2017-06-01T16:34:02.000Z | tests.py | nikita7957/curs | c422be3ad2d517f2862cfe227b3232b0d41e7230 | [
"MIT"
] | null | null | null | import pytest
from Search import dfs
@pytest.mark.parametrize('graph, node, expected',
[
({'A': (['B', 'C']),
'B': (['A', 'D', 'E']),
'C': (['A', 'F']),
'D': (['B']),
'E': (['B', 'F']),
'F': (['C', 'E'])},'B',"['B', 'A', 'C', 'F', 'E', 'D']")
])
def test_node_B(graph, node, expected):
assert str (dfs(graph,node)) == expected
@pytest.mark.parametrize('graph, node, expected',
[
({'A': (['B', 'C']),
'B': (['A', 'D', 'E']),
'C': (['A', 'F']),
'D': (['B']),
'E': (['B', 'F']),
'F': (['C', 'E'])},'A',"['A', 'B', 'D', 'E', 'F', 'C']")
])
def test_node_A(graph, node, expected):
assert str (dfs(graph,node)) == expected
@pytest.mark.parametrize('graph, node, expected',
[
({'A': (['B', 'C']),
'B': (['A', 'D', 'E']),
'C': (['A', 'F']),
'D': (['B']),
'E': (['B', 'F']),
'F': (['C', 'E'])},'Q','error')
])
def test_NonExistent_Node(graph, node, expected):
assert str (dfs(graph,node)) == expected
@pytest.mark.parametrize('graph, node, expected',
[
({'A': (['B', 'C']),
'B': (['A', 'D', 'E']),
'C': (['A', 'F']),
'D': (['B'])},'F','error')
])
def test_Unconnected_Node(graph, node, expected):
assert str (dfs(graph,node)) == expected
@pytest.mark.parametrize('graph, node, expected',
[
({'A': (['A'])},'A','error')
])
def test_Endless_Graph(graph, node, expected):
assert str (dfs(graph,node)) == expected
def test_docs():
assert open('DOCS.md') != None
def test_license():
assert open('LICENSE') != None
| 26.149254 | 66 | 0.408676 | 204 | 1,752 | 3.45098 | 0.147059 | 0.191761 | 0.362216 | 0.184659 | 0.703125 | 0.703125 | 0.703125 | 0.703125 | 0.703125 | 0.637784 | 0 | 0 | 0.295662 | 1,752 | 66 | 67 | 26.545455 | 0.570502 | 0 | 0 | 0.611111 | 0 | 0 | 0.158457 | 0 | 0 | 0 | 0 | 0 | 0.12963 | 1 | 0.12963 | false | 0 | 0.037037 | 0 | 0.166667 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
87bbdb885d7fcd9d26b34ce531e2a27fb08ae289 | 685 | py | Python | src/C-Python/Polyquine.py | graph-paper/Polyquine | 50872251b457da3be7b386933260b4f0231fc83b | [
"WTFPL"
] | 1 | 2021-11-30T12:32:45.000Z | 2021-11-30T12:32:45.000Z | src/C-Python/Polyquine.py | graph-paper/Polyquine | 50872251b457da3be7b386933260b4f0231fc83b | [
"WTFPL"
] | null | null | null | src/C-Python/Polyquine.py | graph-paper/Polyquine | 50872251b457da3be7b386933260b4f0231fc83b | [
"WTFPL"
] | null | null | null | #if false
a="""
#endif
int main(){char*b="#if false%ca=%c%c%c%c#endif%cint main(){char*b=%c%s%c;printf(b,10,34,34,34,10,10,34,b,34,10,10,34,34,34,10,10,10,34,34,10,10);}%c#if false%c%c%c%c%cq=chr(34)%cn=chr(10)%cs=%c#if false%%ca=%%c%%c%%c%%s%%c%%c%%c%%cq=chr(34)%%cn=chr(10)%%cs=%%c%%s%%c%%cprint(s %%%% (n,q,q,q,a,q,q,q,n,n,n,q,s,q,n,n))%%c#endif%c%cprint(s %% (n,q,q,q,a,q,q,q,n,n,n,q,s,q,n,n))%c#endif";printf(b,10,34,34,34,10,10,34,b,34,10,10,34,34,34,10,10,10,34,34,10,10);}
#if false
"""
q=chr(34)
n=chr(10)
s="#if false%ca=%c%c%c%s%c%c%c%cq=chr(34)%cn=chr(10)%cs=%c%s%c%cprint(s %% (n,q,q,q,a,q,q,q,n,n,n,q,s,q,n,n))%c#endif"
print(s % (n,q,q,q,a,q,q,q,n,n,n,q,s,q,n,n))
#endif | 62.272727 | 457 | 0.567883 | 213 | 685 | 1.826291 | 0.107981 | 0.082262 | 0.061697 | 0.123393 | 0.77892 | 0.77892 | 0.748072 | 0.748072 | 0.748072 | 0.748072 | 0 | 0.145675 | 0.037956 | 685 | 11 | 458 | 62.272727 | 0.444613 | 0.018978 | 0 | 0 | 0 | 0.222222 | 0.879285 | 0.785395 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.333333 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
87f621e9202aea32a776fc18de4b0d27ba3f8d4f | 6,060 | py | Python | Dataset.py | wangzhen21/neural_collaborative_filtering | 0eaa23ef1d6411382cf07c3ddf735e27920853d3 | [
"Apache-2.0"
] | null | null | null | Dataset.py | wangzhen21/neural_collaborative_filtering | 0eaa23ef1d6411382cf07c3ddf735e27920853d3 | [
"Apache-2.0"
] | null | null | null | Dataset.py | wangzhen21/neural_collaborative_filtering | 0eaa23ef1d6411382cf07c3ddf735e27920853d3 | [
"Apache-2.0"
] | null | null | null | '''
Created on Aug 8, 2016
Processing datasets.
@author: Xiangnan He (xiangnanhe@gmail.com)
'''
import scipy.sparse as sp
import numpy as np
from tqdm import tqdm
class Dataset(object):
'''
classdocs
'''
def __init__(self, path):
'''
Constructor
'''
self.trainMatrix = self.load_rating_file_as_matrix(path + ".train.rating")
self.testRatings = self.load_rating_file_as_list(path + ".test.rating")
self.testNegatives = self.load_negative_file(path + ".test.negative")
assert len(self.testRatings) == len(self.testNegatives)
self.num_users, self.num_items = self.trainMatrix.shape
def load_rating_file_as_list(self, filename):
ratingList = []
with open(filename, "r") as f:
line = f.readline()
while line != None and line != "":
arr = line.split("\t")
user, item = int(arr[0]), int(arr[1])
ratingList.append([user, item])
line = f.readline()
return ratingList
def load_negative_file(self, filename):
negativeList = []
with open(filename, "r") as f:
line = f.readline()
while line != None and line != "":
arr = line.split("\t")
negatives = []
for x in arr[1: ]:
negatives.append(int(x))
negativeList.append(negatives)
line = f.readline()
return negativeList
def load_rating_file_as_matrix(self, filename):
'''
Read .rating file and Return dok matrix.
The first line of .rating file is: num_users\t num_items
'''
# Get number of users and items
num_users, num_items = 0, 0
with open(filename, "r") as f:
line = f.readline()
while line != None and line != "":
arr = line.split("\t")
u, i = int(arr[0]), int(arr[1])
num_users = max(num_users, u)
num_items = max(num_items, i)
line = f.readline()
# Construct matrix
mat = sp.dok_matrix((num_users+1, num_items+1), dtype=np.float32)
with open(filename, "r") as f:
line = f.readline()
while line != None and line != "":
arr = line.split("\t")
user, item, rating = int(arr[0]), int(arr[1]), float(arr[2])
if (rating > 0):
mat[user, item] = 1.0
line = f.readline()
return mat
class Datasetstance(object):
'''
classdocs
'''
def __init__(self, path):
'''
Constructor
'''
self.trainMatrix,self.trainstanceMatrix,self.stance_num = self.load_rating_file_as_matrix(path + ".train.rating.stance")
self.stance_num += 1
self.testRatings = self.load_rating_file_as_list(path + ".test.rating.stance")
self.testNegatives,self.testNegativesDre = self.load_negative_file(path + ".test.negtive.stance.newindex",path + ".test.negtive.flag.stance.newindex")
assert len(self.testRatings) == len(self.testNegatives)
self.num_users, self.num_items = self.trainMatrix.shape
def load_rating_file_as_list(self, filename):
ratingList = []
with open(filename, "r") as f:
line = f.readline()
while line != None and line != "":
arr = line.split("\t")
user, item,stance = int(arr[0]), int(arr[1]),int(arr[5])
if stance > self.stance_num:
stance = self.stance_num
ratingList.append([user, item,stance])
line = f.readline()
return ratingList
def load_negative_file(self, filename,filename2):
negativeList = []
negativedireList = []
with open(filename, "r") as f:
line = f.readline()
while line != None and line != "":
arr = line.split("\t")
negatives = []
for x in arr[1:]:
negatives.append(int(x))
negativeList.append(negatives)
line = f.readline()
with open(filename2, "r") as f:
line = f.readline()
while line != None and line != "":
arr = line.split("\t")
negatives = []
for x in arr[1:]:
negatives.append(int(x))
negativedireList.append(negatives)
line = f.readline()
return negativeList,negativedireList
def load_rating_file_as_matrix(self, filename):
'''
Read .rating file and Return dok matrix.
The first line of .rating file is: num_users\t num_items
'''
# Get number of users and items
num_users, num_items,num_stance = 0, 0,0
with open(filename, "r") as f:
line = f.readline()
while line != None and line != "":
arr = line.split("\t")
u, i,s = int(arr[0]), int(arr[1]),int(arr[5])
num_users = max(num_users, u)
num_items = max(num_items, i)
num_stance = max(num_stance,s)
line = f.readline()
# Construct matrix
mat = sp.dok_matrix((num_users + 1, num_items + 1), dtype=np.float32)
matstance = sp.dok_matrix((num_users + 1, num_items + 1), dtype=np.float32)
iline = 0
with open(filename, "r") as f:
line = f.readline()
while line != None and line != "":
i+=1
if i%10000 == 0:
print i
arr = line.split("\t")
user, item, rating,stance = int(arr[0]), int(arr[1]), float(arr[2]),int(arr[5])
if (rating > 0):
mat[user, item] = 1.0
matstance[user, item] = stance
line = f.readline()
return mat,matstance,num_stance | 37.407407 | 158 | 0.518152 | 707 | 6,060 | 4.321075 | 0.151344 | 0.02946 | 0.076596 | 0.023568 | 0.803273 | 0.79509 | 0.767594 | 0.71653 | 0.702128 | 0.647791 | 0 | 0.015222 | 0.360396 | 6,060 | 162 | 159 | 37.407407 | 0.772962 | 0.015347 | 0 | 0.669421 | 0 | 0 | 0.030612 | 0.01148 | 0 | 0 | 0 | 0 | 0.016529 | 0 | null | null | 0 | 0.024793 | null | null | 0.008264 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
35547523db1b96ef712d0d6000a31e97a06de366 | 215 | py | Python | src/dicom_parser/utils/__init__.py | GalBenZvi/dicom_parser | fc3e892ebf99c4e5d62cb5e7de7df341baf445fe | [
"MIT"
] | 11 | 2020-08-08T21:41:54.000Z | 2021-07-27T12:48:31.000Z | src/dicom_parser/utils/__init__.py | GalBenZvi/dicom_parser | fc3e892ebf99c4e5d62cb5e7de7df341baf445fe | [
"MIT"
] | 45 | 2020-03-03T14:32:16.000Z | 2021-07-30T16:42:17.000Z | src/dicom_parser/utils/__init__.py | GalBenZvi/dicom_parser | fc3e892ebf99c4e5d62cb5e7de7df341baf445fe | [
"MIT"
] | 6 | 2021-10-19T09:19:22.000Z | 2022-03-13T19:26:10.000Z | """
Utilities for the *dicom_parser* package.
"""
from dicom_parser.utils.parse_tag import parse_tag
from dicom_parser.utils.read_file import read_file
from dicom_parser.utils.requires_pandas import requires_pandas
| 30.714286 | 62 | 0.84186 | 33 | 215 | 5.181818 | 0.454545 | 0.25731 | 0.263158 | 0.350877 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.088372 | 215 | 6 | 63 | 35.833333 | 0.872449 | 0.190698 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
3585d76803397660d5f34deb383686e20e17f0c5 | 12,465 | py | Python | conformal_blocks/test/test_simple_lie_algebra.py | mjschust/conformal-blocks | 7f6c551f506b3ebf5e0f73eb9b4cadaeee2a02b9 | [
"MIT"
] | null | null | null | conformal_blocks/test/test_simple_lie_algebra.py | mjschust/conformal-blocks | 7f6c551f506b3ebf5e0f73eb9b4cadaeee2a02b9 | [
"MIT"
] | 1 | 2017-04-20T23:15:32.000Z | 2017-04-20T23:15:32.000Z | conformal_blocks/test/test_simple_lie_algebra.py | mjschust/conformal-blocks | 7f6c551f506b3ebf5e0f73eb9b4cadaeee2a02b9 | [
"MIT"
] | null | null | null | '''
Created on Nov 19, 2016
@author: mjschust
'''
from __future__ import division
import unittest
import conformal_blocks.cbbundle as cbd
class Test(unittest.TestCase):
def test_sl2_char(self):
liealg = cbd.TypeALieAlgebra(1)
wt = tuple([0])
self.assertEqual(1, liealg.get_rep_dim(wt), "Dimension not correct")
dom_char = liealg.get_dominant_character(wt)
char_wt = tuple([0])
self.assertTrue(char_wt in dom_char, "Character incorrect")
self.assertEqual(1, dom_char[char_wt], "Character incorrect")
wt = tuple([1])
self.assertEqual(2, liealg.get_rep_dim(wt), "Dimension not correct")
dom_char = liealg.get_dominant_character(wt)
char_wt = tuple([1])
self.assertTrue(char_wt in dom_char, "Character incorrect")
self.assertEqual(1, dom_char[char_wt], "Character incorrect")
wt = tuple([2])
self.assertEqual(3, liealg.get_rep_dim(wt), "Dimension not correct")
dom_char = liealg.get_dominant_character(wt)
char_wt = tuple([2])
self.assertTrue(char_wt in dom_char, "Character incorrect")
self.assertEqual(1, dom_char[char_wt], "Character incorrect")
char_wt = tuple([0])
self.assertTrue(char_wt in dom_char, "Character incorrect")
self.assertEqual(1, dom_char[char_wt], "Character incorrect")
def test_sl3_char(self):
liealg = cbd.TypeALieAlgebra(2)
wt = (0, 0)
self.assertEqual(1, liealg.get_rep_dim(wt), "Dimension not correct")
dom_char = liealg.get_dominant_character(wt)
char_wt = tuple([0, 0])
self.assertTrue(char_wt in dom_char, "Character incorrect")
self.assertEqual(1, dom_char[char_wt], "Character incorrect")
wt = (1, 0)
self.assertEqual(3, liealg.get_rep_dim(wt), "Dimension not correct")
dom_char = liealg.get_dominant_character(wt)
char_wt = tuple([1, 0])
self.assertTrue(char_wt in dom_char, "Character incorrect")
self.assertEqual(1, dom_char[char_wt], "Character incorrect")
wt = (0, 1)
self.assertEqual(3, liealg.get_rep_dim(wt), "Dimension not correct")
dom_char = liealg.get_dominant_character(wt)
char_wt = tuple([1, 0])
self.assertFalse(char_wt in dom_char, "Character incorrect")
char_wt = tuple([0, 1])
self.assertTrue(char_wt in dom_char, "Character incorrect")
self.assertEqual(1, dom_char[char_wt], "Character incorrect")
wt = (1, 1)
self.assertEqual(8, liealg.get_rep_dim(wt), "Dimension not correct")
dom_char = liealg.get_dominant_character(wt)
char_wt = tuple([1, 1])
self.assertTrue(char_wt in dom_char, "Character incorrect")
self.assertEqual(1, dom_char[char_wt], "Character incorrect")
char_wt = tuple([0, 0])
self.assertTrue(char_wt in dom_char, "Character incorrect")
self.assertEqual(2, dom_char[char_wt], "Character incorrect")
wt = (2, 1)
self.assertEqual(15, liealg.get_rep_dim(wt), "Dimension not correct")
dom_char = liealg.get_dominant_character(wt)
char_wt = tuple([2, 1])
self.assertTrue(char_wt in dom_char, "Character incorrect")
self.assertEqual(1, dom_char[char_wt], "Character incorrect")
char_wt = tuple([1, 0])
self.assertTrue(char_wt in dom_char, "Character incorrect")
self.assertEqual(2, dom_char[char_wt], "Character incorrect")
char_wt = tuple([0, 2])
self.assertTrue(char_wt in dom_char, "Character incorrect")
self.assertEqual(1, dom_char[char_wt], "Character incorrect")
def test_sl4_char(self):
liealg = cbd.TypeALieAlgebra(3)
wt = tuple([0, 0, 0])
self.assertEqual(1, liealg.get_rep_dim(wt), "Dimension not correct")
dom_char = liealg.get_dominant_character(wt)
char_wt = tuple([0, 0, 0])
self.assertTrue(char_wt in dom_char, "Character incorrect")
self.assertEqual(1, dom_char[char_wt], "Character incorrect")
wt = tuple([1, 0, 0])
self.assertEqual(4, liealg.get_rep_dim(wt), "Dimension not correct")
dom_char = liealg.get_dominant_character(wt)
char_wt = tuple([1, 0, 0])
self.assertTrue(char_wt in dom_char, "Character incorrect")
self.assertEqual(1, dom_char[char_wt], "Character incorrect")
wt = tuple([0, 0, 1])
self.assertEqual(4, liealg.get_rep_dim(wt), "Dimension not correct")
dom_char = liealg.get_dominant_character(wt)
char_wt = tuple([1, 0, 0])
self.assertFalse(char_wt in dom_char, "Character incorrect")
char_wt = tuple([0, 0, 1])
self.assertTrue(char_wt in dom_char, "Character incorrect")
self.assertEqual(1, dom_char[char_wt], "Character incorrect")
wt = tuple([0, 1, 0])
self.assertEqual(6, liealg.get_rep_dim(wt), "Dimension not correct")
dom_char = liealg.get_dominant_character(wt)
char_wt = tuple([0, 1, 0])
self.assertTrue(char_wt in dom_char, "Character incorrect")
self.assertEqual(1, dom_char[char_wt], "Character incorrect")
wt = tuple([1, 1, 1])
self.assertEqual(64, liealg.get_rep_dim(wt), "Dimension not correct")
dom_char = liealg.get_dominant_character(wt)
char_wt = tuple([1, 1, 1])
self.assertTrue(char_wt in dom_char, "Character incorrect")
self.assertEqual(1, dom_char[char_wt], "Character incorrect")
char_wt = tuple([2, 0, 0])
self.assertTrue(char_wt in dom_char, "Character incorrect")
self.assertEqual(2, dom_char[char_wt], "Character incorrect")
char_wt = tuple([0, 1, 0])
self.assertTrue(char_wt in dom_char, "Character incorrect")
self.assertEqual(4, dom_char[char_wt], "Character incorrect")
char_wt = tuple([0, 0, 2])
self.assertTrue(char_wt in dom_char, "Character incorrect")
self.assertEqual(2, dom_char[char_wt], "Character incorrect")
def test_sl2_tensor(self):
liealg = cbd.TypeALieAlgebra(1)
decomp = liealg.tensor((0,), (0,))
dec_wt = tuple([0])
self.assertTrue(dec_wt in decomp, "Tensor decomp incorrect")
self.assertEqual(1, decomp[dec_wt], "Tensor decomp incorrect")
decomp = liealg.tensor((0,), (1,))
dec_wt = tuple([1])
self.assertTrue(dec_wt in decomp, "Tensor decomp incorrect")
self.assertEqual(1, decomp[dec_wt], "Tensor decomp incorrect")
decomp = liealg.tensor((1,), (1,))
dec_wt = tuple([2])
self.assertTrue(dec_wt in decomp, "Tensor decomp incorrect")
self.assertEqual(1, decomp[dec_wt], "Tensor decomp incorrect")
dec_wt = tuple([0])
self.assertTrue(dec_wt in decomp, "Tensor decomp incorrect")
self.assertEqual(1, decomp[dec_wt], "Tensor decomp incorrect")
decomp = liealg.tensor((2,), (1,))
dec_wt = tuple([3])
self.assertTrue(dec_wt in decomp, "Tensor decomp incorrect")
self.assertEqual(1, decomp[dec_wt], "Tensor decomp incorrect")
dec_wt = tuple([1])
self.assertTrue(dec_wt in decomp, "Tensor decomp incorrect")
self.assertEqual(1, decomp[dec_wt], "Tensor decomp incorrect")
decomp = liealg.tensor((5,), (2,))
dec_wt = tuple([7])
self.assertTrue(dec_wt in decomp, "Tensor decomp incorrect")
self.assertEqual(1, decomp[dec_wt], "Tensor decomp incorrect")
dec_wt = tuple([5])
self.assertTrue(dec_wt in decomp, "Tensor decomp incorrect")
self.assertEqual(1, decomp[dec_wt], "Tensor decomp incorrect")
dec_wt = tuple([3])
self.assertTrue(dec_wt in decomp, "Tensor decomp incorrect")
self.assertEqual(1, decomp[dec_wt], "Tensor decomp incorrect")
def test_sl2_fusion(self):
liealg = cbd.TypeALieAlgebra(1)
decomp = liealg.fusion((0,), (0,),1)
dec_wt = tuple([0])
self.assertTrue(dec_wt in decomp, "Tensor decomp incorrect")
self.assertEqual(1, decomp[dec_wt], "Tensor decomp incorrect")
decomp = liealg.fusion((0,), (1,), 1)
dec_wt = tuple([1])
self.assertTrue(dec_wt in decomp, "Tensor decomp incorrect")
self.assertEqual(1, decomp[dec_wt], "Tensor decomp incorrect")
decomp = liealg.fusion((1,), (1,),1)
dec_wt = tuple([2])
self.assertFalse(dec_wt in decomp, "Tensor decomp incorrect")
dec_wt = tuple([0])
self.assertTrue(dec_wt in decomp, "Tensor decomp incorrect")
self.assertEqual(1, decomp[dec_wt], "Tensor decomp incorrect")
decomp = liealg.fusion((1,), (1,),2)
dec_wt = tuple([2])
self.assertTrue(dec_wt in decomp, "Tensor decomp incorrect")
self.assertEqual(1, decomp[dec_wt], "Tensor decomp incorrect")
dec_wt = tuple([0])
self.assertTrue(dec_wt in decomp, "Tensor decomp incorrect")
self.assertEqual(1, decomp[dec_wt], "Tensor decomp incorrect")
decomp = liealg.fusion((5,), (2,),7)
dec_wt = tuple([7])
self.assertTrue(dec_wt in decomp, "Tensor decomp incorrect")
self.assertEqual(1, decomp[dec_wt], "Tensor decomp incorrect")
dec_wt = tuple([5])
self.assertTrue(dec_wt in decomp, "Tensor decomp incorrect")
self.assertEqual(1, decomp[dec_wt], "Tensor decomp incorrect")
dec_wt = tuple([3])
self.assertTrue(dec_wt in decomp, "Tensor decomp incorrect")
self.assertEqual(1, decomp[dec_wt], "Tensor decomp incorrect")
decomp = liealg.fusion((5,), (2,),6)
dec_wt = tuple([7])
self.assertFalse(dec_wt in decomp, "Tensor decomp incorrect")
dec_wt = tuple([5])
self.assertTrue(dec_wt in decomp, "Tensor decomp incorrect")
self.assertEqual(1, decomp[dec_wt], "Tensor decomp incorrect")
dec_wt = tuple([3])
self.assertTrue(dec_wt in decomp, "Tensor decomp incorrect")
self.assertEqual(1, decomp[dec_wt], "Tensor decomp incorrect")
decomp = liealg.fusion((5,), (2,),5)
dec_wt = tuple([7])
self.assertFalse(dec_wt in decomp, "Tensor decomp incorrect")
dec_wt = tuple([5])
self.assertFalse(dec_wt in decomp and decomp[dec_wt] > 0, "Tensor decomp incorrect")
dec_wt = tuple([3])
self.assertTrue(dec_wt in decomp, "Tensor decomp incorrect")
self.assertEqual(1, decomp[dec_wt], "Tensor decomp incorrect")
def test_sl2_multi_fusion(self):
liealg = cbd.TypeALieAlgebra(1)
wt1 = tuple([0])
wt2 = tuple([1])
decomp = liealg.multi_fusion([wt1, wt2, wt2], 1)
dec_wt = tuple([2])
self.assertFalse(dec_wt in decomp, "Tensor decomp incorrect")
dec_wt = tuple([0])
self.assertTrue(dec_wt in decomp, "Tensor decomp incorrect")
self.assertEqual(1, decomp[dec_wt], "Tensor decomp incorrect")
decomp = liealg.multi_fusion([wt2, wt2, wt2], 1)
dec_wt = tuple([1])
self.assertTrue(dec_wt in decomp, "Tensor decomp incorrect")
self.assertEqual(1, decomp[dec_wt], "Tensor decomp incorrect")
wt1 = tuple([4])
wt2 = tuple([2])
decomp = liealg.multi_fusion([wt2, wt2, wt2], 4)
dec_wt = tuple([4])
self.assertTrue(dec_wt in decomp, "Tensor decomp incorrect")
self.assertEqual(1, decomp[dec_wt], "Tensor decomp incorrect")
dec_wt = tuple([2])
self.assertTrue(dec_wt in decomp, "Tensor decomp incorrect")
self.assertEqual(3, decomp[dec_wt], "Tensor decomp incorrect")
dec_wt = tuple([0])
self.assertTrue(dec_wt in decomp, "Tensor decomp incorrect")
self.assertEqual(1, decomp[dec_wt], "Tensor decomp incorrect")
def test_degree(self):
liealg = cbd.TypeALieAlgebra(1)
wt1 = tuple([1])
wt2 = tuple([3])
wt3 = tuple([5])
#self.assertEqual(1, liealg.degree(wt1,wt2,wt2,wt3, 5), "Degree incorrect")
#self.assertEqual(0, liealg.degree(wt1, wt2, wt2, wt3, 6), "Degree incorrect")
liealg = cbd.TypeALieAlgebra(4)
wt1 = tuple([0, 1, 0, 0])
wt2 = tuple([0, 0, 1, 0])
#self.assertEqual(2, liealg.degree(wt1, wt1, wt2, wt2, 1), "Degree incorrect")
if __name__ == "__main__":
#import sys;sys.argv = ['', 'Test.testSL2OrbitIter','Test.testSL2Tensor','Test.testSL2Fusion','testSL2MultiFusion','testDegree']
unittest.main() | 44.838129 | 132 | 0.638347 | 1,663 | 12,465 | 4.612147 | 0.045099 | 0.056063 | 0.150587 | 0.127119 | 0.916037 | 0.899087 | 0.884224 | 0.856584 | 0.855671 | 0.855671 | 0 | 0.028917 | 0.231528 | 12,465 | 278 | 133 | 44.83813 | 0.771792 | 0.031929 | 0 | 0.686695 | 0 | 0 | 0.19441 | 0 | 0 | 0 | 0 | 0 | 0.472103 | 1 | 0.030043 | false | 0 | 0.012876 | 0 | 0.04721 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
ea99f957e2e4b5f2e4a0243a71803f19ce1b0216 | 314 | py | Python | autokeras/backend/tensorflow/__init__.py | chosungsu/autokeras | ab0b10a003b65093bc4b756540e9a865a1f625a5 | [
"MIT"
] | 1 | 2019-04-16T00:28:02.000Z | 2019-04-16T00:28:02.000Z | autokeras/backend/tensorflow/__init__.py | chosungsu/autokeras | ab0b10a003b65093bc4b756540e9a865a1f625a5 | [
"MIT"
] | 6 | 2020-01-28T23:06:50.000Z | 2022-02-10T01:19:01.000Z | autokeras/backend/tensorflow/__init__.py | chosungsu/autokeras | ab0b10a003b65093bc4b756540e9a865a1f625a5 | [
"MIT"
] | 2 | 2019-04-24T05:52:50.000Z | 2022-02-19T13:30:21.000Z | from autokeras.backend.tensorflow.model import produce_model
from autokeras.backend.tensorflow.data_transformer import ImageDataTransformer
from autokeras.backend.tensorflow.model_trainer import ModelTrainer
from autokeras.backend.tensorflow.loss_function import *
from autokeras.backend.tensorflow.metric import * | 62.8 | 78 | 0.88535 | 37 | 314 | 7.405405 | 0.405405 | 0.237226 | 0.364964 | 0.547445 | 0.255474 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.06051 | 314 | 5 | 79 | 62.8 | 0.928814 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
5768b07c44a4c6bcef8baaeca555bc48dcc4b0f2 | 18,172 | py | Python | svsim/compare/sc21_compare/cirq/cirq_qft_n15.py | yukwangmin/SV-Sim | 1b6b71cb490e7a1eac3d6ebc24777590d48378de | [
"MIT"
] | null | null | null | svsim/compare/sc21_compare/cirq/cirq_qft_n15.py | yukwangmin/SV-Sim | 1b6b71cb490e7a1eac3d6ebc24777590d48378de | [
"MIT"
] | null | null | null | svsim/compare/sc21_compare/cirq/cirq_qft_n15.py | yukwangmin/SV-Sim | 1b6b71cb490e7a1eac3d6ebc24777590d48378de | [
"MIT"
] | null | null | null | import time
import cirq
import numpy as np
from functools import reduce
def u1(p_lambda):
return cirq.MatrixGate(np.array([[1, 0], [0, np.exp(1j*p_lambda)]]))
q = [cirq.NamedQubit('q' + str(i)) for i in range(15)]
circuit = cirq.Circuit(
cirq.H(q[0]),
u1(0.785398163397448)(q[1]),
u1(0.392699081698724)(q[2]),
u1(0.196349540849362)(q[3]),
u1(0.098174770424681)(q[4]),
u1(0.0490873852123405)(q[5]),
u1(0.0245436926061703)(q[6]),
u1(0.0122718463030851)(q[7]),
u1(0.00613592315154256)(q[8]),
u1(0.00306796157577128)(q[9]),
u1(0.00153398078788564)(q[10]),
u1(7.66990393942821e-4)(q[11]),
u1(3.8349519697141e-4)(q[12]),
u1(1.91747598485705e-4)(q[13]),
u1(9.58737992428526e-5)(q[14]),
cirq.CNOT(q[1], q[0]),
u1(-0.785398163397448)(q[0]),
cirq.CNOT(q[1], q[0]),
u1(0.785398163397448)(q[0]),
cirq.H(q[1]),
cirq.CNOT(q[2], q[0]),
u1(-0.392699081698724)(q[0]),
cirq.CNOT(q[2], q[0]),
u1(0.392699081698724)(q[0]),
u1(0.785398163397448)(q[2]),
cirq.CNOT(q[2], q[1]),
u1(-0.785398163397448)(q[1]),
cirq.CNOT(q[2], q[1]),
u1(0.785398163397448)(q[1]),
cirq.H(q[2]),
cirq.CNOT(q[3], q[0]),
u1(-0.196349540849362)(q[0]),
cirq.CNOT(q[3], q[0]),
u1(0.196349540849362)(q[0]),
u1(0.392699081698724)(q[3]),
cirq.CNOT(q[3], q[1]),
u1(-0.392699081698724)(q[1]),
cirq.CNOT(q[3], q[1]),
u1(0.392699081698724)(q[1]),
u1(0.785398163397448)(q[3]),
cirq.CNOT(q[3], q[2]),
u1(-0.785398163397448)(q[2]),
cirq.CNOT(q[3], q[2]),
u1(0.785398163397448)(q[2]),
cirq.H(q[3]),
cirq.CNOT(q[4], q[0]),
u1(-0.098174770424681)(q[0]),
cirq.CNOT(q[4], q[0]),
u1(0.098174770424681)(q[0]),
u1(0.196349540849362)(q[4]),
cirq.CNOT(q[4], q[1]),
u1(-0.196349540849362)(q[1]),
cirq.CNOT(q[4], q[1]),
u1(0.196349540849362)(q[1]),
u1(0.392699081698724)(q[4]),
cirq.CNOT(q[4], q[2]),
u1(-0.392699081698724)(q[2]),
cirq.CNOT(q[4], q[2]),
u1(0.392699081698724)(q[2]),
u1(0.785398163397448)(q[4]),
cirq.CNOT(q[4], q[3]),
u1(-0.785398163397448)(q[3]),
cirq.CNOT(q[4], q[3]),
u1(0.785398163397448)(q[3]),
cirq.H(q[4]),
cirq.CNOT(q[5], q[0]),
u1(-0.0490873852123405)(q[0]),
cirq.CNOT(q[5], q[0]),
u1(0.0490873852123405)(q[0]),
u1(0.098174770424681)(q[5]),
cirq.CNOT(q[5], q[1]),
u1(-0.098174770424681)(q[1]),
cirq.CNOT(q[5], q[1]),
u1(0.098174770424681)(q[1]),
u1(0.196349540849362)(q[5]),
cirq.CNOT(q[5], q[2]),
u1(-0.196349540849362)(q[2]),
cirq.CNOT(q[5], q[2]),
u1(0.196349540849362)(q[2]),
u1(0.392699081698724)(q[5]),
cirq.CNOT(q[5], q[3]),
u1(-0.392699081698724)(q[3]),
cirq.CNOT(q[5], q[3]),
u1(0.392699081698724)(q[3]),
u1(0.785398163397448)(q[5]),
cirq.CNOT(q[5], q[4]),
u1(-0.785398163397448)(q[4]),
cirq.CNOT(q[5], q[4]),
u1(0.785398163397448)(q[4]),
cirq.H(q[5]),
cirq.CNOT(q[6], q[0]),
u1(-0.0245436926061703)(q[0]),
cirq.CNOT(q[6], q[0]),
u1(0.0245436926061703)(q[0]),
u1(0.0490873852123405)(q[6]),
cirq.CNOT(q[6], q[1]),
u1(-0.0490873852123405)(q[1]),
cirq.CNOT(q[6], q[1]),
u1(0.0490873852123405)(q[1]),
u1(0.098174770424681)(q[6]),
cirq.CNOT(q[6], q[2]),
u1(-0.098174770424681)(q[2]),
cirq.CNOT(q[6], q[2]),
u1(0.098174770424681)(q[2]),
u1(0.196349540849362)(q[6]),
cirq.CNOT(q[6], q[3]),
u1(-0.196349540849362)(q[3]),
cirq.CNOT(q[6], q[3]),
u1(0.196349540849362)(q[3]),
u1(0.392699081698724)(q[6]),
cirq.CNOT(q[6], q[4]),
u1(-0.392699081698724)(q[4]),
cirq.CNOT(q[6], q[4]),
u1(0.392699081698724)(q[4]),
u1(0.785398163397448)(q[6]),
cirq.CNOT(q[6], q[5]),
u1(-0.785398163397448)(q[5]),
cirq.CNOT(q[6], q[5]),
u1(0.785398163397448)(q[5]),
cirq.H(q[6]),
cirq.CNOT(q[7], q[0]),
u1(-0.0122718463030851)(q[0]),
cirq.CNOT(q[7], q[0]),
u1(0.0122718463030851)(q[0]),
u1(0.0245436926061703)(q[7]),
cirq.CNOT(q[7], q[1]),
u1(-0.0245436926061703)(q[1]),
cirq.CNOT(q[7], q[1]),
u1(0.0245436926061703)(q[1]),
u1(0.0490873852123405)(q[7]),
cirq.CNOT(q[7], q[2]),
u1(-0.0490873852123405)(q[2]),
cirq.CNOT(q[7], q[2]),
u1(0.0490873852123405)(q[2]),
u1(0.098174770424681)(q[7]),
cirq.CNOT(q[7], q[3]),
u1(-0.098174770424681)(q[3]),
cirq.CNOT(q[7], q[3]),
u1(0.098174770424681)(q[3]),
u1(0.196349540849362)(q[7]),
cirq.CNOT(q[7], q[4]),
u1(-0.196349540849362)(q[4]),
cirq.CNOT(q[7], q[4]),
u1(0.196349540849362)(q[4]),
u1(0.392699081698724)(q[7]),
cirq.CNOT(q[7], q[5]),
u1(-0.392699081698724)(q[5]),
cirq.CNOT(q[7], q[5]),
u1(0.392699081698724)(q[5]),
u1(0.785398163397448)(q[7]),
cirq.CNOT(q[7], q[6]),
u1(-0.785398163397448)(q[6]),
cirq.CNOT(q[7], q[6]),
u1(0.785398163397448)(q[6]),
cirq.H(q[7]),
cirq.CNOT(q[8], q[0]),
u1(-0.00613592315154256)(q[0]),
cirq.CNOT(q[8], q[0]),
u1(0.00613592315154256)(q[0]),
u1(0.0122718463030851)(q[8]),
cirq.CNOT(q[8], q[1]),
u1(-0.0122718463030851)(q[1]),
cirq.CNOT(q[8], q[1]),
u1(0.0122718463030851)(q[1]),
u1(0.0245436926061703)(q[8]),
cirq.CNOT(q[8], q[2]),
u1(-0.0245436926061703)(q[2]),
cirq.CNOT(q[8], q[2]),
u1(0.0245436926061703)(q[2]),
u1(0.0490873852123405)(q[8]),
cirq.CNOT(q[8], q[3]),
u1(-0.0490873852123405)(q[3]),
cirq.CNOT(q[8], q[3]),
u1(0.0490873852123405)(q[3]),
u1(0.098174770424681)(q[8]),
cirq.CNOT(q[8], q[4]),
u1(-0.098174770424681)(q[4]),
cirq.CNOT(q[8], q[4]),
u1(0.098174770424681)(q[4]),
u1(0.196349540849362)(q[8]),
cirq.CNOT(q[8], q[5]),
u1(-0.196349540849362)(q[5]),
cirq.CNOT(q[8], q[5]),
u1(0.196349540849362)(q[5]),
u1(0.392699081698724)(q[8]),
cirq.CNOT(q[8], q[6]),
u1(-0.392699081698724)(q[6]),
cirq.CNOT(q[8], q[6]),
u1(0.392699081698724)(q[6]),
u1(0.785398163397448)(q[8]),
cirq.CNOT(q[8], q[7]),
u1(-0.785398163397448)(q[7]),
cirq.CNOT(q[8], q[7]),
u1(0.785398163397448)(q[7]),
cirq.H(q[8]),
cirq.CNOT(q[9], q[0]),
u1(-0.00306796157577128)(q[0]),
cirq.CNOT(q[9], q[0]),
u1(0.00306796157577128)(q[0]),
u1(0.00613592315154256)(q[9]),
cirq.CNOT(q[9], q[1]),
u1(-0.00613592315154256)(q[1]),
cirq.CNOT(q[9], q[1]),
u1(0.00613592315154256)(q[1]),
u1(0.0122718463030851)(q[9]),
cirq.CNOT(q[9], q[2]),
u1(-0.0122718463030851)(q[2]),
cirq.CNOT(q[9], q[2]),
u1(0.0122718463030851)(q[2]),
u1(0.0245436926061703)(q[9]),
cirq.CNOT(q[9], q[3]),
u1(-0.0245436926061703)(q[3]),
cirq.CNOT(q[9], q[3]),
u1(0.0245436926061703)(q[3]),
u1(0.0490873852123405)(q[9]),
cirq.CNOT(q[9], q[4]),
u1(-0.0490873852123405)(q[4]),
cirq.CNOT(q[9], q[4]),
u1(0.0490873852123405)(q[4]),
u1(0.098174770424681)(q[9]),
cirq.CNOT(q[9], q[5]),
u1(-0.098174770424681)(q[5]),
cirq.CNOT(q[9], q[5]),
u1(0.098174770424681)(q[5]),
u1(0.196349540849362)(q[9]),
cirq.CNOT(q[9], q[6]),
u1(-0.196349540849362)(q[6]),
cirq.CNOT(q[9], q[6]),
u1(0.196349540849362)(q[6]),
u1(0.392699081698724)(q[9]),
cirq.CNOT(q[9], q[7]),
u1(-0.392699081698724)(q[7]),
cirq.CNOT(q[9], q[7]),
u1(0.392699081698724)(q[7]),
u1(0.785398163397448)(q[9]),
cirq.CNOT(q[9], q[8]),
u1(-0.785398163397448)(q[8]),
cirq.CNOT(q[9], q[8]),
u1(0.785398163397448)(q[8]),
cirq.H(q[9]),
cirq.CNOT(q[10], q[0]),
u1(-0.00153398078788564)(q[0]),
cirq.CNOT(q[10], q[0]),
u1(0.00153398078788564)(q[0]),
u1(0.00306796157577128)(q[10]),
cirq.CNOT(q[10], q[1]),
u1(-0.00306796157577128)(q[1]),
cirq.CNOT(q[10], q[1]),
u1(0.00306796157577128)(q[1]),
u1(0.00613592315154256)(q[10]),
cirq.CNOT(q[10], q[2]),
u1(-0.00613592315154256)(q[2]),
cirq.CNOT(q[10], q[2]),
u1(0.00613592315154256)(q[2]),
u1(0.0122718463030851)(q[10]),
cirq.CNOT(q[10], q[3]),
u1(-0.0122718463030851)(q[3]),
cirq.CNOT(q[10], q[3]),
u1(0.0122718463030851)(q[3]),
u1(0.0245436926061703)(q[10]),
cirq.CNOT(q[10], q[4]),
u1(-0.0245436926061703)(q[4]),
cirq.CNOT(q[10], q[4]),
u1(0.0245436926061703)(q[4]),
u1(0.0490873852123405)(q[10]),
cirq.CNOT(q[10], q[5]),
u1(-0.0490873852123405)(q[5]),
cirq.CNOT(q[10], q[5]),
u1(0.0490873852123405)(q[5]),
u1(0.098174770424681)(q[10]),
cirq.CNOT(q[10], q[6]),
u1(-0.098174770424681)(q[6]),
cirq.CNOT(q[10], q[6]),
u1(0.098174770424681)(q[6]),
u1(0.196349540849362)(q[10]),
cirq.CNOT(q[10], q[7]),
u1(-0.196349540849362)(q[7]),
cirq.CNOT(q[10], q[7]),
u1(0.196349540849362)(q[7]),
u1(0.392699081698724)(q[10]),
cirq.CNOT(q[10], q[8]),
u1(-0.392699081698724)(q[8]),
cirq.CNOT(q[10], q[8]),
u1(0.392699081698724)(q[8]),
u1(0.785398163397448)(q[10]),
cirq.CNOT(q[10], q[9]),
u1(-0.785398163397448)(q[9]),
cirq.CNOT(q[10], q[9]),
u1(0.785398163397448)(q[9]),
cirq.H(q[10]),
cirq.CNOT(q[11], q[0]),
u1(-7.66990393942821e-4)(q[0]),
cirq.CNOT(q[11], q[0]),
u1(7.66990393942821e-4)(q[0]),
u1(0.00153398078788564)(q[11]),
cirq.CNOT(q[11], q[1]),
u1(-0.00153398078788564)(q[1]),
cirq.CNOT(q[11], q[1]),
u1(0.00153398078788564)(q[1]),
u1(0.00306796157577128)(q[11]),
cirq.CNOT(q[11], q[2]),
u1(-0.00306796157577128)(q[2]),
cirq.CNOT(q[11], q[2]),
u1(0.00306796157577128)(q[2]),
u1(0.00613592315154256)(q[11]),
cirq.CNOT(q[11], q[3]),
u1(-0.00613592315154256)(q[3]),
cirq.CNOT(q[11], q[3]),
u1(0.00613592315154256)(q[3]),
u1(0.0122718463030851)(q[11]),
cirq.CNOT(q[11], q[4]),
u1(-0.0122718463030851)(q[4]),
cirq.CNOT(q[11], q[4]),
u1(0.0122718463030851)(q[4]),
u1(0.0245436926061703)(q[11]),
cirq.CNOT(q[11], q[5]),
u1(-0.0245436926061703)(q[5]),
cirq.CNOT(q[11], q[5]),
u1(0.0245436926061703)(q[5]),
u1(0.0490873852123405)(q[11]),
cirq.CNOT(q[11], q[6]),
u1(-0.0490873852123405)(q[6]),
cirq.CNOT(q[11], q[6]),
u1(0.0490873852123405)(q[6]),
u1(0.098174770424681)(q[11]),
cirq.CNOT(q[11], q[7]),
u1(-0.098174770424681)(q[7]),
cirq.CNOT(q[11], q[7]),
u1(0.098174770424681)(q[7]),
u1(0.196349540849362)(q[11]),
cirq.CNOT(q[11], q[8]),
u1(-0.196349540849362)(q[8]),
cirq.CNOT(q[11], q[8]),
u1(0.196349540849362)(q[8]),
u1(0.392699081698724)(q[11]),
cirq.CNOT(q[11], q[9]),
u1(-0.392699081698724)(q[9]),
cirq.CNOT(q[11], q[9]),
u1(0.392699081698724)(q[9]),
u1(0.785398163397448)(q[11]),
cirq.CNOT(q[11], q[10]),
u1(-0.785398163397448)(q[10]),
cirq.CNOT(q[11], q[10]),
u1(0.785398163397448)(q[10]),
cirq.H(q[11]),
cirq.CNOT(q[12], q[0]),
u1(-3.8349519697141e-4)(q[0]),
cirq.CNOT(q[12], q[0]),
u1(3.8349519697141e-4)(q[0]),
u1(7.66990393942821e-4)(q[12]),
cirq.CNOT(q[12], q[1]),
u1(-7.66990393942821e-4)(q[1]),
cirq.CNOT(q[12], q[1]),
u1(7.66990393942821e-4)(q[1]),
u1(0.00153398078788564)(q[12]),
cirq.CNOT(q[12], q[2]),
u1(-0.00153398078788564)(q[2]),
cirq.CNOT(q[12], q[2]),
u1(0.00153398078788564)(q[2]),
u1(0.00306796157577128)(q[12]),
cirq.CNOT(q[12], q[3]),
u1(-0.00306796157577128)(q[3]),
cirq.CNOT(q[12], q[3]),
u1(0.00306796157577128)(q[3]),
u1(0.00613592315154256)(q[12]),
cirq.CNOT(q[12], q[4]),
u1(-0.00613592315154256)(q[4]),
cirq.CNOT(q[12], q[4]),
u1(0.00613592315154256)(q[4]),
u1(0.0122718463030851)(q[12]),
cirq.CNOT(q[12], q[5]),
u1(-0.0122718463030851)(q[5]),
cirq.CNOT(q[12], q[5]),
u1(0.0122718463030851)(q[5]),
u1(0.0245436926061703)(q[12]),
cirq.CNOT(q[12], q[6]),
u1(-0.0245436926061703)(q[6]),
cirq.CNOT(q[12], q[6]),
u1(0.0245436926061703)(q[6]),
u1(0.0490873852123405)(q[12]),
cirq.CNOT(q[12], q[7]),
u1(-0.0490873852123405)(q[7]),
cirq.CNOT(q[12], q[7]),
u1(0.0490873852123405)(q[7]),
u1(0.098174770424681)(q[12]),
cirq.CNOT(q[12], q[8]),
u1(-0.098174770424681)(q[8]),
cirq.CNOT(q[12], q[8]),
u1(0.098174770424681)(q[8]),
u1(0.196349540849362)(q[12]),
cirq.CNOT(q[12], q[9]),
u1(-0.196349540849362)(q[9]),
cirq.CNOT(q[12], q[9]),
u1(0.196349540849362)(q[9]),
u1(0.392699081698724)(q[12]),
cirq.CNOT(q[12], q[10]),
u1(-0.392699081698724)(q[10]),
cirq.CNOT(q[12], q[10]),
u1(0.392699081698724)(q[10]),
u1(0.785398163397448)(q[12]),
cirq.CNOT(q[12], q[11]),
u1(-0.785398163397448)(q[11]),
cirq.CNOT(q[12], q[11]),
u1(0.785398163397448)(q[11]),
cirq.H(q[12]),
cirq.CNOT(q[13], q[0]),
u1(-1.91747598485705e-4)(q[0]),
cirq.CNOT(q[13], q[0]),
u1(1.91747598485705e-4)(q[0]),
u1(3.8349519697141e-4)(q[13]),
cirq.CNOT(q[13], q[1]),
u1(-3.8349519697141e-4)(q[1]),
cirq.CNOT(q[13], q[1]),
u1(3.8349519697141e-4)(q[1]),
u1(7.66990393942821e-4)(q[13]),
cirq.CNOT(q[13], q[2]),
u1(-7.66990393942821e-4)(q[2]),
cirq.CNOT(q[13], q[2]),
u1(7.66990393942821e-4)(q[2]),
u1(0.00153398078788564)(q[13]),
cirq.CNOT(q[13], q[3]),
u1(-0.00153398078788564)(q[3]),
cirq.CNOT(q[13], q[3]),
u1(0.00153398078788564)(q[3]),
u1(0.00306796157577128)(q[13]),
cirq.CNOT(q[13], q[4]),
u1(-0.00306796157577128)(q[4]),
cirq.CNOT(q[13], q[4]),
u1(0.00306796157577128)(q[4]),
u1(0.00613592315154256)(q[13]),
cirq.CNOT(q[13], q[5]),
u1(-0.00613592315154256)(q[5]),
cirq.CNOT(q[13], q[5]),
u1(0.00613592315154256)(q[5]),
u1(0.0122718463030851)(q[13]),
cirq.CNOT(q[13], q[6]),
u1(-0.0122718463030851)(q[6]),
cirq.CNOT(q[13], q[6]),
u1(0.0122718463030851)(q[6]),
u1(0.0245436926061703)(q[13]),
cirq.CNOT(q[13], q[7]),
u1(-0.0245436926061703)(q[7]),
cirq.CNOT(q[13], q[7]),
u1(0.0245436926061703)(q[7]),
u1(0.0490873852123405)(q[13]),
cirq.CNOT(q[13], q[8]),
u1(-0.0490873852123405)(q[8]),
cirq.CNOT(q[13], q[8]),
u1(0.0490873852123405)(q[8]),
u1(0.098174770424681)(q[13]),
cirq.CNOT(q[13], q[9]),
u1(-0.098174770424681)(q[9]),
cirq.CNOT(q[13], q[9]),
u1(0.098174770424681)(q[9]),
u1(0.196349540849362)(q[13]),
cirq.CNOT(q[13], q[10]),
u1(-0.196349540849362)(q[10]),
cirq.CNOT(q[13], q[10]),
u1(0.196349540849362)(q[10]),
u1(0.392699081698724)(q[13]),
cirq.CNOT(q[13], q[11]),
u1(-0.392699081698724)(q[11]),
cirq.CNOT(q[13], q[11]),
u1(0.392699081698724)(q[11]),
u1(0.785398163397448)(q[13]),
cirq.CNOT(q[13], q[12]),
u1(-0.785398163397448)(q[12]),
cirq.CNOT(q[13], q[12]),
u1(0.785398163397448)(q[12]),
cirq.H(q[13]),
cirq.CNOT(q[14], q[0]),
u1(-9.58737992428526e-5)(q[0]),
cirq.CNOT(q[14], q[0]),
u1(9.58737992428526e-5)(q[0]),
u1(1.91747598485705e-4)(q[14]),
cirq.CNOT(q[14], q[1]),
u1(-1.91747598485705e-4)(q[1]),
cirq.CNOT(q[14], q[1]),
u1(1.91747598485705e-4)(q[1]),
u1(3.8349519697141e-4)(q[14]),
cirq.CNOT(q[14], q[2]),
u1(-3.8349519697141e-4)(q[2]),
cirq.CNOT(q[14], q[2]),
u1(3.8349519697141e-4)(q[2]),
u1(7.66990393942821e-4)(q[14]),
cirq.CNOT(q[14], q[3]),
u1(-7.66990393942821e-4)(q[3]),
cirq.CNOT(q[14], q[3]),
u1(7.66990393942821e-4)(q[3]),
u1(0.00153398078788564)(q[14]),
cirq.CNOT(q[14], q[4]),
u1(-0.00153398078788564)(q[4]),
cirq.CNOT(q[14], q[4]),
u1(0.00153398078788564)(q[4]),
u1(0.00306796157577128)(q[14]),
cirq.CNOT(q[14], q[5]),
u1(-0.00306796157577128)(q[5]),
cirq.CNOT(q[14], q[5]),
u1(0.00306796157577128)(q[5]),
u1(0.00613592315154256)(q[14]),
cirq.CNOT(q[14], q[6]),
u1(-0.00613592315154256)(q[6]),
cirq.CNOT(q[14], q[6]),
u1(0.00613592315154256)(q[6]),
u1(0.0122718463030851)(q[14]),
cirq.CNOT(q[14], q[7]),
u1(-0.0122718463030851)(q[7]),
cirq.CNOT(q[14], q[7]),
u1(0.0122718463030851)(q[7]),
u1(0.0245436926061703)(q[14]),
cirq.CNOT(q[14], q[8]),
u1(-0.0245436926061703)(q[8]),
cirq.CNOT(q[14], q[8]),
u1(0.0245436926061703)(q[8]),
u1(0.0490873852123405)(q[14]),
cirq.CNOT(q[14], q[9]),
u1(-0.0490873852123405)(q[9]),
cirq.CNOT(q[14], q[9]),
u1(0.0490873852123405)(q[9]),
u1(0.098174770424681)(q[14]),
cirq.CNOT(q[14], q[10]),
u1(-0.098174770424681)(q[10]),
cirq.CNOT(q[14], q[10]),
u1(0.098174770424681)(q[10]),
u1(0.196349540849362)(q[14]),
cirq.CNOT(q[14], q[11]),
u1(-0.196349540849362)(q[11]),
cirq.CNOT(q[14], q[11]),
u1(0.196349540849362)(q[11]),
u1(0.392699081698724)(q[14]),
cirq.CNOT(q[14], q[12]),
u1(-0.392699081698724)(q[12]),
cirq.CNOT(q[14], q[12]),
u1(0.392699081698724)(q[12]),
u1(0.785398163397448)(q[14]),
cirq.CNOT(q[14], q[13]),
u1(-0.785398163397448)(q[13]),
cirq.CNOT(q[14], q[13]),
u1(0.785398163397448)(q[13]),
cirq.H(q[14]),
cirq.measure(q[0], key='c0'),
cirq.measure(q[1], key='c1'),
cirq.measure(q[2], key='c2'),
cirq.measure(q[3], key='c3'),
cirq.measure(q[4], key='c4'),
cirq.measure(q[5], key='c5'),
cirq.measure(q[6], key='c6'),
cirq.measure(q[7], key='c7'),
cirq.measure(q[8], key='c8'),
cirq.measure(q[9], key='c9'),
cirq.measure(q[10], key='c10'),
cirq.measure(q[11], key='c11'),
cirq.measure(q[12], key='c12'),
cirq.measure(q[13], key='c13'),
cirq.measure(q[14], key='c14')
)
start = time.time()
simulator = cirq.Simulator()
result = simulator.run(circuit, repetitions=1)
result_dict = dict(result.multi_measurement_histogram(keys=['c0', 'c1', 'c2', 'c3', 'c4', 'c5', 'c6', 'c7', 'c8', 'c9', 'c10', 'c11', 'c12', 'c13', 'c14']))
keys = list(map(lambda arr: reduce(lambda x, y: str(x) + str(y), arr[::-1]), result_dict.keys()))
counts = dict(zip(keys,[value for value in result_dict.values()]))
#print(counts)
end = time.time()
print("qft_n15 simulate on Cirq:" + str(end-start))
| 31.603478 | 156 | 0.557836 | 3,136 | 18,172 | 3.229911 | 0.031888 | 0.084411 | 0.186593 | 0.078784 | 0.914997 | 0.907 | 0.824267 | 0.785073 | 0.698588 | 0.698588 | 0 | 0.436435 | 0.170207 | 18,172 | 574 | 157 | 31.658537 | 0.235294 | 0.000715 | 0 | 0.538462 | 0 | 0 | 0.005287 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.001748 | false | 0 | 0.006993 | 0.001748 | 0.01049 | 0.001748 | 0 | 0 | 0 | null | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
579b246c661c3a0a8eae9e959fbac7d19ca37939 | 35,723 | py | Python | package/ja_pytest_v2.py | wolfmib/ja_slot_line | 6d617fbff7239c4405209e943ae9c16f88a1f32f | [
"MIT"
] | null | null | null | package/ja_pytest_v2.py | wolfmib/ja_slot_line | 6d617fbff7239c4405209e943ae9c16f88a1f32f | [
"MIT"
] | null | null | null | package/ja_pytest_v2.py | wolfmib/ja_slot_line | 6d617fbff7239c4405209e943ae9c16f88a1f32f | [
"MIT"
] | null | null | null |
import copy
"""
let's test pytest.
v1: meet basic symbol-win-lines: only for 3, 4 , 5 line wins
🌟🌟🌟🌟
v2: add S1: 3 (counts), 4(counts), 5(counts) winning scores.
"""
# slot for 25 lines
# thanks to mary for implement...
line_25_setting = [
[1,1,1,1,1], # 1
[0,0,0,0,0], # 2
[2,2,2,2,2], # 3
[0,1,2,1,0], # 4
[2,1,0,1,2], # 5
[0,0,1,0,0], # 6
[2,2,1,2,2], # 7
[1,2,2,2,1], # 8
[1,0,0,0,1], # 9
[1,0,1,0,1], # 10
[1,2,1,2,1], # 11
[0,1,0,1,0], # 12
[2,1,2,1,2], # 13
[1,1,0,1,1], # 14
[1,1,2,1,1], # 15
[0,1,1,1,0], # 16
[2,1,1,1,2], # 17
[0,2,0,2,0], # 18
[2,0,2,0,2], # 19
[0,2,2,2,0], # 20
[2,0,0,0,2], # 21
[0,0,2,0,0], # 22
[2,2,0,2,2], # 23
[0,2,1,2,0], # 24
[2,0,1,0,2] # 25
]
line_obj_bet_rate = {
"H1": {2: 2, 3: 30, 4: 200, 5: 750},
"H2": {2: 2, 3: 30, 4: 100, 5: 750},
"H3": {2: 0, 3: 25, 4: 75, 5: 400},
"H4": {2: 0, 3: 20, 4: 75, 5: 300},
"H5": {2: 0, 3: 20, 4: 50, 5: 300},
"L1": {2: 0, 3: 15, 4: 30, 5: 150},
"L2": {2: 0, 3: 15, 4: 30, 5: 150},
"L3": {2: 0, 3: 10, 4: 25, 5: 130},
"L4": {2: 0, 3: 10, 4: 25, 5: 130},
"L5": {2: 0, 3: 5, 4: 20, 5: 100},
"L6": {2: 2, 3: 5, 4: 20, 5: 100},
"W1": {2: 35, 3: 250, 4: 2500, 5: 10000},
"S1": {2: 2, 3: 5, 4: 20, 5: 500},
}
# 🌟🌟🌟🌟: add the S1
# :also add the H1-H5, L1-L5 2 cases.
line_obj_25_test_coverage = {
"H1": {
2: {1:False, 2:False, 3:False, 4:False, 5:False, 6:False, 7:False, 8:False, 9:False, 10:False, 11:False, 12:False, 13:False, 14:False, 15:False, 16:False, 17:False, 18:False, 19:False, 20:False, 21:False, 22:False, 23:False, 24:False, 25:False},
3: {1:False, 2:False, 3:False, 4:False, 5:False, 6:False, 7:False, 8:False, 9:False, 10:False, 11:False, 12:False, 13:False, 14:False, 15:False, 16:False, 17:False, 18:False, 19:False, 20:False, 21:False, 22:False, 23:False, 24:False, 25:False},
4: {1:False, 2:False, 3:False, 4:False, 5:False, 6:False, 7:False, 8:False, 9:False, 10:False, 11:False, 12:False, 13:False, 14:False, 15:False, 16:False, 17:False, 18:False, 19:False, 20:False, 21:False, 22:False, 23:False, 24:False, 25:False},
5: {1:False, 2:False, 3:False, 4:False, 5:False, 6:False, 7:False, 8:False, 9:False, 10:False, 11:False, 12:False, 13:False, 14:False, 15:False, 16:False, 17:False, 18:False, 19:False, 20:False, 21:False, 22:False, 23:False, 24:False, 25:False}
},
"H2": {
2: {1:False, 2:False, 3:False, 4:False, 5:False, 6:False, 7:False, 8:False, 9:False, 10:False, 11:False, 12:False, 13:False, 14:False, 15:False, 16:False, 17:False, 18:False, 19:False, 20:False, 21:False, 22:False, 23:False, 24:False, 25:False},
3: {1:False, 2:False, 3:False, 4:False, 5:False, 6:False, 7:False, 8:False, 9:False, 10:False, 11:False, 12:False, 13:False, 14:False, 15:False, 16:False, 17:False, 18:False, 19:False, 20:False, 21:False, 22:False, 23:False, 24:False, 25:False},
4: {1:False, 2:False, 3:False, 4:False, 5:False, 6:False, 7:False, 8:False, 9:False, 10:False, 11:False, 12:False, 13:False, 14:False, 15:False, 16:False, 17:False, 18:False, 19:False, 20:False, 21:False, 22:False, 23:False, 24:False, 25:False},
5: {1:False, 2:False, 3:False, 4:False, 5:False, 6:False, 7:False, 8:False, 9:False, 10:False, 11:False, 12:False, 13:False, 14:False, 15:False, 16:False, 17:False, 18:False, 19:False, 20:False, 21:False, 22:False, 23:False, 24:False, 25:False}
},
"H3": {
2: {1:False, 2:False, 3:False, 4:False, 5:False, 6:False, 7:False, 8:False, 9:False, 10:False, 11:False, 12:False, 13:False, 14:False, 15:False, 16:False, 17:False, 18:False, 19:False, 20:False, 21:False, 22:False, 23:False, 24:False, 25:False},
3: {1:False, 2:False, 3:False, 4:False, 5:False, 6:False, 7:False, 8:False, 9:False, 10:False, 11:False, 12:False, 13:False, 14:False, 15:False, 16:False, 17:False, 18:False, 19:False, 20:False, 21:False, 22:False, 23:False, 24:False, 25:False},
4: {1:False, 2:False, 3:False, 4:False, 5:False, 6:False, 7:False, 8:False, 9:False, 10:False, 11:False, 12:False, 13:False, 14:False, 15:False, 16:False, 17:False, 18:False, 19:False, 20:False, 21:False, 22:False, 23:False, 24:False, 25:False},
5: {1:False, 2:False, 3:False, 4:False, 5:False, 6:False, 7:False, 8:False, 9:False, 10:False, 11:False, 12:False, 13:False, 14:False, 15:False, 16:False, 17:False, 18:False, 19:False, 20:False, 21:False, 22:False, 23:False, 24:False, 25:False}
},
"H4": {
2: {1:False, 2:False, 3:False, 4:False, 5:False, 6:False, 7:False, 8:False, 9:False, 10:False, 11:False, 12:False, 13:False, 14:False, 15:False, 16:False, 17:False, 18:False, 19:False, 20:False, 21:False, 22:False, 23:False, 24:False, 25:False},
3: {1:False, 2:False, 3:False, 4:False, 5:False, 6:False, 7:False, 8:False, 9:False, 10:False, 11:False, 12:False, 13:False, 14:False, 15:False, 16:False, 17:False, 18:False, 19:False, 20:False, 21:False, 22:False, 23:False, 24:False, 25:False},
4: {1:False, 2:False, 3:False, 4:False, 5:False, 6:False, 7:False, 8:False, 9:False, 10:False, 11:False, 12:False, 13:False, 14:False, 15:False, 16:False, 17:False, 18:False, 19:False, 20:False, 21:False, 22:False, 23:False, 24:False, 25:False},
5: {1:False, 2:False, 3:False, 4:False, 5:False, 6:False, 7:False, 8:False, 9:False, 10:False, 11:False, 12:False, 13:False, 14:False, 15:False, 16:False, 17:False, 18:False, 19:False, 20:False, 21:False, 22:False, 23:False, 24:False, 25:False}
},
"H5": {
2: {1:False, 2:False, 3:False, 4:False, 5:False, 6:False, 7:False, 8:False, 9:False, 10:False, 11:False, 12:False, 13:False, 14:False, 15:False, 16:False, 17:False, 18:False, 19:False, 20:False, 21:False, 22:False, 23:False, 24:False, 25:False},
3: {1:False, 2:False, 3:False, 4:False, 5:False, 6:False, 7:False, 8:False, 9:False, 10:False, 11:False, 12:False, 13:False, 14:False, 15:False, 16:False, 17:False, 18:False, 19:False, 20:False, 21:False, 22:False, 23:False, 24:False, 25:False},
4: {1:False, 2:False, 3:False, 4:False, 5:False, 6:False, 7:False, 8:False, 9:False, 10:False, 11:False, 12:False, 13:False, 14:False, 15:False, 16:False, 17:False, 18:False, 19:False, 20:False, 21:False, 22:False, 23:False, 24:False, 25:False},
5: {1:False, 2:False, 3:False, 4:False, 5:False, 6:False, 7:False, 8:False, 9:False, 10:False, 11:False, 12:False, 13:False, 14:False, 15:False, 16:False, 17:False, 18:False, 19:False, 20:False, 21:False, 22:False, 23:False, 24:False, 25:False}
},
"L1": {
2: {1:False, 2:False, 3:False, 4:False, 5:False, 6:False, 7:False, 8:False, 9:False, 10:False, 11:False, 12:False, 13:False, 14:False, 15:False, 16:False, 17:False, 18:False, 19:False, 20:False, 21:False, 22:False, 23:False, 24:False, 25:False},
3: {1:False, 2:False, 3:False, 4:False, 5:False, 6:False, 7:False, 8:False, 9:False, 10:False, 11:False, 12:False, 13:False, 14:False, 15:False, 16:False, 17:False, 18:False, 19:False, 20:False, 21:False, 22:False, 23:False, 24:False, 25:False},
4: {1:False, 2:False, 3:False, 4:False, 5:False, 6:False, 7:False, 8:False, 9:False, 10:False, 11:False, 12:False, 13:False, 14:False, 15:False, 16:False, 17:False, 18:False, 19:False, 20:False, 21:False, 22:False, 23:False, 24:False, 25:False},
5: {1:False, 2:False, 3:False, 4:False, 5:False, 6:False, 7:False, 8:False, 9:False, 10:False, 11:False, 12:False, 13:False, 14:False, 15:False, 16:False, 17:False, 18:False, 19:False, 20:False, 21:False, 22:False, 23:False, 24:False, 25:False}
},
"L2": {
2: {1:False, 2:False, 3:False, 4:False, 5:False, 6:False, 7:False, 8:False, 9:False, 10:False, 11:False, 12:False, 13:False, 14:False, 15:False, 16:False, 17:False, 18:False, 19:False, 20:False, 21:False, 22:False, 23:False, 24:False, 25:False},
3: {1:False, 2:False, 3:False, 4:False, 5:False, 6:False, 7:False, 8:False, 9:False, 10:False, 11:False, 12:False, 13:False, 14:False, 15:False, 16:False, 17:False, 18:False, 19:False, 20:False, 21:False, 22:False, 23:False, 24:False, 25:False},
4: {1:False, 2:False, 3:False, 4:False, 5:False, 6:False, 7:False, 8:False, 9:False, 10:False, 11:False, 12:False, 13:False, 14:False, 15:False, 16:False, 17:False, 18:False, 19:False, 20:False, 21:False, 22:False, 23:False, 24:False, 25:False},
5: {1:False, 2:False, 3:False, 4:False, 5:False, 6:False, 7:False, 8:False, 9:False, 10:False, 11:False, 12:False, 13:False, 14:False, 15:False, 16:False, 17:False, 18:False, 19:False, 20:False, 21:False, 22:False, 23:False, 24:False, 25:False}
},
"L3": {
2: {1:False, 2:False, 3:False, 4:False, 5:False, 6:False, 7:False, 8:False, 9:False, 10:False, 11:False, 12:False, 13:False, 14:False, 15:False, 16:False, 17:False, 18:False, 19:False, 20:False, 21:False, 22:False, 23:False, 24:False, 25:False},
3: {1:False, 2:False, 3:False, 4:False, 5:False, 6:False, 7:False, 8:False, 9:False, 10:False, 11:False, 12:False, 13:False, 14:False, 15:False, 16:False, 17:False, 18:False, 19:False, 20:False, 21:False, 22:False, 23:False, 24:False, 25:False},
4: {1:False, 2:False, 3:False, 4:False, 5:False, 6:False, 7:False, 8:False, 9:False, 10:False, 11:False, 12:False, 13:False, 14:False, 15:False, 16:False, 17:False, 18:False, 19:False, 20:False, 21:False, 22:False, 23:False, 24:False, 25:False},
5: {1:False, 2:False, 3:False, 4:False, 5:False, 6:False, 7:False, 8:False, 9:False, 10:False, 11:False, 12:False, 13:False, 14:False, 15:False, 16:False, 17:False, 18:False, 19:False, 20:False, 21:False, 22:False, 23:False, 24:False, 25:False}
},
"L4": {
2: {1:False, 2:False, 3:False, 4:False, 5:False, 6:False, 7:False, 8:False, 9:False, 10:False, 11:False, 12:False, 13:False, 14:False, 15:False, 16:False, 17:False, 18:False, 19:False, 20:False, 21:False, 22:False, 23:False, 24:False, 25:False},
3: {1:False, 2:False, 3:False, 4:False, 5:False, 6:False, 7:False, 8:False, 9:False, 10:False, 11:False, 12:False, 13:False, 14:False, 15:False, 16:False, 17:False, 18:False, 19:False, 20:False, 21:False, 22:False, 23:False, 24:False, 25:False},
4: {1:False, 2:False, 3:False, 4:False, 5:False, 6:False, 7:False, 8:False, 9:False, 10:False, 11:False, 12:False, 13:False, 14:False, 15:False, 16:False, 17:False, 18:False, 19:False, 20:False, 21:False, 22:False, 23:False, 24:False, 25:False},
5: {1:False, 2:False, 3:False, 4:False, 5:False, 6:False, 7:False, 8:False, 9:False, 10:False, 11:False, 12:False, 13:False, 14:False, 15:False, 16:False, 17:False, 18:False, 19:False, 20:False, 21:False, 22:False, 23:False, 24:False, 25:False}
},
"L5": {
2: {1:False, 2:False, 3:False, 4:False, 5:False, 6:False, 7:False, 8:False, 9:False, 10:False, 11:False, 12:False, 13:False, 14:False, 15:False, 16:False, 17:False, 18:False, 19:False, 20:False, 21:False, 22:False, 23:False, 24:False, 25:False},
3: {1:False, 2:False, 3:False, 4:False, 5:False, 6:False, 7:False, 8:False, 9:False, 10:False, 11:False, 12:False, 13:False, 14:False, 15:False, 16:False, 17:False, 18:False, 19:False, 20:False, 21:False, 22:False, 23:False, 24:False, 25:False},
4: {1:False, 2:False, 3:False, 4:False, 5:False, 6:False, 7:False, 8:False, 9:False, 10:False, 11:False, 12:False, 13:False, 14:False, 15:False, 16:False, 17:False, 18:False, 19:False, 20:False, 21:False, 22:False, 23:False, 24:False, 25:False},
5: {1:False, 2:False, 3:False, 4:False, 5:False, 6:False, 7:False, 8:False, 9:False, 10:False, 11:False, 12:False, 13:False, 14:False, 15:False, 16:False, 17:False, 18:False, 19:False, 20:False, 21:False, 22:False, 23:False, 24:False, 25:False}
},
"L6": {
2: {1:False, 2:False, 3:False, 4:False, 5:False, 6:False, 7:False, 8:False, 9:False, 10:False, 11:False, 12:False, 13:False, 14:False, 15:False, 16:False, 17:False, 18:False, 19:False, 20:False, 21:False, 22:False, 23:False, 24:False, 25:False},
3: {1:False, 2:False, 3:False, 4:False, 5:False, 6:False, 7:False, 8:False, 9:False, 10:False, 11:False, 12:False, 13:False, 14:False, 15:False, 16:False, 17:False, 18:False, 19:False, 20:False, 21:False, 22:False, 23:False, 24:False, 25:False},
4: {1:False, 2:False, 3:False, 4:False, 5:False, 6:False, 7:False, 8:False, 9:False, 10:False, 11:False, 12:False, 13:False, 14:False, 15:False, 16:False, 17:False, 18:False, 19:False, 20:False, 21:False, 22:False, 23:False, 24:False, 25:False},
5: {1:False, 2:False, 3:False, 4:False, 5:False, 6:False, 7:False, 8:False, 9:False, 10:False, 11:False, 12:False, 13:False, 14:False, 15:False, 16:False, 17:False, 18:False, 19:False, 20:False, 21:False, 22:False, 23:False, 24:False, 25:False}
},
"W1": {
2: {1:False, 2:False, 3:False, 4:False, 5:False, 6:False, 7:False, 8:False, 9:False, 10:False, 11:False, 12:False, 13:False, 14:False, 15:False, 16:False, 17:False, 18:False, 19:False, 20:False, 21:False, 22:False, 23:False, 24:False, 25:False},
3: {1:False, 2:False, 3:False, 4:False, 5:False, 6:False, 7:False, 8:False, 9:False, 10:False, 11:False, 12:False, 13:False, 14:False, 15:False, 16:False, 17:False, 18:False, 19:False, 20:False, 21:False, 22:False, 23:False, 24:False, 25:False},
4: {1:False, 2:False, 3:False, 4:False, 5:False, 6:False, 7:False, 8:False, 9:False, 10:False, 11:False, 12:False, 13:False, 14:False, 15:False, 16:False, 17:False, 18:False, 19:False, 20:False, 21:False, 22:False, 23:False, 24:False, 25:False},
5: {1:False, 2:False, 3:False, 4:False, 5:False, 6:False, 7:False, 8:False, 9:False, 10:False, 11:False, 12:False, 13:False, 14:False, 15:False, 16:False, 17:False, 18:False, 19:False, 20:False, 21:False, 22:False, 23:False, 24:False, 25:False}
},
"S1":{
2:{ 1: False},
3:{ 1: False},
4:{ 1: False},
5:{ 1: False}
}
}
def test_non_wild(input_obj,input_list):
test_list = copy.deepcopy(input_list)
if input_obj in test_list:
while input_obj in test_list:
test_list.remove(input_obj)
while "W1" in test_list:
test_list.remove("W1")
else:
return False
if not test_list:
return True
else:
return False
## 說明
## 通常系統判斷W1-3蓮, 利用這個確認此盤面是否是4連 True: 是, False: 不是
## input_obj = "W1"
## input_list = ["W1","W1","W1","W1"] return True
## input_obj = "W1"
## input_list = ["W1","W1","W1","H1"] return False
def test_wild_v1(input_obj, input_list):
test_list = copy.deepcopy(input_list)
if input_obj in test_list:
while "W1" in test_list:
test_list.remove("W1")
else:
return False
# 表示input_list 中 全部都是wilds, test_list is empty now.
if not test_list:
return True
# 表示input_list 中 不全然是wilds, test_list is NOT empty
else:
return False
class agent():
def __init__(self):
print("🧪 :[ja_pytest]: Please tune the parameter in the script so that the test will working with correct setting_informaiton")
print("🧪 :[ja_pytest]: you get it ?")
input()
self.line_setting = line_25_setting
self.line_obj_bet_rate = line_obj_bet_rate
self.line_obj_25_test_coverage = line_obj_25_test_coverage
print(self.line_setting)
print("-------")
print(self.line_obj_bet_rate)
print("---------")
input("🧪 :[ja_pytest]: confirm two setting !")
print("------------")
print(self.line_obj_25_test_coverage)
print("------------")
input("🧪 :[ja_pytest]: confirm test coverage map !\n")
# 🌟🌟🌟: how_many_cases: v2 version. add new cases S1.
cnt = 0
for any_obj in self.line_obj_25_test_coverage:
for any_win_line in self.line_obj_25_test_coverage[any_obj]:
tem_len = len(self.line_obj_25_test_coverage[any_obj][any_win_line])
cnt += tem_len
print("🧪 :[ja_pytest]: load ", any_obj, " ",any_win_line, " ", tem_len, " total_size = ",cnt)
self.how_many_cases = cnt
self.passed_cases = 0
input("🧪 :[ja_pytest]: total test cases = %4d, current passed cases = %2d \n"%(self.how_many_cases,self.passed_cases))
# Wild notification
print("🧪 : This test will apply only 'W1' for step_2_test_line_index() testing ")
input("[ja_pytest]: do you copy ?")
# H1, 3 -> 50
# H1, 4 -> 80 for example
# if match, return True, if False return False
def step_1_test_bet_amount(self,test_bet_rate,input_obj,input_win_line):
if test_bet_rate == self.line_obj_bet_rate[input_obj][input_win_line]:
return True
else:
print("🧪 :[ja_pytest]: raised error with test_bet_amount")
print("test_bet_rate = ", test_bet_rate, " self.line_obj_rate[%s][%d] = "%(input_obj,input_win_line),self.line_obj_bet_rate[input_obj][input_win_line])
assert test_bet_rate == self.line_obj_bet_rate[input_obj][input_win_line]
# W1 or (W1, W2 ) pairs ,depend on slot-game, implement by your self.
# now the case is that "it's W1 only"
def step_2_test_line_index(self,test_line_index,input_obj,input_win_line,input_list_r):
# 💡: quick error_diaplay_function.
def __show_step_2_error_info(ii_test_line_index,ii_obj,ii_win_line,ii_list_r):
ui_line_index = ii_test_line_index + 1
print("ii_test_line_index = ",ui_line_index)
print("ii_obj = ",ii_obj)
print("ii_win_line = ",ii_win_line)
print("ii_list_r = ",ii_list_r)
# 🌟🌟🌟: add S1 Cases
if input_obj == "S1":
print("[ja_pytest]: meet S1 cases")
# the S1 , 3, 4, 5 only have one test_line_index each
assert test_line_index == 1
# count S1 again
s1_check_cnts = 0
for any_col_list in input_list_r:
for any_obj in any_col_list:
if any_obj == "S1":
s1_check_cnts += 1
if s1_check_cnts != input_win_line:
# 💡 讓我指引你一盞明燈
__show_step_2_error_info(test_line_index,input_obj,input_win_line,input_list_r)
return False
else:
return True
# 🌟🌟🌟: implement the 2, 3, 4, 5 win_line later... 🙃
else:
obj_1 = input_list_r[0][self.line_setting[test_line_index][0]]
obj_2 = input_list_r[1][self.line_setting[test_line_index][1]]
obj_3 = input_list_r[2][self.line_setting[test_line_index][2]]
obj_4 = input_list_r[3][self.line_setting[test_line_index][3]]
obj_5 = input_list_r[4][self.line_setting[test_line_index][4]]
# consider non-wild case and in the line_setting dict:
if input_obj in self.line_obj_bet_rate:
# add error_cases, input_win_line might be 4 or 5
if input_win_line == 3:
# check error_case win_line=5
check_win_5_list = [obj_1,obj_2,obj_3,obj_4,obj_5]
res = test_non_wild(input_obj,check_win_5_list)
if res == True:
print("[ja_pytest]: False: judge win_line 3, but the win_line is 5 ")
# 💡
__show_step_2_error_info(test_line_index,input_obj,input_win_line,input_list_r)
# didn't pass test
return False
else:
pass
# check error_case win_line = 4
check_win_4_list = [obj_1,obj_2,obj_3,obj_4]
res = test_non_wild(input_obj,check_win_4_list)
if res == True:
print("[ja_pytest]: False: judge win_line 3, but the win_line is 4")
# 💡
__show_step_2_error_info(test_line_index,input_obj,input_win_line,input_list_r)
# didn't pass test
return False
else:
pass
# check it's 3 line or not
check_win_3_list = [obj_1,obj_2,obj_3]
res = test_non_wild(input_obj,check_win_3_list)
if res == True:
pass
else:
print("[ja_pytest]: False: judge win_line 3, but the win_line is not")
# 💡
__show_step_2_error_info(test_line_index,input_obj,input_win_line,input_list_r)
# didn't pass test
return False
# add error case, input_win_line= 5
elif input_win_line == 4:
# check error_case win_line=5
check_win_5_list = [obj_1,obj_2,obj_3,obj_4,obj_5]
res = test_non_wild(input_obj,check_win_5_list)
if res == True:
print("[ja_pytest]: False: judge win_line 4, but the win_line is 5")
# 💡
__show_step_2_error_info(test_line_index,input_obj,input_win_line,input_list_r)
# didn't pass test
return False
else:
pass
# check it's 4 line or not
check_win_4_list = [obj_1,obj_2,obj_3,obj_4]
res = test_non_wild(input_obj,check_win_4_list)
if res == True:
pass
else:
print("[ja_pytest]: False: judge win_line 4, but the win_line is not")
# 💡
__show_step_2_error_info(test_line_index,input_obj,input_win_line,input_list_r)
# didn't pass test
return False
elif input_win_line == 5:
# check it's 5 line or not
check_win_5_list = [obj_1,obj_2,obj_3,obj_4,obj_5]
res = test_non_wild(input_obj,check_win_5_list)
if res == True:
pass
else:
print("[ja_pytest]: False: judge win_line 5, but the win_line is not")
# 💡
__show_step_2_error_info(test_line_index,input_obj,input_win_line,input_list_r)
# didn't pass test
return False
elif input_win_line == 2:
# check it's 2 line or not
check_win_2_list = [obj_1,obj_2]
res = test_non_wild(input_obj,check_win_2_list)
if res == True:
pass
else:
print("[ja_pytest]: False: judge win_line 2, but the win_line is not")
# 💡
__show_step_2_error_info(test_line_index,input_obj,input_win_line,input_list_r)
# didn't pass test
return False
else:
print("[ja_pytest]: False: unspport win_line 3 or 4 or 5: ",input_win_line)
return False
# Remove all false cases, the rest case is pass. return True
return True
# consider the "Wild case"
elif input_obj == "W1":
# add error_cases, input_win_line might be 4 or 5
if input_win_line == 3:
# check error_case win_line=5
check_win_5_list = [obj_1,obj_2,obj_3,obj_4,obj_5]
res = test_wild_v1(input_obj,check_win_5_list)
if res == True:
print("[ja_pytest]: False: judge win_line 3, but the win_line is 5")
# didn't pass test
return False
else:
pass
# check error_case win_line = 4
check_win_4_list = [obj_1,obj_2,obj_3,obj_4]
res = test_wild_v1(input_obj,check_win_4_list)
if res == True:
print("[ja_pytest]: False: judge win_line 3, but the win_line is 4")
# didn't pass test
return False
else:
pass
# check it's 3 line or not
check_win_3_list = [obj_1,obj_2,obj_3]
res = test_wild_v1(input_obj,check_win_3_list)
if res == True:
pass
else:
print("[ja_pytest]: False: judge win_line 3, but the win_line is not")
# didn't pass test
return False
elif input_win_line == 4:
# check error_case win_line=5
check_win_5_list = [obj_1,obj_2,obj_3,obj_4,obj_5]
res = test_wild_v1(input_obj,check_win_5_list)
if res == True:
print("[ja_pytest]: False: judge win_line 3, but the win_line is 5")
# didn't pass test
return False
else:
pass
# check it's 4 line or not
check_win_4_list = [obj_1,obj_2,obj_3,obj_4]
res = test_wild_v1(input_obj,check_win_4_list)
if res == True:
pass
else:
print("[ja_pytest]: False: judge win_line 4, but the win_line is not")
# didn't pass test
return False
elif input_win_line == 5:
# check it's 5 line or not
check_win_5_list = [obj_1,obj_2,obj_3,obj_4,obj_5]
res = test_wild_v1(input_obj,check_win_5_list)
if res == True:
pass
else:
print("[ja_pytest]: False: judge win_line 5, but the win_line is not")
# didn't pass test
return False
elif input_win_line == 2:
# check error_case win_line=5
check_win_5_list = [obj_1,obj_2,obj_3,obj_4,obj_5]
res = test_wild_v1(input_obj,check_win_5_list)
if res == True:
print("[ja_pytest]: False: judge win_line 2, but the win_line is 5")
# didn't pass test
return False
else:
pass
# check error_case win_line = 4
check_win_4_list = [obj_1,obj_2,obj_3,obj_4]
res = test_wild_v1(input_obj,check_win_4_list)
if res == True:
print("[ja_pytest]: False: judge win_line 2, but the win_line is 4")
# didn't pass test
return False
else:
pass
# check error_case win_line = 3
check_win_3_list = [obj_1,obj_2,obj_3]
res = test_wild_v1(input_obj,check_win_3_list)
if res == True:
print("[ja_pytest]: False: judge win_line 2, but the win_line is 3")
# didn't pass test
return False
else:
pass
# check it's 2 line or not
check_win_2_list = [obj_1,obj_2]
res = test_wild_v1(input_obj,check_win_2_list)
if res == True:
pass
else:
print("[ja_pytest]: False: judge win_line 2, but the win_line is not")
# didn't pass test
return False
else:
print("[ja_pytest]: False: unspport win_line 3 or 4 or 5: ",input_win_line)
return False
# Remove all false cases, the rest case is pass. return True
return True
else:
print("Error: The obj: %s is not in the line_obj_dict: \n %s \n"%(input_obj,self.line_obj_bet_rate))
return False
# # 🌟🌟🌟: add S1 Cases
def step_3_test_win_line_20_gain_v1(self,test_gain,input_obj,input_win_line,input_line_index):
if input_obj == "S1":
expected_gain = self.line_obj_bet_rate[input_obj][input_win_line]
# 🌟: S1 input_line_index = 1
if test_gain == expected_gain:
if self.line_obj_25_test_coverage[input_obj][input_win_line][input_line_index] == True:
pass
else:
self.line_obj_25_test_coverage[input_obj][input_win_line][input_line_index] = True
print("[ja_pytest][keyword_pytest_coverage]: current test_coverage updated: obj: %s win_line: %d ui_line_index %d "%(input_obj,input_win_line,input_line_index))
# 💡
# self.show_test_coverage()
return True
else:
return False
else:
# Jason: gain = bet_rate / 25
expected_gain = float(self.line_obj_bet_rate[input_obj][input_win_line] / 25)
print("test_gain = ",test_gain)
print("expec gain ",expected_gain)
if test_gain == expected_gain:
# Debug: print("[Debug]: the line_index = ",input_line_index)
# Debug: input()
# Debug: print("[ja_pytest]: pass case obj: %s , win_line: %2d , ui_line_index: %2d ,gain: %.4f"%(input_obj,input_win_line,input_line_index+1,expected_gain))
# Debug: print("[Debug]: input_obj ",input_obj)
# Debug: print("[Debug]: input_win_line ",input_win_line)
# Debug: print("[Debug]: input_line_index ",input_line_index)
# Set the test_coverage:
# 🌟: for non-S1 symbols, we add index +1 , for ui_purpose.
ui_line_index = input_line_index + 1
if self.line_obj_25_test_coverage[input_obj][input_win_line][ui_line_index] == True:
pass
else:
self.line_obj_25_test_coverage[input_obj][input_win_line][ui_line_index] = True
print("[ja_pytest][keyword_pytest_coverage]: current test_coverage updated: obj: %s win_line: %d ui_line_index %d "%(input_obj,input_win_line,ui_line_index))
# 💡
#self.show_test_coverage()
return True
else:
return False
# 🌟🌟🌟: v2 version optimized the coding to meet different win_line between (W1, H1, H2) to others.
def show_test_coverage(self,key="slot_04"):
if key == "slot_04":
for any_obj in self.line_obj_25_test_coverage:
for any_win_line in self.line_obj_25_test_coverage[any_obj]:
print("\n---------Symbol: %s WinLine: %d------------------------"%(any_obj,any_win_line))
print(self.line_obj_25_test_coverage)
"""
之前v1 程式的拉....
for any_obj in self.line_obj_25_test_coverage:
if any_obj == "W1" or any_obj == "H1" or any_obj == "H2":
print("\n---------------%s----------------"%any_obj)
print("----------- 3 ------------")
print(line_obj_25_test_coverage[any_obj][3])
print("----------- 4 ------------")
print(line_obj_25_test_coverage[any_obj][4])
print("----------- 5 ------------")
print(line_obj_25_test_coverage[any_obj][5])
print("-----------------------------------\n")
"""
# 🌟🌟🌟: v2 version optimzed the coding to meet different win_line between (W1, H1, H2) to others.
def show_test_cases_coverage(self):
# Recalculate again
self.passed_cases = 0
for any_obj in self.line_obj_25_test_coverage:
for any_win_line in self.line_obj_25_test_coverage[any_obj]:
for any_line_index in self.line_obj_25_test_coverage[any_obj][any_win_line]:
if self.line_obj_25_test_coverage[any_obj][any_win_line][any_line_index]:
self.passed_cases += 1
else:
pass
print("🧪: how_many_cases = ",self.how_many_cases)
print("🧪: how_many_cases_pass = ",self.passed_cases)
print("🧪: Test coerage. = %.4f"%float(self.passed_cases/self.how_many_cases))
"""
[{'win_obj': 'H2', 'win_line': 3, 'win_list_r': [['W1', 'L2', 'L1'], ['W1', 'L1', 'L4'], ['H2', 'H4', 'L6'], ['L6', 'L4', 'H2'], ['S1', 'L5', 'H4']], 'win_gain': 0.0, 'win_line_index': 1, 'win_bet_rate': 40.0}, {'win_obj': 'W1', 'win_line': 2, 'win_list_r': [['W1', 'L2', 'L1'], ['W1', 'L1', 'L4'], ['H2', 'H4', 'L6'], ['L6', 'L4', 'H2'], ['S1', 'L5', 'H4']], 'win_gain': 2.0, 'win_line_index': 5, 'win_bet_rate': 40.0}]
test_gain = 0.0
expec gain 2.0
bug H4-4: gain = 0.0 ?
[{'win_obj': 'H1', 'win_line': 4,
'win_list_r': [['S1', 'H3', 'W1'], ['H3', 'W1', 'L2'], ['H3', 'W1', 'L2'], ['L5', 'H1', 'L6'], ['L5', 'H4', 'L6']],
'win_gain': 0.0,
'win_line_index': 10,
'win_bet_rate': 500.0
}
Root cause: W-3 pay score = 500 , it's equal to H1-4 pay score = 500 as well... causing the wild_gain ~ obj_gain, so the code didn't assign current_gain in this case..
Fix after chaning the W-3 pay scores from 500 to 600
"""
if __name__ == "__main__":
"""
win_obj = "H1"
win_line = 3
print("win_obj = ",win_obj)
print("win_line = ",win_line)
print("win_bet = ",line_obj_bet_rate[win_obj][win_line])
for any_obj in line_obj_25_test_coverage:
print("\n---------------%s----------------"%any_obj)
print(line_obj_25_test_coverage[any_obj])
print("-----------------------------------\n")
"""
test_agent = agent()
test_agent.show_test_cases_coverage()
| 51.252511 | 420 | 0.539176 | 5,541 | 35,723 | 3.306262 | 0.047645 | 0.043177 | 0.018341 | 0.031441 | 0.822107 | 0.791539 | 0.76643 | 0.758788 | 0.74274 | 0.740502 | 0 | 0.116372 | 0.315399 | 35,723 | 696 | 421 | 51.326149 | 0.630643 | 0.079165 | 0 | 0.599502 | 0 | 0 | 0.078628 | 0.004904 | 0 | 0 | 0 | 0 | 0.004975 | 1 | 0.022388 | false | 0.064677 | 0.002488 | 0 | 0.114428 | 0.116915 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 7 |
57a09a63270286f9679b8257268bb28a4bfc1603 | 5,902 | py | Python | nest_py/tests/unit/db/test_db_ops_utils.py | KnowEnG/platform | 7356fabf5e2db4171ef1f910514436b69ecaa701 | [
"MIT"
] | 2 | 2020-02-12T22:20:51.000Z | 2020-07-31T03:19:51.000Z | nest_py/tests/unit/db/test_db_ops_utils.py | KnowEnG/platform | 7356fabf5e2db4171ef1f910514436b69ecaa701 | [
"MIT"
] | 1 | 2021-06-02T00:29:02.000Z | 2021-06-02T00:29:02.000Z | nest_py/tests/unit/db/test_db_ops_utils.py | KnowEnG/platform | 7356fabf5e2db4171ef1f910514436b69ecaa701 | [
"MIT"
] | 1 | 2018-01-03T22:56:27.000Z | 2018-01-03T22:56:27.000Z | import pytest
import nest_py.core.db.db_ops_utils as db_ops_utils
def test_validate_configs():
"""Tests __init__."""
# no app
with pytest.raises(AttributeError):
db_ops_utils._validate_user_configs(None)
# no DEMO_AUTHENTICATION_ACCOUNTS key in config
with pytest.raises(AttributeError):
db_ops_utils._validate_user_configs([])
# duplicate usernames
with pytest.raises(AttributeError):
db_ops_utils._validate_user_configs(
[
{
'username': 'demouser',
'password': 'demopass',
'given_name': 'Demo',
'family_name': 'User',
}, {
'username': 'demouser',
'password': 'otherpass',
'given_name': 'Other',
'family_name': 'User',
}
])
# no username
with pytest.raises(AttributeError):
db_ops_utils._validate_user_configs(
[{
'password': 'demopass',
'given_name': 'Demo',
'family_name': 'User'}])
# no password
with pytest.raises(AttributeError):
db_ops_utils._validate_user_configs(
[{
'username': 'demouser',
'given_name': 'Demo',
'family_name': 'User'}])
# no given_name
with pytest.raises(AttributeError):
db_ops_utils._validate_user_configs(
[{
'username': 'demouser',
'password': 'demopass',
'family_name': 'User'}])
# no family_name
with pytest.raises(AttributeError):
db_ops_utils._validate_user_configs(
[{
'username': 'demouser',
'password': 'demopass',
'given_name': 'Demo'}])
# empty username
with pytest.raises(AttributeError) as exc:
db_ops_utils._validate_user_configs(
[{
'username': '',
'password': 'demopass',
'given_name': 'Demo',
'family_name': 'User'}])
assert str(exc.value) == 'username cannot be empty'
# empty password
with pytest.raises(AttributeError) as exc:
db_ops_utils._validate_user_configs(
[{
'username': 'demouser',
'password': '',
'given_name': 'Demo',
'family_name': 'User'}])
assert str(exc.value) == 'password cannot be empty'
# empty given_name
with pytest.raises(AttributeError) as exc:
db_ops_utils._validate_user_configs(
[{
'username': 'demouser',
'password': 'demopass',
'given_name': '',
'family_name': 'User'}])
assert str(exc.value) == 'given_name cannot be empty'
# empty family_name
with pytest.raises(AttributeError) as exc:
db_ops_utils._validate_user_configs(
[{
'username': 'demouser',
'password': 'demopass',
'given_name': 'Demo',
'family_name': ''}])
assert str(exc.value) == 'family_name cannot be empty'
# spaces-only username
with pytest.raises(AttributeError) as exc:
db_ops_utils._validate_user_configs(
[{
'username': ' ',
'password': 'demopass',
'given_name': 'Demo',
'family_name': 'User'}])
assert str(exc.value) == 'username cannot be empty'
# spaces-only password
with pytest.raises(AttributeError) as exc:
db_ops_utils._validate_user_configs(
[{
'username': 'demouser',
'password': ' ',
'given_name': 'Demo',
'family_name': 'User'}])
assert str(exc.value) == 'password cannot be empty'
# space-only given_name
with pytest.raises(AttributeError) as exc:
db_ops_utils._validate_user_configs(
[{
'username': 'demouser',
'password': 'demopass',
'given_name': ' ',
'family_name': 'User'}])
assert str(exc.value) == 'given_name cannot be empty'
# spaces-only family_name
with pytest.raises(AttributeError) as exc:
db_ops_utils._validate_user_configs(
[{
'username': 'demouser',
'password': 'demopass',
'given_name': 'Demo',
'family_name': ' '}])
assert str(exc.value) == 'family_name cannot be empty'
# username w/ spaces
with pytest.raises(AttributeError) as exc:
db_ops_utils._validate_user_configs(
[{
'username': 'demouser ',
'password': 'demopass',
'given_name': 'Demo',
'family_name': 'User'}])
assert str(exc.value) == \
'DEMO_AUTHENTICATION_USERNAME cannot contain spaces'
# password w/ spaces
with pytest.raises(AttributeError):
db_ops_utils._validate_user_configs(
[{
'username': 'demouser',
'password': 'demopass ',
'given_name': 'Demo',
'family_name': 'User'}])
assert str(exc.value) == \
'DEMO_AUTHENTICATION_PASSWORD cannot contain spaces'
# good username and password
db_ops_utils._validate_user_configs(
[ {
'username': 'demouser',
'userid': 11,
'password': 'demopass',
'given_name': 'Demo',
'family_name': 'User',
}, {
'username': 'otheruser',
'userid': 12,
'password': 'otherpass',
'given_name': 'Other',
'family_name': 'User',
}
])
return
| 33.534091 | 64 | 0.511352 | 525 | 5,902 | 5.460952 | 0.110476 | 0.069062 | 0.069759 | 0.11301 | 0.885595 | 0.876526 | 0.86083 | 0.850017 | 0.787932 | 0.767353 | 0 | 0.001074 | 0.369197 | 5,902 | 175 | 65 | 33.725714 | 0.769003 | 0.060996 | 0 | 0.888889 | 0 | 0 | 0.239079 | 0.01015 | 0 | 0 | 0 | 0 | 0.069444 | 1 | 0.006944 | true | 0.138889 | 0.013889 | 0 | 0.027778 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 9 |
17e4d856cbfec58bccddf7b3542411ae951b419b | 8,097 | py | Python | tests/test_factories/test_algorithms.py | j2slab/MLStudio | 7d7c4b1073617968c28f0e496020e4720b552451 | [
"BSD-3-Clause"
] | 1 | 2019-05-13T01:07:23.000Z | 2019-05-13T01:07:23.000Z | tests/test_factories/test_algorithms.py | DecisionScients/MLStudio | 7d7c4b1073617968c28f0e496020e4720b552451 | [
"BSD-3-Clause"
] | 1 | 2020-04-11T22:14:42.000Z | 2020-04-11T22:14:42.000Z | tests/test_factories/test_algorithms.py | decisionscients/MLStudio | 7d7c4b1073617968c28f0e496020e4720b552451 | [
"BSD-3-Clause"
] | null | null | null | # -*- coding:utf-8 -*-
# =========================================================================== #
# Project : MLStudio #
# File : \test_algorithms.py #
# Python : 3.8.3 #
# --------------------------------------------------------------------------- #
# Author : John James #
# Company : nov8.ai #
# Email : jjames@nov8.ai #
# URL : https://github.com/nov8ai/MLStudio #
# --------------------------------------------------------------------------- #
# Created : Friday, August 7th 2020, 4:57:39 am #
# Last Modified : Friday, August 7th 2020, 4:57:39 am #
# Modified By : John James (jjames@nov8.ai) #
# --------------------------------------------------------------------------- #
# License : BSD #
# Copyright (c) 2020 nov8.ai #
# =========================================================================== #
import numpy as np
import pytest
from pytest import mark
from mlstudio.factories.algorithms import GradientDescent
from mlstudio.supervised.algorithms.optimization.observers import early_stop
from mlstudio.supervised.algorithms.optimization.observers import learning_rate
from mlstudio.supervised.algorithms.optimization.services import loss, regularizers
from mlstudio.supervised.algorithms.optimization.services import activations
# --------------------------------------------------------------------------- #
@mark.factories
class GDFactoryTests:
def test_regression_factories(self):
# Test #1: Immutable parameters:
estimator = GradientDescent().regressor(eta0=0.05, epochs=2000, batch_size=32,
val_size=0.4, theta_init=5, verbose=True,
random_state=5, check_gradient=True)
assert estimator.eta0 == 0.05, "eta0 error"
assert estimator.epochs == 2000, "epochs error"
assert estimator.batch_size == 32, "batch_size error"
assert estimator.val_size == 0.4, "val_size error"
assert estimator.theta_init == 5, "theta_init error"
assert estimator.verbose == True, "verbose error"
assert estimator.random_state == 5, "random_state error"
assert estimator.check_gradient == True, "check_gradient error"
# Test #2: Loss
estimator = GradientDescent().regressor(loss=loss.Quadratic(regularizer=regularizers.L2()))
assert "L2" in estimator.loss.regularizer.name, "Loss initialization error"
# Test #3: Data Processor
assert "Regression" in estimator.data_processor.__class__.__name__, "Data processor error"
# Test #4: Optimizer
assert "Gradient" in estimator.optimizer.name, "Optimizer error"
# Test #5: Scorer
assert "R2" in estimator.scorer.name, "Scorer error"
# Test #6: Early stop
estimator = GradientDescent().regressor(early_stop=early_stop.EarlyStop())
assert "Early" in estimator.early_stop.name, "Early stop error"
# Test #7: Learning rate
estimator = GradientDescent().regressor(learning_rate=learning_rate.TimeDecay())
assert "Time" in estimator.learning_rate.name, "Learning rate error"
def test_binaryclass_factories(self):
# Test #1: Immutable parameters:
estimator = GradientDescent().binaryclass(eta0=0.05, epochs=2000, batch_size=32,
val_size=0.4, theta_init=5, verbose=True,
random_state=5, check_gradient=True)
assert estimator.eta0 == 0.05, "eta0 error"
assert estimator.epochs == 2000, "epochs error"
assert estimator.batch_size == 32, "batch_size error"
assert estimator.val_size == 0.4, "val_size error"
assert estimator.theta_init == 5, "theta_init error"
assert estimator.verbose == True, "verbose error"
assert estimator.random_state == 5, "random_state error"
assert estimator.check_gradient == True, "check_gradient error"
# Test #2: Loss
estimator = GradientDescent().binaryclass(loss=loss.CrossEntropy(regularizer=regularizers.L2()))
assert "L2" in estimator.loss.regularizer.name, "Loss initialization error"
# Test #3: Data Processor
assert "Binary" in estimator.data_processor.__class__.__name__, "Data processor error"
# Test #4: Optimizer
assert "Gradient" in estimator.optimizer.name, "Optimizer error"
# Test #5: Scorer
assert "accuracy" in estimator.scorer.name, "Scorer error"
# Test #6: Early stop
estimator = GradientDescent().binaryclass(early_stop=early_stop.EarlyStop())
assert "Early" in estimator.early_stop.name, "Early stop error"
# Test #7: Learning rate
estimator = GradientDescent().binaryclass(learning_rate=learning_rate.TimeDecay())
assert "Time" in estimator.learning_rate.name, "Learning rate error"
def test_multiclass_factories(self):
# Test #1: Immutable parameters:
estimator = GradientDescent().multiclass(eta0=0.05, epochs=2000, batch_size=32,
val_size=0.4, theta_init=5, verbose=True,
random_state=5, check_gradient=True)
assert estimator.eta0 == 0.05, "eta0 error"
assert estimator.epochs == 2000, "epochs error"
assert estimator.batch_size == 32, "batch_size error"
assert estimator.val_size == 0.4, "val_size error"
assert estimator.theta_init == 5, "theta_init error"
assert estimator.verbose == True, "verbose error"
assert estimator.random_state == 5, "random_state error"
assert estimator.check_gradient == True, "check_gradient error"
# Test #2: Loss
estimator = GradientDescent().multiclass(loss=loss.CategoricalCrossEntropy(regularizer=regularizers.L2()))
assert "L2" in estimator.loss.regularizer.name, "Loss initialization error"
# Test #3: Data Processor
assert "Multi" in estimator.data_processor.__class__.__name__, "Data processor error"
# Test #4: Optimizer
assert "Gradient" in estimator.optimizer.name, "Optimizer error"
# Test #5: Scorer
assert "accuracy" in estimator.scorer.name, "Scorer error"
# Test #6: Early stop
estimator = GradientDescent().multiclass(early_stop=early_stop.EarlyStop())
assert "Early" in estimator.early_stop.name, "Early stop error"
# Test #7: Learning rate
estimator = GradientDescent().multiclass(learning_rate=learning_rate.TimeDecay())
assert "Time" in estimator.learning_rate.name, "Learning rate error" | 68.042017 | 114 | 0.505002 | 711 | 8,097 | 5.611814 | 0.160338 | 0.090226 | 0.105263 | 0.013534 | 0.8401 | 0.8401 | 0.8401 | 0.781454 | 0.722556 | 0.722556 | 0 | 0.02861 | 0.361121 | 8,097 | 119 | 115 | 68.042017 | 0.742702 | 0.234778 | 0 | 0.60274 | 0 | 0 | 0.127576 | 0 | 0 | 0 | 0 | 0 | 0.575342 | 1 | 0.041096 | false | 0 | 0.109589 | 0 | 0.164384 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
17f702499b92c956471715ec020c5fdf28e58822 | 8,085 | py | Python | Apps/FileExplorer/app.py | miyucode/MaxPyOS | d329b3d6be01006c92ac84fc44823139eb2daa39 | [
"MIT"
] | 2 | 2022-03-06T17:17:36.000Z | 2022-03-08T11:33:03.000Z | Apps/FileExplorer/app.py | miyucode/MaxPyOS | d329b3d6be01006c92ac84fc44823139eb2daa39 | [
"MIT"
] | null | null | null | Apps/FileExplorer/app.py | miyucode/MaxPyOS | d329b3d6be01006c92ac84fc44823139eb2daa39 | [
"MIT"
] | null | null | null | from UI.Menu import menu
from tkinter import *
from tkinter.scrolledtext import ScrolledText
from Apps.Notepad.app import notepad, openFile
import tkinter.filedialog as fd
import tkinter.messagebox as mb
import os
def fileexplorer():
app = Tk()
app.title("MaxPyOS - File Explorer")
app.geometry("800x500")
app.resizable(False, False)
app.iconbitmap("Apps/FileExplorer/icons/fileexplorer-icon.ico")
def delete_folder():
dir_to_delete = fd.askdirectory(title="Which folder to delete?", initialdir="System/Users/User/Desktop")
if dir_to_delete == "":
mb.showerror(title="Error!", message="We\'re not enable to delete this folder !")
i = 0
files = os.listdir("System/Users/User/Desktop/")
listbox = Listbox(app, selectbackground='SteelBlue', font=("Arial", 10))
listbox.place(relx=0, rely=0, relheight=1, relwidth=1)
scrollbar = Scrollbar(listbox, orient=VERTICAL, command=listbox.yview)
scrollbar.pack(side=RIGHT, fill=Y)
listbox.config(yscrollcommand=scrollbar.set)
while i < len(files):
listbox.insert(END, files[i])
i += 1
else:
os.rmdir(dir_to_delete)
mb.showinfo('MaxPyOS - File Explorer','Folder has been deleted with success.')
i = 0
files = os.listdir("System/Users/User/Desktop/")
listbox = Listbox(app, selectbackground='SteelBlue', font=("Arial", 10))
listbox.place(relx=0, rely=0, relheight=1, relwidth=1)
scrollbar = Scrollbar(listbox, orient=VERTICAL, command=listbox.yview)
scrollbar.pack(side=RIGHT, fill=Y)
listbox.config(yscrollcommand=scrollbar.set)
while i < len(files):
listbox.insert(END, files[i])
i += 1
def updatelist():
i = 0
files = os.listdir("System/Users/User/Desktop/")
listbox = Listbox(app, selectbackground='SteelBlue', font=("Arial", 10))
listbox.place(relx=0, rely=0, relheight=1, relwidth=1)
scrollbar = Scrollbar(listbox, orient=VERTICAL, command=listbox.yview)
scrollbar.pack(side=RIGHT, fill=Y)
listbox.config(yscrollcommand=scrollbar.set)
while i < len(files):
listbox.insert(END, files[i])
i += 1
def copy_file():
file_to_copy = fd.askopenfilename(title='Choose a file to copy', filetypes=[("All files", "*.*")], initialdir="System/Users/User/Desktop")
dir_to_copy_to = fd.askdirectory(title="In which folder to copy to?", initialdir="System/Users/User/Desktop")
if file_to_copy == "":
mb.showerror(title="Error!", message='We\'re not enable to copy this file !')
i = 0
files = os.listdir("System/Users/User/Desktop/")
listbox = Listbox(app, selectbackground='SteelBlue', font=("Arial", 10))
listbox.place(relx=0, rely=0, relheight=1, relwidth=1)
scrollbar = Scrollbar(listbox, orient=VERTICAL, command=listbox.yview)
scrollbar.pack(side=RIGHT, fill=Y)
listbox.config(yscrollcommand=scrollbar.set)
while i < len(files):
listbox.insert(END, files[i])
i += 1
else:
if dir_to_copy_to == "":
mb.showerror(title="Error!", message='We\'re not enable to copy this file !')
i = 0
files = os.listdir("System/Users/User/Desktop/")
listbox = Listbox(app, selectbackground='SteelBlue', font=("Arial", 10))
listbox.place(relx=0, rely=0, relheight=1, relwidth=1)
scrollbar = Scrollbar(listbox, orient=VERTICAL, command=listbox.yview)
scrollbar.pack(side=RIGHT, fill=Y)
listbox.config(yscrollcommand=scrollbar.set)
while i < len(files):
listbox.insert(END, files[i])
i += 1
else:
try:
shutil.copy(file_to_copy, dir_to_copy_to)
mb.showinfo(title='File copied!', message='Your desired file has been copied to your desired location')
i = 0
files = os.listdir("System/Users/User/Desktop/")
listbox = Listbox(app, selectbackground='SteelBlue', font=("Arial", 10))
listbox.place(relx=0, rely=0, relheight=1, relwidth=1)
scrollbar = Scrollbar(listbox, orient=VERTICAL, command=listbox.yview)
scrollbar.pack(side=RIGHT, fill=Y)
listbox.config(yscrollcommand=scrollbar.set)
while i < len(files):
listbox.insert(END, files[i])
i += 1
except:
mb.showerror(title='Error!', message='We were unable to copy your file to the desired location. Please try again')
i = 0
files = os.listdir("System/Users/User/Desktop/")
listbox = Listbox(app, selectbackground='SteelBlue', font=("Arial", 10))
listbox.place(relx=0, rely=0, relheight=1, relwidth=1)
scrollbar = Scrollbar(listbox, orient=VERTICAL, command=listbox.yview)
scrollbar.pack(side=RIGHT, fill=Y)
listbox.config(yscrollcommand=scrollbar.set)
while i < len(files):
listbox.insert(END, files[i])
i += 1
def delete_file():
file = fd.askopenfilename(title='Choose a file to delete', filetypes=[("All files", "*.*")], initialdir="System/Users/User/Desktop")
if file == "":
mb.showerror(title="Error!", message='We\'re not enable to delete this file !')
i = 0
files = os.listdir("System/Users/User/Desktop/")
listbox = Listbox(app, selectbackground='SteelBlue', font=("Arial", 10))
listbox.place(relx=0, rely=0, relheight=1, relwidth=1)
scrollbar = Scrollbar(listbox, orient=VERTICAL, command=listbox.yview)
scrollbar.pack(side=RIGHT, fill=Y)
listbox.config(yscrollcommand=scrollbar.set)
while i < len(files):
listbox.insert(END, files[i])
i += 1
else:
os.remove(os.path.abspath(file))
mb.showinfo('MaxPyOS - File Explorer', message='File has been deleted with success.')
i = 0
files = os.listdir("System/Users/User/Desktop/")
listbox = Listbox(app, selectbackground='SteelBlue', font=("Arial", 10))
listbox.place(relx=0, rely=0, relheight=1, relwidth=1)
scrollbar = Scrollbar(listbox, orient=VERTICAL, command=listbox.yview)
scrollbar.pack(side=RIGHT, fill=Y)
listbox.config(yscrollcommand=scrollbar.set)
while i < len(files):
listbox.insert(END, files[i])
i += 1
def open_file():
file = fd.askopenfilename(title='Choose a file of any type', filetypes=[("All files", "*.*")], initialdir="System/Users/User/Desktop")
if file == "":
mb.showerror(title="Error!", message='We\'re not enable to open this file !')
i = 0
files = os.listdir("System/Users/User/Desktop/")
listbox = Listbox(app, selectbackground='SteelBlue', font=("Arial", 10))
listbox.place(relx=0, rely=0, relheight=1, relwidth=1)
scrollbar = Scrollbar(listbox, orient=VERTICAL, command=listbox.yview)
scrollbar.pack(side=RIGHT, fill=Y)
listbox.config(yscrollcommand=scrollbar.set)
while i < len(files):
listbox.insert(END, files[i])
i += 1
else:
i = 0
files = os.listdir("System/Users/User/Desktop/")
listbox = Listbox(app, selectbackground='SteelBlue', font=("Arial", 10))
listbox.place(relx=0, rely=0, relheight=1, relwidth=1)
scrollbar = Scrollbar(listbox, orient=VERTICAL, command=listbox.yview)
scrollbar.pack(side=RIGHT, fill=Y)
listbox.config(yscrollcommand=scrollbar.set)
while i < len(files):
listbox.insert(END, files[i])
i += 1
split_file = os.path.splitext(file)
file_extension = split_file[1]
if file_extension == ".txt":
openFile(file)
else:
os.startfile(os.path.abspath(file))
def list_files_in_folder():
i = 0
files = os.listdir("System/Users/User/Desktop/")
listbox = Listbox(app, selectbackground='SteelBlue', font=("Arial", 10))
listbox.place(relx=0, rely=0, relheight=1, relwidth=1)
scrollbar = Scrollbar(listbox, orient=VERTICAL, command=listbox.yview)
scrollbar.pack(side=RIGHT, fill=Y)
listbox.config(yscrollcommand=scrollbar.set)
while i < len(files):
listbox.insert(END, files[i])
i += 1
list_files_in_folder()
menubar = Menu(app)
file = Menu(menubar, tearoff=0)
file.add_command(label="Open a file", command=open_file)
file.add_command(label="Delete a file", command=delete_file)
file.add_command(label="Copy a file", command=copy_file)
file.add_separator()
file.add_command(label="Delete a folder", command=delete_folder)
file.add_separator()
file.add_command(label="Update", command=updatelist)
menubar.add_cascade(label="File", menu=file)
app.config(menu=menubar)
app.mainloop() | 39.827586 | 140 | 0.698207 | 1,115 | 8,085 | 5.020628 | 0.119283 | 0.033405 | 0.045552 | 0.06681 | 0.801894 | 0.782422 | 0.765095 | 0.739193 | 0.715791 | 0.715791 | 0 | 0.015154 | 0.151144 | 8,085 | 203 | 141 | 39.827586 | 0.800525 | 0 | 0 | 0.705882 | 0 | 0 | 0.154588 | 0.059609 | 0 | 0 | 0 | 0 | 0 | 1 | 0.037433 | false | 0 | 0.037433 | 0 | 0.074866 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
17ff0d9d5eb2ce39fcab161c498b012dd2fa99a2 | 100 | py | Python | tests/test_code/py/subset_find_exception/two.py | FreddyZeng/code2flow | 37e45ca4340289f8ceec79b3fe5131c401387c58 | [
"MIT"
] | 1 | 2022-03-16T13:44:35.000Z | 2022-03-16T13:44:35.000Z | tests/test_code/py/subset_find_exception/two.py | FreddyZeng/code2flow | 37e45ca4340289f8ceec79b3fe5131c401387c58 | [
"MIT"
] | null | null | null | tests/test_code/py/subset_find_exception/two.py | FreddyZeng/code2flow | 37e45ca4340289f8ceec79b3fe5131c401387c58 | [
"MIT"
] | null | null | null | def private():
pass
class Abra:
def func():
private()
class Cadabra:
def func():
private()
| 9.090909 | 14 | 0.63 | 13 | 100 | 4.846154 | 0.538462 | 0.222222 | 0.444444 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.22 | 100 | 10 | 15 | 10 | 0.807692 | 0 | 0 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.375 | true | 0.125 | 0 | 0 | 0.625 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 7 |
a4db11eb6825dc0c21af9965b5f0079d90d18566 | 61 | py | Python | enjoy/005_20210731/e02.py | fkubota/enjoy-vim | f1f87ff569ebeea3b84993add200a2c7125cab7a | [
"MIT"
] | 26 | 2021-02-19T11:59:05.000Z | 2022-03-07T02:36:18.000Z | enjoy/005_20210731/e02.py | fkubota/enjoy-vim | f1f87ff569ebeea3b84993add200a2c7125cab7a | [
"MIT"
] | 5 | 2021-02-19T13:18:20.000Z | 2021-02-28T23:59:08.000Z | enjoy/005_20210731/e02.py | fkubota/enjoy-vim | f1f87ff569ebeea3b84993add200a2c7125cab7a | [
"MIT"
] | null | null | null | '''
'b': (3, 4) を 'c': 5に
'''
func('hoge', **{'b': (3, 4)})
| 10.166667 | 29 | 0.295082 | 11 | 61 | 1.636364 | 0.727273 | 0.222222 | 0.333333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.104167 | 0.213115 | 61 | 5 | 30 | 12.2 | 0.270833 | 0.344262 | 0 | 0 | 0 | 0 | 0.15625 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
a4ffbee14c3e15c49b1d90c8429e0d7888c81f14 | 188,268 | py | Python | dingtalk/python/alibabacloud_dingtalk/calendar_1_0/models.py | aliyun/dingtalk-sdk | ab4f856b8cfe94f6b69f10a0730a2e5a7d4901c5 | [
"Apache-2.0"
] | 15 | 2020-08-27T04:10:26.000Z | 2022-03-07T06:25:42.000Z | dingtalk/python/alibabacloud_dingtalk/calendar_1_0/models.py | aliyun/dingtalk-sdk | ab4f856b8cfe94f6b69f10a0730a2e5a7d4901c5 | [
"Apache-2.0"
] | 1 | 2020-09-27T01:30:46.000Z | 2021-12-29T09:15:34.000Z | dingtalk/python/alibabacloud_dingtalk/calendar_1_0/models.py | aliyun/dingtalk-sdk | ab4f856b8cfe94f6b69f10a0730a2e5a7d4901c5 | [
"Apache-2.0"
] | 5 | 2020-08-27T04:07:44.000Z | 2021-12-03T02:55:20.000Z | # -*- coding: utf-8 -*-
# This file is auto-generated, don't edit it. Thanks.
from Tea.model import TeaModel
from typing import Dict, List, Any
class CreateAclsHeaders(TeaModel):
def __init__(
self,
common_headers: Dict[str, str] = None,
x_acs_dingtalk_access_token: str = None,
):
self.common_headers = common_headers
self.x_acs_dingtalk_access_token = x_acs_dingtalk_access_token
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.common_headers is not None:
result['commonHeaders'] = self.common_headers
if self.x_acs_dingtalk_access_token is not None:
result['x-acs-dingtalk-access-token'] = self.x_acs_dingtalk_access_token
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('commonHeaders') is not None:
self.common_headers = m.get('commonHeaders')
if m.get('x-acs-dingtalk-access-token') is not None:
self.x_acs_dingtalk_access_token = m.get('x-acs-dingtalk-access-token')
return self
class CreateAclsRequestScope(TeaModel):
def __init__(
self,
scope_type: str = None,
user_id: str = None,
):
# 权限类型
self.scope_type = scope_type
# 用户id
self.user_id = user_id
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.scope_type is not None:
result['scopeType'] = self.scope_type
if self.user_id is not None:
result['userId'] = self.user_id
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('scopeType') is not None:
self.scope_type = m.get('scopeType')
if m.get('userId') is not None:
self.user_id = m.get('userId')
return self
class CreateAclsRequest(TeaModel):
def __init__(
self,
privilege: str = None,
send_msg: bool = None,
scope: CreateAclsRequestScope = None,
):
# 对日历的访问权限
self.privilege = privilege
# 是否向授权人发消息
self.send_msg = send_msg
# 权限范围
self.scope = scope
def validate(self):
if self.scope:
self.scope.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.privilege is not None:
result['privilege'] = self.privilege
if self.send_msg is not None:
result['sendMsg'] = self.send_msg
if self.scope is not None:
result['scope'] = self.scope.to_map()
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('privilege') is not None:
self.privilege = m.get('privilege')
if m.get('sendMsg') is not None:
self.send_msg = m.get('sendMsg')
if m.get('scope') is not None:
temp_model = CreateAclsRequestScope()
self.scope = temp_model.from_map(m['scope'])
return self
class CreateAclsResponseBodyScope(TeaModel):
def __init__(
self,
scope_type: str = None,
user_id: str = None,
):
# 权限类型
self.scope_type = scope_type
# 用户id
self.user_id = user_id
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.scope_type is not None:
result['scopeType'] = self.scope_type
if self.user_id is not None:
result['userId'] = self.user_id
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('scopeType') is not None:
self.scope_type = m.get('scopeType')
if m.get('userId') is not None:
self.user_id = m.get('userId')
return self
class CreateAclsResponseBody(TeaModel):
def __init__(
self,
privilege: str = None,
acl_id: str = None,
scope: CreateAclsResponseBodyScope = None,
):
# 对日历的访问权限
self.privilege = privilege
# acl资源ID
self.acl_id = acl_id
# 权限范围
self.scope = scope
def validate(self):
if self.scope:
self.scope.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.privilege is not None:
result['privilege'] = self.privilege
if self.acl_id is not None:
result['aclId'] = self.acl_id
if self.scope is not None:
result['scope'] = self.scope.to_map()
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('privilege') is not None:
self.privilege = m.get('privilege')
if m.get('aclId') is not None:
self.acl_id = m.get('aclId')
if m.get('scope') is not None:
temp_model = CreateAclsResponseBodyScope()
self.scope = temp_model.from_map(m['scope'])
return self
class CreateAclsResponse(TeaModel):
def __init__(
self,
headers: Dict[str, str] = None,
body: CreateAclsResponseBody = None,
):
self.headers = headers
self.body = body
def validate(self):
self.validate_required(self.headers, 'headers')
self.validate_required(self.body, 'body')
if self.body:
self.body.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.headers is not None:
result['headers'] = self.headers
if self.body is not None:
result['body'] = self.body.to_map()
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('headers') is not None:
self.headers = m.get('headers')
if m.get('body') is not None:
temp_model = CreateAclsResponseBody()
self.body = temp_model.from_map(m['body'])
return self
class ListAclsHeaders(TeaModel):
def __init__(
self,
common_headers: Dict[str, str] = None,
x_acs_dingtalk_access_token: str = None,
):
self.common_headers = common_headers
self.x_acs_dingtalk_access_token = x_acs_dingtalk_access_token
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.common_headers is not None:
result['commonHeaders'] = self.common_headers
if self.x_acs_dingtalk_access_token is not None:
result['x-acs-dingtalk-access-token'] = self.x_acs_dingtalk_access_token
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('commonHeaders') is not None:
self.common_headers = m.get('commonHeaders')
if m.get('x-acs-dingtalk-access-token') is not None:
self.x_acs_dingtalk_access_token = m.get('x-acs-dingtalk-access-token')
return self
class ListAclsResponseBodyAclsScope(TeaModel):
def __init__(
self,
user_id: str = None,
scope_type: str = None,
):
# 用户id
self.user_id = user_id
# 权限类型
self.scope_type = scope_type
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.user_id is not None:
result['userId'] = self.user_id
if self.scope_type is not None:
result['scopeType'] = self.scope_type
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('userId') is not None:
self.user_id = m.get('userId')
if m.get('scopeType') is not None:
self.scope_type = m.get('scopeType')
return self
class ListAclsResponseBodyAcls(TeaModel):
def __init__(
self,
privilege: str = None,
acl_id: str = None,
scope: ListAclsResponseBodyAclsScope = None,
):
# 权限信息
self.privilege = privilege
# acl资源ID
self.acl_id = acl_id
# 权限范围
self.scope = scope
def validate(self):
if self.scope:
self.scope.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.privilege is not None:
result['privilege'] = self.privilege
if self.acl_id is not None:
result['aclId'] = self.acl_id
if self.scope is not None:
result['scope'] = self.scope.to_map()
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('privilege') is not None:
self.privilege = m.get('privilege')
if m.get('aclId') is not None:
self.acl_id = m.get('aclId')
if m.get('scope') is not None:
temp_model = ListAclsResponseBodyAclsScope()
self.scope = temp_model.from_map(m['scope'])
return self
class ListAclsResponseBody(TeaModel):
def __init__(
self,
acls: List[ListAclsResponseBodyAcls] = None,
):
# 访问控制列表
self.acls = acls
def validate(self):
if self.acls:
for k in self.acls:
if k:
k.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
result['acls'] = []
if self.acls is not None:
for k in self.acls:
result['acls'].append(k.to_map() if k else None)
return result
def from_map(self, m: dict = None):
m = m or dict()
self.acls = []
if m.get('acls') is not None:
for k in m.get('acls'):
temp_model = ListAclsResponseBodyAcls()
self.acls.append(temp_model.from_map(k))
return self
class ListAclsResponse(TeaModel):
def __init__(
self,
headers: Dict[str, str] = None,
body: ListAclsResponseBody = None,
):
self.headers = headers
self.body = body
def validate(self):
self.validate_required(self.headers, 'headers')
self.validate_required(self.body, 'body')
if self.body:
self.body.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.headers is not None:
result['headers'] = self.headers
if self.body is not None:
result['body'] = self.body.to_map()
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('headers') is not None:
self.headers = m.get('headers')
if m.get('body') is not None:
temp_model = ListAclsResponseBody()
self.body = temp_model.from_map(m['body'])
return self
class RespondEventHeaders(TeaModel):
def __init__(
self,
common_headers: Dict[str, str] = None,
x_acs_dingtalk_access_token: str = None,
):
self.common_headers = common_headers
self.x_acs_dingtalk_access_token = x_acs_dingtalk_access_token
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.common_headers is not None:
result['commonHeaders'] = self.common_headers
if self.x_acs_dingtalk_access_token is not None:
result['x-acs-dingtalk-access-token'] = self.x_acs_dingtalk_access_token
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('commonHeaders') is not None:
self.common_headers = m.get('commonHeaders')
if m.get('x-acs-dingtalk-access-token') is not None:
self.x_acs_dingtalk_access_token = m.get('x-acs-dingtalk-access-token')
return self
class RespondEventRequest(TeaModel):
def __init__(
self,
response_status: str = None,
):
self.response_status = response_status
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.response_status is not None:
result['responseStatus'] = self.response_status
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('responseStatus') is not None:
self.response_status = m.get('responseStatus')
return self
class RespondEventResponse(TeaModel):
def __init__(
self,
headers: Dict[str, str] = None,
):
self.headers = headers
def validate(self):
self.validate_required(self.headers, 'headers')
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.headers is not None:
result['headers'] = self.headers
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('headers') is not None:
self.headers = m.get('headers')
return self
class GenerateCaldavAccountHeaders(TeaModel):
def __init__(
self,
common_headers: Dict[str, str] = None,
ding_uid: str = None,
x_acs_dingtalk_access_token: str = None,
):
self.common_headers = common_headers
# 授权本次调用的用户id,该字段有值时认为本次调用已被授权访问该用户可以访问的所有数据
self.ding_uid = ding_uid
self.x_acs_dingtalk_access_token = x_acs_dingtalk_access_token
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.common_headers is not None:
result['commonHeaders'] = self.common_headers
if self.ding_uid is not None:
result['dingUid'] = self.ding_uid
if self.x_acs_dingtalk_access_token is not None:
result['x-acs-dingtalk-access-token'] = self.x_acs_dingtalk_access_token
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('commonHeaders') is not None:
self.common_headers = m.get('commonHeaders')
if m.get('dingUid') is not None:
self.ding_uid = m.get('dingUid')
if m.get('x-acs-dingtalk-access-token') is not None:
self.x_acs_dingtalk_access_token = m.get('x-acs-dingtalk-access-token')
return self
class GenerateCaldavAccountRequest(TeaModel):
def __init__(
self,
device: str = None,
):
# 设备名称
self.device = device
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.device is not None:
result['device'] = self.device
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('device') is not None:
self.device = m.get('device')
return self
class GenerateCaldavAccountResponseBody(TeaModel):
def __init__(
self,
server_address: str = None,
username: str = None,
password: str = None,
):
self.server_address = server_address
self.username = username
self.password = password
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.server_address is not None:
result['serverAddress'] = self.server_address
if self.username is not None:
result['username'] = self.username
if self.password is not None:
result['password'] = self.password
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('serverAddress') is not None:
self.server_address = m.get('serverAddress')
if m.get('username') is not None:
self.username = m.get('username')
if m.get('password') is not None:
self.password = m.get('password')
return self
class GenerateCaldavAccountResponse(TeaModel):
def __init__(
self,
headers: Dict[str, str] = None,
body: GenerateCaldavAccountResponseBody = None,
):
self.headers = headers
self.body = body
def validate(self):
self.validate_required(self.headers, 'headers')
self.validate_required(self.body, 'body')
if self.body:
self.body.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.headers is not None:
result['headers'] = self.headers
if self.body is not None:
result['body'] = self.body.to_map()
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('headers') is not None:
self.headers = m.get('headers')
if m.get('body') is not None:
temp_model = GenerateCaldavAccountResponseBody()
self.body = temp_model.from_map(m['body'])
return self
class GetScheduleHeaders(TeaModel):
def __init__(
self,
common_headers: Dict[str, str] = None,
x_acs_dingtalk_access_token: str = None,
):
self.common_headers = common_headers
self.x_acs_dingtalk_access_token = x_acs_dingtalk_access_token
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.common_headers is not None:
result['commonHeaders'] = self.common_headers
if self.x_acs_dingtalk_access_token is not None:
result['x-acs-dingtalk-access-token'] = self.x_acs_dingtalk_access_token
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('commonHeaders') is not None:
self.common_headers = m.get('commonHeaders')
if m.get('x-acs-dingtalk-access-token') is not None:
self.x_acs_dingtalk_access_token = m.get('x-acs-dingtalk-access-token')
return self
class GetScheduleRequest(TeaModel):
def __init__(
self,
user_ids: List[str] = None,
start_time: str = None,
end_time: str = None,
):
# 待查询的用户列表
self.user_ids = user_ids
# 查询开始时间
self.start_time = start_time
# 查询结束时间
self.end_time = end_time
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.user_ids is not None:
result['userIds'] = self.user_ids
if self.start_time is not None:
result['startTime'] = self.start_time
if self.end_time is not None:
result['endTime'] = self.end_time
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('userIds') is not None:
self.user_ids = m.get('userIds')
if m.get('startTime') is not None:
self.start_time = m.get('startTime')
if m.get('endTime') is not None:
self.end_time = m.get('endTime')
return self
class GetScheduleResponseBodyScheduleInformationScheduleItemsStart(TeaModel):
def __init__(
self,
date: str = None,
date_time: str = None,
time_zone: str = None,
):
# 开始日期
self.date = date
# 开始时间戳,按照ISO 8601格式
self.date_time = date_time
# 所属时区
self.time_zone = time_zone
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.date is not None:
result['date'] = self.date
if self.date_time is not None:
result['dateTime'] = self.date_time
if self.time_zone is not None:
result['timeZone'] = self.time_zone
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('date') is not None:
self.date = m.get('date')
if m.get('dateTime') is not None:
self.date_time = m.get('dateTime')
if m.get('timeZone') is not None:
self.time_zone = m.get('timeZone')
return self
class GetScheduleResponseBodyScheduleInformationScheduleItemsEnd(TeaModel):
def __init__(
self,
date: str = None,
date_time: str = None,
time_zone: str = None,
):
# 结束日期
self.date = date
# 结束时间戳,按照ISO 8601格式
self.date_time = date_time
# 时间戳所属时区
self.time_zone = time_zone
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.date is not None:
result['date'] = self.date
if self.date_time is not None:
result['dateTime'] = self.date_time
if self.time_zone is not None:
result['timeZone'] = self.time_zone
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('date') is not None:
self.date = m.get('date')
if m.get('dateTime') is not None:
self.date_time = m.get('dateTime')
if m.get('timeZone') is not None:
self.time_zone = m.get('timeZone')
return self
class GetScheduleResponseBodyScheduleInformationScheduleItems(TeaModel):
def __init__(
self,
status: str = None,
start: GetScheduleResponseBodyScheduleInformationScheduleItemsStart = None,
end: GetScheduleResponseBodyScheduleInformationScheduleItemsEnd = None,
):
# 状态: - BUSY:繁忙, - TENTATIVE:暂定繁忙
self.status = status
# 开始时间,表示一个日期,或者一个带时区的时间戳
self.start = start
# 结束时间,表示一个日期,或者一个带时区的时间戳
self.end = end
def validate(self):
if self.start:
self.start.validate()
if self.end:
self.end.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.status is not None:
result['status'] = self.status
if self.start is not None:
result['start'] = self.start.to_map()
if self.end is not None:
result['end'] = self.end.to_map()
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('status') is not None:
self.status = m.get('status')
if m.get('start') is not None:
temp_model = GetScheduleResponseBodyScheduleInformationScheduleItemsStart()
self.start = temp_model.from_map(m['start'])
if m.get('end') is not None:
temp_model = GetScheduleResponseBodyScheduleInformationScheduleItemsEnd()
self.end = temp_model.from_map(m['end'])
return self
class GetScheduleResponseBodyScheduleInformation(TeaModel):
def __init__(
self,
user_id: str = None,
error: str = None,
schedule_items: List[GetScheduleResponseBodyScheduleInformationScheduleItems] = None,
):
# 用户userId
self.user_id = user_id
# 异常描述
self.error = error
self.schedule_items = schedule_items
def validate(self):
if self.schedule_items:
for k in self.schedule_items:
if k:
k.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.user_id is not None:
result['userId'] = self.user_id
if self.error is not None:
result['error'] = self.error
result['scheduleItems'] = []
if self.schedule_items is not None:
for k in self.schedule_items:
result['scheduleItems'].append(k.to_map() if k else None)
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('userId') is not None:
self.user_id = m.get('userId')
if m.get('error') is not None:
self.error = m.get('error')
self.schedule_items = []
if m.get('scheduleItems') is not None:
for k in m.get('scheduleItems'):
temp_model = GetScheduleResponseBodyScheduleInformationScheduleItems()
self.schedule_items.append(temp_model.from_map(k))
return self
class GetScheduleResponseBody(TeaModel):
def __init__(
self,
schedule_information: List[GetScheduleResponseBodyScheduleInformation] = None,
):
# 闲忙信息
self.schedule_information = schedule_information
def validate(self):
if self.schedule_information:
for k in self.schedule_information:
if k:
k.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
result['scheduleInformation'] = []
if self.schedule_information is not None:
for k in self.schedule_information:
result['scheduleInformation'].append(k.to_map() if k else None)
return result
def from_map(self, m: dict = None):
m = m or dict()
self.schedule_information = []
if m.get('scheduleInformation') is not None:
for k in m.get('scheduleInformation'):
temp_model = GetScheduleResponseBodyScheduleInformation()
self.schedule_information.append(temp_model.from_map(k))
return self
class GetScheduleResponse(TeaModel):
def __init__(
self,
headers: Dict[str, str] = None,
body: GetScheduleResponseBody = None,
):
self.headers = headers
self.body = body
def validate(self):
self.validate_required(self.headers, 'headers')
self.validate_required(self.body, 'body')
if self.body:
self.body.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.headers is not None:
result['headers'] = self.headers
if self.body is not None:
result['body'] = self.body.to_map()
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('headers') is not None:
self.headers = m.get('headers')
if m.get('body') is not None:
temp_model = GetScheduleResponseBody()
self.body = temp_model.from_map(m['body'])
return self
class ConvertLegacyEventIdHeaders(TeaModel):
def __init__(
self,
common_headers: Dict[str, str] = None,
ding_org_id: str = None,
ding_uid: str = None,
ding_access_token_type: str = None,
x_acs_dingtalk_access_token: str = None,
):
self.common_headers = common_headers
# 授权本次调用的企业id,该字段有值时认为本次调用已被授权访问该企业下的所有数据
self.ding_org_id = ding_org_id
# 授权本次调用的用户id,该字段有值时认为本次调用已被授权访问该用户可以访问的所有数据
self.ding_uid = ding_uid
# 授权类型
self.ding_access_token_type = ding_access_token_type
self.x_acs_dingtalk_access_token = x_acs_dingtalk_access_token
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.common_headers is not None:
result['commonHeaders'] = self.common_headers
if self.ding_org_id is not None:
result['dingOrgId'] = self.ding_org_id
if self.ding_uid is not None:
result['dingUid'] = self.ding_uid
if self.ding_access_token_type is not None:
result['dingAccessTokenType'] = self.ding_access_token_type
if self.x_acs_dingtalk_access_token is not None:
result['x-acs-dingtalk-access-token'] = self.x_acs_dingtalk_access_token
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('commonHeaders') is not None:
self.common_headers = m.get('commonHeaders')
if m.get('dingOrgId') is not None:
self.ding_org_id = m.get('dingOrgId')
if m.get('dingUid') is not None:
self.ding_uid = m.get('dingUid')
if m.get('dingAccessTokenType') is not None:
self.ding_access_token_type = m.get('dingAccessTokenType')
if m.get('x-acs-dingtalk-access-token') is not None:
self.x_acs_dingtalk_access_token = m.get('x-acs-dingtalk-access-token')
return self
class ConvertLegacyEventIdRequest(TeaModel):
def __init__(
self,
legacy_event_ids: Dict[str, str] = None,
):
self.legacy_event_ids = legacy_event_ids
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.legacy_event_ids is not None:
result['legacyEventIds'] = self.legacy_event_ids
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('legacyEventIds') is not None:
self.legacy_event_ids = m.get('legacyEventIds')
return self
class ConvertLegacyEventIdResponseBody(TeaModel):
def __init__(
self,
legacy_event_id_map: Dict[str, Any] = None,
):
# legacyEventIdMap
self.legacy_event_id_map = legacy_event_id_map
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.legacy_event_id_map is not None:
result['legacyEventIdMap'] = self.legacy_event_id_map
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('legacyEventIdMap') is not None:
self.legacy_event_id_map = m.get('legacyEventIdMap')
return self
class ConvertLegacyEventIdResponse(TeaModel):
def __init__(
self,
headers: Dict[str, str] = None,
body: ConvertLegacyEventIdResponseBody = None,
):
self.headers = headers
self.body = body
def validate(self):
self.validate_required(self.headers, 'headers')
self.validate_required(self.body, 'body')
if self.body:
self.body.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.headers is not None:
result['headers'] = self.headers
if self.body is not None:
result['body'] = self.body.to_map()
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('headers') is not None:
self.headers = m.get('headers')
if m.get('body') is not None:
temp_model = ConvertLegacyEventIdResponseBody()
self.body = temp_model.from_map(m['body'])
return self
class RemoveAttendeeHeaders(TeaModel):
def __init__(
self,
common_headers: Dict[str, str] = None,
x_acs_dingtalk_access_token: str = None,
):
self.common_headers = common_headers
self.x_acs_dingtalk_access_token = x_acs_dingtalk_access_token
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.common_headers is not None:
result['commonHeaders'] = self.common_headers
if self.x_acs_dingtalk_access_token is not None:
result['x-acs-dingtalk-access-token'] = self.x_acs_dingtalk_access_token
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('commonHeaders') is not None:
self.common_headers = m.get('commonHeaders')
if m.get('x-acs-dingtalk-access-token') is not None:
self.x_acs_dingtalk_access_token = m.get('x-acs-dingtalk-access-token')
return self
class RemoveAttendeeRequestAttendeesToRemove(TeaModel):
def __init__(
self,
id: str = None,
):
self.id = id
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.id is not None:
result['id'] = self.id
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('id') is not None:
self.id = m.get('id')
return self
class RemoveAttendeeRequest(TeaModel):
def __init__(
self,
attendees_to_remove: List[RemoveAttendeeRequestAttendeesToRemove] = None,
):
self.attendees_to_remove = attendees_to_remove
def validate(self):
if self.attendees_to_remove:
for k in self.attendees_to_remove:
if k:
k.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
result['attendeesToRemove'] = []
if self.attendees_to_remove is not None:
for k in self.attendees_to_remove:
result['attendeesToRemove'].append(k.to_map() if k else None)
return result
def from_map(self, m: dict = None):
m = m or dict()
self.attendees_to_remove = []
if m.get('attendeesToRemove') is not None:
for k in m.get('attendeesToRemove'):
temp_model = RemoveAttendeeRequestAttendeesToRemove()
self.attendees_to_remove.append(temp_model.from_map(k))
return self
class RemoveAttendeeResponse(TeaModel):
def __init__(
self,
headers: Dict[str, str] = None,
):
self.headers = headers
def validate(self):
self.validate_required(self.headers, 'headers')
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.headers is not None:
result['headers'] = self.headers
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('headers') is not None:
self.headers = m.get('headers')
return self
class AddAttendeeHeaders(TeaModel):
def __init__(
self,
common_headers: Dict[str, str] = None,
x_acs_dingtalk_access_token: str = None,
):
self.common_headers = common_headers
self.x_acs_dingtalk_access_token = x_acs_dingtalk_access_token
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.common_headers is not None:
result['commonHeaders'] = self.common_headers
if self.x_acs_dingtalk_access_token is not None:
result['x-acs-dingtalk-access-token'] = self.x_acs_dingtalk_access_token
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('commonHeaders') is not None:
self.common_headers = m.get('commonHeaders')
if m.get('x-acs-dingtalk-access-token') is not None:
self.x_acs_dingtalk_access_token = m.get('x-acs-dingtalk-access-token')
return self
class AddAttendeeRequestAttendeesToAdd(TeaModel):
def __init__(
self,
id: str = None,
):
self.id = id
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.id is not None:
result['id'] = self.id
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('id') is not None:
self.id = m.get('id')
return self
class AddAttendeeRequest(TeaModel):
def __init__(
self,
attendees_to_add: List[AddAttendeeRequestAttendeesToAdd] = None,
):
self.attendees_to_add = attendees_to_add
def validate(self):
if self.attendees_to_add:
for k in self.attendees_to_add:
if k:
k.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
result['attendeesToAdd'] = []
if self.attendees_to_add is not None:
for k in self.attendees_to_add:
result['attendeesToAdd'].append(k.to_map() if k else None)
return result
def from_map(self, m: dict = None):
m = m or dict()
self.attendees_to_add = []
if m.get('attendeesToAdd') is not None:
for k in m.get('attendeesToAdd'):
temp_model = AddAttendeeRequestAttendeesToAdd()
self.attendees_to_add.append(temp_model.from_map(k))
return self
class AddAttendeeResponse(TeaModel):
def __init__(
self,
headers: Dict[str, str] = None,
):
self.headers = headers
def validate(self):
self.validate_required(self.headers, 'headers')
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.headers is not None:
result['headers'] = self.headers
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('headers') is not None:
self.headers = m.get('headers')
return self
class CreateEventHeaders(TeaModel):
def __init__(
self,
common_headers: Dict[str, str] = None,
x_acs_dingtalk_access_token: str = None,
):
self.common_headers = common_headers
self.x_acs_dingtalk_access_token = x_acs_dingtalk_access_token
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.common_headers is not None:
result['commonHeaders'] = self.common_headers
if self.x_acs_dingtalk_access_token is not None:
result['x-acs-dingtalk-access-token'] = self.x_acs_dingtalk_access_token
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('commonHeaders') is not None:
self.common_headers = m.get('commonHeaders')
if m.get('x-acs-dingtalk-access-token') is not None:
self.x_acs_dingtalk_access_token = m.get('x-acs-dingtalk-access-token')
return self
class CreateEventRequestStart(TeaModel):
def __init__(
self,
date: str = None,
date_time: str = None,
time_zone: str = None,
):
# 日程开始日期,如果是全天日程必须有值,非全天日程必须留空,格式:yyyy-MM-dd
self.date = date
# 日程开始时间,非全天日程必须有值,全天日程必须留空,格式为ISO-8601的date-time格式
self.date_time = date_time
# 日程开始时间所属时区,非全天日程必须有值,全天日程必须留空,tz database name格式,参考:https://en.wikipedia.org/wiki/List_of_tz_database_time_zones
self.time_zone = time_zone
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.date is not None:
result['date'] = self.date
if self.date_time is not None:
result['dateTime'] = self.date_time
if self.time_zone is not None:
result['timeZone'] = self.time_zone
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('date') is not None:
self.date = m.get('date')
if m.get('dateTime') is not None:
self.date_time = m.get('dateTime')
if m.get('timeZone') is not None:
self.time_zone = m.get('timeZone')
return self
class CreateEventRequestEnd(TeaModel):
def __init__(
self,
date: str = None,
date_time: str = None,
time_zone: str = None,
):
# 日程结束日期,如果是全天日程必须有值,非全天日程必须留空,格式:yyyy-MM-dd
self.date = date
# 日程结束时间,非全天日程必须有值,全天日程必须留空,格式为ISO-8601的date-time格式
self.date_time = date_time
# 日程结束时间所属时区,非全天日程必须有值,全天日程必须留空,tz database name格式,参考:https://en.wikipedia.org/wiki/List_of_tz_database_time_zones
self.time_zone = time_zone
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.date is not None:
result['date'] = self.date
if self.date_time is not None:
result['dateTime'] = self.date_time
if self.time_zone is not None:
result['timeZone'] = self.time_zone
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('date') is not None:
self.date = m.get('date')
if m.get('dateTime') is not None:
self.date_time = m.get('dateTime')
if m.get('timeZone') is not None:
self.time_zone = m.get('timeZone')
return self
class CreateEventRequestRecurrencePattern(TeaModel):
def __init__(
self,
type: str = None,
day_of_month: int = None,
days_of_week: str = None,
index: str = None,
interval: int = None,
):
# 循环规则类型: daily:每interval天 weekly:每interval周的第daysOfWeek天 absoluteMonthly:每interval月的第dayOfMonth天 relativeMonthly:每interval月的第index周的第daysOfWeek天 absoluteYearly:每interval年
#
self.type = type
self.day_of_month = day_of_month
self.days_of_week = days_of_week
self.index = index
self.interval = interval
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.type is not None:
result['type'] = self.type
if self.day_of_month is not None:
result['dayOfMonth'] = self.day_of_month
if self.days_of_week is not None:
result['daysOfWeek'] = self.days_of_week
if self.index is not None:
result['index'] = self.index
if self.interval is not None:
result['interval'] = self.interval
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('type') is not None:
self.type = m.get('type')
if m.get('dayOfMonth') is not None:
self.day_of_month = m.get('dayOfMonth')
if m.get('daysOfWeek') is not None:
self.days_of_week = m.get('daysOfWeek')
if m.get('index') is not None:
self.index = m.get('index')
if m.get('interval') is not None:
self.interval = m.get('interval')
return self
class CreateEventRequestRecurrenceRange(TeaModel):
def __init__(
self,
type: str = None,
end_date: str = None,
number_of_occurrences: int = None,
):
self.type = type
self.end_date = end_date
self.number_of_occurrences = number_of_occurrences
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.type is not None:
result['type'] = self.type
if self.end_date is not None:
result['endDate'] = self.end_date
if self.number_of_occurrences is not None:
result['numberOfOccurrences'] = self.number_of_occurrences
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('type') is not None:
self.type = m.get('type')
if m.get('endDate') is not None:
self.end_date = m.get('endDate')
if m.get('numberOfOccurrences') is not None:
self.number_of_occurrences = m.get('numberOfOccurrences')
return self
class CreateEventRequestRecurrence(TeaModel):
def __init__(
self,
pattern: CreateEventRequestRecurrencePattern = None,
range: CreateEventRequestRecurrenceRange = None,
):
# 循环规则
self.pattern = pattern
self.range = range
def validate(self):
if self.pattern:
self.pattern.validate()
if self.range:
self.range.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.pattern is not None:
result['pattern'] = self.pattern.to_map()
if self.range is not None:
result['range'] = self.range.to_map()
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('pattern') is not None:
temp_model = CreateEventRequestRecurrencePattern()
self.pattern = temp_model.from_map(m['pattern'])
if m.get('range') is not None:
temp_model = CreateEventRequestRecurrenceRange()
self.range = temp_model.from_map(m['range'])
return self
class CreateEventRequestAttendees(TeaModel):
def __init__(
self,
id: str = None,
):
self.id = id
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.id is not None:
result['id'] = self.id
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('id') is not None:
self.id = m.get('id')
return self
class CreateEventRequestLocation(TeaModel):
def __init__(
self,
display_name: str = None,
):
self.display_name = display_name
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.display_name is not None:
result['displayName'] = self.display_name
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('displayName') is not None:
self.display_name = m.get('displayName')
return self
class CreateEventRequestReminders(TeaModel):
def __init__(
self,
method: str = None,
minutes: int = None,
):
self.method = method
self.minutes = minutes
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.method is not None:
result['method'] = self.method
if self.minutes is not None:
result['minutes'] = self.minutes
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('method') is not None:
self.method = m.get('method')
if m.get('minutes') is not None:
self.minutes = m.get('minutes')
return self
class CreateEventRequestOnlineMeetingInfo(TeaModel):
def __init__(
self,
type: str = None,
):
self.type = type
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.type is not None:
result['type'] = self.type
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('type') is not None:
self.type = m.get('type')
return self
class CreateEventRequest(TeaModel):
def __init__(
self,
summary: str = None,
description: str = None,
start: CreateEventRequestStart = None,
end: CreateEventRequestEnd = None,
is_all_day: bool = None,
recurrence: CreateEventRequestRecurrence = None,
attendees: List[CreateEventRequestAttendees] = None,
location: CreateEventRequestLocation = None,
reminders: List[CreateEventRequestReminders] = None,
online_meeting_info: CreateEventRequestOnlineMeetingInfo = None,
extra: Dict[str, str] = None,
):
# 日程标题
self.summary = summary
# 日程描述
self.description = description
# 日程开始时间
self.start = start
# 日程结束时间
self.end = end
# 是否为全天日程
self.is_all_day = is_all_day
# 日程循环规则
self.recurrence = recurrence
self.attendees = attendees
self.location = location
self.reminders = reminders
self.online_meeting_info = online_meeting_info
# 扩展信息
self.extra = extra
def validate(self):
if self.start:
self.start.validate()
if self.end:
self.end.validate()
if self.recurrence:
self.recurrence.validate()
if self.attendees:
for k in self.attendees:
if k:
k.validate()
if self.location:
self.location.validate()
if self.reminders:
for k in self.reminders:
if k:
k.validate()
if self.online_meeting_info:
self.online_meeting_info.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.summary is not None:
result['summary'] = self.summary
if self.description is not None:
result['description'] = self.description
if self.start is not None:
result['start'] = self.start.to_map()
if self.end is not None:
result['end'] = self.end.to_map()
if self.is_all_day is not None:
result['isAllDay'] = self.is_all_day
if self.recurrence is not None:
result['recurrence'] = self.recurrence.to_map()
result['attendees'] = []
if self.attendees is not None:
for k in self.attendees:
result['attendees'].append(k.to_map() if k else None)
if self.location is not None:
result['location'] = self.location.to_map()
result['reminders'] = []
if self.reminders is not None:
for k in self.reminders:
result['reminders'].append(k.to_map() if k else None)
if self.online_meeting_info is not None:
result['onlineMeetingInfo'] = self.online_meeting_info.to_map()
if self.extra is not None:
result['extra'] = self.extra
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('summary') is not None:
self.summary = m.get('summary')
if m.get('description') is not None:
self.description = m.get('description')
if m.get('start') is not None:
temp_model = CreateEventRequestStart()
self.start = temp_model.from_map(m['start'])
if m.get('end') is not None:
temp_model = CreateEventRequestEnd()
self.end = temp_model.from_map(m['end'])
if m.get('isAllDay') is not None:
self.is_all_day = m.get('isAllDay')
if m.get('recurrence') is not None:
temp_model = CreateEventRequestRecurrence()
self.recurrence = temp_model.from_map(m['recurrence'])
self.attendees = []
if m.get('attendees') is not None:
for k in m.get('attendees'):
temp_model = CreateEventRequestAttendees()
self.attendees.append(temp_model.from_map(k))
if m.get('location') is not None:
temp_model = CreateEventRequestLocation()
self.location = temp_model.from_map(m['location'])
self.reminders = []
if m.get('reminders') is not None:
for k in m.get('reminders'):
temp_model = CreateEventRequestReminders()
self.reminders.append(temp_model.from_map(k))
if m.get('onlineMeetingInfo') is not None:
temp_model = CreateEventRequestOnlineMeetingInfo()
self.online_meeting_info = temp_model.from_map(m['onlineMeetingInfo'])
if m.get('extra') is not None:
self.extra = m.get('extra')
return self
class CreateEventResponseBodyStart(TeaModel):
def __init__(
self,
date: str = None,
date_time: str = None,
time_zone: str = None,
):
self.date = date
self.date_time = date_time
self.time_zone = time_zone
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.date is not None:
result['date'] = self.date
if self.date_time is not None:
result['dateTime'] = self.date_time
if self.time_zone is not None:
result['timeZone'] = self.time_zone
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('date') is not None:
self.date = m.get('date')
if m.get('dateTime') is not None:
self.date_time = m.get('dateTime')
if m.get('timeZone') is not None:
self.time_zone = m.get('timeZone')
return self
class CreateEventResponseBodyEnd(TeaModel):
def __init__(
self,
date: str = None,
date_time: str = None,
time_zone: str = None,
):
self.date = date
self.date_time = date_time
self.time_zone = time_zone
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.date is not None:
result['date'] = self.date
if self.date_time is not None:
result['dateTime'] = self.date_time
if self.time_zone is not None:
result['timeZone'] = self.time_zone
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('date') is not None:
self.date = m.get('date')
if m.get('dateTime') is not None:
self.date_time = m.get('dateTime')
if m.get('timeZone') is not None:
self.time_zone = m.get('timeZone')
return self
class CreateEventResponseBodyRecurrencePattern(TeaModel):
def __init__(
self,
type: str = None,
day_of_month: int = None,
days_of_week: str = None,
index: str = None,
interval: int = None,
):
self.type = type
self.day_of_month = day_of_month
self.days_of_week = days_of_week
self.index = index
self.interval = interval
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.type is not None:
result['type'] = self.type
if self.day_of_month is not None:
result['dayOfMonth'] = self.day_of_month
if self.days_of_week is not None:
result['daysOfWeek'] = self.days_of_week
if self.index is not None:
result['index'] = self.index
if self.interval is not None:
result['interval'] = self.interval
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('type') is not None:
self.type = m.get('type')
if m.get('dayOfMonth') is not None:
self.day_of_month = m.get('dayOfMonth')
if m.get('daysOfWeek') is not None:
self.days_of_week = m.get('daysOfWeek')
if m.get('index') is not None:
self.index = m.get('index')
if m.get('interval') is not None:
self.interval = m.get('interval')
return self
class CreateEventResponseBodyRecurrenceRange(TeaModel):
def __init__(
self,
type: str = None,
end_date: str = None,
number_of_occurrences: int = None,
):
self.type = type
self.end_date = end_date
self.number_of_occurrences = number_of_occurrences
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.type is not None:
result['type'] = self.type
if self.end_date is not None:
result['endDate'] = self.end_date
if self.number_of_occurrences is not None:
result['numberOfOccurrences'] = self.number_of_occurrences
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('type') is not None:
self.type = m.get('type')
if m.get('endDate') is not None:
self.end_date = m.get('endDate')
if m.get('numberOfOccurrences') is not None:
self.number_of_occurrences = m.get('numberOfOccurrences')
return self
class CreateEventResponseBodyRecurrence(TeaModel):
def __init__(
self,
pattern: CreateEventResponseBodyRecurrencePattern = None,
range: CreateEventResponseBodyRecurrenceRange = None,
):
self.pattern = pattern
self.range = range
def validate(self):
if self.pattern:
self.pattern.validate()
if self.range:
self.range.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.pattern is not None:
result['pattern'] = self.pattern.to_map()
if self.range is not None:
result['range'] = self.range.to_map()
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('pattern') is not None:
temp_model = CreateEventResponseBodyRecurrencePattern()
self.pattern = temp_model.from_map(m['pattern'])
if m.get('range') is not None:
temp_model = CreateEventResponseBodyRecurrenceRange()
self.range = temp_model.from_map(m['range'])
return self
class CreateEventResponseBodyAttendees(TeaModel):
def __init__(
self,
id: str = None,
display_name: str = None,
response_status: str = None,
self_: bool = None,
):
self.id = id
self.display_name = display_name
# 回复状态
self.response_status = response_status
self.self_ = self_
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.id is not None:
result['id'] = self.id
if self.display_name is not None:
result['displayName'] = self.display_name
if self.response_status is not None:
result['responseStatus'] = self.response_status
if self.self_ is not None:
result['self'] = self.self_
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('id') is not None:
self.id = m.get('id')
if m.get('displayName') is not None:
self.display_name = m.get('displayName')
if m.get('responseStatus') is not None:
self.response_status = m.get('responseStatus')
if m.get('self') is not None:
self.self_ = m.get('self')
return self
class CreateEventResponseBodyOrganizer(TeaModel):
def __init__(
self,
id: str = None,
display_name: str = None,
response_status: str = None,
self_: bool = None,
):
self.id = id
# 用户名
self.display_name = display_name
# 回复状态
self.response_status = response_status
self.self_ = self_
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.id is not None:
result['id'] = self.id
if self.display_name is not None:
result['displayName'] = self.display_name
if self.response_status is not None:
result['responseStatus'] = self.response_status
if self.self_ is not None:
result['self'] = self.self_
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('id') is not None:
self.id = m.get('id')
if m.get('displayName') is not None:
self.display_name = m.get('displayName')
if m.get('responseStatus') is not None:
self.response_status = m.get('responseStatus')
if m.get('self') is not None:
self.self_ = m.get('self')
return self
class CreateEventResponseBodyLocation(TeaModel):
def __init__(
self,
display_name: str = None,
):
self.display_name = display_name
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.display_name is not None:
result['displayName'] = self.display_name
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('displayName') is not None:
self.display_name = m.get('displayName')
return self
class CreateEventResponseBodyReminders(TeaModel):
def __init__(
self,
method: str = None,
minutes: str = None,
):
self.method = method
self.minutes = minutes
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.method is not None:
result['method'] = self.method
if self.minutes is not None:
result['minutes'] = self.minutes
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('method') is not None:
self.method = m.get('method')
if m.get('minutes') is not None:
self.minutes = m.get('minutes')
return self
class CreateEventResponseBodyOnlineMeetingInfo(TeaModel):
def __init__(
self,
type: str = None,
conference_id: str = None,
url: str = None,
extra_info: Dict[str, Any] = None,
):
self.type = type
self.conference_id = conference_id
self.url = url
self.extra_info = extra_info
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.type is not None:
result['type'] = self.type
if self.conference_id is not None:
result['conferenceId'] = self.conference_id
if self.url is not None:
result['url'] = self.url
if self.extra_info is not None:
result['extraInfo'] = self.extra_info
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('type') is not None:
self.type = m.get('type')
if m.get('conferenceId') is not None:
self.conference_id = m.get('conferenceId')
if m.get('url') is not None:
self.url = m.get('url')
if m.get('extraInfo') is not None:
self.extra_info = m.get('extraInfo')
return self
class CreateEventResponseBody(TeaModel):
def __init__(
self,
id: str = None,
summary: str = None,
description: str = None,
start: CreateEventResponseBodyStart = None,
end: CreateEventResponseBodyEnd = None,
is_all_day: bool = None,
recurrence: CreateEventResponseBodyRecurrence = None,
attendees: List[CreateEventResponseBodyAttendees] = None,
organizer: CreateEventResponseBodyOrganizer = None,
location: CreateEventResponseBodyLocation = None,
reminders: List[CreateEventResponseBodyReminders] = None,
create_time: str = None,
update_time: str = None,
online_meeting_info: CreateEventResponseBodyOnlineMeetingInfo = None,
):
self.id = id
self.summary = summary
self.description = description
# 日程开始时间
self.start = start
self.end = end
self.is_all_day = is_all_day
self.recurrence = recurrence
self.attendees = attendees
self.organizer = organizer
self.location = location
self.reminders = reminders
# 创建时间
self.create_time = create_time
# 更新时间
self.update_time = update_time
self.online_meeting_info = online_meeting_info
def validate(self):
if self.start:
self.start.validate()
if self.end:
self.end.validate()
if self.recurrence:
self.recurrence.validate()
if self.attendees:
for k in self.attendees:
if k:
k.validate()
if self.organizer:
self.organizer.validate()
if self.location:
self.location.validate()
if self.reminders:
for k in self.reminders:
if k:
k.validate()
if self.online_meeting_info:
self.online_meeting_info.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.id is not None:
result['id'] = self.id
if self.summary is not None:
result['summary'] = self.summary
if self.description is not None:
result['description'] = self.description
if self.start is not None:
result['start'] = self.start.to_map()
if self.end is not None:
result['end'] = self.end.to_map()
if self.is_all_day is not None:
result['isAllDay'] = self.is_all_day
if self.recurrence is not None:
result['recurrence'] = self.recurrence.to_map()
result['attendees'] = []
if self.attendees is not None:
for k in self.attendees:
result['attendees'].append(k.to_map() if k else None)
if self.organizer is not None:
result['organizer'] = self.organizer.to_map()
if self.location is not None:
result['location'] = self.location.to_map()
result['reminders'] = []
if self.reminders is not None:
for k in self.reminders:
result['reminders'].append(k.to_map() if k else None)
if self.create_time is not None:
result['createTime'] = self.create_time
if self.update_time is not None:
result['updateTime'] = self.update_time
if self.online_meeting_info is not None:
result['onlineMeetingInfo'] = self.online_meeting_info.to_map()
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('id') is not None:
self.id = m.get('id')
if m.get('summary') is not None:
self.summary = m.get('summary')
if m.get('description') is not None:
self.description = m.get('description')
if m.get('start') is not None:
temp_model = CreateEventResponseBodyStart()
self.start = temp_model.from_map(m['start'])
if m.get('end') is not None:
temp_model = CreateEventResponseBodyEnd()
self.end = temp_model.from_map(m['end'])
if m.get('isAllDay') is not None:
self.is_all_day = m.get('isAllDay')
if m.get('recurrence') is not None:
temp_model = CreateEventResponseBodyRecurrence()
self.recurrence = temp_model.from_map(m['recurrence'])
self.attendees = []
if m.get('attendees') is not None:
for k in m.get('attendees'):
temp_model = CreateEventResponseBodyAttendees()
self.attendees.append(temp_model.from_map(k))
if m.get('organizer') is not None:
temp_model = CreateEventResponseBodyOrganizer()
self.organizer = temp_model.from_map(m['organizer'])
if m.get('location') is not None:
temp_model = CreateEventResponseBodyLocation()
self.location = temp_model.from_map(m['location'])
self.reminders = []
if m.get('reminders') is not None:
for k in m.get('reminders'):
temp_model = CreateEventResponseBodyReminders()
self.reminders.append(temp_model.from_map(k))
if m.get('createTime') is not None:
self.create_time = m.get('createTime')
if m.get('updateTime') is not None:
self.update_time = m.get('updateTime')
if m.get('onlineMeetingInfo') is not None:
temp_model = CreateEventResponseBodyOnlineMeetingInfo()
self.online_meeting_info = temp_model.from_map(m['onlineMeetingInfo'])
return self
class CreateEventResponse(TeaModel):
def __init__(
self,
headers: Dict[str, str] = None,
body: CreateEventResponseBody = None,
):
self.headers = headers
self.body = body
def validate(self):
self.validate_required(self.headers, 'headers')
self.validate_required(self.body, 'body')
if self.body:
self.body.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.headers is not None:
result['headers'] = self.headers
if self.body is not None:
result['body'] = self.body.to_map()
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('headers') is not None:
self.headers = m.get('headers')
if m.get('body') is not None:
temp_model = CreateEventResponseBody()
self.body = temp_model.from_map(m['body'])
return self
class ListCalendarsHeaders(TeaModel):
def __init__(
self,
common_headers: Dict[str, str] = None,
x_acs_dingtalk_access_token: str = None,
):
self.common_headers = common_headers
self.x_acs_dingtalk_access_token = x_acs_dingtalk_access_token
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.common_headers is not None:
result['commonHeaders'] = self.common_headers
if self.x_acs_dingtalk_access_token is not None:
result['x-acs-dingtalk-access-token'] = self.x_acs_dingtalk_access_token
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('commonHeaders') is not None:
self.common_headers = m.get('commonHeaders')
if m.get('x-acs-dingtalk-access-token') is not None:
self.x_acs_dingtalk_access_token = m.get('x-acs-dingtalk-access-token')
return self
class ListCalendarsResponseBodyResponseCalendars(TeaModel):
def __init__(
self,
id: str = None,
summary: str = None,
description: str = None,
time_zone: str = None,
e_tag: str = None,
type: str = None,
privilege: str = None,
):
# 日历id
self.id = id
# 日历标题
self.summary = summary
# 日历描述
self.description = description
# 时区
self.time_zone = time_zone
# Calendar资源的ETag,用于检测该Calendar以及内部的Event是否有被更新
self.e_tag = e_tag
# 日历类型
self.type = type
# 权限信息
self.privilege = privilege
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.id is not None:
result['id'] = self.id
if self.summary is not None:
result['summary'] = self.summary
if self.description is not None:
result['description'] = self.description
if self.time_zone is not None:
result['timeZone'] = self.time_zone
if self.e_tag is not None:
result['eTag'] = self.e_tag
if self.type is not None:
result['type'] = self.type
if self.privilege is not None:
result['privilege'] = self.privilege
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('id') is not None:
self.id = m.get('id')
if m.get('summary') is not None:
self.summary = m.get('summary')
if m.get('description') is not None:
self.description = m.get('description')
if m.get('timeZone') is not None:
self.time_zone = m.get('timeZone')
if m.get('eTag') is not None:
self.e_tag = m.get('eTag')
if m.get('type') is not None:
self.type = m.get('type')
if m.get('privilege') is not None:
self.privilege = m.get('privilege')
return self
class ListCalendarsResponseBodyResponse(TeaModel):
def __init__(
self,
calendars: List[ListCalendarsResponseBodyResponseCalendars] = None,
):
self.calendars = calendars
def validate(self):
if self.calendars:
for k in self.calendars:
if k:
k.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
result['calendars'] = []
if self.calendars is not None:
for k in self.calendars:
result['calendars'].append(k.to_map() if k else None)
return result
def from_map(self, m: dict = None):
m = m or dict()
self.calendars = []
if m.get('calendars') is not None:
for k in m.get('calendars'):
temp_model = ListCalendarsResponseBodyResponseCalendars()
self.calendars.append(temp_model.from_map(k))
return self
class ListCalendarsResponseBody(TeaModel):
def __init__(
self,
response: ListCalendarsResponseBodyResponse = None,
):
# 日历信息
self.response = response
def validate(self):
if self.response:
self.response.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.response is not None:
result['response'] = self.response.to_map()
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('response') is not None:
temp_model = ListCalendarsResponseBodyResponse()
self.response = temp_model.from_map(m['response'])
return self
class ListCalendarsResponse(TeaModel):
def __init__(
self,
headers: Dict[str, str] = None,
body: ListCalendarsResponseBody = None,
):
self.headers = headers
self.body = body
def validate(self):
self.validate_required(self.headers, 'headers')
self.validate_required(self.body, 'body')
if self.body:
self.body.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.headers is not None:
result['headers'] = self.headers
if self.body is not None:
result['body'] = self.body.to_map()
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('headers') is not None:
self.headers = m.get('headers')
if m.get('body') is not None:
temp_model = ListCalendarsResponseBody()
self.body = temp_model.from_map(m['body'])
return self
class GetSignInListHeaders(TeaModel):
def __init__(
self,
common_headers: Dict[str, str] = None,
x_acs_dingtalk_access_token: str = None,
):
self.common_headers = common_headers
self.x_acs_dingtalk_access_token = x_acs_dingtalk_access_token
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.common_headers is not None:
result['commonHeaders'] = self.common_headers
if self.x_acs_dingtalk_access_token is not None:
result['x-acs-dingtalk-access-token'] = self.x_acs_dingtalk_access_token
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('commonHeaders') is not None:
self.common_headers = m.get('commonHeaders')
if m.get('x-acs-dingtalk-access-token') is not None:
self.x_acs_dingtalk_access_token = m.get('x-acs-dingtalk-access-token')
return self
class GetSignInListRequest(TeaModel):
def __init__(
self,
max_results: int = None,
next_token: str = None,
type: str = None,
):
# 查询返回结果数(上限200)
self.max_results = max_results
self.next_token = next_token
# 签到信息类型(check_in,not_yet_check_in)
self.type = type
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.max_results is not None:
result['maxResults'] = self.max_results
if self.next_token is not None:
result['nextToken'] = self.next_token
if self.type is not None:
result['type'] = self.type
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('maxResults') is not None:
self.max_results = m.get('maxResults')
if m.get('nextToken') is not None:
self.next_token = m.get('nextToken')
if m.get('type') is not None:
self.type = m.get('type')
return self
class GetSignInListResponseBodyUsers(TeaModel):
def __init__(
self,
user_id: str = None,
display_name: str = None,
check_in_time: int = None,
):
self.user_id = user_id
# 用户名
self.display_name = display_name
# 签到时间
self.check_in_time = check_in_time
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.user_id is not None:
result['userId'] = self.user_id
if self.display_name is not None:
result['displayName'] = self.display_name
if self.check_in_time is not None:
result['checkInTime'] = self.check_in_time
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('userId') is not None:
self.user_id = m.get('userId')
if m.get('displayName') is not None:
self.display_name = m.get('displayName')
if m.get('checkInTime') is not None:
self.check_in_time = m.get('checkInTime')
return self
class GetSignInListResponseBody(TeaModel):
def __init__(
self,
next_token: str = None,
users: List[GetSignInListResponseBodyUsers] = None,
):
# 翻页token
self.next_token = next_token
# 签到信息
self.users = users
def validate(self):
if self.users:
for k in self.users:
if k:
k.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.next_token is not None:
result['nextToken'] = self.next_token
result['users'] = []
if self.users is not None:
for k in self.users:
result['users'].append(k.to_map() if k else None)
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('nextToken') is not None:
self.next_token = m.get('nextToken')
self.users = []
if m.get('users') is not None:
for k in m.get('users'):
temp_model = GetSignInListResponseBodyUsers()
self.users.append(temp_model.from_map(k))
return self
class GetSignInListResponse(TeaModel):
def __init__(
self,
headers: Dict[str, str] = None,
body: GetSignInListResponseBody = None,
):
self.headers = headers
self.body = body
def validate(self):
self.validate_required(self.headers, 'headers')
self.validate_required(self.body, 'body')
if self.body:
self.body.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.headers is not None:
result['headers'] = self.headers
if self.body is not None:
result['body'] = self.body.to_map()
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('headers') is not None:
self.headers = m.get('headers')
if m.get('body') is not None:
temp_model = GetSignInListResponseBody()
self.body = temp_model.from_map(m['body'])
return self
class DeleteAclHeaders(TeaModel):
def __init__(
self,
common_headers: Dict[str, str] = None,
x_acs_dingtalk_access_token: str = None,
):
self.common_headers = common_headers
self.x_acs_dingtalk_access_token = x_acs_dingtalk_access_token
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.common_headers is not None:
result['commonHeaders'] = self.common_headers
if self.x_acs_dingtalk_access_token is not None:
result['x-acs-dingtalk-access-token'] = self.x_acs_dingtalk_access_token
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('commonHeaders') is not None:
self.common_headers = m.get('commonHeaders')
if m.get('x-acs-dingtalk-access-token') is not None:
self.x_acs_dingtalk_access_token = m.get('x-acs-dingtalk-access-token')
return self
class DeleteAclResponse(TeaModel):
def __init__(
self,
headers: Dict[str, str] = None,
):
self.headers = headers
def validate(self):
self.validate_required(self.headers, 'headers')
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.headers is not None:
result['headers'] = self.headers
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('headers') is not None:
self.headers = m.get('headers')
return self
class DeleteEventHeaders(TeaModel):
def __init__(
self,
common_headers: Dict[str, str] = None,
x_acs_dingtalk_access_token: str = None,
):
self.common_headers = common_headers
self.x_acs_dingtalk_access_token = x_acs_dingtalk_access_token
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.common_headers is not None:
result['commonHeaders'] = self.common_headers
if self.x_acs_dingtalk_access_token is not None:
result['x-acs-dingtalk-access-token'] = self.x_acs_dingtalk_access_token
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('commonHeaders') is not None:
self.common_headers = m.get('commonHeaders')
if m.get('x-acs-dingtalk-access-token') is not None:
self.x_acs_dingtalk_access_token = m.get('x-acs-dingtalk-access-token')
return self
class DeleteEventResponse(TeaModel):
def __init__(
self,
headers: Dict[str, str] = None,
):
self.headers = headers
def validate(self):
self.validate_required(self.headers, 'headers')
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.headers is not None:
result['headers'] = self.headers
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('headers') is not None:
self.headers = m.get('headers')
return self
class ListEventsHeaders(TeaModel):
def __init__(
self,
common_headers: Dict[str, str] = None,
x_acs_dingtalk_access_token: str = None,
):
self.common_headers = common_headers
self.x_acs_dingtalk_access_token = x_acs_dingtalk_access_token
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.common_headers is not None:
result['commonHeaders'] = self.common_headers
if self.x_acs_dingtalk_access_token is not None:
result['x-acs-dingtalk-access-token'] = self.x_acs_dingtalk_access_token
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('commonHeaders') is not None:
self.common_headers = m.get('commonHeaders')
if m.get('x-acs-dingtalk-access-token') is not None:
self.x_acs_dingtalk_access_token = m.get('x-acs-dingtalk-access-token')
return self
class ListEventsRequest(TeaModel):
def __init__(
self,
time_min: str = None,
time_max: str = None,
show_deleted: bool = None,
max_results: int = None,
next_token: str = None,
sync_token: str = None,
):
# 查询开始时间
self.time_min = time_min
# 查询截止时间
self.time_max = time_max
# 是否返回删除事件
self.show_deleted = show_deleted
# 查询返回结果数
self.max_results = max_results
# 查询翻页token
self.next_token = next_token
# 增量查询token
self.sync_token = sync_token
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.time_min is not None:
result['timeMin'] = self.time_min
if self.time_max is not None:
result['timeMax'] = self.time_max
if self.show_deleted is not None:
result['showDeleted'] = self.show_deleted
if self.max_results is not None:
result['maxResults'] = self.max_results
if self.next_token is not None:
result['nextToken'] = self.next_token
if self.sync_token is not None:
result['syncToken'] = self.sync_token
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('timeMin') is not None:
self.time_min = m.get('timeMin')
if m.get('timeMax') is not None:
self.time_max = m.get('timeMax')
if m.get('showDeleted') is not None:
self.show_deleted = m.get('showDeleted')
if m.get('maxResults') is not None:
self.max_results = m.get('maxResults')
if m.get('nextToken') is not None:
self.next_token = m.get('nextToken')
if m.get('syncToken') is not None:
self.sync_token = m.get('syncToken')
return self
class ListEventsResponseBodyEventsStart(TeaModel):
def __init__(
self,
date: str = None,
date_time: str = None,
time_zone: str = None,
):
# 日期,格式:yyyyMMdd
self.date = date
# 时间戳,按照ISO 8601格式
self.date_time = date_time
# 时区
self.time_zone = time_zone
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.date is not None:
result['date'] = self.date
if self.date_time is not None:
result['dateTime'] = self.date_time
if self.time_zone is not None:
result['timeZone'] = self.time_zone
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('date') is not None:
self.date = m.get('date')
if m.get('dateTime') is not None:
self.date_time = m.get('dateTime')
if m.get('timeZone') is not None:
self.time_zone = m.get('timeZone')
return self
class ListEventsResponseBodyEventsEnd(TeaModel):
def __init__(
self,
date: str = None,
date_time: str = None,
time_zone: str = None,
):
self.date = date
self.date_time = date_time
self.time_zone = time_zone
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.date is not None:
result['date'] = self.date
if self.date_time is not None:
result['dateTime'] = self.date_time
if self.time_zone is not None:
result['timeZone'] = self.time_zone
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('date') is not None:
self.date = m.get('date')
if m.get('dateTime') is not None:
self.date_time = m.get('dateTime')
if m.get('timeZone') is not None:
self.time_zone = m.get('timeZone')
return self
class ListEventsResponseBodyEventsRecurrencePattern(TeaModel):
def __init__(
self,
type: str = None,
day_of_month: int = None,
days_of_week: str = None,
index: str = None,
interval: int = None,
):
# 循环模式类型(type: daily, weekly, absoluteMonthly, relativeMonthly, absoluteYearly, relativeYearly)
self.type = type
self.day_of_month = day_of_month
self.days_of_week = days_of_week
self.index = index
self.interval = interval
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.type is not None:
result['type'] = self.type
if self.day_of_month is not None:
result['dayOfMonth'] = self.day_of_month
if self.days_of_week is not None:
result['daysOfWeek'] = self.days_of_week
if self.index is not None:
result['index'] = self.index
if self.interval is not None:
result['interval'] = self.interval
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('type') is not None:
self.type = m.get('type')
if m.get('dayOfMonth') is not None:
self.day_of_month = m.get('dayOfMonth')
if m.get('daysOfWeek') is not None:
self.days_of_week = m.get('daysOfWeek')
if m.get('index') is not None:
self.index = m.get('index')
if m.get('interval') is not None:
self.interval = m.get('interval')
return self
class ListEventsResponseBodyEventsRecurrenceRange(TeaModel):
def __init__(
self,
type: str = None,
end_date: str = None,
number_of_occurrences: int = None,
):
# 范围类型(endDate, noEnd, numbered)
self.type = type
self.end_date = end_date
self.number_of_occurrences = number_of_occurrences
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.type is not None:
result['type'] = self.type
if self.end_date is not None:
result['endDate'] = self.end_date
if self.number_of_occurrences is not None:
result['numberOfOccurrences'] = self.number_of_occurrences
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('type') is not None:
self.type = m.get('type')
if m.get('endDate') is not None:
self.end_date = m.get('endDate')
if m.get('numberOfOccurrences') is not None:
self.number_of_occurrences = m.get('numberOfOccurrences')
return self
class ListEventsResponseBodyEventsRecurrence(TeaModel):
def __init__(
self,
pattern: ListEventsResponseBodyEventsRecurrencePattern = None,
range: ListEventsResponseBodyEventsRecurrenceRange = None,
):
# 重复模式
self.pattern = pattern
# 重复范围
self.range = range
def validate(self):
if self.pattern:
self.pattern.validate()
if self.range:
self.range.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.pattern is not None:
result['pattern'] = self.pattern.to_map()
if self.range is not None:
result['range'] = self.range.to_map()
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('pattern') is not None:
temp_model = ListEventsResponseBodyEventsRecurrencePattern()
self.pattern = temp_model.from_map(m['pattern'])
if m.get('range') is not None:
temp_model = ListEventsResponseBodyEventsRecurrenceRange()
self.range = temp_model.from_map(m['range'])
return self
class ListEventsResponseBodyEventsAttendees(TeaModel):
def __init__(
self,
id: str = None,
display_name: str = None,
response_status: str = None,
self_: bool = None,
):
# 用户id
self.id = id
# 用户名
self.display_name = display_name
# 回复状态
self.response_status = response_status
# 是否是当前登陆用户
self.self_ = self_
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.id is not None:
result['id'] = self.id
if self.display_name is not None:
result['displayName'] = self.display_name
if self.response_status is not None:
result['responseStatus'] = self.response_status
if self.self_ is not None:
result['self'] = self.self_
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('id') is not None:
self.id = m.get('id')
if m.get('displayName') is not None:
self.display_name = m.get('displayName')
if m.get('responseStatus') is not None:
self.response_status = m.get('responseStatus')
if m.get('self') is not None:
self.self_ = m.get('self')
return self
class ListEventsResponseBodyEventsOrganizer(TeaModel):
def __init__(
self,
id: str = None,
display_name: str = None,
response_status: str = None,
self_: bool = None,
):
# 用户id
self.id = id
# 用户名
self.display_name = display_name
# 回复状态
self.response_status = response_status
# 是否是当前登陆用户
self.self_ = self_
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.id is not None:
result['id'] = self.id
if self.display_name is not None:
result['displayName'] = self.display_name
if self.response_status is not None:
result['responseStatus'] = self.response_status
if self.self_ is not None:
result['self'] = self.self_
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('id') is not None:
self.id = m.get('id')
if m.get('displayName') is not None:
self.display_name = m.get('displayName')
if m.get('responseStatus') is not None:
self.response_status = m.get('responseStatus')
if m.get('self') is not None:
self.self_ = m.get('self')
return self
class ListEventsResponseBodyEventsLocation(TeaModel):
def __init__(
self,
display_name: str = None,
):
# 展示名称
self.display_name = display_name
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.display_name is not None:
result['displayName'] = self.display_name
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('displayName') is not None:
self.display_name = m.get('displayName')
return self
class ListEventsResponseBodyEventsOnlineMeetingInfo(TeaModel):
def __init__(
self,
type: str = None,
conference_id: str = None,
url: str = None,
extra_info: Dict[str, Any] = None,
):
self.type = type
self.conference_id = conference_id
self.url = url
self.extra_info = extra_info
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.type is not None:
result['type'] = self.type
if self.conference_id is not None:
result['conferenceId'] = self.conference_id
if self.url is not None:
result['url'] = self.url
if self.extra_info is not None:
result['extraInfo'] = self.extra_info
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('type') is not None:
self.type = m.get('type')
if m.get('conferenceId') is not None:
self.conference_id = m.get('conferenceId')
if m.get('url') is not None:
self.url = m.get('url')
if m.get('extraInfo') is not None:
self.extra_info = m.get('extraInfo')
return self
class ListEventsResponseBodyEventsReminders(TeaModel):
def __init__(
self,
method: str = None,
minutes: str = None,
):
self.method = method
self.minutes = minutes
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.method is not None:
result['method'] = self.method
if self.minutes is not None:
result['minutes'] = self.minutes
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('method') is not None:
self.method = m.get('method')
if m.get('minutes') is not None:
self.minutes = m.get('minutes')
return self
class ListEventsResponseBodyEvents(TeaModel):
def __init__(
self,
id: str = None,
summary: str = None,
description: str = None,
start: ListEventsResponseBodyEventsStart = None,
end: ListEventsResponseBodyEventsEnd = None,
is_all_day: bool = None,
recurrence: ListEventsResponseBodyEventsRecurrence = None,
attendees: List[ListEventsResponseBodyEventsAttendees] = None,
organizer: ListEventsResponseBodyEventsOrganizer = None,
location: ListEventsResponseBodyEventsLocation = None,
series_master_id: str = None,
create_time: str = None,
update_time: str = None,
status: str = None,
online_meeting_info: ListEventsResponseBodyEventsOnlineMeetingInfo = None,
reminders: List[ListEventsResponseBodyEventsReminders] = None,
):
# 日程事件id
self.id = id
# 日程标题
self.summary = summary
# 日程描述
self.description = description
# 日程开始时间
self.start = start
# 日程结束时间
self.end = end
# 是否为全天日程
self.is_all_day = is_all_day
# 日程重复规则
self.recurrence = recurrence
# 日程参与人
self.attendees = attendees
# 日程组织人
self.organizer = organizer
# 日程地点
self.location = location
# 重复日程的主日程id,非重复日程为空
self.series_master_id = series_master_id
# 创建时间
self.create_time = create_time
# 更新时间
self.update_time = update_time
# 日程状态
self.status = status
self.online_meeting_info = online_meeting_info
self.reminders = reminders
def validate(self):
if self.start:
self.start.validate()
if self.end:
self.end.validate()
if self.recurrence:
self.recurrence.validate()
if self.attendees:
for k in self.attendees:
if k:
k.validate()
if self.organizer:
self.organizer.validate()
if self.location:
self.location.validate()
if self.online_meeting_info:
self.online_meeting_info.validate()
if self.reminders:
for k in self.reminders:
if k:
k.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.id is not None:
result['id'] = self.id
if self.summary is not None:
result['summary'] = self.summary
if self.description is not None:
result['description'] = self.description
if self.start is not None:
result['start'] = self.start.to_map()
if self.end is not None:
result['end'] = self.end.to_map()
if self.is_all_day is not None:
result['isAllDay'] = self.is_all_day
if self.recurrence is not None:
result['recurrence'] = self.recurrence.to_map()
result['attendees'] = []
if self.attendees is not None:
for k in self.attendees:
result['attendees'].append(k.to_map() if k else None)
if self.organizer is not None:
result['organizer'] = self.organizer.to_map()
if self.location is not None:
result['location'] = self.location.to_map()
if self.series_master_id is not None:
result['seriesMasterId'] = self.series_master_id
if self.create_time is not None:
result['createTime'] = self.create_time
if self.update_time is not None:
result['updateTime'] = self.update_time
if self.status is not None:
result['status'] = self.status
if self.online_meeting_info is not None:
result['onlineMeetingInfo'] = self.online_meeting_info.to_map()
result['reminders'] = []
if self.reminders is not None:
for k in self.reminders:
result['reminders'].append(k.to_map() if k else None)
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('id') is not None:
self.id = m.get('id')
if m.get('summary') is not None:
self.summary = m.get('summary')
if m.get('description') is not None:
self.description = m.get('description')
if m.get('start') is not None:
temp_model = ListEventsResponseBodyEventsStart()
self.start = temp_model.from_map(m['start'])
if m.get('end') is not None:
temp_model = ListEventsResponseBodyEventsEnd()
self.end = temp_model.from_map(m['end'])
if m.get('isAllDay') is not None:
self.is_all_day = m.get('isAllDay')
if m.get('recurrence') is not None:
temp_model = ListEventsResponseBodyEventsRecurrence()
self.recurrence = temp_model.from_map(m['recurrence'])
self.attendees = []
if m.get('attendees') is not None:
for k in m.get('attendees'):
temp_model = ListEventsResponseBodyEventsAttendees()
self.attendees.append(temp_model.from_map(k))
if m.get('organizer') is not None:
temp_model = ListEventsResponseBodyEventsOrganizer()
self.organizer = temp_model.from_map(m['organizer'])
if m.get('location') is not None:
temp_model = ListEventsResponseBodyEventsLocation()
self.location = temp_model.from_map(m['location'])
if m.get('seriesMasterId') is not None:
self.series_master_id = m.get('seriesMasterId')
if m.get('createTime') is not None:
self.create_time = m.get('createTime')
if m.get('updateTime') is not None:
self.update_time = m.get('updateTime')
if m.get('status') is not None:
self.status = m.get('status')
if m.get('onlineMeetingInfo') is not None:
temp_model = ListEventsResponseBodyEventsOnlineMeetingInfo()
self.online_meeting_info = temp_model.from_map(m['onlineMeetingInfo'])
self.reminders = []
if m.get('reminders') is not None:
for k in m.get('reminders'):
temp_model = ListEventsResponseBodyEventsReminders()
self.reminders.append(temp_model.from_map(k))
return self
class ListEventsResponseBody(TeaModel):
def __init__(
self,
next_token: str = None,
events: List[ListEventsResponseBodyEvents] = None,
sync_token: str = None,
):
# 翻页token
self.next_token = next_token
# 日程
self.events = events
# 增量同步token
self.sync_token = sync_token
def validate(self):
if self.events:
for k in self.events:
if k:
k.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.next_token is not None:
result['nextToken'] = self.next_token
result['events'] = []
if self.events is not None:
for k in self.events:
result['events'].append(k.to_map() if k else None)
if self.sync_token is not None:
result['syncToken'] = self.sync_token
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('nextToken') is not None:
self.next_token = m.get('nextToken')
self.events = []
if m.get('events') is not None:
for k in m.get('events'):
temp_model = ListEventsResponseBodyEvents()
self.events.append(temp_model.from_map(k))
if m.get('syncToken') is not None:
self.sync_token = m.get('syncToken')
return self
class ListEventsResponse(TeaModel):
def __init__(
self,
headers: Dict[str, str] = None,
body: ListEventsResponseBody = None,
):
self.headers = headers
self.body = body
def validate(self):
self.validate_required(self.headers, 'headers')
self.validate_required(self.body, 'body')
if self.body:
self.body.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.headers is not None:
result['headers'] = self.headers
if self.body is not None:
result['body'] = self.body.to_map()
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('headers') is not None:
self.headers = m.get('headers')
if m.get('body') is not None:
temp_model = ListEventsResponseBody()
self.body = temp_model.from_map(m['body'])
return self
class ListEventsViewHeaders(TeaModel):
def __init__(
self,
common_headers: Dict[str, str] = None,
x_acs_dingtalk_access_token: str = None,
):
self.common_headers = common_headers
self.x_acs_dingtalk_access_token = x_acs_dingtalk_access_token
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.common_headers is not None:
result['commonHeaders'] = self.common_headers
if self.x_acs_dingtalk_access_token is not None:
result['x-acs-dingtalk-access-token'] = self.x_acs_dingtalk_access_token
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('commonHeaders') is not None:
self.common_headers = m.get('commonHeaders')
if m.get('x-acs-dingtalk-access-token') is not None:
self.x_acs_dingtalk_access_token = m.get('x-acs-dingtalk-access-token')
return self
class ListEventsViewRequest(TeaModel):
def __init__(
self,
time_min: str = None,
time_max: str = None,
max_results: int = None,
next_token: str = None,
):
# 查询开始时间
self.time_min = time_min
# 查询截止时间
self.time_max = time_max
# 查询返回结果数
self.max_results = max_results
# 查询翻页token
self.next_token = next_token
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.time_min is not None:
result['timeMin'] = self.time_min
if self.time_max is not None:
result['timeMax'] = self.time_max
if self.max_results is not None:
result['maxResults'] = self.max_results
if self.next_token is not None:
result['nextToken'] = self.next_token
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('timeMin') is not None:
self.time_min = m.get('timeMin')
if m.get('timeMax') is not None:
self.time_max = m.get('timeMax')
if m.get('maxResults') is not None:
self.max_results = m.get('maxResults')
if m.get('nextToken') is not None:
self.next_token = m.get('nextToken')
return self
class ListEventsViewResponseBodyEventsStart(TeaModel):
def __init__(
self,
date: str = None,
date_time: str = None,
time_zone: str = None,
):
# 日期,格式:yyyyMMdd
self.date = date
# 时间戳,按照ISO 8601格式
self.date_time = date_time
# 时区
self.time_zone = time_zone
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.date is not None:
result['date'] = self.date
if self.date_time is not None:
result['dateTime'] = self.date_time
if self.time_zone is not None:
result['timeZone'] = self.time_zone
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('date') is not None:
self.date = m.get('date')
if m.get('dateTime') is not None:
self.date_time = m.get('dateTime')
if m.get('timeZone') is not None:
self.time_zone = m.get('timeZone')
return self
class ListEventsViewResponseBodyEventsEnd(TeaModel):
def __init__(
self,
date: str = None,
date_time: str = None,
time_zone: str = None,
):
self.date = date
self.date_time = date_time
self.time_zone = time_zone
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.date is not None:
result['date'] = self.date
if self.date_time is not None:
result['dateTime'] = self.date_time
if self.time_zone is not None:
result['timeZone'] = self.time_zone
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('date') is not None:
self.date = m.get('date')
if m.get('dateTime') is not None:
self.date_time = m.get('dateTime')
if m.get('timeZone') is not None:
self.time_zone = m.get('timeZone')
return self
class ListEventsViewResponseBodyEventsRecurrencePattern(TeaModel):
def __init__(
self,
type: str = None,
day_of_month: int = None,
days_of_week: str = None,
index: str = None,
interval: int = None,
):
# 循环模式类型(type: daily, weekly, absoluteMonthly, relativeMonthly, absoluteYearly, relativeYearly)
self.type = type
self.day_of_month = day_of_month
self.days_of_week = days_of_week
self.index = index
self.interval = interval
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.type is not None:
result['type'] = self.type
if self.day_of_month is not None:
result['dayOfMonth'] = self.day_of_month
if self.days_of_week is not None:
result['daysOfWeek'] = self.days_of_week
if self.index is not None:
result['index'] = self.index
if self.interval is not None:
result['interval'] = self.interval
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('type') is not None:
self.type = m.get('type')
if m.get('dayOfMonth') is not None:
self.day_of_month = m.get('dayOfMonth')
if m.get('daysOfWeek') is not None:
self.days_of_week = m.get('daysOfWeek')
if m.get('index') is not None:
self.index = m.get('index')
if m.get('interval') is not None:
self.interval = m.get('interval')
return self
class ListEventsViewResponseBodyEventsRecurrenceRange(TeaModel):
def __init__(
self,
type: str = None,
end_date: str = None,
number_of_occurrences: int = None,
):
# 范围类型(endDate, noEnd, numbered)
self.type = type
self.end_date = end_date
self.number_of_occurrences = number_of_occurrences
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.type is not None:
result['type'] = self.type
if self.end_date is not None:
result['endDate'] = self.end_date
if self.number_of_occurrences is not None:
result['numberOfOccurrences'] = self.number_of_occurrences
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('type') is not None:
self.type = m.get('type')
if m.get('endDate') is not None:
self.end_date = m.get('endDate')
if m.get('numberOfOccurrences') is not None:
self.number_of_occurrences = m.get('numberOfOccurrences')
return self
class ListEventsViewResponseBodyEventsRecurrence(TeaModel):
def __init__(
self,
pattern: ListEventsViewResponseBodyEventsRecurrencePattern = None,
range: ListEventsViewResponseBodyEventsRecurrenceRange = None,
):
# 重复模式
self.pattern = pattern
# 重复范围
self.range = range
def validate(self):
if self.pattern:
self.pattern.validate()
if self.range:
self.range.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.pattern is not None:
result['pattern'] = self.pattern.to_map()
if self.range is not None:
result['range'] = self.range.to_map()
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('pattern') is not None:
temp_model = ListEventsViewResponseBodyEventsRecurrencePattern()
self.pattern = temp_model.from_map(m['pattern'])
if m.get('range') is not None:
temp_model = ListEventsViewResponseBodyEventsRecurrenceRange()
self.range = temp_model.from_map(m['range'])
return self
class ListEventsViewResponseBodyEventsAttendees(TeaModel):
def __init__(
self,
id: str = None,
display_name: str = None,
response_status: str = None,
self_: bool = None,
):
# 用户id
self.id = id
# 用户名
self.display_name = display_name
# 回复状态
self.response_status = response_status
# 是否是当前登陆用户
self.self_ = self_
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.id is not None:
result['id'] = self.id
if self.display_name is not None:
result['displayName'] = self.display_name
if self.response_status is not None:
result['responseStatus'] = self.response_status
if self.self_ is not None:
result['self'] = self.self_
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('id') is not None:
self.id = m.get('id')
if m.get('displayName') is not None:
self.display_name = m.get('displayName')
if m.get('responseStatus') is not None:
self.response_status = m.get('responseStatus')
if m.get('self') is not None:
self.self_ = m.get('self')
return self
class ListEventsViewResponseBodyEventsOrganizer(TeaModel):
def __init__(
self,
id: str = None,
display_name: str = None,
response_status: str = None,
self_: bool = None,
):
# 用户id
self.id = id
# 用户名
self.display_name = display_name
# 回复状态
self.response_status = response_status
# 是否是当前登陆用户
self.self_ = self_
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.id is not None:
result['id'] = self.id
if self.display_name is not None:
result['displayName'] = self.display_name
if self.response_status is not None:
result['responseStatus'] = self.response_status
if self.self_ is not None:
result['self'] = self.self_
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('id') is not None:
self.id = m.get('id')
if m.get('displayName') is not None:
self.display_name = m.get('displayName')
if m.get('responseStatus') is not None:
self.response_status = m.get('responseStatus')
if m.get('self') is not None:
self.self_ = m.get('self')
return self
class ListEventsViewResponseBodyEventsLocation(TeaModel):
def __init__(
self,
display_name: str = None,
):
# 展示名称
self.display_name = display_name
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.display_name is not None:
result['displayName'] = self.display_name
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('displayName') is not None:
self.display_name = m.get('displayName')
return self
class ListEventsViewResponseBodyEventsOnlineMeetingInfo(TeaModel):
def __init__(
self,
type: str = None,
conference_id: str = None,
url: str = None,
extra_info: Dict[str, Any] = None,
):
self.type = type
self.conference_id = conference_id
self.url = url
self.extra_info = extra_info
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.type is not None:
result['type'] = self.type
if self.conference_id is not None:
result['conferenceId'] = self.conference_id
if self.url is not None:
result['url'] = self.url
if self.extra_info is not None:
result['extraInfo'] = self.extra_info
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('type') is not None:
self.type = m.get('type')
if m.get('conferenceId') is not None:
self.conference_id = m.get('conferenceId')
if m.get('url') is not None:
self.url = m.get('url')
if m.get('extraInfo') is not None:
self.extra_info = m.get('extraInfo')
return self
class ListEventsViewResponseBodyEvents(TeaModel):
def __init__(
self,
id: str = None,
summary: str = None,
description: str = None,
start: ListEventsViewResponseBodyEventsStart = None,
end: ListEventsViewResponseBodyEventsEnd = None,
is_all_day: bool = None,
recurrence: ListEventsViewResponseBodyEventsRecurrence = None,
attendees: List[ListEventsViewResponseBodyEventsAttendees] = None,
organizer: ListEventsViewResponseBodyEventsOrganizer = None,
location: ListEventsViewResponseBodyEventsLocation = None,
series_master_id: str = None,
create_time: str = None,
update_time: str = None,
status: str = None,
online_meeting_info: ListEventsViewResponseBodyEventsOnlineMeetingInfo = None,
):
# 日程事件id
self.id = id
# 日程标题
self.summary = summary
# 日程描述
self.description = description
# 日程开始时间
self.start = start
# 日程结束时间
self.end = end
# 是否为全天日程
self.is_all_day = is_all_day
# 日程重复规则
self.recurrence = recurrence
# 日程参与人
self.attendees = attendees
# 日程组织人
self.organizer = organizer
# 日程地点
self.location = location
# 重复日程的主日程id,非重复日程为空
self.series_master_id = series_master_id
# 创建时间
self.create_time = create_time
# 更新时间
self.update_time = update_time
# 日程状态
self.status = status
self.online_meeting_info = online_meeting_info
def validate(self):
if self.start:
self.start.validate()
if self.end:
self.end.validate()
if self.recurrence:
self.recurrence.validate()
if self.attendees:
for k in self.attendees:
if k:
k.validate()
if self.organizer:
self.organizer.validate()
if self.location:
self.location.validate()
if self.online_meeting_info:
self.online_meeting_info.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.id is not None:
result['id'] = self.id
if self.summary is not None:
result['summary'] = self.summary
if self.description is not None:
result['description'] = self.description
if self.start is not None:
result['start'] = self.start.to_map()
if self.end is not None:
result['end'] = self.end.to_map()
if self.is_all_day is not None:
result['isAllDay'] = self.is_all_day
if self.recurrence is not None:
result['recurrence'] = self.recurrence.to_map()
result['attendees'] = []
if self.attendees is not None:
for k in self.attendees:
result['attendees'].append(k.to_map() if k else None)
if self.organizer is not None:
result['organizer'] = self.organizer.to_map()
if self.location is not None:
result['location'] = self.location.to_map()
if self.series_master_id is not None:
result['seriesMasterId'] = self.series_master_id
if self.create_time is not None:
result['createTime'] = self.create_time
if self.update_time is not None:
result['updateTime'] = self.update_time
if self.status is not None:
result['status'] = self.status
if self.online_meeting_info is not None:
result['onlineMeetingInfo'] = self.online_meeting_info.to_map()
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('id') is not None:
self.id = m.get('id')
if m.get('summary') is not None:
self.summary = m.get('summary')
if m.get('description') is not None:
self.description = m.get('description')
if m.get('start') is not None:
temp_model = ListEventsViewResponseBodyEventsStart()
self.start = temp_model.from_map(m['start'])
if m.get('end') is not None:
temp_model = ListEventsViewResponseBodyEventsEnd()
self.end = temp_model.from_map(m['end'])
if m.get('isAllDay') is not None:
self.is_all_day = m.get('isAllDay')
if m.get('recurrence') is not None:
temp_model = ListEventsViewResponseBodyEventsRecurrence()
self.recurrence = temp_model.from_map(m['recurrence'])
self.attendees = []
if m.get('attendees') is not None:
for k in m.get('attendees'):
temp_model = ListEventsViewResponseBodyEventsAttendees()
self.attendees.append(temp_model.from_map(k))
if m.get('organizer') is not None:
temp_model = ListEventsViewResponseBodyEventsOrganizer()
self.organizer = temp_model.from_map(m['organizer'])
if m.get('location') is not None:
temp_model = ListEventsViewResponseBodyEventsLocation()
self.location = temp_model.from_map(m['location'])
if m.get('seriesMasterId') is not None:
self.series_master_id = m.get('seriesMasterId')
if m.get('createTime') is not None:
self.create_time = m.get('createTime')
if m.get('updateTime') is not None:
self.update_time = m.get('updateTime')
if m.get('status') is not None:
self.status = m.get('status')
if m.get('onlineMeetingInfo') is not None:
temp_model = ListEventsViewResponseBodyEventsOnlineMeetingInfo()
self.online_meeting_info = temp_model.from_map(m['onlineMeetingInfo'])
return self
class ListEventsViewResponseBody(TeaModel):
def __init__(
self,
next_token: str = None,
events: List[ListEventsViewResponseBodyEvents] = None,
):
# 翻页token
self.next_token = next_token
# 日程
self.events = events
def validate(self):
if self.events:
for k in self.events:
if k:
k.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.next_token is not None:
result['nextToken'] = self.next_token
result['events'] = []
if self.events is not None:
for k in self.events:
result['events'].append(k.to_map() if k else None)
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('nextToken') is not None:
self.next_token = m.get('nextToken')
self.events = []
if m.get('events') is not None:
for k in m.get('events'):
temp_model = ListEventsViewResponseBodyEvents()
self.events.append(temp_model.from_map(k))
return self
class ListEventsViewResponse(TeaModel):
def __init__(
self,
headers: Dict[str, str] = None,
body: ListEventsViewResponseBody = None,
):
self.headers = headers
self.body = body
def validate(self):
self.validate_required(self.headers, 'headers')
self.validate_required(self.body, 'body')
if self.body:
self.body.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.headers is not None:
result['headers'] = self.headers
if self.body is not None:
result['body'] = self.body.to_map()
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('headers') is not None:
self.headers = m.get('headers')
if m.get('body') is not None:
temp_model = ListEventsViewResponseBody()
self.body = temp_model.from_map(m['body'])
return self
class SignInHeaders(TeaModel):
def __init__(
self,
common_headers: Dict[str, str] = None,
x_acs_dingtalk_access_token: str = None,
):
self.common_headers = common_headers
self.x_acs_dingtalk_access_token = x_acs_dingtalk_access_token
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.common_headers is not None:
result['commonHeaders'] = self.common_headers
if self.x_acs_dingtalk_access_token is not None:
result['x-acs-dingtalk-access-token'] = self.x_acs_dingtalk_access_token
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('commonHeaders') is not None:
self.common_headers = m.get('commonHeaders')
if m.get('x-acs-dingtalk-access-token') is not None:
self.x_acs_dingtalk_access_token = m.get('x-acs-dingtalk-access-token')
return self
class SignInResponseBody(TeaModel):
def __init__(
self,
check_in_time: int = None,
):
# 签到时间戳
self.check_in_time = check_in_time
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.check_in_time is not None:
result['checkInTime'] = self.check_in_time
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('checkInTime') is not None:
self.check_in_time = m.get('checkInTime')
return self
class SignInResponse(TeaModel):
def __init__(
self,
headers: Dict[str, str] = None,
body: SignInResponseBody = None,
):
self.headers = headers
self.body = body
def validate(self):
self.validate_required(self.headers, 'headers')
self.validate_required(self.body, 'body')
if self.body:
self.body.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.headers is not None:
result['headers'] = self.headers
if self.body is not None:
result['body'] = self.body.to_map()
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('headers') is not None:
self.headers = m.get('headers')
if m.get('body') is not None:
temp_model = SignInResponseBody()
self.body = temp_model.from_map(m['body'])
return self
class GetEventHeaders(TeaModel):
def __init__(
self,
common_headers: Dict[str, str] = None,
x_acs_dingtalk_access_token: str = None,
):
self.common_headers = common_headers
self.x_acs_dingtalk_access_token = x_acs_dingtalk_access_token
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.common_headers is not None:
result['commonHeaders'] = self.common_headers
if self.x_acs_dingtalk_access_token is not None:
result['x-acs-dingtalk-access-token'] = self.x_acs_dingtalk_access_token
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('commonHeaders') is not None:
self.common_headers = m.get('commonHeaders')
if m.get('x-acs-dingtalk-access-token') is not None:
self.x_acs_dingtalk_access_token = m.get('x-acs-dingtalk-access-token')
return self
class GetEventRequest(TeaModel):
def __init__(
self,
max_attendees: int = None,
):
# 返回参与人,上限500人,默认为0
self.max_attendees = max_attendees
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.max_attendees is not None:
result['maxAttendees'] = self.max_attendees
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('maxAttendees') is not None:
self.max_attendees = m.get('maxAttendees')
return self
class GetEventResponseBodyStart(TeaModel):
def __init__(
self,
date: str = None,
date_time: str = None,
time_zone: str = None,
):
# 日期,格式:yyyyMMdd
self.date = date
# 时间戳,按照ISO 8601格式
self.date_time = date_time
# 时区
self.time_zone = time_zone
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.date is not None:
result['date'] = self.date
if self.date_time is not None:
result['dateTime'] = self.date_time
if self.time_zone is not None:
result['timeZone'] = self.time_zone
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('date') is not None:
self.date = m.get('date')
if m.get('dateTime') is not None:
self.date_time = m.get('dateTime')
if m.get('timeZone') is not None:
self.time_zone = m.get('timeZone')
return self
class GetEventResponseBodyEnd(TeaModel):
def __init__(
self,
date: str = None,
date_time: str = None,
time_zone: str = None,
):
self.date = date
self.date_time = date_time
self.time_zone = time_zone
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.date is not None:
result['date'] = self.date
if self.date_time is not None:
result['dateTime'] = self.date_time
if self.time_zone is not None:
result['timeZone'] = self.time_zone
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('date') is not None:
self.date = m.get('date')
if m.get('dateTime') is not None:
self.date_time = m.get('dateTime')
if m.get('timeZone') is not None:
self.time_zone = m.get('timeZone')
return self
class GetEventResponseBodyRecurrencePattern(TeaModel):
def __init__(
self,
type: str = None,
day_of_month: int = None,
days_of_week: str = None,
index: str = None,
interval: int = None,
):
# 循环模式类型(type: daily, weekly, absoluteMonthly, relativeMonthly, absoluteYearly, relativeYearly)
self.type = type
self.day_of_month = day_of_month
self.days_of_week = days_of_week
self.index = index
self.interval = interval
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.type is not None:
result['type'] = self.type
if self.day_of_month is not None:
result['dayOfMonth'] = self.day_of_month
if self.days_of_week is not None:
result['daysOfWeek'] = self.days_of_week
if self.index is not None:
result['index'] = self.index
if self.interval is not None:
result['interval'] = self.interval
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('type') is not None:
self.type = m.get('type')
if m.get('dayOfMonth') is not None:
self.day_of_month = m.get('dayOfMonth')
if m.get('daysOfWeek') is not None:
self.days_of_week = m.get('daysOfWeek')
if m.get('index') is not None:
self.index = m.get('index')
if m.get('interval') is not None:
self.interval = m.get('interval')
return self
class GetEventResponseBodyRecurrenceRange(TeaModel):
def __init__(
self,
type: str = None,
end_date: str = None,
number_of_occurrences: int = None,
):
# 范围类型(endDate, noEnd, numbered)
self.type = type
self.end_date = end_date
self.number_of_occurrences = number_of_occurrences
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.type is not None:
result['type'] = self.type
if self.end_date is not None:
result['endDate'] = self.end_date
if self.number_of_occurrences is not None:
result['numberOfOccurrences'] = self.number_of_occurrences
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('type') is not None:
self.type = m.get('type')
if m.get('endDate') is not None:
self.end_date = m.get('endDate')
if m.get('numberOfOccurrences') is not None:
self.number_of_occurrences = m.get('numberOfOccurrences')
return self
class GetEventResponseBodyRecurrence(TeaModel):
def __init__(
self,
pattern: GetEventResponseBodyRecurrencePattern = None,
range: GetEventResponseBodyRecurrenceRange = None,
):
# 重复模式
self.pattern = pattern
# 重复范围
self.range = range
def validate(self):
if self.pattern:
self.pattern.validate()
if self.range:
self.range.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.pattern is not None:
result['pattern'] = self.pattern.to_map()
if self.range is not None:
result['range'] = self.range.to_map()
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('pattern') is not None:
temp_model = GetEventResponseBodyRecurrencePattern()
self.pattern = temp_model.from_map(m['pattern'])
if m.get('range') is not None:
temp_model = GetEventResponseBodyRecurrenceRange()
self.range = temp_model.from_map(m['range'])
return self
class GetEventResponseBodyAttendees(TeaModel):
def __init__(
self,
id: str = None,
display_name: str = None,
response_status: str = None,
self_: bool = None,
):
self.id = id
# 用户名
self.display_name = display_name
# 回复状态
self.response_status = response_status
# 是否是当前登陆用户
self.self_ = self_
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.id is not None:
result['id'] = self.id
if self.display_name is not None:
result['displayName'] = self.display_name
if self.response_status is not None:
result['responseStatus'] = self.response_status
if self.self_ is not None:
result['self'] = self.self_
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('id') is not None:
self.id = m.get('id')
if m.get('displayName') is not None:
self.display_name = m.get('displayName')
if m.get('responseStatus') is not None:
self.response_status = m.get('responseStatus')
if m.get('self') is not None:
self.self_ = m.get('self')
return self
class GetEventResponseBodyOrganizer(TeaModel):
def __init__(
self,
id: str = None,
display_name: str = None,
response_status: str = None,
self_: bool = None,
):
self.id = id
# 用户名
self.display_name = display_name
# 回复状态
self.response_status = response_status
# 是否是当前登陆用户
self.self_ = self_
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.id is not None:
result['id'] = self.id
if self.display_name is not None:
result['displayName'] = self.display_name
if self.response_status is not None:
result['responseStatus'] = self.response_status
if self.self_ is not None:
result['self'] = self.self_
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('id') is not None:
self.id = m.get('id')
if m.get('displayName') is not None:
self.display_name = m.get('displayName')
if m.get('responseStatus') is not None:
self.response_status = m.get('responseStatus')
if m.get('self') is not None:
self.self_ = m.get('self')
return self
class GetEventResponseBodyLocation(TeaModel):
def __init__(
self,
display_name: str = None,
):
self.display_name = display_name
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.display_name is not None:
result['displayName'] = self.display_name
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('displayName') is not None:
self.display_name = m.get('displayName')
return self
class GetEventResponseBodyReminders(TeaModel):
def __init__(
self,
method: str = None,
minutes: str = None,
):
self.method = method
self.minutes = minutes
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.method is not None:
result['method'] = self.method
if self.minutes is not None:
result['minutes'] = self.minutes
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('method') is not None:
self.method = m.get('method')
if m.get('minutes') is not None:
self.minutes = m.get('minutes')
return self
class GetEventResponseBodyOnlineMeetingInfo(TeaModel):
def __init__(
self,
type: str = None,
conference_id: str = None,
url: str = None,
extra_info: Dict[str, Any] = None,
):
self.type = type
self.conference_id = conference_id
self.url = url
self.extra_info = extra_info
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.type is not None:
result['type'] = self.type
if self.conference_id is not None:
result['conferenceId'] = self.conference_id
if self.url is not None:
result['url'] = self.url
if self.extra_info is not None:
result['extraInfo'] = self.extra_info
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('type') is not None:
self.type = m.get('type')
if m.get('conferenceId') is not None:
self.conference_id = m.get('conferenceId')
if m.get('url') is not None:
self.url = m.get('url')
if m.get('extraInfo') is not None:
self.extra_info = m.get('extraInfo')
return self
class GetEventResponseBody(TeaModel):
def __init__(
self,
id: str = None,
summary: str = None,
description: str = None,
status: str = None,
start: GetEventResponseBodyStart = None,
end: GetEventResponseBodyEnd = None,
is_all_day: bool = None,
recurrence: GetEventResponseBodyRecurrence = None,
attendees: List[GetEventResponseBodyAttendees] = None,
organizer: GetEventResponseBodyOrganizer = None,
location: GetEventResponseBodyLocation = None,
series_master_id: str = None,
create_time: str = None,
update_time: str = None,
reminders: List[GetEventResponseBodyReminders] = None,
online_meeting_info: GetEventResponseBodyOnlineMeetingInfo = None,
):
self.id = id
# 日程标题
self.summary = summary
# 日程描述
self.description = description
# 日程状态
self.status = status
# 日程开始时间
self.start = start
# 日程结束时间
self.end = end
# 是否为全天日程
self.is_all_day = is_all_day
self.recurrence = recurrence
self.attendees = attendees
self.organizer = organizer
self.location = location
# 重复日程的主日程id,非重复日程为空
self.series_master_id = series_master_id
# 创建时间
self.create_time = create_time
# 更新时间
self.update_time = update_time
self.reminders = reminders
self.online_meeting_info = online_meeting_info
def validate(self):
if self.start:
self.start.validate()
if self.end:
self.end.validate()
if self.recurrence:
self.recurrence.validate()
if self.attendees:
for k in self.attendees:
if k:
k.validate()
if self.organizer:
self.organizer.validate()
if self.location:
self.location.validate()
if self.reminders:
for k in self.reminders:
if k:
k.validate()
if self.online_meeting_info:
self.online_meeting_info.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.id is not None:
result['id'] = self.id
if self.summary is not None:
result['summary'] = self.summary
if self.description is not None:
result['description'] = self.description
if self.status is not None:
result['status'] = self.status
if self.start is not None:
result['start'] = self.start.to_map()
if self.end is not None:
result['end'] = self.end.to_map()
if self.is_all_day is not None:
result['isAllDay'] = self.is_all_day
if self.recurrence is not None:
result['recurrence'] = self.recurrence.to_map()
result['attendees'] = []
if self.attendees is not None:
for k in self.attendees:
result['attendees'].append(k.to_map() if k else None)
if self.organizer is not None:
result['organizer'] = self.organizer.to_map()
if self.location is not None:
result['location'] = self.location.to_map()
if self.series_master_id is not None:
result['seriesMasterId'] = self.series_master_id
if self.create_time is not None:
result['createTime'] = self.create_time
if self.update_time is not None:
result['updateTime'] = self.update_time
result['reminders'] = []
if self.reminders is not None:
for k in self.reminders:
result['reminders'].append(k.to_map() if k else None)
if self.online_meeting_info is not None:
result['onlineMeetingInfo'] = self.online_meeting_info.to_map()
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('id') is not None:
self.id = m.get('id')
if m.get('summary') is not None:
self.summary = m.get('summary')
if m.get('description') is not None:
self.description = m.get('description')
if m.get('status') is not None:
self.status = m.get('status')
if m.get('start') is not None:
temp_model = GetEventResponseBodyStart()
self.start = temp_model.from_map(m['start'])
if m.get('end') is not None:
temp_model = GetEventResponseBodyEnd()
self.end = temp_model.from_map(m['end'])
if m.get('isAllDay') is not None:
self.is_all_day = m.get('isAllDay')
if m.get('recurrence') is not None:
temp_model = GetEventResponseBodyRecurrence()
self.recurrence = temp_model.from_map(m['recurrence'])
self.attendees = []
if m.get('attendees') is not None:
for k in m.get('attendees'):
temp_model = GetEventResponseBodyAttendees()
self.attendees.append(temp_model.from_map(k))
if m.get('organizer') is not None:
temp_model = GetEventResponseBodyOrganizer()
self.organizer = temp_model.from_map(m['organizer'])
if m.get('location') is not None:
temp_model = GetEventResponseBodyLocation()
self.location = temp_model.from_map(m['location'])
if m.get('seriesMasterId') is not None:
self.series_master_id = m.get('seriesMasterId')
if m.get('createTime') is not None:
self.create_time = m.get('createTime')
if m.get('updateTime') is not None:
self.update_time = m.get('updateTime')
self.reminders = []
if m.get('reminders') is not None:
for k in m.get('reminders'):
temp_model = GetEventResponseBodyReminders()
self.reminders.append(temp_model.from_map(k))
if m.get('onlineMeetingInfo') is not None:
temp_model = GetEventResponseBodyOnlineMeetingInfo()
self.online_meeting_info = temp_model.from_map(m['onlineMeetingInfo'])
return self
class GetEventResponse(TeaModel):
def __init__(
self,
headers: Dict[str, str] = None,
body: GetEventResponseBody = None,
):
self.headers = headers
self.body = body
def validate(self):
self.validate_required(self.headers, 'headers')
self.validate_required(self.body, 'body')
if self.body:
self.body.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.headers is not None:
result['headers'] = self.headers
if self.body is not None:
result['body'] = self.body.to_map()
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('headers') is not None:
self.headers = m.get('headers')
if m.get('body') is not None:
temp_model = GetEventResponseBody()
self.body = temp_model.from_map(m['body'])
return self
class CheckInHeaders(TeaModel):
def __init__(
self,
common_headers: Dict[str, str] = None,
x_acs_dingtalk_access_token: str = None,
):
self.common_headers = common_headers
self.x_acs_dingtalk_access_token = x_acs_dingtalk_access_token
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.common_headers is not None:
result['commonHeaders'] = self.common_headers
if self.x_acs_dingtalk_access_token is not None:
result['x-acs-dingtalk-access-token'] = self.x_acs_dingtalk_access_token
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('commonHeaders') is not None:
self.common_headers = m.get('commonHeaders')
if m.get('x-acs-dingtalk-access-token') is not None:
self.x_acs_dingtalk_access_token = m.get('x-acs-dingtalk-access-token')
return self
class CheckInResponseBody(TeaModel):
def __init__(
self,
check_in_time: int = None,
):
# 签到时间戳
self.check_in_time = check_in_time
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.check_in_time is not None:
result['checkInTime'] = self.check_in_time
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('checkInTime') is not None:
self.check_in_time = m.get('checkInTime')
return self
class CheckInResponse(TeaModel):
def __init__(
self,
headers: Dict[str, str] = None,
body: CheckInResponseBody = None,
):
self.headers = headers
self.body = body
def validate(self):
self.validate_required(self.headers, 'headers')
self.validate_required(self.body, 'body')
if self.body:
self.body.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.headers is not None:
result['headers'] = self.headers
if self.body is not None:
result['body'] = self.body.to_map()
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('headers') is not None:
self.headers = m.get('headers')
if m.get('body') is not None:
temp_model = CheckInResponseBody()
self.body = temp_model.from_map(m['body'])
return self
class PatchEventHeaders(TeaModel):
def __init__(
self,
common_headers: Dict[str, str] = None,
x_acs_dingtalk_access_token: str = None,
):
self.common_headers = common_headers
self.x_acs_dingtalk_access_token = x_acs_dingtalk_access_token
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.common_headers is not None:
result['commonHeaders'] = self.common_headers
if self.x_acs_dingtalk_access_token is not None:
result['x-acs-dingtalk-access-token'] = self.x_acs_dingtalk_access_token
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('commonHeaders') is not None:
self.common_headers = m.get('commonHeaders')
if m.get('x-acs-dingtalk-access-token') is not None:
self.x_acs_dingtalk_access_token = m.get('x-acs-dingtalk-access-token')
return self
class PatchEventRequestStart(TeaModel):
def __init__(
self,
date: str = None,
date_time: str = None,
time_zone: str = None,
):
self.date = date
self.date_time = date_time
self.time_zone = time_zone
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.date is not None:
result['date'] = self.date
if self.date_time is not None:
result['dateTime'] = self.date_time
if self.time_zone is not None:
result['timeZone'] = self.time_zone
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('date') is not None:
self.date = m.get('date')
if m.get('dateTime') is not None:
self.date_time = m.get('dateTime')
if m.get('timeZone') is not None:
self.time_zone = m.get('timeZone')
return self
class PatchEventRequestEnd(TeaModel):
def __init__(
self,
date: str = None,
date_time: str = None,
time_zone: str = None,
):
self.date = date
self.date_time = date_time
self.time_zone = time_zone
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.date is not None:
result['date'] = self.date
if self.date_time is not None:
result['dateTime'] = self.date_time
if self.time_zone is not None:
result['timeZone'] = self.time_zone
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('date') is not None:
self.date = m.get('date')
if m.get('dateTime') is not None:
self.date_time = m.get('dateTime')
if m.get('timeZone') is not None:
self.time_zone = m.get('timeZone')
return self
class PatchEventRequestRecurrencePattern(TeaModel):
def __init__(
self,
type: str = None,
day_of_month: int = None,
days_of_week: str = None,
index: str = None,
interval: int = None,
):
self.type = type
self.day_of_month = day_of_month
self.days_of_week = days_of_week
self.index = index
self.interval = interval
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.type is not None:
result['type'] = self.type
if self.day_of_month is not None:
result['dayOfMonth'] = self.day_of_month
if self.days_of_week is not None:
result['daysOfWeek'] = self.days_of_week
if self.index is not None:
result['index'] = self.index
if self.interval is not None:
result['interval'] = self.interval
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('type') is not None:
self.type = m.get('type')
if m.get('dayOfMonth') is not None:
self.day_of_month = m.get('dayOfMonth')
if m.get('daysOfWeek') is not None:
self.days_of_week = m.get('daysOfWeek')
if m.get('index') is not None:
self.index = m.get('index')
if m.get('interval') is not None:
self.interval = m.get('interval')
return self
class PatchEventRequestRecurrenceRange(TeaModel):
def __init__(
self,
type: str = None,
end_date: str = None,
number_of_occurrences: int = None,
):
self.type = type
self.end_date = end_date
self.number_of_occurrences = number_of_occurrences
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.type is not None:
result['type'] = self.type
if self.end_date is not None:
result['endDate'] = self.end_date
if self.number_of_occurrences is not None:
result['numberOfOccurrences'] = self.number_of_occurrences
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('type') is not None:
self.type = m.get('type')
if m.get('endDate') is not None:
self.end_date = m.get('endDate')
if m.get('numberOfOccurrences') is not None:
self.number_of_occurrences = m.get('numberOfOccurrences')
return self
class PatchEventRequestRecurrence(TeaModel):
def __init__(
self,
pattern: PatchEventRequestRecurrencePattern = None,
range: PatchEventRequestRecurrenceRange = None,
):
self.pattern = pattern
self.range = range
def validate(self):
if self.pattern:
self.pattern.validate()
if self.range:
self.range.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.pattern is not None:
result['pattern'] = self.pattern.to_map()
if self.range is not None:
result['range'] = self.range.to_map()
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('pattern') is not None:
temp_model = PatchEventRequestRecurrencePattern()
self.pattern = temp_model.from_map(m['pattern'])
if m.get('range') is not None:
temp_model = PatchEventRequestRecurrenceRange()
self.range = temp_model.from_map(m['range'])
return self
class PatchEventRequestAttendees(TeaModel):
def __init__(
self,
id: str = None,
):
self.id = id
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.id is not None:
result['id'] = self.id
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('id') is not None:
self.id = m.get('id')
return self
class PatchEventRequestLocation(TeaModel):
def __init__(
self,
display_name: str = None,
):
self.display_name = display_name
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.display_name is not None:
result['displayName'] = self.display_name
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('displayName') is not None:
self.display_name = m.get('displayName')
return self
class PatchEventRequestReminders(TeaModel):
def __init__(
self,
method: str = None,
minutes: int = None,
):
self.method = method
self.minutes = minutes
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.method is not None:
result['method'] = self.method
if self.minutes is not None:
result['minutes'] = self.minutes
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('method') is not None:
self.method = m.get('method')
if m.get('minutes') is not None:
self.minutes = m.get('minutes')
return self
class PatchEventRequest(TeaModel):
def __init__(
self,
summary: str = None,
id: str = None,
description: str = None,
start: PatchEventRequestStart = None,
end: PatchEventRequestEnd = None,
is_all_day: bool = None,
recurrence: PatchEventRequestRecurrence = None,
attendees: List[PatchEventRequestAttendees] = None,
location: PatchEventRequestLocation = None,
extra: Dict[str, str] = None,
reminders: List[PatchEventRequestReminders] = None,
):
# 日程标题
self.summary = summary
# 日程id
self.id = id
self.description = description
# 日程开始时间
self.start = start
self.end = end
self.is_all_day = is_all_day
self.recurrence = recurrence
self.attendees = attendees
self.location = location
# 扩展信息
self.extra = extra
self.reminders = reminders
def validate(self):
if self.start:
self.start.validate()
if self.end:
self.end.validate()
if self.recurrence:
self.recurrence.validate()
if self.attendees:
for k in self.attendees:
if k:
k.validate()
if self.location:
self.location.validate()
if self.reminders:
for k in self.reminders:
if k:
k.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.summary is not None:
result['summary'] = self.summary
if self.id is not None:
result['id'] = self.id
if self.description is not None:
result['description'] = self.description
if self.start is not None:
result['start'] = self.start.to_map()
if self.end is not None:
result['end'] = self.end.to_map()
if self.is_all_day is not None:
result['isAllDay'] = self.is_all_day
if self.recurrence is not None:
result['recurrence'] = self.recurrence.to_map()
result['attendees'] = []
if self.attendees is not None:
for k in self.attendees:
result['attendees'].append(k.to_map() if k else None)
if self.location is not None:
result['location'] = self.location.to_map()
if self.extra is not None:
result['extra'] = self.extra
result['reminders'] = []
if self.reminders is not None:
for k in self.reminders:
result['reminders'].append(k.to_map() if k else None)
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('summary') is not None:
self.summary = m.get('summary')
if m.get('id') is not None:
self.id = m.get('id')
if m.get('description') is not None:
self.description = m.get('description')
if m.get('start') is not None:
temp_model = PatchEventRequestStart()
self.start = temp_model.from_map(m['start'])
if m.get('end') is not None:
temp_model = PatchEventRequestEnd()
self.end = temp_model.from_map(m['end'])
if m.get('isAllDay') is not None:
self.is_all_day = m.get('isAllDay')
if m.get('recurrence') is not None:
temp_model = PatchEventRequestRecurrence()
self.recurrence = temp_model.from_map(m['recurrence'])
self.attendees = []
if m.get('attendees') is not None:
for k in m.get('attendees'):
temp_model = PatchEventRequestAttendees()
self.attendees.append(temp_model.from_map(k))
if m.get('location') is not None:
temp_model = PatchEventRequestLocation()
self.location = temp_model.from_map(m['location'])
if m.get('extra') is not None:
self.extra = m.get('extra')
self.reminders = []
if m.get('reminders') is not None:
for k in m.get('reminders'):
temp_model = PatchEventRequestReminders()
self.reminders.append(temp_model.from_map(k))
return self
class PatchEventResponseBodyStart(TeaModel):
def __init__(
self,
date: str = None,
date_time: str = None,
time_zone: str = None,
):
self.date = date
self.date_time = date_time
self.time_zone = time_zone
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.date is not None:
result['date'] = self.date
if self.date_time is not None:
result['dateTime'] = self.date_time
if self.time_zone is not None:
result['timeZone'] = self.time_zone
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('date') is not None:
self.date = m.get('date')
if m.get('dateTime') is not None:
self.date_time = m.get('dateTime')
if m.get('timeZone') is not None:
self.time_zone = m.get('timeZone')
return self
class PatchEventResponseBodyEnd(TeaModel):
def __init__(
self,
date: str = None,
date_time: str = None,
time_zone: str = None,
):
self.date = date
self.date_time = date_time
self.time_zone = time_zone
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.date is not None:
result['date'] = self.date
if self.date_time is not None:
result['dateTime'] = self.date_time
if self.time_zone is not None:
result['timeZone'] = self.time_zone
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('date') is not None:
self.date = m.get('date')
if m.get('dateTime') is not None:
self.date_time = m.get('dateTime')
if m.get('timeZone') is not None:
self.time_zone = m.get('timeZone')
return self
class PatchEventResponseBodyRecurrencePattern(TeaModel):
def __init__(
self,
type: str = None,
day_of_month: int = None,
days_of_week: str = None,
index: str = None,
interval: int = None,
):
self.type = type
self.day_of_month = day_of_month
self.days_of_week = days_of_week
self.index = index
self.interval = interval
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.type is not None:
result['type'] = self.type
if self.day_of_month is not None:
result['dayOfMonth'] = self.day_of_month
if self.days_of_week is not None:
result['daysOfWeek'] = self.days_of_week
if self.index is not None:
result['index'] = self.index
if self.interval is not None:
result['interval'] = self.interval
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('type') is not None:
self.type = m.get('type')
if m.get('dayOfMonth') is not None:
self.day_of_month = m.get('dayOfMonth')
if m.get('daysOfWeek') is not None:
self.days_of_week = m.get('daysOfWeek')
if m.get('index') is not None:
self.index = m.get('index')
if m.get('interval') is not None:
self.interval = m.get('interval')
return self
class PatchEventResponseBodyRecurrenceRange(TeaModel):
def __init__(
self,
type: str = None,
end_date: str = None,
number_of_occurrences: int = None,
):
self.type = type
self.end_date = end_date
self.number_of_occurrences = number_of_occurrences
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.type is not None:
result['type'] = self.type
if self.end_date is not None:
result['endDate'] = self.end_date
if self.number_of_occurrences is not None:
result['numberOfOccurrences'] = self.number_of_occurrences
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('type') is not None:
self.type = m.get('type')
if m.get('endDate') is not None:
self.end_date = m.get('endDate')
if m.get('numberOfOccurrences') is not None:
self.number_of_occurrences = m.get('numberOfOccurrences')
return self
class PatchEventResponseBodyRecurrence(TeaModel):
def __init__(
self,
pattern: PatchEventResponseBodyRecurrencePattern = None,
range: PatchEventResponseBodyRecurrenceRange = None,
):
self.pattern = pattern
self.range = range
def validate(self):
if self.pattern:
self.pattern.validate()
if self.range:
self.range.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.pattern is not None:
result['pattern'] = self.pattern.to_map()
if self.range is not None:
result['range'] = self.range.to_map()
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('pattern') is not None:
temp_model = PatchEventResponseBodyRecurrencePattern()
self.pattern = temp_model.from_map(m['pattern'])
if m.get('range') is not None:
temp_model = PatchEventResponseBodyRecurrenceRange()
self.range = temp_model.from_map(m['range'])
return self
class PatchEventResponseBodyAttendees(TeaModel):
def __init__(
self,
id: str = None,
display_name: str = None,
response_status: str = None,
self_: bool = None,
):
self.id = id
# 用户名
self.display_name = display_name
# 回复状态
self.response_status = response_status
# 是否是当前登陆用户
self.self_ = self_
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.id is not None:
result['id'] = self.id
if self.display_name is not None:
result['displayName'] = self.display_name
if self.response_status is not None:
result['responseStatus'] = self.response_status
if self.self_ is not None:
result['self'] = self.self_
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('id') is not None:
self.id = m.get('id')
if m.get('displayName') is not None:
self.display_name = m.get('displayName')
if m.get('responseStatus') is not None:
self.response_status = m.get('responseStatus')
if m.get('self') is not None:
self.self_ = m.get('self')
return self
class PatchEventResponseBodyOrganizer(TeaModel):
def __init__(
self,
id: str = None,
display_name: str = None,
response_status: str = None,
self_: bool = None,
):
self.id = id
# 用户名
self.display_name = display_name
# 回复状态
self.response_status = response_status
# 是否是当前登陆用户
self.self_ = self_
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.id is not None:
result['id'] = self.id
if self.display_name is not None:
result['displayName'] = self.display_name
if self.response_status is not None:
result['responseStatus'] = self.response_status
if self.self_ is not None:
result['self'] = self.self_
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('id') is not None:
self.id = m.get('id')
if m.get('displayName') is not None:
self.display_name = m.get('displayName')
if m.get('responseStatus') is not None:
self.response_status = m.get('responseStatus')
if m.get('self') is not None:
self.self_ = m.get('self')
return self
class PatchEventResponseBodyLocation(TeaModel):
def __init__(
self,
display_name: str = None,
):
self.display_name = display_name
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.display_name is not None:
result['displayName'] = self.display_name
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('displayName') is not None:
self.display_name = m.get('displayName')
return self
class PatchEventResponseBodyReminders(TeaModel):
def __init__(
self,
method: str = None,
minutes: str = None,
):
self.method = method
self.minutes = minutes
def validate(self):
pass
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.method is not None:
result['method'] = self.method
if self.minutes is not None:
result['minutes'] = self.minutes
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('method') is not None:
self.method = m.get('method')
if m.get('minutes') is not None:
self.minutes = m.get('minutes')
return self
class PatchEventResponseBody(TeaModel):
def __init__(
self,
id: str = None,
summary: str = None,
description: str = None,
start: PatchEventResponseBodyStart = None,
end: PatchEventResponseBodyEnd = None,
is_all_day: bool = None,
recurrence: PatchEventResponseBodyRecurrence = None,
attendees: List[PatchEventResponseBodyAttendees] = None,
organizer: PatchEventResponseBodyOrganizer = None,
location: PatchEventResponseBodyLocation = None,
reminders: List[PatchEventResponseBodyReminders] = None,
create_time: str = None,
update_time: str = None,
):
self.id = id
self.summary = summary
self.description = description
# 日程开始时间
self.start = start
self.end = end
self.is_all_day = is_all_day
self.recurrence = recurrence
self.attendees = attendees
self.organizer = organizer
self.location = location
self.reminders = reminders
# 创建时间
self.create_time = create_time
# 更新时间
self.update_time = update_time
def validate(self):
if self.start:
self.start.validate()
if self.end:
self.end.validate()
if self.recurrence:
self.recurrence.validate()
if self.attendees:
for k in self.attendees:
if k:
k.validate()
if self.organizer:
self.organizer.validate()
if self.location:
self.location.validate()
if self.reminders:
for k in self.reminders:
if k:
k.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.id is not None:
result['id'] = self.id
if self.summary is not None:
result['summary'] = self.summary
if self.description is not None:
result['description'] = self.description
if self.start is not None:
result['start'] = self.start.to_map()
if self.end is not None:
result['end'] = self.end.to_map()
if self.is_all_day is not None:
result['isAllDay'] = self.is_all_day
if self.recurrence is not None:
result['recurrence'] = self.recurrence.to_map()
result['attendees'] = []
if self.attendees is not None:
for k in self.attendees:
result['attendees'].append(k.to_map() if k else None)
if self.organizer is not None:
result['organizer'] = self.organizer.to_map()
if self.location is not None:
result['location'] = self.location.to_map()
result['reminders'] = []
if self.reminders is not None:
for k in self.reminders:
result['reminders'].append(k.to_map() if k else None)
if self.create_time is not None:
result['createTime'] = self.create_time
if self.update_time is not None:
result['updateTime'] = self.update_time
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('id') is not None:
self.id = m.get('id')
if m.get('summary') is not None:
self.summary = m.get('summary')
if m.get('description') is not None:
self.description = m.get('description')
if m.get('start') is not None:
temp_model = PatchEventResponseBodyStart()
self.start = temp_model.from_map(m['start'])
if m.get('end') is not None:
temp_model = PatchEventResponseBodyEnd()
self.end = temp_model.from_map(m['end'])
if m.get('isAllDay') is not None:
self.is_all_day = m.get('isAllDay')
if m.get('recurrence') is not None:
temp_model = PatchEventResponseBodyRecurrence()
self.recurrence = temp_model.from_map(m['recurrence'])
self.attendees = []
if m.get('attendees') is not None:
for k in m.get('attendees'):
temp_model = PatchEventResponseBodyAttendees()
self.attendees.append(temp_model.from_map(k))
if m.get('organizer') is not None:
temp_model = PatchEventResponseBodyOrganizer()
self.organizer = temp_model.from_map(m['organizer'])
if m.get('location') is not None:
temp_model = PatchEventResponseBodyLocation()
self.location = temp_model.from_map(m['location'])
self.reminders = []
if m.get('reminders') is not None:
for k in m.get('reminders'):
temp_model = PatchEventResponseBodyReminders()
self.reminders.append(temp_model.from_map(k))
if m.get('createTime') is not None:
self.create_time = m.get('createTime')
if m.get('updateTime') is not None:
self.update_time = m.get('updateTime')
return self
class PatchEventResponse(TeaModel):
def __init__(
self,
headers: Dict[str, str] = None,
body: PatchEventResponseBody = None,
):
self.headers = headers
self.body = body
def validate(self):
self.validate_required(self.headers, 'headers')
self.validate_required(self.body, 'body')
if self.body:
self.body.validate()
def to_map(self):
_map = super().to_map()
if _map is not None:
return _map
result = dict()
if self.headers is not None:
result['headers'] = self.headers
if self.body is not None:
result['body'] = self.body.to_map()
return result
def from_map(self, m: dict = None):
m = m or dict()
if m.get('headers') is not None:
self.headers = m.get('headers')
if m.get('body') is not None:
temp_model = PatchEventResponseBody()
self.body = temp_model.from_map(m['body'])
return self
| 30.017219 | 180 | 0.565853 | 22,991 | 188,268 | 4.466748 | 0.019703 | 0.049808 | 0.089654 | 0.060909 | 0.856828 | 0.845543 | 0.840265 | 0.832056 | 0.827801 | 0.820527 | 0 | 0.000287 | 0.334396 | 188,268 | 6,271 | 181 | 30.022006 | 0.819227 | 0.013066 | 0 | 0.903857 | 1 | 0 | 0.064051 | 0.008291 | 0 | 0 | 0 | 0 | 0 | 1 | 0.111303 | false | 0.019958 | 0.000384 | 0 | 0.22299 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
3505e19a49ba5949851a1ac32a5e6b7506d89bef | 11,139 | py | Python | Tic Tac Toe/tic-tac-toe.py | Tess314/python-projects | 01d2e85d717c8b31b78bc21219baf5c0cfd48e91 | [
"Apache-2.0"
] | null | null | null | Tic Tac Toe/tic-tac-toe.py | Tess314/python-projects | 01d2e85d717c8b31b78bc21219baf5c0cfd48e91 | [
"Apache-2.0"
] | null | null | null | Tic Tac Toe/tic-tac-toe.py | Tess314/python-projects | 01d2e85d717c8b31b78bc21219baf5c0cfd48e91 | [
"Apache-2.0"
] | null | null | null | import pygame
pygame.init()
size = width, height = 550, 540
screen = pygame.display.set_mode(size)
pygame.display.set_caption("Tic Tac Toe")
# drawing on screen
zero = pygame.draw.rect(screen, (255, 255, 255), (20, 20, 150, 150))
first = pygame.draw.rect(screen, (255, 255, 255), (200, 20, 150, 150))
second = pygame.draw.rect(screen, (255, 255, 255), (380, 20, 150, 150))
third = pygame.draw.rect(screen, (255, 255, 255), (20, 190, 150, 150))
fourth = pygame.draw.rect(screen, (255, 255, 255), (200, 190, 150, 150))
fifth = pygame.draw.rect(screen, (255, 255, 255), (380, 190, 150, 150))
sixth = pygame.draw.rect(screen, (255, 255, 255), (20, 360, 150, 150))
seventh = pygame.draw.rect(screen, (255, 255, 255), (200, 360, 150, 150))
eighth = pygame.draw.rect(screen, (255, 255, 255), (380, 360, 150, 150))
running = True
# 0: red, 1: yellow, 2: idle / empty
gameState = [2, 2, 2, 2, 2, 2, 2, 2, 2]
winningPositions = [[0, 1, 2], [3, 4, 5], [6, 7, 8], [0, 3, 6], [1, 4, 7], [2, 5, 8], [0, 4, 8], [2, 4, 6]]
activePlayer = 0
while running:
for event in pygame.event.get():
if event.type == pygame.QUIT:
running = False
if event.type == pygame.MOUSEBUTTONDOWN:
if zero.collidepoint(event.pos):
if gameState[0] == 2:
gameState[0] = activePlayer
if activePlayer == 0:
pygame.draw.rect(screen, (255, 0, 0), (45, 45, 100, 100))
activePlayer = 1
else:
pygame.draw.rect(screen, (0, 255, 0), (45, 45, 100, 100))
activePlayer = 0
for winningPos in winningPositions:
if gameState[winningPos[0]] == gameState[winningPos[1]] and gameState[winningPos[1]] == \
gameState[winningPos[2]]:
if gameState[winningPos[0]] != 2:
if activePlayer == 0:
print("Green has won")
running = False
elif activePlayer == 1:
print("Red has won")
running = False
if first.collidepoint(event.pos):
if gameState[1] == 2:
gameState[1] = activePlayer
if activePlayer == 0:
pygame.draw.rect(screen, (255, 0, 0), (225, 45, 100, 100))
activePlayer = 1
else:
pygame.draw.rect(screen, (0, 255, 0), (225, 45, 100, 100))
activePlayer = 0
for winningPos in winningPositions:
if gameState[winningPos[0]] == gameState[winningPos[1]] and gameState[winningPos[1]] == \
gameState[winningPos[2]]:
if gameState[winningPos[0]] != 2:
if activePlayer == 0:
print("Green has won")
running = False
elif activePlayer == 1:
print("Red has won")
running = False
if second.collidepoint(event.pos):
if gameState[2] == 2:
gameState[2] = activePlayer
if activePlayer == 0:
pygame.draw.rect(screen, (255, 0, 0), (405, 45, 100, 100))
activePlayer = 1
else:
pygame.draw.rect(screen, (0, 255, 0), (405, 45, 100, 100))
activePlayer = 0
for winningPos in winningPositions:
if gameState[winningPos[0]] == gameState[winningPos[1]] and gameState[winningPos[1]] == \
gameState[winningPos[2]]:
if gameState[winningPos[0]] != 2:
if activePlayer == 0:
print("Green has won")
running = False
elif activePlayer == 1:
print("Red has won")
running = False
if third.collidepoint(event.pos):
if gameState[3] == 2:
gameState[3] = activePlayer
if activePlayer == 0:
pygame.draw.rect(screen, (255, 0, 0), (45, 215, 100, 100))
activePlayer = 1
else:
pygame.draw.rect(screen, (0, 255, 0), (45, 215, 100, 100))
activePlayer = 0
for winningPos in winningPositions:
if gameState[winningPos[0]] == gameState[winningPos[1]] and gameState[winningPos[1]] == \
gameState[winningPos[2]]:
if gameState[winningPos[0]] != 2:
if activePlayer == 0:
print("Green has won")
running = False
elif activePlayer == 1:
print("Red has won")
running = False
if fourth.collidepoint(event.pos):
if gameState[4] == 2:
gameState[4] = activePlayer
if activePlayer == 0:
pygame.draw.rect(screen, (255, 0, 0), (225, 215, 100, 100))
activePlayer = 1
else:
pygame.draw.rect(screen, (0, 255, 0), (225, 215, 100, 100))
activePlayer = 0
for winningPos in winningPositions:
if gameState[winningPos[0]] == gameState[winningPos[1]] and gameState[winningPos[1]] == \
gameState[winningPos[2]]:
if gameState[winningPos[0]] != 2:
if activePlayer == 0:
print("Green has won")
running = False
elif activePlayer == 1:
print("Red has won")
running = False
if fifth.collidepoint(event.pos):
if gameState[5] == 2:
gameState[5] = activePlayer
if activePlayer == 0:
pygame.draw.rect(screen, (255, 0, 0), (405, 215, 100, 100))
activePlayer = 1
else:
pygame.draw.rect(screen, (0, 255, 0), (405, 215, 100, 100))
activePlayer = 0
for winningPos in winningPositions:
if gameState[winningPos[0]] == gameState[winningPos[1]] and gameState[winningPos[1]] == \
gameState[winningPos[2]]:
if gameState[winningPos[0]] != 2:
if activePlayer == 0:
print("Green has won")
running = False
elif activePlayer == 1:
print("Red has won")
running = False
if sixth.collidepoint(event.pos):
if gameState[6] == 2:
gameState[6] = activePlayer
if activePlayer == 0:
pygame.draw.rect(screen, (255, 0, 0), (45, 385, 100, 100))
activePlayer = 1
else:
pygame.draw.rect(screen, (0, 255, 0), (45, 385, 100, 100))
activePlayer = 0
for winningPos in winningPositions:
if gameState[winningPos[0]] == gameState[winningPos[1]] and gameState[winningPos[1]] == \
gameState[winningPos[2]]:
if gameState[winningPos[0]] != 2:
if activePlayer == 0:
print("Green has won")
running = False
elif activePlayer == 1:
print("Red has won")
running = False
if seventh.collidepoint(event.pos):
if gameState[7] == 2:
gameState[7] = activePlayer
if activePlayer == 0:
pygame.draw.rect(screen, (255, 0, 0), (225, 385, 100, 100))
activePlayer = 1
else:
pygame.draw.rect(screen, (0, 255, 0), (225, 385, 100, 100))
activePlayer = 0
for winningPos in winningPositions:
if gameState[winningPos[0]] == gameState[winningPos[1]] and gameState[winningPos[1]] == \
gameState[winningPos[2]]:
if gameState[winningPos[0]] != 2:
if activePlayer == 0:
print("Green has won")
running = False
elif activePlayer == 1:
print("Red has won")
running = False
if eighth.collidepoint(event.pos):
if gameState[8] == 2:
gameState[8] = activePlayer
if activePlayer == 0:
pygame.draw.rect(screen, (255, 0, 0), (405, 385, 100, 100))
activePlayer = 1
else:
pygame.draw.rect(screen, (0, 255, 0), (405, 385, 100, 100))
activePlayer = 0
for winningPos in winningPositions:
if gameState[winningPos[0]] == gameState[winningPos[1]] and gameState[winningPos[1]] == \
gameState[winningPos[2]]:
if gameState[winningPos[0]] != 2:
if activePlayer == 0:
print("Green has won")
running = False
elif activePlayer == 1:
print("Red has won")
running = False
pygame.display.update()
pygame.quit()
| 46.028926 | 114 | 0.396176 | 951 | 11,139 | 4.638276 | 0.085174 | 0.193834 | 0.085695 | 0.122421 | 0.857175 | 0.793924 | 0.793924 | 0.793924 | 0.727273 | 0.727273 | 0 | 0.117093 | 0.504713 | 11,139 | 241 | 115 | 46.219917 | 0.682436 | 0.004668 | 0 | 0.653061 | 0 | 0 | 0.020935 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.005102 | 0 | 0.005102 | 0.091837 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
351798eb97f31a3dff2f82010c79bf3a03f4084f | 67 | py | Python | pyclesperanto_prototype/_tier11/__init__.py | elsandal/pyclesperanto_prototype | 7bda828813b86b44b63d73d5e8f466d9769cded1 | [
"BSD-3-Clause"
] | 2 | 2020-07-01T06:20:44.000Z | 2020-07-01T09:36:48.000Z | pyclesperanto_prototype/_tier11/__init__.py | elsandal/pyclesperanto_prototype | 7bda828813b86b44b63d73d5e8f466d9769cded1 | [
"BSD-3-Clause"
] | null | null | null | pyclesperanto_prototype/_tier11/__init__.py | elsandal/pyclesperanto_prototype | 7bda828813b86b44b63d73d5e8f466d9769cded1 | [
"BSD-3-Clause"
] | 1 | 2020-06-29T18:40:54.000Z | 2020-06-29T18:40:54.000Z | from ._reduce_labels_to_centroids import reduce_labels_to_centroids | 67 | 67 | 0.940299 | 10 | 67 | 5.6 | 0.6 | 0.428571 | 0.5 | 0.821429 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.044776 | 67 | 1 | 67 | 67 | 0.875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
3525befe73618719729248ff71b6633a3b06e13c | 4,217 | py | Python | UserAgentRandom/TianYanSpider.py | MrJianLiang/test_obj | e6898d746e92e576f96f432ee2355f8c912e7d1e | [
"Apache-2.0"
] | null | null | null | UserAgentRandom/TianYanSpider.py | MrJianLiang/test_obj | e6898d746e92e576f96f432ee2355f8c912e7d1e | [
"Apache-2.0"
] | null | null | null | UserAgentRandom/TianYanSpider.py | MrJianLiang/test_obj | e6898d746e92e576f96f432ee2355f8c912e7d1e | [
"Apache-2.0"
] | null | null | null | import requests
import time
import random
# def headers():
# # 随机更换user-agents
# agents = [
# 'Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/50.0.2661.87 Safari/537.36',
# 'Mozilla/5.0 (Windows NT 6.1; WOW64; rv:46.0) Gecko/20100101 Firefox/46.0',
# 'Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/50.0.2661.87 Safari/537.36 OPR/37.0.2178.32',
# 'Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/534.57.2 (KHTML, like Gecko) Version/5.1.7 Safari/534.57.2',
# 'Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/45.0.2454.101 Safari/537.36',
# 'Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/45.0.2454.101 Safari/537.36',
# 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/46.0.2486.0 Safari/537.36 Edge/13.10586',
# 'Mozilla/5.0 (Windows NT 10.0; WOW64; Trident/7.0; rv:11.0) like Gecko',
# 'Mozilla/5.0 (compatible; MSIE 10.0; Windows NT 6.1; WOW64; Trident/6.0)',
# 'Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; WOW64; Trident/5.0)',
# 'Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/47.0.2526.106 BIDUBrowser/8.3 Safari/537.36',
# 'Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Maxthon/4.9.2.1000 Chrome/39.0.2146.0 Safari/537.36',
# 'Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/47.0.2526.80 Safari/537.36 Core/1.47.277.400 QQBrowser/9.4.7658.400',
# 'Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/48.0.2564.116 UBrowser/5.6.12150.8 Safari/537.36',
# 'Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/38.0.2125.122 Safari/537.36 SE 2.X MetaSr 1.0',
# ]
# Agent = random.choice(agents)
# headers = {
# "User-Agent": Agent,
# }
# return headers
class UserAgentOne():
def random_ua(self):
import random
# 随机更换UA
agents = [
'Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/50.0.2661.87 Safari/537.36',
'Mozilla/5.0 (Windows NT 6.1; WOW64; rv:46.0) Gecko/20100101 Firefox/46.0',
'Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/50.0.2661.87 Safari/537.36 OPR/37.0.2178.32',
'Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/534.57.2 (KHTML, like Gecko) Version/5.1.7 Safari/534.57.2',
'Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/45.0.2454.101 Safari/537.36',
'Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/45.0.2454.101 Safari/537.36',
'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/46.0.2486.0 Safari/537.36 Edge/13.10586',
'Mozilla/5.0 (Windows NT 10.0; WOW64; Trident/7.0; rv:11.0) like Gecko',
'Mozilla/5.0 (compatible; MSIE 10.0; Windows NT 6.1; WOW64; Trident/6.0)',
'Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; WOW64; Trident/5.0)',
'Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/47.0.2526.106 BIDUBrowser/8.3 Safari/537.36',
'Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Maxthon/4.9.2.1000 Chrome/39.0.2146.0 Safari/537.36',
'Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/47.0.2526.80 Safari/537.36 Core/1.47.277.400 QQBrowser/9.4.7658.400',
'Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/48.0.2564.116 UBrowser/5.6.12150.8 Safari/537.36',
'Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/38.0.2125.122 Safari/537.36 SE 2.X MetaSr 1.0',
]
agents = random.choice(agents)
# print(agents)
return agents
if __name__ == '__main__':
ua = UserAgentOne()
ua_rand = ua.random_ua()
print(ua_rand) | 70.283333 | 164 | 0.640977 | 738 | 4,217 | 3.646341 | 0.131436 | 0.074322 | 0.100334 | 0.154589 | 0.901524 | 0.901524 | 0.901524 | 0.901524 | 0.901524 | 0.901524 | 0 | 0.226282 | 0.195162 | 4,217 | 60 | 165 | 70.283333 | 0.566588 | 0.469528 | 0 | 0.137931 | 0 | 0.517241 | 0.738009 | 0.009955 | 0 | 0 | 0 | 0 | 0 | 1 | 0.034483 | false | 0 | 0.137931 | 0 | 0.241379 | 0.034483 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
3541f99c908b6af282f185155caa84cd714057e1 | 2,618 | py | Python | problems/p008.py | VincentHaring/python_project-euler | b6845047097a61b0487ad15a0c40c703a307c7b5 | [
"MIT"
] | null | null | null | problems/p008.py | VincentHaring/python_project-euler | b6845047097a61b0487ad15a0c40c703a307c7b5 | [
"MIT"
] | null | null | null | problems/p008.py | VincentHaring/python_project-euler | b6845047097a61b0487ad15a0c40c703a307c7b5 | [
"MIT"
] | null | null | null | """
Problem 8
The four adjacent digits in the 1000-digit number that have the greatest product are 9 × 9 × 8 × 9 = 5832.
73167176531330624919225119674426574742355349194934
96983520312774506326239578318016984801869478851843
85861560789112949495459501737958331952853208805511
12540698747158523863050715693290963295227443043557
66896648950445244523161731856403098711121722383113
62229893423380308135336276614282806444486645238749
30358907296290491560440772390713810515859307960866
70172427121883998797908792274921901699720888093776
65727333001053367881220235421809751254540594752243
52584907711670556013604839586446706324415722155397
53697817977846174064955149290862569321978468622482
83972241375657056057490261407972968652414535100474
82166370484403199890008895243450658541227588666881
16427171479924442928230863465674813919123162824586
17866458359124566529476545682848912883142607690042
24219022671055626321111109370544217506941658960408
07198403850962455444362981230987879927244284909188
84580156166097919133875499200524063689912560717606
05886116467109405077541002256983155200055935729725
71636269561882670428252483600823257530420752963450
Find the thirteen adjacent digits in the 1000-digit number that have the greatest product. What is the value of this product?
"""
from helper import *
def p8():
n = (
'73167176531330624919225119674426574742355349194934'
'96983520312774506326239578318016984801869478851843'
'85861560789112949495459501737958331952853208805511'
'12540698747158523863050715693290963295227443043557'
'66896648950445244523161731856403098711121722383113'
'62229893423380308135336276614282806444486645238749'
'30358907296290491560440772390713810515859307960866'
'70172427121883998797908792274921901699720888093776'
'65727333001053367881220235421809751254540594752243'
'52584907711670556013604839586446706324415722155397'
'53697817977846174064955149290862569321978468622482'
'83972241375657056057490261407972968652414535100474'
'82166370484403199890008895243450658541227588666881'
'16427171479924442928230863465674813919123162824586'
'17866458359124566529476545682848912883142607690042'
'24219022671055626321111109370544217506941658960408'
'07198403850962455444362981230987879927244284909188'
'84580156166097919133875499200524063689912560717606'
'05886116467109405077541002256983155200055935729725'
'71636269561882670428252483600823257530420752963450'
)
s = 13
l = []
for i in range((len(n)-s)+1):
c = 1
for j in range(i,i+s):
c *= int(n[j])
l.append(c)
return max(l)
if __name__ == "__main__":
print("Problem 8: %d" % p8()) | 34.447368 | 125 | 0.880443 | 131 | 2,618 | 17.557252 | 0.534351 | 0.006957 | 0.013913 | 0.016522 | 0.921739 | 0.921739 | 0.921739 | 0.921739 | 0.921739 | 0.921739 | 0 | 0.839486 | 0.079068 | 2,618 | 76 | 126 | 34.447368 | 0.113231 | 0.483193 | 0 | 0 | 0 | 0 | 0.758544 | 0.742942 | 0 | 1 | 0 | 0 | 0 | 1 | 0.029412 | false | 0 | 0.029412 | 0 | 0.088235 | 0.029412 | 0 | 0 | 1 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 12 |
102b708fa1ed4cb2a73e19b5b09ef9ce2b4e9ae2 | 8,498 | py | Python | tests/test_memocell_utils.py | hoefer-lab/memocell | 5dc08d121e64fbde1ccdce86f0f1390e6918d255 | [
"MIT"
] | null | null | null | tests/test_memocell_utils.py | hoefer-lab/memocell | 5dc08d121e64fbde1ccdce86f0f1390e6918d255 | [
"MIT"
] | null | null | null | tests/test_memocell_utils.py | hoefer-lab/memocell | 5dc08d121e64fbde1ccdce86f0f1390e6918d255 | [
"MIT"
] | 1 | 2021-05-25T12:54:51.000Z | 2021-05-25T12:54:51.000Z |
# for package testing with pytest call
# in upper directory "$ python setup.py pytest"
# or in this directory "$ py.test test_memocell_[...].py"
# or after pip installation $py.test --pyargs memocell$
import pytest
import memocell as me
import numpy as np
class TestUtilsModule(object):
def test_utils_phase_type_from_erlang_1(self):
a, S = me.utils.phase_type_from_erlang(0.2, 1)
np.testing.assert_allclose(a, np.array([[1.]]))
np.testing.assert_allclose(S, np.array([[-0.2]]))
def test_utils_phase_type_from_erlang_1_pdf(self):
a, S = me.utils.phase_type_from_erlang(0.2, 1)
x = np.linspace(0.0, 20.0, num=11, endpoint=True)
y = me.utils.phase_type_pdf(a, S, x)
np.testing.assert_allclose(y, np.array([0.2 , 0.13406401, 0.08986579, 0.06023884, 0.0403793 ,
0.02706706, 0.01814359, 0.01216201, 0.00815244, 0.00546474,
0.00366313]),
rtol=1e-06, atol=1e-06)
def test_utils_phase_type_from_erlang_5(self):
a, S = me.utils.phase_type_from_erlang(0.2, 5)
np.testing.assert_allclose(a, np.array([[1., 0., 0., 0., 0.]]))
np.testing.assert_allclose(S, np.array([[-1., 1., 0., 0., 0.],
[ 0., -1., 1., 0., 0.],
[ 0., 0., -1., 1., 0.],
[ 0., 0., 0., -1., 1.],
[ 0., 0., 0., 0., -1.]]))
def test_utils_phase_type_from_erlang_5_pdf(self):
a, S = me.utils.phase_type_from_erlang(0.2, 5)
x = np.linspace(0.0, 20.0, num=11, endpoint=True)
y = me.utils.phase_type_pdf(a, S, x)
np.testing.assert_allclose(y, np.array([0.00000000e+00, 9.02235222e-02, 1.95366815e-01, 1.33852618e-01,
5.72522885e-02, 1.89166374e-02, 5.30859947e-03, 1.33100030e-03,
3.07296050e-04, 6.66159314e-05, 1.37410241e-05]),
rtol=1e-06, atol=1e-06)
def test_utils_phase_type_from_parallel_erlang2_exp(self):
a, S = me.utils.phase_type_from_parallel_erlang2(0.2, 0.1, 1, 1)
np.testing.assert_allclose(a, np.array([[1.]]))
np.testing.assert_allclose(S, np.array([[-0.3]]))
def test_utils_phase_type_from_parallel_erlang2_22(self):
a, S = me.utils.phase_type_from_parallel_erlang2(0.2, 0.1, 2, 2)
np.testing.assert_allclose(a, np.array([[1., 0., 0.]]))
np.testing.assert_allclose(S, np.array([[-0.6, 0.4, 0.2],
[ 0. , -0.4, 0. ],
[ 0. , 0. , -0.2]]))
def test_utils_phase_type_from_parallel_erlang2_12(self):
a, S = me.utils.phase_type_from_parallel_erlang2(0.2, 0.1, 1, 2)
np.testing.assert_allclose(a, np.array([[1., 0.]]))
np.testing.assert_allclose(S, np.array([[-0.4, 0.2],
[ 0. , -0.2]]))
def test_utils_phase_type_from_parallel_erlang2_21(self):
a, S = me.utils.phase_type_from_parallel_erlang2(0.2, 0.1, 2, 1)
np.testing.assert_allclose(a, np.array([[1., 0.]]))
np.testing.assert_allclose(S, np.array([[-0.5, 0.4],
[ 0. , -0.4]]))
def test_utils_phase_type_from_parallel_erlang2(self):
a, S = me.utils.phase_type_from_parallel_erlang2(0.2, 0.1, 5, 8)
np.testing.assert_allclose(a, np.array([[1., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.]]))
np.testing.assert_allclose(S, np.array([[-1.8, 1. , 0. , 0. , 0. , 0.8, 0. , 0. , 0. , 0. , 0. , 0. ],
[ 0. , -1. , 1. , 0. , 0. , 0. , 0. , 0. , 0. , 0. , 0. , 0. ],
[ 0. , 0. , -1. , 1. , 0. , 0. , 0. , 0. , 0. , 0. , 0. , 0. ],
[ 0. , 0. , 0. , -1. , 1. , 0. , 0. , 0. , 0. , 0. , 0. , 0. ],
[ 0. , 0. , 0. , 0. , -1. , 0. , 0. , 0. , 0. , 0. , 0. , 0. ],
[ 0. , 0. , 0. , 0. , 0. , -0.8, 0.8, 0. , 0. , 0. , 0. , 0. ],
[ 0. , 0. , 0. , 0. , 0. , 0. , -0.8, 0.8, 0. , 0. , 0. , 0. ],
[ 0. , 0. , 0. , 0. , 0. , 0. , 0. , -0.8, 0.8, 0. , 0. , 0. ],
[ 0. , 0. , 0. , 0. , 0. , 0. , 0. , 0. , -0.8, 0.8, 0. , 0. ],
[ 0. , 0. , 0. , 0. , 0. , 0. , 0. , 0. , 0. , -0.8, 0.8, 0. ],
[ 0. , 0. , 0. , 0. , 0. , 0. , 0. , 0. , 0. , 0. , -0.8, 0.8],
[ 0. , 0. , 0. , 0. , 0. , 0. , 0. , 0. , 0. , 0. , 0. , -0.8]]))
def test_utils_phase_type_from_parallel_erlang2_pdf(self):
a, S = me.utils.phase_type_from_parallel_erlang2(0.2, 0.1, 5, 8)
x = np.linspace(0.0, 20.0, num=11, endpoint=True)
y = me.utils.phase_type_pdf(a, S, x)
np.testing.assert_allclose(y, np.array([0. , 0.06827431, 0.13007684, 0.10767396, 0.07956391,
0.0543326 , 0.03245024, 0.01689615, 0.00782326, 0.00329188,
0.00128175]),
rtol=1e-06, atol=1e-06)
def test_utils_phase_type_from_parallel_erlang3_exp(self):
a, S = me.utils.phase_type_from_parallel_erlang3(0.2, 0.1, 0.05, 1, 1, 1)
np.testing.assert_allclose(a, np.array([[1.]]))
np.testing.assert_allclose(S, np.array([[-0.35]]))
def test_utils_phase_type_from_parallel_erlang3_222(self):
a, S = me.utils.phase_type_from_parallel_erlang3(0.2, 0.1, 0.05, 2, 2, 2)
np.testing.assert_allclose(a, np.array([[1., 0., 0., 0.]]))
np.testing.assert_allclose(S, np.array([[-0.7, 0.4, 0.2, 0.1],
[ 0. , -0.4, 0. , 0. ],
[ 0. , 0. , -0.2, 0. ],
[ 0. , 0. , 0. , -0.1]]))
def test_utils_phase_type_from_parallel_erlang3_112(self):
a, S = me.utils.phase_type_from_parallel_erlang3(0.2, 0.1, 0.05, 1, 1, 2)
np.testing.assert_allclose(a, np.array([[1., 0.]]))
np.testing.assert_allclose(S, np.array([[-0.4, 0.1],
[ 0. , -0.1]]))
def test_utils_phase_type_from_parallel_erlang3_221(self):
a, S = me.utils.phase_type_from_parallel_erlang3(0.2, 0.1, 0.05, 2, 2, 1)
np.testing.assert_allclose(a, np.array([[1., 0., 0.]]))
np.testing.assert_allclose(S, np.array([[-0.65, 0.4 , 0.2 ],
[ 0. , -0.4 , 0. ],
[ 0. , 0. , -0.2 ]]))
def test_utils_phase_type_from_parallel_erlang3(self):
a, S = me.utils.phase_type_from_parallel_erlang3(0.2, 0.1, 0.05, 2, 4, 3)
np.testing.assert_allclose(a, np.array([[1., 0., 0., 0., 0., 0., 0.]]))
np.testing.assert_allclose(S, np.array([[-0.95, 0.4 , 0.4 , 0. , 0. , 0.15, 0. ],
[ 0. , -0.4 , 0. , 0. , 0. , 0. , 0. ],
[ 0. , 0. , -0.4 , 0.4 , 0. , 0. , 0. ],
[ 0. , 0. , 0. , -0.4 , 0.4 , 0. , 0. ],
[ 0. , 0. , 0. , 0. , -0.4 , 0. , 0. ],
[ 0. , 0. , 0. , 0. , 0. , -0.15, 0.15],
[ 0. , 0. , 0. , 0. , 0. , 0. , -0.15]]))
def test_utils_phase_type_from_parallel_erlang3_pdf(self):
a, S = me.utils.phase_type_from_parallel_erlang3(0.2, 0.1, 0.05, 2, 4, 3)
x = np.linspace(0.0, 20.0, num=11, endpoint=True)
y = me.utils.phase_type_pdf(a, S, x)
np.testing.assert_allclose(y, np.array([0. , 0.10223638, 0.09316866, 0.07696751, 0.0596036 ,
0.04347545, 0.03037457, 0.02071724, 0.01403341, 0.00957331,
0.0066425 ]),
rtol=1e-06, atol=1e-06)
| 61.57971 | 121 | 0.438103 | 1,205 | 8,498 | 2.918672 | 0.102905 | 0.121126 | 0.143304 | 0.155815 | 0.816889 | 0.813193 | 0.807222 | 0.805232 | 0.714529 | 0.713961 | 0 | 0.18619 | 0.388209 | 8,498 | 137 | 122 | 62.029197 | 0.490287 | 0.022594 | 0 | 0.278261 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.243478 | 1 | 0.13913 | false | 0 | 0.026087 | 0 | 0.173913 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
1053e3201c852bc0fce9712ba304d757daff0281 | 87,888 | py | Python | rcnn_dff/symbol/symbol_vgg.py | tonysy/mx-rcnn-flow | b78c3c964c802bb874d673170d7452e7a573a998 | [
"Apache-2.0"
] | 2 | 2018-01-31T02:47:42.000Z | 2019-07-05T03:48:54.000Z | rcnn_dff/symbol/symbol_vgg.py | tonysy/mx-rcnn-flow | b78c3c964c802bb874d673170d7452e7a573a998 | [
"Apache-2.0"
] | null | null | null | rcnn_dff/symbol/symbol_vgg.py | tonysy/mx-rcnn-flow | b78c3c964c802bb874d673170d7452e7a573a998 | [
"Apache-2.0"
] | null | null | null | import mxnet as mx
from collections import OrderedDict
import proposal
import proposal_target
import rcnn_iou_loss
import rpn_iou_loss
import sample_anchors
import sample_rois
from ..config import config
from symbol_flow import conv_unit, stereo_scale_net, feature_warp, feature_propagate, feature_propagate_share
def get_vgg_conv(data):
"""
shared convolutional layers
:param data: Symbol
:return: Symbol
"""
# group 1
conv1_1 = mx.symbol.Convolution(
data=data, kernel=(3, 3), pad=(1, 1), num_filter=64, workspace=2048, name="conv1_1")
relu1_1 = mx.symbol.Activation(data=conv1_1, act_type="relu", name="relu1_1")
conv1_2 = mx.symbol.Convolution(
data=relu1_1, kernel=(3, 3), pad=(1, 1), num_filter=64, workspace=2048, name="conv1_2")
relu1_2 = mx.symbol.Activation(data=conv1_2, act_type="relu", name="relu1_2")
pool1 = mx.symbol.Pooling(
data=relu1_2, pool_type="max", kernel=(2, 2), stride=(2, 2), name="pool1")
# group 2
conv2_1 = mx.symbol.Convolution(
data=pool1, kernel=(3, 3), pad=(1, 1), num_filter=128, workspace=2048, name="conv2_1")
relu2_1 = mx.symbol.Activation(data=conv2_1, act_type="relu", name="relu2_1")
conv2_2 = mx.symbol.Convolution(
data=relu2_1, kernel=(3, 3), pad=(1, 1), num_filter=128, workspace=2048, name="conv2_2")
relu2_2 = mx.symbol.Activation(data=conv2_2, act_type="relu", name="relu2_2")
pool2 = mx.symbol.Pooling(
data=relu2_2, pool_type="max", kernel=(2, 2), stride=(2, 2), name="pool2")
# group 3
conv3_1 = mx.symbol.Convolution(
data=pool2, kernel=(3, 3), pad=(1, 1), num_filter=256, workspace=2048, name="conv3_1")
relu3_1 = mx.symbol.Activation(data=conv3_1, act_type="relu", name="relu3_1")
conv3_2 = mx.symbol.Convolution(
data=relu3_1, kernel=(3, 3), pad=(1, 1), num_filter=256, workspace=2048, name="conv3_2")
relu3_2 = mx.symbol.Activation(data=conv3_2, act_type="relu", name="relu3_2")
conv3_3 = mx.symbol.Convolution(
data=relu3_2, kernel=(3, 3), pad=(1, 1), num_filter=256, workspace=2048, name="conv3_3")
relu3_3 = mx.symbol.Activation(data=conv3_3, act_type="relu", name="relu3_3")
pool3 = mx.symbol.Pooling(
data=relu3_3, pool_type="max", kernel=(2, 2), stride=(2, 2), name="pool3")
# group 4
conv4_1 = mx.symbol.Convolution(
data=pool3, kernel=(3, 3), pad=(1, 1), num_filter=512, workspace=2048, name="conv4_1")
relu4_1 = mx.symbol.Activation(data=conv4_1, act_type="relu", name="relu4_1")
conv4_2 = mx.symbol.Convolution(
data=relu4_1, kernel=(3, 3), pad=(1, 1), num_filter=512, workspace=2048, name="conv4_2")
relu4_2 = mx.symbol.Activation(data=conv4_2, act_type="relu", name="relu4_2")
conv4_3 = mx.symbol.Convolution(
data=relu4_2, kernel=(3, 3), pad=(1, 1), num_filter=512, workspace=2048, name="conv4_3")
relu4_3 = mx.symbol.Activation(data=conv4_3, act_type="relu", name="relu4_3")
pool4 = mx.symbol.Pooling(
data=relu4_3, pool_type="max", kernel=(2, 2), stride=(2, 2), name="pool4")
# group 5
conv5_1 = mx.symbol.Convolution(
data=pool4, kernel=(3, 3), pad=(1, 1), num_filter=512, workspace=2048, name="conv5_1")
relu5_1 = mx.symbol.Activation(data=conv5_1, act_type="relu", name="relu5_1")
conv5_2 = mx.symbol.Convolution(
data=relu5_1, kernel=(3, 3), pad=(1, 1), num_filter=512, workspace=2048, name="conv5_2")
relu5_2 = mx.symbol.Activation(data=conv5_2, act_type="relu", name="relu5_2")
conv5_3 = mx.symbol.Convolution(
data=relu5_2, kernel=(3, 3), pad=(1, 1), num_filter=512, workspace=2048, name="conv5_3")
relu5_3 = mx.symbol.Activation(data=conv5_3, act_type="relu", name="relu5_3")
return relu5_3
def get_vgg_rcnn(num_classes=config.NUM_CLASSES):
"""
Fast R-CNN with VGG 16 conv layers
:param num_classes: used to determine output size
:return: Symbol
"""
data = mx.symbol.Variable(name="data")
rois = mx.symbol.Variable(name='rois')
label = mx.symbol.Variable(name='label')
bbox_target = mx.symbol.Variable(name='bbox_target')
bbox_weight = mx.symbol.Variable(name='bbox_weight')
# reshape input
rois = mx.symbol.Reshape(data=rois, shape=(-1, 5), name='rois_reshape')
label = mx.symbol.Reshape(data=label, shape=(-1, ), name='label_reshape')
bbox_target = mx.symbol.Reshape(data=bbox_target, shape=(-1, 4 * num_classes), name='bbox_target_reshape')
bbox_weight = mx.symbol.Reshape(data=bbox_weight, shape=(-1, 4 * num_classes), name='bbox_weight_reshape')
# shared convolutional layers
relu5_3 = get_vgg_conv(data)
# Fast R-CNN
roi_pool = mx.symbol.ROIPooling(
name='roi_pool', data=relu5_3, rois=rois, pooled_size=(7, 7), spatial_scale=1.0 / config.RCNN_FEAT_STRIDE)
if config.RCNN_CTX_WINDOW:
roi_pool_ctx = mx.symbol.ROIPooling(
name='roi_pool_ctx', data=relu5_3, rois=rois, pooled_size=(7, 7),
spatial_scale=1.0 / config.RCNN_FEAT_STRIDE, pad=0.25)
roi_pool_concat = mx.symbol.Concat(roi_pool, roi_pool_ctx, name='roi_pool_concat')
roi_pool_red = mx.symbol.Convolution(
data=roi_pool_concat, num_filter=512, kernel=(1, 1), stride=(1, 1), name='roi_pool_ctx_red')
roi_pool = mx.symbol.Activation(data=roi_pool_red, act_type='relu', name='roi_pool_relu')
# group 6
flatten = mx.symbol.Flatten(data=roi_pool, name="flatten")
fc6 = mx.symbol.FullyConnected(data=flatten, num_hidden=4096, name="fc6")
relu6 = mx.symbol.Activation(data=fc6, act_type="relu", name="relu6")
drop6 = mx.symbol.Dropout(data=relu6, p=0.5, name="drop6")
# group 7
fc7 = mx.symbol.FullyConnected(data=drop6, num_hidden=4096, name="fc7")
relu7 = mx.symbol.Activation(data=fc7, act_type="relu", name="relu7")
drop7 = mx.symbol.Dropout(data=relu7, p=0.5, name="drop7")
# classification
cls_score = mx.symbol.FullyConnected(name='cls_score', data=drop7, num_hidden=num_classes)
cls_prob = mx.symbol.SoftmaxOutput(name='cls_prob', data=cls_score, label=label, normalization='batch',
is_hidden_layer=config.TRAIN.RCNN_OHEM)
# bounding box regression
bbox_pred = mx.symbol.FullyConnected(name='bbox_pred', data=drop7, num_hidden=num_classes * 4)
if config.RCNN_IOU_LOSS:
bbox_loss_ = mx.symbol.Custom(data=bbox_pred, bbox_target=bbox_target, bbox_weight=bbox_weight, rois=rois,
op_type='rcnn_iou_loss', name='bbox_loss_', num_classes=num_classes)
else:
bbox_loss_ = bbox_weight * mx.symbol.smooth_l1(name='bbox_loss_', scalar=1.0, data=(bbox_pred - bbox_target))
if config.TRAIN.RCNN_OHEM:
group = mx.symbol.Custom(
cls_prob=cls_prob, bbox_loss=bbox_loss_, label=label, rois=rois, bbox_pred=bbox_pred,
name='rcnn_ohem', op_type='sample_rois',
batch_images=config.TRAIN.BATCH_IMAGES, batch_size=config.TRAIN.BATCH_ROIS,
nms_threshold=config.TRAIN.RCNN_OHEM_NMS, iou_loss=config.RCNN_IOU_LOSS,
transform=config.TRAIN.RCNN_OHEM_TRANSFORM, ignore=config.TRAIN.RCNN_OHEM_IGNORE)
rcnn_group = [group[0], group[1], group[2], group[3]]
for ind, name, last_shape in zip(range(len(rcnn_group)),
['cls_prob', 'bbox_loss', 'cls_mask', 'bbox_mask'],
[num_classes, num_classes * 4, num_classes, num_classes * 4]):
rcnn_group[ind] = mx.symbol.Reshape(data=rcnn_group[ind], shape=(config.TRAIN.BATCH_IMAGES, -1, last_shape),
name=name + '_reshape')
else:
bbox_loss = mx.sym.MakeLoss(name='bbox_loss', data=bbox_loss_, grad_scale=1.0 / config.TRAIN.BATCH_ROIS)
rcnn_group = [cls_prob, bbox_loss]
for ind, name, last_shape in zip(range(len(rcnn_group)),
['cls_prob', 'bbox_loss'],
[num_classes, num_classes * 4]):
rcnn_group[ind] = mx.symbol.Reshape(data=rcnn_group[ind], shape=(config.TRAIN.BATCH_IMAGES, -1, last_shape),
name=name + '_reshape')
# group output
group = mx.symbol.Group(rcnn_group)
return group
def get_vgg_rcnn_test(num_classes=config.NUM_CLASSES):
"""
Fast R-CNN Network with VGG
:param num_classes: used to determine output size
:return: Symbol
"""
data = mx.symbol.Variable(name="data")
rois = mx.symbol.Variable(name='rois')
# reshape rois
rois = mx.symbol.Reshape(data=rois, shape=(-1, 5), name='rois_reshape')
# shared convolutional layer
relu5_3 = get_vgg_conv(data)
# Fast R-CNN
roi_pool = mx.symbol.ROIPooling(
name='roi_pool', data=relu5_3, rois=rois, pooled_size=(7, 7), spatial_scale=1.0 / config.RCNN_FEAT_STRIDE)
if config.RCNN_CTX_WINDOW:
roi_pool_ctx = mx.symbol.ROIPooling(
name='roi_pool_ctx', data=relu5_3, rois=rois, pooled_size=(7, 7),
spatial_scale=1.0 / config.RCNN_FEAT_STRIDE, pad=0.25)
roi_pool_concat = mx.symbol.Concat(roi_pool, roi_pool_ctx, name='roi_pool_concat')
roi_pool_red = mx.symbol.Convolution(
data=roi_pool_concat, num_filter=512, kernel=(1, 1), stride=(1, 1), name='roi_pool_ctx_red')
roi_pool = mx.symbol.Activation(data=roi_pool_red, act_type='relu', name='roi_pool_relu')
# group 6
flatten = mx.symbol.Flatten(data=roi_pool, name="flatten")
fc6 = mx.symbol.FullyConnected(data=flatten, num_hidden=4096, name="fc6")
relu6 = mx.symbol.Activation(data=fc6, act_type="relu", name="relu6")
drop6 = mx.symbol.Dropout(data=relu6, p=0.5, name="drop6")
# group 7
fc7 = mx.symbol.FullyConnected(data=drop6, num_hidden=4096, name="fc7")
relu7 = mx.symbol.Activation(data=fc7, act_type="relu", name="relu7")
drop7 = mx.symbol.Dropout(data=relu7, p=0.5, name="drop7")
# classification
cls_score = mx.symbol.FullyConnected(name='cls_score', data=drop7, num_hidden=num_classes)
cls_prob = mx.symbol.SoftmaxOutput(name='cls_prob', data=cls_score)
# bounding box regression
bbox_pred = mx.symbol.FullyConnected(name='bbox_pred', data=drop7, num_hidden=num_classes * 4)
# reshape output
cls_prob = mx.symbol.Reshape(data=cls_prob, shape=(config.TEST.BATCH_IMAGES, -1, num_classes), name='cls_prob_reshape')
bbox_pred = mx.symbol.Reshape(data=bbox_pred, shape=(config.TEST.BATCH_IMAGES, -1, 4 * num_classes), name='bbox_pred_reshape')
# group output
group = mx.symbol.Group([cls_prob, bbox_pred])
return group
def get_vgg_rpn(num_anchors=config.NUM_ANCHORS):
"""
Region Proposal Network with VGG
:param num_anchors: used to determine output size
:return: Symbol
"""
data = mx.symbol.Variable(name="data")
label = mx.symbol.Variable(name='label')
bbox_target = mx.symbol.Variable(name='bbox_target')
bbox_weight = mx.symbol.Variable(name='bbox_weight')
# shared convolutional layers
relu5_3 = get_vgg_conv(data)
# RPN
rpn_conv = mx.symbol.Convolution(
data=relu5_3, kernel=(3, 3), pad=(1, 1), num_filter=512, name="rpn_conv_3x3")
rpn_relu = mx.symbol.Activation(data=rpn_conv, act_type="relu", name="rpn_relu")
rpn_cls_score = mx.symbol.Convolution(
data=rpn_relu, kernel=(1, 1), pad=(0, 0), num_filter=2 * num_anchors, name="rpn_cls_score")
rpn_bbox_pred = mx.symbol.Convolution(
data=rpn_relu, kernel=(1, 1), pad=(0, 0), num_filter=4 * num_anchors, name="rpn_bbox_pred")
# prepare rpn data
rpn_cls_score_reshape = mx.symbol.Reshape(
data=rpn_cls_score, shape=(0, 2, -1, 0), name="rpn_cls_score_reshape")
# classification
cls_prob = mx.symbol.SoftmaxOutput(data=rpn_cls_score_reshape, label=label, multi_output=True,
normalization='valid', use_ignore=True, ignore_label=-1,
is_hidden_layer=config.TRAIN.RPN_OHEM, name="cls_prob")
# bounding box regression
if config.RPN_IOU_LOSS:
bbox_loss_ = mx.symbol.Custom(data=rpn_bbox_pred, bbox_target=bbox_target, bbox_weight=bbox_weight,
op_type='rpn_iou_loss', name='bbox_loss_',
feature_stride=config.RPN_FEAT_STRIDE, scales=tuple(config.ANCHOR_SCALES),
ratios=tuple(config.ANCHOR_RATIOS))
else:
bbox_loss_ = bbox_weight * mx.symbol.smooth_l1(name='bbox_loss_', scalar=3.0, data=(rpn_bbox_pred - bbox_target))
if config.TRAIN.RPN_OHEM:
group = mx.symbol.Custom(
cls_prob=cls_prob, bbox_loss=bbox_loss_, label=label, bbox_pred=rpn_bbox_pred,
name='rpn_ohem', op_type='sample_anchors',
feature_stride=config.RPN_FEAT_STRIDE, scales=tuple(config.ANCHOR_SCALES), ratios=tuple(config.ANCHOR_RATIOS),
rpn_pre_nms_top_n=config.TRAIN.RPN_OHEM_ANCHORS, rpn_batch_size=config.TRAIN.RPN_BATCH_SIZE,
nms_threshold=config.TRAIN.RPN_OHEM_NMS, iou_loss=config.RPN_IOU_LOSS,
transform=config.TRAIN.RPN_OHEM_TRANSFORM, ignore=config.TRAIN.RPN_OHEM_IGNORE, np_ratio=config.TRAIN.RPN_OHEM_NP_RATIO)
rpn_group = [group[0], group[1], group[2], group[3]]
else:
bbox_loss = mx.sym.MakeLoss(name='rpn_bbox_loss', data=bbox_loss_, grad_scale=1.0 / config.TRAIN.RPN_BATCH_SIZE)
rpn_group = [cls_prob, bbox_loss]
# group output
group = mx.symbol.Group(rpn_group)
return group
def get_vgg_rpn_test(num_anchors=config.NUM_ANCHORS):
"""
Region Proposal Network with VGG
:param num_anchors: used to determine output size
:return: Symbol
"""
data = mx.symbol.Variable(name="data")
im_info = mx.symbol.Variable(name="im_info")
# shared convolutional layers
relu5_3 = get_vgg_conv(data)
# RPN
rpn_conv = mx.symbol.Convolution(
data=relu5_3, kernel=(3, 3), pad=(1, 1), num_filter=512, name="rpn_conv_3x3")
rpn_relu = mx.symbol.Activation(data=rpn_conv, act_type="relu", name="rpn_relu")
rpn_cls_score = mx.symbol.Convolution(
data=rpn_relu, kernel=(1, 1), pad=(0, 0), num_filter=2 * num_anchors, name="rpn_cls_score")
rpn_bbox_pred = mx.symbol.Convolution(
data=rpn_relu, kernel=(1, 1), pad=(0, 0), num_filter=4 * num_anchors, name="rpn_bbox_pred")
# ROI Proposal
rpn_cls_score_reshape = mx.symbol.Reshape(
data=rpn_cls_score, shape=(0, 2, -1, 0), name="rpn_cls_score_reshape")
rpn_cls_prob = mx.symbol.SoftmaxActivation(
data=rpn_cls_score_reshape, mode="channel", name="rpn_cls_prob")
rpn_cls_prob_reshape = mx.symbol.Reshape(
data=rpn_cls_prob, shape=(0, 2 * num_anchors, -1, 0), name='rpn_cls_prob_reshape')
if config.TEST.CXX_PROPOSAL:
group = mx.symbol.Proposal(
cls_prob=rpn_cls_prob_reshape, bbox_pred=rpn_bbox_pred, im_info=im_info, name='rois', output_score=True,
feature_stride=config.RPN_FEAT_STRIDE, scales=tuple(config.ANCHOR_SCALES), ratios=tuple(config.ANCHOR_RATIOS),
rpn_pre_nms_top_n=config.TEST.PROPOSAL_PRE_NMS_TOP_N, rpn_post_nms_top_n=config.TEST.PROPOSAL_POST_NMS_TOP_N,
threshold=config.TEST.PROPOSAL_NMS_THRESH, rpn_min_size=config.TEST.PROPOSAL_MIN_SIZE, iou_loss=config.RPN_IOU_LOSS)
else:
group = mx.symbol.Custom(
cls_prob=rpn_cls_prob_reshape, bbox_pred=rpn_bbox_pred, im_info=im_info, name='rois', output_score=True,
op_type='proposal', feat_stride=config.RPN_FEAT_STRIDE,
scales=tuple(config.ANCHOR_SCALES), ratios=tuple(config.ANCHOR_RATIOS),
rpn_pre_nms_top_n=config.TEST.PROPOSAL_PRE_NMS_TOP_N, rpn_post_nms_top_n=config.TEST.PROPOSAL_POST_NMS_TOP_N,
threshold=config.TEST.PROPOSAL_NMS_THRESH, rpn_min_size=config.TEST.PROPOSAL_MIN_SIZE, iou_loss=config.RPN_IOU_LOSS)
# rois = group[0]
# score = group[1]
return group
def get_vgg_test(num_classes=config.NUM_CLASSES, num_anchors=config.NUM_ANCHORS):
"""
Faster R-CNN test with VGG 16 conv layers
:param num_classes: used to determine output size
:param num_anchors: used to determine output size
:return: Symbol
"""
data = mx.symbol.Variable(name="data")
im_info = mx.symbol.Variable(name="im_info")
# shared convolutional layers
relu5_3 = get_vgg_conv(data)
# RPN
rpn_conv = mx.symbol.Convolution(
data=relu5_3, kernel=(3, 3), pad=(1, 1), num_filter=512, name="rpn_conv_3x3")
rpn_relu = mx.symbol.Activation(data=rpn_conv, act_type="relu", name="rpn_relu")
rpn_cls_score = mx.symbol.Convolution(
data=rpn_relu, kernel=(1, 1), pad=(0, 0), num_filter=2 * num_anchors, name="rpn_cls_score")
rpn_bbox_pred = mx.symbol.Convolution(
data=rpn_relu, kernel=(1, 1), pad=(0, 0), num_filter=4 * num_anchors, name="rpn_bbox_pred")
# ROI Proposal
rpn_cls_score_reshape = mx.symbol.Reshape(
data=rpn_cls_score, shape=(0, 2, -1, 0), name="rpn_cls_score_reshape")
rpn_cls_prob = mx.symbol.SoftmaxActivation(
data=rpn_cls_score_reshape, mode="channel", name="rpn_cls_prob")
rpn_cls_prob_reshape = mx.symbol.Reshape(
data=rpn_cls_prob, shape=(0, 2 * num_anchors, -1, 0), name='rpn_cls_prob_reshape')
if config.TEST.CXX_PROPOSAL:
rois = mx.symbol.Proposal(
cls_prob=rpn_cls_prob_reshape, bbox_pred=rpn_bbox_pred, im_info=im_info, name='rois',
feature_stride=config.RPN_FEAT_STRIDE, scales=tuple(config.ANCHOR_SCALES), ratios=tuple(config.ANCHOR_RATIOS),
rpn_pre_nms_top_n=config.TEST.RPN_PRE_NMS_TOP_N, rpn_post_nms_top_n=config.TEST.RPN_POST_NMS_TOP_N,
threshold=config.TEST.RPN_NMS_THRESH, rpn_min_size=config.TEST.RPN_MIN_SIZE, iou_loss=config.RPN_IOU_LOSS)
else:
rois = mx.symbol.Custom(
cls_prob=rpn_cls_prob_reshape, bbox_pred=rpn_bbox_pred, im_info=im_info, name='rois',
op_type='proposal', feat_stride=config.RPN_FEAT_STRIDE,
scales=tuple(config.ANCHOR_SCALES), ratios=tuple(config.ANCHOR_RATIOS),
rpn_pre_nms_top_n=config.TEST.RPN_PRE_NMS_TOP_N, rpn_post_nms_top_n=config.TEST.RPN_POST_NMS_TOP_N,
threshold=config.TEST.RPN_NMS_THRESH, rpn_min_size=config.TEST.RPN_MIN_SIZE, iou_loss=config.RPN_IOU_LOSS)
# Fast R-CNN
roi_pool = mx.symbol.ROIPooling(
name='roi_pool', data=relu5_3, rois=rois, pooled_size=(7, 7), spatial_scale=1.0 / config.RCNN_FEAT_STRIDE)
if config.RCNN_CTX_WINDOW:
roi_pool_ctx = mx.symbol.ROIPooling(
name='roi_pool_ctx', data=relu5_3, rois=rois, pooled_size=(7, 7),
spatial_scale=1.0 / config.RCNN_FEAT_STRIDE, pad=0.25)
roi_pool_concat = mx.symbol.Concat(roi_pool, roi_pool_ctx, name='roi_pool_concat')
roi_pool_red = mx.symbol.Convolution(
data=roi_pool_concat, num_filter=512, kernel=(1, 1), stride=(1, 1), name='roi_pool_ctx_red')
roi_pool = mx.symbol.Activation(data=roi_pool_red, act_type='relu', name='roi_pool_relu')
# group 6
flatten = mx.symbol.Flatten(data=roi_pool, name="flatten")
fc6 = mx.symbol.FullyConnected(data=flatten, num_hidden=4096, name="fc6")
relu6 = mx.symbol.Activation(data=fc6, act_type="relu", name="relu6")
drop6 = mx.symbol.Dropout(data=relu6, p=0.5, name="drop6")
# group 7
fc7 = mx.symbol.FullyConnected(data=drop6, num_hidden=4096, name="fc7")
relu7 = mx.symbol.Activation(data=fc7, act_type="relu", name="relu7")
drop7 = mx.symbol.Dropout(data=relu7, p=0.5, name="drop7")
# classification
cls_score = mx.symbol.FullyConnected(name='cls_score', data=drop7, num_hidden=num_classes)
cls_prob = mx.symbol.SoftmaxOutput(name='cls_prob', data=cls_score)
# bounding box regression
bbox_pred = mx.symbol.FullyConnected(name='bbox_pred', data=drop7, num_hidden=num_classes * 4)
# reshape output
cls_prob = mx.symbol.Reshape(data=cls_prob, shape=(config.TEST.BATCH_IMAGES, -1, num_classes), name='cls_prob_reshape')
bbox_pred = mx.symbol.Reshape(data=bbox_pred, shape=(config.TEST.BATCH_IMAGES, -1, 4 * num_classes), name='bbox_pred_reshape')
# group output
group = mx.symbol.Group([rois, cls_prob, bbox_pred])
return group
def get_vgg_train(num_classes=config.NUM_CLASSES, num_anchors=config.NUM_ANCHORS):
"""
Faster R-CNN end-to-end with VGG 16 conv layers
:param num_classes: used to determine output size
:param num_anchors: used to determine output size
:return: Symbol
"""
data = mx.symbol.Variable(name="data")
im_info = mx.symbol.Variable(name="im_info")
gt_boxes = mx.symbol.Variable(name="gt_boxes")
rpn_label = mx.symbol.Variable(name='label')
rpn_bbox_target = mx.symbol.Variable(name='bbox_target')
rpn_bbox_weight = mx.symbol.Variable(name='bbox_weight')
# shared convolutional layers
relu5_3 = get_vgg_conv(data)
# RPN layers
rpn_conv = mx.symbol.Convolution(
data=relu5_3, kernel=(3, 3), pad=(1, 1), num_filter=512, name="rpn_conv_3x3")
rpn_relu = mx.symbol.Activation(data=rpn_conv, act_type="relu", name="rpn_relu")
rpn_cls_score = mx.symbol.Convolution(
data=rpn_relu, kernel=(1, 1), pad=(0, 0), num_filter=2 * num_anchors, name="rpn_cls_score")
rpn_bbox_pred = mx.symbol.Convolution(
data=rpn_relu, kernel=(1, 1), pad=(0, 0), num_filter=4 * num_anchors, name="rpn_bbox_pred")
# prepare rpn data
rpn_cls_score_reshape = mx.symbol.Reshape(
data=rpn_cls_score, shape=(0, 2, -1, 0), name="rpn_cls_score_reshape")
# classification
rpn_cls_prob = mx.symbol.SoftmaxOutput(data=rpn_cls_score_reshape, label=rpn_label, multi_output=True,
normalization='valid', use_ignore=True, ignore_label=-1,
is_hidden_layer=config.TRAIN.RPN_OHEM, name="rpn_cls_prob")
# bounding box regression
if config.RPN_IOU_LOSS:
rpn_bbox_loss_ = mx.symbol.Custom(data=rpn_bbox_pred, bbox_target=rpn_bbox_target, bbox_weight=rpn_bbox_weight,
op_type='rpn_iou_loss', name='rpn_bbox_loss_',
feature_stride=config.RPN_FEAT_STRIDE, scales=tuple(config.ANCHOR_SCALES),
ratios=tuple(config.ANCHOR_RATIOS))
else:
rpn_bbox_loss_ = rpn_bbox_weight * mx.symbol.smooth_l1(name='rpn_bbox_loss_', scalar=3.0, data=(rpn_bbox_pred - rpn_bbox_target))
# rpn output
if config.TRAIN.RPN_OHEM:
group = mx.symbol.Custom(
cls_prob=rpn_cls_prob, bbox_loss=rpn_bbox_loss_, label=rpn_label, bbox_pred=rpn_bbox_pred,
name='rpn_ohem', op_type='sample_anchors',
feature_stride=config.RPN_FEAT_STRIDE, scales=tuple(config.ANCHOR_SCALES), ratios=tuple(config.ANCHOR_RATIOS),
rpn_pre_nms_top_n=config.TRAIN.RPN_OHEM_ANCHORS, rpn_batch_size=config.TRAIN.RPN_BATCH_SIZE,
nms_threshold=config.TRAIN.RPN_OHEM_NMS, iou_loss=config.RPN_IOU_LOSS,
transform=config.TRAIN.RPN_OHEM_TRANSFORM, ignore=config.TRAIN.RPN_OHEM_IGNORE, np_ratio=config.TRAIN.RPN_OHEM_NP_RATIO)
rpn_group = [group[0], group[1], group[2], group[3]]
else:
rpn_bbox_loss = mx.sym.MakeLoss(name='rpn_bbox_loss', data=rpn_bbox_loss_, grad_scale=1.0 / config.TRAIN.RPN_BATCH_SIZE)
rpn_group = [rpn_cls_prob, rpn_bbox_loss]
# ROI proposal
rpn_cls_act = mx.symbol.SoftmaxActivation(
data=rpn_cls_score_reshape, mode="channel", name="rpn_cls_act")
rpn_cls_act_reshape = mx.symbol.Reshape(
data=rpn_cls_act, shape=(0, 2 * num_anchors, -1, 0), name='rpn_cls_act_reshape')
if config.TRAIN.CXX_PROPOSAL:
rois = mx.symbol.Proposal(
cls_prob=rpn_cls_act_reshape, bbox_pred=rpn_bbox_pred, im_info=im_info, name='rois',
feature_stride=config.RPN_FEAT_STRIDE, scales=tuple(config.ANCHOR_SCALES), ratios=tuple(config.ANCHOR_RATIOS),
rpn_pre_nms_top_n=config.TRAIN.RPN_PRE_NMS_TOP_N, rpn_post_nms_top_n=config.TRAIN.RPN_POST_NMS_TOP_N,
threshold=config.TRAIN.RPN_NMS_THRESH, rpn_min_size=config.TRAIN.RPN_MIN_SIZE, iou_loss=config.RPN_IOU_LOSS)
else:
rois = mx.symbol.Custom(
cls_prob=rpn_cls_act_reshape, bbox_pred=rpn_bbox_pred, im_info=im_info, name='rois',
op_type='proposal', feat_stride=config.RPN_FEAT_STRIDE,
scales=tuple(config.ANCHOR_SCALES), ratios=tuple(config.ANCHOR_RATIOS),
rpn_pre_nms_top_n=config.TRAIN.RPN_PRE_NMS_TOP_N, rpn_post_nms_top_n=config.TRAIN.RPN_POST_NMS_TOP_N,
threshold=config.TRAIN.RPN_NMS_THRESH, rpn_min_size=config.TRAIN.RPN_MIN_SIZE, iou_loss=config.RPN_IOU_LOSS)
# ROI proposal target
gt_boxes_reshape = mx.symbol.Reshape(data=gt_boxes, shape=(-1, 5), name='gt_boxes_reshape')
if config.TRAIN.RCNN_OHEM:
group = mx.symbol.Custom(rois=rois, gt_boxes=gt_boxes_reshape, op_type='proposal_target',
num_classes=num_classes, batch_images=config.TRAIN.BATCH_IMAGES,
batch_rois=config.TRAIN.RCNN_OHEM_ROIS, ohem=config.TRAIN.RCNN_OHEM)
else:
group = mx.symbol.Custom(rois=rois, gt_boxes=gt_boxes_reshape, op_type='proposal_target',
num_classes=num_classes, batch_images=config.TRAIN.BATCH_IMAGES,
batch_rois=config.TRAIN.BATCH_ROIS, fg_fraction=config.TRAIN.FG_FRACTION)
rois = group[0]
label = group[1]
bbox_target = group[2]
bbox_weight = group[3]
# Fast R-CNN
roi_pool = mx.symbol.ROIPooling(
name='roi_pool', data=relu5_3, rois=rois, pooled_size=(7, 7), spatial_scale=1.0 / config.RCNN_FEAT_STRIDE)
if config.RCNN_CTX_WINDOW:
roi_pool_ctx = mx.symbol.ROIPooling(
name='roi_pool_ctx', data=relu5_3, rois=rois, pooled_size=(7, 7),
spatial_scale=1.0 / config.RCNN_FEAT_STRIDE, pad=0.25)
roi_pool_concat = mx.symbol.Concat(roi_pool, roi_pool_ctx, name='roi_pool_concat')
roi_pool_red = mx.symbol.Convolution(
data=roi_pool_concat, num_filter=512, kernel=(1, 1), stride=(1, 1), name='roi_pool_ctx_red')
roi_pool = mx.symbol.Activation(data=roi_pool_red, act_type='relu', name='roi_pool_relu')
# group 6
flatten = mx.symbol.Flatten(data=roi_pool, name="flatten")
fc6 = mx.symbol.FullyConnected(data=flatten, num_hidden=4096, name="fc6")
relu6 = mx.symbol.Activation(data=fc6, act_type="relu", name="relu6")
drop6 = mx.symbol.Dropout(data=relu6, p=0.5, name="drop6")
# group 7
fc7 = mx.symbol.FullyConnected(data=drop6, num_hidden=4096, name="fc7")
relu7 = mx.symbol.Activation(data=fc7, act_type="relu", name="relu7")
drop7 = mx.symbol.Dropout(data=relu7, p=0.5, name="drop7")
# classification
cls_score = mx.symbol.FullyConnected(name='cls_score', data=drop7, num_hidden=num_classes)
cls_prob = mx.symbol.SoftmaxOutput(name='cls_prob', data=cls_score, label=label, normalization='batch',
is_hidden_layer=config.TRAIN.RCNN_OHEM)
# bounding box regression
bbox_pred = mx.symbol.FullyConnected(name='bbox_pred', data=drop7, num_hidden=num_classes * 4)
if config.RCNN_IOU_LOSS:
bbox_loss_ = mx.symbol.Custom(data=bbox_pred, bbox_target=bbox_target, bbox_weight=bbox_weight, rois=rois,
op_type='rcnn_iou_loss', name='bbox_loss_', num_classes=num_classes)
else:
bbox_loss_ = bbox_weight * mx.symbol.smooth_l1(name='bbox_loss_', scalar=1.0, data=(bbox_pred - bbox_target))
if config.TRAIN.RCNN_OHEM:
group = mx.symbol.Custom(
cls_prob=cls_prob, bbox_loss=bbox_loss_, label=label, rois=rois, bbox_pred=bbox_pred,
name='rcnn_ohem', op_type='sample_rois',
batch_images=config.TRAIN.BATCH_IMAGES, batch_size=config.TRAIN.BATCH_ROIS,
nms_threshold=config.TRAIN.RCNN_OHEM_NMS, iou_loss=config.RPN_IOU_LOSS,
transform=config.TRAIN.RCNN_OHEM_TRANSFORM, ignore=config.TRAIN.RCNN_OHEM_IGNORE)
rcnn_group = [group[0], group[1], group[2], group[3]]
for ind, name, last_shape in zip(range(len(rcnn_group)),
['cls_prob', 'bbox_loss', 'cls_mask', 'bbox_mask'],
[num_classes, num_classes * 4, num_classes, num_classes * 4]):
rcnn_group[ind] = mx.symbol.Reshape(data=rcnn_group[ind], shape=(config.TRAIN.BATCH_IMAGES, -1, last_shape),
name=name + '_reshape')
else:
bbox_loss = mx.sym.MakeLoss(name='bbox_loss', data=bbox_loss_, grad_scale=1.0 / config.TRAIN.BATCH_ROIS)
rcnn_group = [cls_prob, bbox_loss]
for ind, name, last_shape in zip(range(len(rcnn_group)),
['cls_prob', 'bbox_loss'],
[num_classes, num_classes * 4]):
rcnn_group[ind] = mx.symbol.Reshape(data=rcnn_group[ind], shape=(config.TRAIN.BATCH_IMAGES, -1, last_shape),
name=name + '_reshape')
# append label
label = mx.symbol.Reshape(data=label, shape=(config.TRAIN.BATCH_IMAGES, -1), name='label_reshape')
rcnn_group += [mx.symbol.BlockGrad(label, name='label_blockgrad')]
group = mx.symbol.Group(rpn_group + rcnn_group)
return group
def get_vgg_acc_test(num_classes=config.NUM_CLASSES, num_anchors=config.NUM_ANCHORS):
"""
Faster R-CNN test with VGG 16 conv layers
:param num_classes: used to determine output size
:param num_anchors: used to determine output size
:return: Symbol
"""
data = mx.symbol.Variable(name="data")
im_info = mx.symbol.Variable(name="im_info")
# group 1
conv1_1 = mx.symbol.Convolution(
data=data, kernel=(3, 3), pad=(1, 1), num_filter=32, workspace=2048, name="conv1_1")
relu1_1 = mx.symbol.Activation(data=conv1_1, act_type="relu", name="relu1_1")
conv1_2 = mx.symbol.Convolution(
data=relu1_1, kernel=(3, 3), pad=(1, 1), num_filter=32, workspace=2048, name="conv1_2")
relu1_2 = mx.symbol.Activation(data=conv1_2, act_type="relu", name="relu1_2")
pool1 = mx.symbol.Pooling(
data=relu1_2, pool_type="max", kernel=(2, 2), stride=(2, 2), name="pool1")
# group 2
conv2_1 = mx.symbol.Convolution(
data=pool1, kernel=(3, 3), pad=(1, 1), num_filter=64, workspace=2048, name="conv2_1")
relu2_1 = mx.symbol.Activation(data=conv2_1, act_type="relu", name="relu2_1")
conv2_2 = mx.symbol.Convolution(
data=relu2_1, kernel=(3, 3), pad=(1, 1), num_filter=64, workspace=2048, name="conv2_2")
relu2_2 = mx.symbol.Activation(data=conv2_2, act_type="relu", name="relu2_2")
pool2 = mx.symbol.Pooling(
data=relu2_2, pool_type="max", kernel=(2, 2), stride=(2, 2), name="pool2")
# group 3
conv3_1 = mx.symbol.Convolution(
data=pool2, kernel=(3, 3), pad=(1, 1), num_filter=64, workspace=2048, name="conv3_1")
relu3_1 = mx.symbol.Activation(data=conv3_1, act_type="relu", name="relu3_1")
conv3_2 = mx.symbol.Convolution(
data=relu3_1, kernel=(3, 3), pad=(1, 1), num_filter=64, workspace=2048, name="conv3_2")
relu3_2 = mx.symbol.Activation(data=conv3_2, act_type="relu", name="relu3_2")
conv3_3 = mx.symbol.Convolution(
data=relu3_2, kernel=(3, 3), pad=(1, 1), num_filter=128, workspace=2048, name="conv3_3")
relu3_3 = mx.symbol.Activation(data=conv3_3, act_type="relu", name="relu3_3")
pool3 = mx.symbol.Pooling(
data=relu3_3, pool_type="max", kernel=(2, 2), stride=(2, 2), name="pool3")
# group 4
conv4_1 = mx.symbol.Convolution(
data=pool3, kernel=(3, 3), pad=(1, 1), num_filter=128, workspace=2048, name="conv4_1")
relu4_1 = mx.symbol.Activation(data=conv4_1, act_type="relu", name="relu4_1")
conv4_2 = mx.symbol.Convolution(
data=relu4_1, kernel=(3, 3), pad=(1, 1), num_filter=128, workspace=2048, name="conv4_2")
relu4_2 = mx.symbol.Activation(data=conv4_2, act_type="relu", name="relu4_2")
conv4_3 = mx.symbol.Convolution(
data=relu4_2, kernel=(3, 3), pad=(1, 1), num_filter=128, workspace=2048, name="conv4_3")
relu4_3 = mx.symbol.Activation(data=conv4_3, act_type="relu", name="relu4_3")
pool4 = mx.symbol.Pooling(
data=relu4_3, pool_type="max", kernel=(2, 2), stride=(2, 2), name="pool4")
# group 5
conv5_1 = mx.symbol.Convolution(
data=pool4, kernel=(3, 3), pad=(1, 1), num_filter=128, workspace=2048, name="conv5_1")
relu5_1 = mx.symbol.Activation(data=conv5_1, act_type="relu", name="relu5_1")
conv5_2 = mx.symbol.Convolution(
data=relu5_1, kernel=(3, 3), pad=(1, 1), num_filter=128, workspace=2048, name="conv5_2")
relu5_2 = mx.symbol.Activation(data=conv5_2, act_type="relu", name="relu5_2")
conv5_3 = mx.symbol.Convolution(
data=relu5_2, kernel=(3, 3), pad=(1, 1), num_filter=128, workspace=2048, name="conv5_3")
relu5_3 = mx.symbol.Activation(data=conv5_3, act_type="relu", name="relu5_3")
# shared convolutional layers
conv_feat = relu5_3
# RPN
rpn_conv = mx.symbol.Convolution(
data=conv_feat, kernel=(3, 3), pad=(1, 1), num_filter=512, name="rpn_conv_3x3")
rpn_relu = mx.symbol.Activation(data=rpn_conv, act_type="relu", name="rpn_relu")
rpn_cls_score = mx.symbol.Convolution(
data=rpn_relu, kernel=(1, 1), pad=(0, 0), num_filter=2 * num_anchors, name="rpn_cls_score")
rpn_bbox_pred = mx.symbol.Convolution(
data=rpn_relu, kernel=(1, 1), pad=(0, 0), num_filter=4 * num_anchors, name="rpn_bbox_pred")
# ROI Proposal
rpn_cls_score_reshape = mx.symbol.Reshape(
data=rpn_cls_score, shape=(0, 2, -1, 0), name="rpn_cls_score_reshape")
rpn_cls_prob = mx.symbol.SoftmaxActivation(
data=rpn_cls_score_reshape, mode="channel", name="rpn_cls_prob")
rpn_cls_prob_reshape = mx.symbol.Reshape(
data=rpn_cls_prob, shape=(0, 2 * num_anchors, -1, 0), name='rpn_cls_prob_reshape')
if config.TEST.CXX_PROPOSAL:
rois = mx.symbol.Proposal(
cls_prob=rpn_cls_prob_reshape, bbox_pred=rpn_bbox_pred, im_info=im_info, name='rois',
feature_stride=config.RPN_FEAT_STRIDE, scales=tuple(config.ANCHOR_SCALES), ratios=tuple(config.ANCHOR_RATIOS),
rpn_pre_nms_top_n=config.TEST.RPN_PRE_NMS_TOP_N, rpn_post_nms_top_n=config.TEST.RPN_POST_NMS_TOP_N,
threshold=config.TEST.RPN_NMS_THRESH, rpn_min_size=config.TEST.RPN_MIN_SIZE, iou_loss=config.RPN_IOU_LOSS)
else:
rois = mx.symbol.Custom(
cls_prob=rpn_cls_prob_reshape, bbox_pred=rpn_bbox_pred, im_info=im_info, name='rois',
op_type='proposal', feat_stride=config.RPN_FEAT_STRIDE,
scales=tuple(config.ANCHOR_SCALES), ratios=tuple(config.ANCHOR_RATIOS),
rpn_pre_nms_top_n=config.TEST.RPN_PRE_NMS_TOP_N, rpn_post_nms_top_n=config.TEST.RPN_POST_NMS_TOP_N,
threshold=config.TEST.RPN_NMS_THRESH, rpn_min_size=config.TEST.RPN_MIN_SIZE, iou_loss=config.RPN_IOU_LOSS)
# Fast R-CNN
roi_pool = mx.symbol.ROIPooling(
name='roi_pool', data=conv_feat, rois=rois, pooled_size=(7, 7), spatial_scale=1.0 / config.RCNN_FEAT_STRIDE)
if config.RCNN_CTX_WINDOW:
roi_pool_ctx = mx.symbol.ROIPooling(
name='roi_pool_ctx', data=conv_feat, rois=rois, pooled_size=(7, 7),
spatial_scale=1.0 / config.RCNN_FEAT_STRIDE, pad=0.25)
roi_pool_concat = mx.symbol.Concat(roi_pool, roi_pool_ctx, name='roi_pool_concat')
roi_pool_red = mx.symbol.Convolution(
data=roi_pool_concat, num_filter=512, kernel=(1, 1), stride=(1, 1), name='roi_pool_ctx_red')
roi_pool = mx.symbol.Activation(data=roi_pool_red, act_type='relu', name='roi_pool_relu')
# group 6
flatten = mx.symbol.Flatten(data=roi_pool, name="flatten")
fc6 = mx.symbol.FullyConnected(data=flatten, num_hidden=512, name="fc6")
relu6 = mx.symbol.Activation(data=fc6, act_type="relu", name="relu6")
drop6 = mx.symbol.Dropout(data=relu6, p=0.5, name="drop6")
# group 7
fc7 = mx.symbol.FullyConnected(data=drop6, num_hidden=512, name="fc7")
relu7 = mx.symbol.Activation(data=fc7, act_type="relu", name="relu7")
drop7 = mx.symbol.Dropout(data=relu7, p=0.5, name="drop7")
# classification
cls_score = mx.symbol.FullyConnected(name='cls_score', data=drop7, num_hidden=num_classes)
cls_prob = mx.symbol.SoftmaxOutput(name='cls_prob', data=cls_score)
# bounding box regression
bbox_pred = mx.symbol.FullyConnected(name='bbox_pred', data=drop7, num_hidden=num_classes * 4)
# reshape output
cls_prob = mx.symbol.Reshape(data=cls_prob, shape=(config.TEST.BATCH_IMAGES, -1, num_classes), name='cls_prob_reshape')
bbox_pred = mx.symbol.Reshape(data=bbox_pred, shape=(config.TEST.BATCH_IMAGES, -1, 4 * num_classes), name='bbox_pred_reshape')
# group output
group = mx.symbol.Group([rois, cls_prob, bbox_pred])
return group
def get_vgg_dilate_conv(data):
"""
vgg-16
shared convolutional layers,use dilated convolution in group 5
:param data: Symbol
:return: Symbol
"""
# ====group 1
conv1_1 = mx.symbol.Convolution(data=data, kernel=(3,3), pad=(1,1), \
num_filter=64, name='conv1_1')
relu1_1 = mx.symbol.Activation(data=conv1_1, act_type='relu', \
name='relu1_1')
conv1_2 = mx.symbol.Convolution(data=relu1_1,kernel=(3,3), pad=(1,1), \
num_filter=64, name='conv1_2')
relu1_2 = mx.symbol.Activation(data=conv1_2, act_type='relu', \
name='relu1_2')
pool1 = mx.symbol.Pooling(data=relu1_2, pool_type="max", kernel=(2,2), \
stride=(2,2), name="pool1")
# ======group 2
conv2_1 = mx.symbol.Convolution(data=pool1, kernel=(3,3), pad=(1,1), \
num_filter=128, name='conv2_1')
relu2_1 = mx.symbol.Activation(data=conv2_1, act_type="relu", \
name="relu2_1")
conv2_2 = mx.symbol.Convolution(data=relu2_1, kernel=(3,3), pad=(1,1), \
num_filter=128, name="conv2_2")
relu2_2 = mx.symbol.Activation(data=conv2_2, act_type="relu", \
name="relu2_2")
pool2 = mx.symbol.Pooling(data=relu2_2, pool_type="max",kernel=(2,2), \
stride=(2,2), name="pool2")
# ======group 3
conv3_1 = mx.symbol.Convolution(data=pool2, kernel=(3,3), pad=(1,1), \
num_filter=256, name="conv3_1")
relu3_1 = mx.symbol.Activation(data=conv3_1, act_type='relu', \
name='relu3_1')
conv3_2 = mx.symbol.Convolution(data=relu3_1, kernel=(3,3), pad=(1,1), \
num_filter=256, name='conv3_2')
relu3_2 = mx.symbol.Activation(data=conv3_2, act_type='relu', \
name='relu3_2')
conv3_3 = mx.symbol.Convolution(data=relu3_2, kernel=(3,3), pad=(1,1), \
num_filter=256, name="conv3_3")
relu3_3 = mx.symbol.Activation(data=conv3_3, act_type='relu', \
name='relu3_3')
pool3 = mx.symbol.Pooling(data=relu3_3, pool_type='max', kernel=(2,2), \
stride=(2,2), name="pool3")
# ======group 4
conv4_1 = mx.symbol.Convolution(data=pool3, kernel=(3,3), pad=(1, 1), \
num_filter=512, name='conv4_1')
relu4_1 = mx.symbol.Activation(data=conv4_1, act_type='relu', \
name='relu4_1')
conv4_2 = mx.symbol.Convolution(data=relu4_1,kernel=(3,3), pad=(1,1), \
num_filter=512, name='conv4_2')
relu4_2 = mx.symbol.Activation(data=conv4_1,act_type='relu',\
name='relu4_2')
conv4_3 = mx.symbol.Convolution(data=relu4_2,kernel=(3,3), pad=(1,1), \
num_filter=512, name='conv4_3')
relu4_3 = mx.symbol.Activation(data=conv4_3, act_type='relu', \
name='relu4_3')
pool4 = mx.symbol.Pooling(data=relu4_3,pool_type='max', kernel=(2,2), \
stride=(2,2), name='pool4')
# ======group5
# ======use dilation conv
conv5_1 = mx.symbol.Convolution(data=pool4, kernel=(3,3), pad=(2,2), \
dilate=(2,2), num_filter=512, \
name='conv5_1')
relu5_1 = mx.symbol.Activation(data=conv5_1, act_type='relu', \
name='relu5_1')
conv5_2 = mx.symbol.Convolution(data=relu5_1, kernel=(3,3), pad=(2,2), \
dilate=(2,2), num_filter=512, \
name='conv5_2')
relu5_2 = mx.symbol.Activation(data=conv5_2, act_type='relu', \
name='relu5_2')
conv5_3 = mx.symbol.Convolution(data=relu5_2, kernel=(3,3), pad=(2,2), \
dilate=(2,2), num_filter=512, \
name='conv5_3')
relu5_3 = mx.symbol.Activation(data=conv5_3, act_type='relu', \
name='relu5_3')
return relu5_3
def get_vgg_train_dff(num_classes=config.NUM_CLASSES, num_anchors=config.NUM_ANCHORS):
"""
Faster R-CNN end-to-end with VGG 16 conv layers
Edited for deep feature flow, use flownet warp feature
:param num_classes: used to determine output size
:param num_anchors: used to determine output size
:return: Symbol
"""
data = mx.symbol.Variable(name="data")
data2 = mx.symbol.Variable(name="data2")
im_info = mx.symbol.Variable(name="im_info")
gt_boxes = mx.symbol.Variable(name="gt_boxes")
rpn_label = mx.symbol.Variable(name='label')
rpn_bbox_target = mx.symbol.Variable(name='bbox_target')
rpn_bbox_weight = mx.symbol.Variable(name='bbox_weight')
# shared convolutional layers
# relu5_3 = get_vgg_conv(data)
relu5_3 = get_vgg_dilate_conv(data2)
relu5_3, _, _ = feature_propagate(relu5_3, data, data2)
# flownet = stereo_scale_net(data*config.FLOW_SCALE_FACTOR, \
# data2*config.FLOW_SCALE_FACTOR,\
# net_type='flow')
# flow = flownet[0]
# scale = flownet[1]
# scale_avg = mx.sym.Pooling(data=scale*0.125, pool_type='avg',\
# kernel=(8,8),stride=(8,8),name="scale_avg")
# flow_avg = mx.sym.Pooling(data=flow*0.125, pool_type='avg',\
# kernel=(8,8),stride=(8,8),name="flow_avg")
#
# flow_grid = mx.symbol.GridGenerator(data=flow_avg,transform_type='warp',\
# name='flow_grid')
# warp_res = mx.symbol.BilinearSampler(data=relu5_3,grid=flow_grid,\
# name='warp_res')
#
# relu5_3 = warp_res * scale_avg
# RPN layers
rpn_conv = mx.symbol.Convolution(
data=relu5_3, kernel=(3, 3), pad=(1, 1), num_filter=512, name="rpn_conv_3x3")
rpn_relu = mx.symbol.Activation(data=rpn_conv, act_type="relu", name="rpn_relu")
rpn_cls_score = mx.symbol.Convolution(
data=rpn_relu, kernel=(1, 1), pad=(0, 0), num_filter=2 * num_anchors, name="rpn_cls_score")
rpn_bbox_pred = mx.symbol.Convolution(
data=rpn_relu, kernel=(1, 1), pad=(0, 0), num_filter=4 * num_anchors, name="rpn_bbox_pred")
# prepare rpn data
rpn_cls_score_reshape = mx.symbol.Reshape(
data=rpn_cls_score, shape=(0, 2, -1, 0), name="rpn_cls_score_reshape")
# classification
rpn_cls_prob = mx.symbol.SoftmaxOutput(data=rpn_cls_score_reshape, label=rpn_label, multi_output=True,
normalization='valid', use_ignore=True, ignore_label=-1,
is_hidden_layer=config.TRAIN.RPN_OHEM, name="rpn_cls_prob")
# bounding box regression
if config.RPN_IOU_LOSS:
rpn_bbox_loss_ = mx.symbol.Custom(data=rpn_bbox_pred, bbox_target=rpn_bbox_target, bbox_weight=rpn_bbox_weight,
op_type='rpn_iou_loss', name='rpn_bbox_loss_',
feature_stride=config.RPN_FEAT_STRIDE, scales=tuple(config.ANCHOR_SCALES),
ratios=tuple(config.ANCHOR_RATIOS))
else:
rpn_bbox_loss_ = rpn_bbox_weight * mx.symbol.smooth_l1(name='rpn_bbox_loss_', scalar=3.0, data=(rpn_bbox_pred - rpn_bbox_target))
# rpn output
if config.TRAIN.RPN_OHEM:
group = mx.symbol.Custom(
cls_prob=rpn_cls_prob, bbox_loss=rpn_bbox_loss_, label=rpn_label, bbox_pred=rpn_bbox_pred,
name='rpn_ohem', op_type='sample_anchors',
feature_stride=config.RPN_FEAT_STRIDE, scales=tuple(config.ANCHOR_SCALES), ratios=tuple(config.ANCHOR_RATIOS),
rpn_pre_nms_top_n=config.TRAIN.RPN_OHEM_ANCHORS, rpn_batch_size=config.TRAIN.RPN_BATCH_SIZE,
nms_threshold=config.TRAIN.RPN_OHEM_NMS, iou_loss=config.RPN_IOU_LOSS,
transform=config.TRAIN.RPN_OHEM_TRANSFORM, ignore=config.TRAIN.RPN_OHEM_IGNORE, np_ratio=config.TRAIN.RPN_OHEM_NP_RATIO)
rpn_group = [group[0], group[1], group[2], group[3]]
else:
rpn_bbox_loss = mx.sym.MakeLoss(name='rpn_bbox_loss', data=rpn_bbox_loss_, grad_scale=1.0 / config.TRAIN.RPN_BATCH_SIZE)
rpn_group = [rpn_cls_prob, rpn_bbox_loss]
# ROI proposal
rpn_cls_act = mx.symbol.SoftmaxActivation(
data=rpn_cls_score_reshape, mode="channel", name="rpn_cls_act")
rpn_cls_act_reshape = mx.symbol.Reshape(
data=rpn_cls_act, shape=(0, 2 * num_anchors, -1, 0), name='rpn_cls_act_reshape')
if config.TRAIN.CXX_PROPOSAL:
rois = mx.symbol.Proposal(
cls_prob=rpn_cls_act_reshape, bbox_pred=rpn_bbox_pred, im_info=im_info, name='rois',
feature_stride=config.RPN_FEAT_STRIDE, scales=tuple(config.ANCHOR_SCALES), ratios=tuple(config.ANCHOR_RATIOS),
rpn_pre_nms_top_n=config.TRAIN.RPN_PRE_NMS_TOP_N, rpn_post_nms_top_n=config.TRAIN.RPN_POST_NMS_TOP_N,
threshold=config.TRAIN.RPN_NMS_THRESH, rpn_min_size=config.TRAIN.RPN_MIN_SIZE, iou_loss=config.RPN_IOU_LOSS)
else:
rois = mx.symbol.Custom(
cls_prob=rpn_cls_act_reshape, bbox_pred=rpn_bbox_pred, im_info=im_info, name='rois',
op_type='proposal', feat_stride=config.RPN_FEAT_STRIDE,
scales=tuple(config.ANCHOR_SCALES), ratios=tuple(config.ANCHOR_RATIOS),
rpn_pre_nms_top_n=config.TRAIN.RPN_PRE_NMS_TOP_N, rpn_post_nms_top_n=config.TRAIN.RPN_POST_NMS_TOP_N,
threshold=config.TRAIN.RPN_NMS_THRESH, rpn_min_size=config.TRAIN.RPN_MIN_SIZE, iou_loss=config.RPN_IOU_LOSS)
# ROI proposal target
gt_boxes_reshape = mx.symbol.Reshape(data=gt_boxes, shape=(-1, 5), name='gt_boxes_reshape')
if config.TRAIN.RCNN_OHEM:
group = mx.symbol.Custom(rois=rois, gt_boxes=gt_boxes_reshape, op_type='proposal_target',
num_classes=num_classes, batch_images=config.TRAIN.BATCH_IMAGES,
batch_rois=config.TRAIN.RCNN_OHEM_ROIS, ohem=config.TRAIN.RCNN_OHEM)
else:
group = mx.symbol.Custom(rois=rois, gt_boxes=gt_boxes_reshape, op_type='proposal_target',
num_classes=num_classes, batch_images=config.TRAIN.BATCH_IMAGES,
batch_rois=config.TRAIN.BATCH_ROIS, fg_fraction=config.TRAIN.FG_FRACTION)
rois = group[0]
label = group[1]
bbox_target = group[2]
bbox_weight = group[3]
# Fast R-CNN
roi_pool = mx.symbol.ROIPooling(
name='roi_pool', data=relu5_3, rois=rois, pooled_size=(7, 7), spatial_scale=1.0 / config.RCNN_FEAT_STRIDE)
if config.RCNN_CTX_WINDOW:
roi_pool_ctx = mx.symbol.ROIPooling(
name='roi_pool_ctx', data=relu5_3, rois=rois, pooled_size=(7, 7),
spatial_scale=1.0 / config.RCNN_FEAT_STRIDE, pad=0.25)
roi_pool_concat = mx.symbol.Concat(roi_pool, roi_pool_ctx, name='roi_pool_concat')
roi_pool_red = mx.symbol.Convolution(
data=roi_pool_concat, num_filter=512, kernel=(1, 1), stride=(1, 1), name='roi_pool_ctx_red')
roi_pool = mx.symbol.Activation(data=roi_pool_red, act_type='relu', name='roi_pool_relu')
# group 6
flatten = mx.symbol.Flatten(data=roi_pool, name="flatten")
fc6 = mx.symbol.FullyConnected(data=flatten, num_hidden=4096, name="fc6")
relu6 = mx.symbol.Activation(data=fc6, act_type="relu", name="relu6")
drop6 = mx.symbol.Dropout(data=relu6, p=0.5, name="drop6")
# group 7
fc7 = mx.symbol.FullyConnected(data=drop6, num_hidden=4096, name="fc7")
relu7 = mx.symbol.Activation(data=fc7, act_type="relu", name="relu7")
drop7 = mx.symbol.Dropout(data=relu7, p=0.5, name="drop7")
# classification
cls_score = mx.symbol.FullyConnected(name='cls_score', data=drop7, num_hidden=num_classes)
cls_prob = mx.symbol.SoftmaxOutput(name='cls_prob', data=cls_score, label=label, normalization='batch',
is_hidden_layer=config.TRAIN.RCNN_OHEM)
# bounding box regression
bbox_pred = mx.symbol.FullyConnected(name='bbox_pred', data=drop7, num_hidden=num_classes * 4)
if config.RCNN_IOU_LOSS:
bbox_loss_ = mx.symbol.Custom(data=bbox_pred, bbox_target=bbox_target, bbox_weight=bbox_weight, rois=rois,
op_type='rcnn_iou_loss', name='bbox_loss_', num_classes=num_classes)
else:
bbox_loss_ = bbox_weight * mx.symbol.smooth_l1(name='bbox_loss_', scalar=1.0, data=(bbox_pred - bbox_target))
if config.TRAIN.RCNN_OHEM:
group = mx.symbol.Custom(
cls_prob=cls_prob, bbox_loss=bbox_loss_, label=label, rois=rois, bbox_pred=bbox_pred,
name='rcnn_ohem', op_type='sample_rois',
batch_images=config.TRAIN.BATCH_IMAGES, batch_size=config.TRAIN.BATCH_ROIS,
nms_threshold=config.TRAIN.RCNN_OHEM_NMS, iou_loss=config.RPN_IOU_LOSS,
transform=config.TRAIN.RCNN_OHEM_TRANSFORM, ignore=config.TRAIN.RCNN_OHEM_IGNORE)
rcnn_group = [group[0], group[1], group[2], group[3]]
for ind, name, last_shape in zip(range(len(rcnn_group)),
['cls_prob', 'bbox_loss', 'cls_mask', 'bbox_mask'],
[num_classes, num_classes * 4, num_classes, num_classes * 4]):
rcnn_group[ind] = mx.symbol.Reshape(data=rcnn_group[ind], shape=(config.TRAIN.BATCH_IMAGES, -1, last_shape),
name=name + '_reshape')
else:
bbox_loss = mx.sym.MakeLoss(name='bbox_loss', data=bbox_loss_, grad_scale=1.0 / config.TRAIN.BATCH_ROIS)
rcnn_group = [cls_prob, bbox_loss]
for ind, name, last_shape in zip(range(len(rcnn_group)),
['cls_prob', 'bbox_loss'],
[num_classes, num_classes * 4]):
rcnn_group[ind] = mx.symbol.Reshape(data=rcnn_group[ind], shape=(config.TRAIN.BATCH_IMAGES, -1, last_shape),
name=name + '_reshape')
# append label
label = mx.symbol.Reshape(data=label, shape=(config.TRAIN.BATCH_IMAGES, -1), name='label_reshape')
rcnn_group += [mx.symbol.BlockGrad(label, name='label_blockgrad')]
group = mx.symbol.Group(rpn_group + rcnn_group)
return group
def get_vgg_test_dff(num_classes=config.NUM_CLASSES, num_anchors=config.NUM_ANCHORS):
"""
Faster R-CNN test with VGG 16 conv layers
:param num_classes: used to determine output size
:param num_anchors: used to determine output size
:return: Symbol
"""
data = mx.symbol.Variable(name="data")
data2 = mx.symbol.Variable(name="data2")
im_info = mx.symbol.Variable(name="im_info")
# shared convolutional layers
# relu5_3 = get_vgg_conv(data)
relu5_3 = get_vgg_dilate_conv(data2)
relu5_3, flow, flow_avg = feature_propagate(relu5_3, data, data2)
# flownet = stereo_scale_net(data*config.FLOW_SCALE_FACTOR, \
# data2*config.FLOW_SCALE_FACTOR,\
# net_type='flow')
# flow = flownet[0]
# scale = flownet[1]
# scale_avg = mx.sym.Pooling(data=scale*0.125, pool_type='avg',\
# kernel=(8,8),stride=(8,8),name="scale_avg")
# flow_avg = mx.sym.Pooling(data=flow*0.125, pool_type='avg',\
# kernel=(8,8),stride=(8,8),name="flow_avg")
#
# flow_grid = mx.symbol.GridGenerator(data=flow_avg,transform_type='warp',\
# name='flow_grid')
# warp_res = mx.symbol.BilinearSampler(data=relu5_3,grid=flow_grid,\
# name='warp_res')
#
# relu5_3 = warp_res * scale_avg
# RPN
rpn_conv = mx.symbol.Convolution(
data=relu5_3, kernel=(3, 3), pad=(1, 1), num_filter=512, name="rpn_conv_3x3")
rpn_relu = mx.symbol.Activation(data=rpn_conv, act_type="relu", name="rpn_relu")
rpn_cls_score = mx.symbol.Convolution(
data=rpn_relu, kernel=(1, 1), pad=(0, 0), num_filter=2 * num_anchors, name="rpn_cls_score")
rpn_bbox_pred = mx.symbol.Convolution(
data=rpn_relu, kernel=(1, 1), pad=(0, 0), num_filter=4 * num_anchors, name="rpn_bbox_pred")
# ROI Proposal
rpn_cls_score_reshape = mx.symbol.Reshape(
data=rpn_cls_score, shape=(0, 2, -1, 0), name="rpn_cls_score_reshape")
rpn_cls_prob = mx.symbol.SoftmaxActivation(
data=rpn_cls_score_reshape, mode="channel", name="rpn_cls_prob")
rpn_cls_prob_reshape = mx.symbol.Reshape(
data=rpn_cls_prob, shape=(0, 2 * num_anchors, -1, 0), name='rpn_cls_prob_reshape')
if config.TEST.CXX_PROPOSAL:
rois = mx.symbol.Proposal(
cls_prob=rpn_cls_prob_reshape, bbox_pred=rpn_bbox_pred, im_info=im_info, name='rois',
feature_stride=config.RPN_FEAT_STRIDE, scales=tuple(config.ANCHOR_SCALES), ratios=tuple(config.ANCHOR_RATIOS),
rpn_pre_nms_top_n=config.TEST.RPN_PRE_NMS_TOP_N, rpn_post_nms_top_n=config.TEST.RPN_POST_NMS_TOP_N,
threshold=config.TEST.RPN_NMS_THRESH, rpn_min_size=config.TEST.RPN_MIN_SIZE, iou_loss=config.RPN_IOU_LOSS)
else:
rois = mx.symbol.Custom(
cls_prob=rpn_cls_prob_reshape, bbox_pred=rpn_bbox_pred, im_info=im_info, name='rois',
op_type='proposal', feat_stride=config.RPN_FEAT_STRIDE,
scales=tuple(config.ANCHOR_SCALES), ratios=tuple(config.ANCHOR_RATIOS),
rpn_pre_nms_top_n=config.TEST.RPN_PRE_NMS_TOP_N, rpn_post_nms_top_n=config.TEST.RPN_POST_NMS_TOP_N,
threshold=config.TEST.RPN_NMS_THRESH, rpn_min_size=config.TEST.RPN_MIN_SIZE, iou_loss=config.RPN_IOU_LOSS)
# Fast R-CNN
roi_pool = mx.symbol.ROIPooling(
name='roi_pool', data=relu5_3, rois=rois, pooled_size=(7, 7), spatial_scale=1.0 / config.RCNN_FEAT_STRIDE)
if config.RCNN_CTX_WINDOW:
roi_pool_ctx = mx.symbol.ROIPooling(
name='roi_pool_ctx', data=relu5_3, rois=rois, pooled_size=(7, 7),
spatial_scale=1.0 / config.RCNN_FEAT_STRIDE, pad=0.25)
roi_pool_concat = mx.symbol.Concat(roi_pool, roi_pool_ctx, name='roi_pool_concat')
roi_pool_red = mx.symbol.Convolution(
data=roi_pool_concat, num_filter=512, kernel=(1, 1), stride=(1, 1), name='roi_pool_ctx_red')
roi_pool = mx.symbol.Activation(data=roi_pool_red, act_type='relu', name='roi_pool_relu')
# group 6
flatten = mx.symbol.Flatten(data=roi_pool, name="flatten")
fc6 = mx.symbol.FullyConnected(data=flatten, num_hidden=4096, name="fc6")
relu6 = mx.symbol.Activation(data=fc6, act_type="relu", name="relu6")
drop6 = mx.symbol.Dropout(data=relu6, p=0.5, name="drop6")
# group 7
fc7 = mx.symbol.FullyConnected(data=drop6, num_hidden=4096, name="fc7")
relu7 = mx.symbol.Activation(data=fc7, act_type="relu", name="relu7")
drop7 = mx.symbol.Dropout(data=relu7, p=0.5, name="drop7")
# classification
cls_score = mx.symbol.FullyConnected(name='cls_score', data=drop7, num_hidden=num_classes)
cls_prob = mx.symbol.SoftmaxOutput(name='cls_prob', data=cls_score)
# bounding box regression
bbox_pred = mx.symbol.FullyConnected(name='bbox_pred', data=drop7, num_hidden=num_classes * 4)
# reshape output
cls_prob = mx.symbol.Reshape(data=cls_prob, shape=(config.TEST.BATCH_IMAGES, -1, num_classes), name='cls_prob_reshape')
bbox_pred = mx.symbol.Reshape(data=bbox_pred, shape=(config.TEST.BATCH_IMAGES, -1, 4 * num_classes), name='bbox_pred_reshape')
# group output
# group = mx.symbol.Group([rois, cls_prob, bbox_pred])
group = mx.symbol.Group([rois, cls_prob, bbox_pred, flow, flow_avg])
return group
def get_vgg_train_dff_cycle(num_classes=config.NUM_CLASSES, num_anchors=config.NUM_ANCHORS):
"""
Faster R-CNN end-to-end with VGG 16 conv layers
Edited for deep feature flow, use flownet warp feature
:param num_classes: used to determine output size
:param num_anchors: used to determine output size
:return: Symbol
"""
data = mx.symbol.Variable(name="data")
data2 = mx.symbol.Variable(name="data2")
im_info = mx.symbol.Variable(name="im_info")
gt_boxes = mx.symbol.Variable(name="gt_boxes")
rpn_label = mx.symbol.Variable(name='label')
rpn_bbox_target = mx.symbol.Variable(name='bbox_target')
rpn_bbox_weight = mx.symbol.Variable(name='bbox_weight')
param_variable_list = [mx.sym.Variable(item) for item in config.SHARE_PARAMS_LIST]
param_dic = dict(zip(config.SHARE_PARAMS_LIST, param_variable_list))
print param_dic
# shared convolutional layers
# relu5_3 = get_vgg_conv(data)
# relu5_3 = get_vgg_dilate_conv(data2)
# relu5_3, _, _ = feature_propagate(relu5_3, data, data2)
# Cycle rcnn_dff
# Prev_Image ------> Curr_Image
# |
# |
# v
# Prev_Feature <----- Curr_Feature
#
# Curr_Image ------> Prev_Image
# |
# |
# v
# Curr_Feature <----- Prev_Feature
src_feature = get_vgg_dilate_conv(data)
temp_feature, _, _ = feature_propagate_share(param_dic, src_feature, data2, data)
dst_feature, _, _ = feature_propagate_share(param_dic, temp_feature, data, data2)
relu5_3 = dst_feature
# RPN layers
rpn_conv = mx.symbol.Convolution(
data=relu5_3, kernel=(3, 3), pad=(1, 1), num_filter=512, name="rpn_conv_3x3")
rpn_relu = mx.symbol.Activation(data=rpn_conv, act_type="relu", name="rpn_relu")
rpn_cls_score = mx.symbol.Convolution(
data=rpn_relu, kernel=(1, 1), pad=(0, 0), num_filter=2 * num_anchors, name="rpn_cls_score")
rpn_bbox_pred = mx.symbol.Convolution(
data=rpn_relu, kernel=(1, 1), pad=(0, 0), num_filter=4 * num_anchors, name="rpn_bbox_pred")
# prepare rpn data
rpn_cls_score_reshape = mx.symbol.Reshape(
data=rpn_cls_score, shape=(0, 2, -1, 0), name="rpn_cls_score_reshape")
# classification
rpn_cls_prob = mx.symbol.SoftmaxOutput(data=rpn_cls_score_reshape, label=rpn_label, multi_output=True,
normalization='valid', use_ignore=True, ignore_label=-1,
is_hidden_layer=config.TRAIN.RPN_OHEM, name="rpn_cls_prob")
# bounding box regression
if config.RPN_IOU_LOSS:
rpn_bbox_loss_ = mx.symbol.Custom(data=rpn_bbox_pred, bbox_target=rpn_bbox_target, bbox_weight=rpn_bbox_weight,
op_type='rpn_iou_loss', name='rpn_bbox_loss_',
feature_stride=config.RPN_FEAT_STRIDE, scales=tuple(config.ANCHOR_SCALES),
ratios=tuple(config.ANCHOR_RATIOS))
else:
rpn_bbox_loss_ = rpn_bbox_weight * mx.symbol.smooth_l1(name='rpn_bbox_loss_', scalar=3.0, data=(rpn_bbox_pred - rpn_bbox_target))
# rpn output
if config.TRAIN.RPN_OHEM:
group = mx.symbol.Custom(
cls_prob=rpn_cls_prob, bbox_loss=rpn_bbox_loss_, label=rpn_label, bbox_pred=rpn_bbox_pred,
name='rpn_ohem', op_type='sample_anchors',
feature_stride=config.RPN_FEAT_STRIDE, scales=tuple(config.ANCHOR_SCALES), ratios=tuple(config.ANCHOR_RATIOS),
rpn_pre_nms_top_n=config.TRAIN.RPN_OHEM_ANCHORS, rpn_batch_size=config.TRAIN.RPN_BATCH_SIZE,
nms_threshold=config.TRAIN.RPN_OHEM_NMS, iou_loss=config.RPN_IOU_LOSS,
transform=config.TRAIN.RPN_OHEM_TRANSFORM, ignore=config.TRAIN.RPN_OHEM_IGNORE, np_ratio=config.TRAIN.RPN_OHEM_NP_RATIO)
rpn_group = [group[0], group[1], group[2], group[3]]
else:
rpn_bbox_loss = mx.sym.MakeLoss(name='rpn_bbox_loss', data=rpn_bbox_loss_, grad_scale=1.0 / config.TRAIN.RPN_BATCH_SIZE)
rpn_group = [rpn_cls_prob, rpn_bbox_loss]
# ROI proposal
rpn_cls_act = mx.symbol.SoftmaxActivation(
data=rpn_cls_score_reshape, mode="channel", name="rpn_cls_act")
rpn_cls_act_reshape = mx.symbol.Reshape(
data=rpn_cls_act, shape=(0, 2 * num_anchors, -1, 0), name='rpn_cls_act_reshape')
if config.TRAIN.CXX_PROPOSAL:
rois = mx.symbol.Proposal(
cls_prob=rpn_cls_act_reshape, bbox_pred=rpn_bbox_pred, im_info=im_info, name='rois',
feature_stride=config.RPN_FEAT_STRIDE, scales=tuple(config.ANCHOR_SCALES), ratios=tuple(config.ANCHOR_RATIOS),
rpn_pre_nms_top_n=config.TRAIN.RPN_PRE_NMS_TOP_N, rpn_post_nms_top_n=config.TRAIN.RPN_POST_NMS_TOP_N,
threshold=config.TRAIN.RPN_NMS_THRESH, rpn_min_size=config.TRAIN.RPN_MIN_SIZE, iou_loss=config.RPN_IOU_LOSS)
else:
rois = mx.symbol.Custom(
cls_prob=rpn_cls_act_reshape, bbox_pred=rpn_bbox_pred, im_info=im_info, name='rois',
op_type='proposal', feat_stride=config.RPN_FEAT_STRIDE,
scales=tuple(config.ANCHOR_SCALES), ratios=tuple(config.ANCHOR_RATIOS),
rpn_pre_nms_top_n=config.TRAIN.RPN_PRE_NMS_TOP_N, rpn_post_nms_top_n=config.TRAIN.RPN_POST_NMS_TOP_N,
threshold=config.TRAIN.RPN_NMS_THRESH, rpn_min_size=config.TRAIN.RPN_MIN_SIZE, iou_loss=config.RPN_IOU_LOSS)
# ROI proposal target
gt_boxes_reshape = mx.symbol.Reshape(data=gt_boxes, shape=(-1, 5), name='gt_boxes_reshape')
if config.TRAIN.RCNN_OHEM:
group = mx.symbol.Custom(rois=rois, gt_boxes=gt_boxes_reshape, op_type='proposal_target',
num_classes=num_classes, batch_images=config.TRAIN.BATCH_IMAGES,
batch_rois=config.TRAIN.RCNN_OHEM_ROIS, ohem=config.TRAIN.RCNN_OHEM)
else:
group = mx.symbol.Custom(rois=rois, gt_boxes=gt_boxes_reshape, op_type='proposal_target',
num_classes=num_classes, batch_images=config.TRAIN.BATCH_IMAGES,
batch_rois=config.TRAIN.BATCH_ROIS, fg_fraction=config.TRAIN.FG_FRACTION)
rois = group[0]
label = group[1]
bbox_target = group[2]
bbox_weight = group[3]
# Fast R-CNN
roi_pool = mx.symbol.ROIPooling(
name='roi_pool', data=relu5_3, rois=rois, pooled_size=(7, 7), spatial_scale=1.0 / config.RCNN_FEAT_STRIDE)
if config.RCNN_CTX_WINDOW:
roi_pool_ctx = mx.symbol.ROIPooling(
name='roi_pool_ctx', data=relu5_3, rois=rois, pooled_size=(7, 7),
spatial_scale=1.0 / config.RCNN_FEAT_STRIDE, pad=0.25)
roi_pool_concat = mx.symbol.Concat(roi_pool, roi_pool_ctx, name='roi_pool_concat')
roi_pool_red = mx.symbol.Convolution(
data=roi_pool_concat, num_filter=512, kernel=(1, 1), stride=(1, 1), name='roi_pool_ctx_red')
roi_pool = mx.symbol.Activation(data=roi_pool_red, act_type='relu', name='roi_pool_relu')
# group 6
flatten = mx.symbol.Flatten(data=roi_pool, name="flatten")
fc6 = mx.symbol.FullyConnected(data=flatten, num_hidden=4096, name="fc6")
relu6 = mx.symbol.Activation(data=fc6, act_type="relu", name="relu6")
drop6 = mx.symbol.Dropout(data=relu6, p=0.5, name="drop6")
# group 7
fc7 = mx.symbol.FullyConnected(data=drop6, num_hidden=4096, name="fc7")
relu7 = mx.symbol.Activation(data=fc7, act_type="relu", name="relu7")
drop7 = mx.symbol.Dropout(data=relu7, p=0.5, name="drop7")
# classification
cls_score = mx.symbol.FullyConnected(name='cls_score', data=drop7, num_hidden=num_classes)
cls_prob = mx.symbol.SoftmaxOutput(name='cls_prob', data=cls_score, label=label, normalization='batch',
is_hidden_layer=config.TRAIN.RCNN_OHEM)
# bounding box regression
bbox_pred = mx.symbol.FullyConnected(name='bbox_pred', data=drop7, num_hidden=num_classes * 4)
if config.RCNN_IOU_LOSS:
bbox_loss_ = mx.symbol.Custom(data=bbox_pred, bbox_target=bbox_target, bbox_weight=bbox_weight, rois=rois,
op_type='rcnn_iou_loss', name='bbox_loss_', num_classes=num_classes)
else:
bbox_loss_ = bbox_weight * mx.symbol.smooth_l1(name='bbox_loss_', scalar=1.0, data=(bbox_pred - bbox_target))
if config.TRAIN.RCNN_OHEM:
group = mx.symbol.Custom(
cls_prob=cls_prob, bbox_loss=bbox_loss_, label=label, rois=rois, bbox_pred=bbox_pred,
name='rcnn_ohem', op_type='sample_rois',
batch_images=config.TRAIN.BATCH_IMAGES, batch_size=config.TRAIN.BATCH_ROIS,
nms_threshold=config.TRAIN.RCNN_OHEM_NMS, iou_loss=config.RPN_IOU_LOSS,
transform=config.TRAIN.RCNN_OHEM_TRANSFORM, ignore=config.TRAIN.RCNN_OHEM_IGNORE)
rcnn_group = [group[0], group[1], group[2], group[3]]
for ind, name, last_shape in zip(range(len(rcnn_group)),
['cls_prob', 'bbox_loss', 'cls_mask', 'bbox_mask'],
[num_classes, num_classes * 4, num_classes, num_classes * 4]):
rcnn_group[ind] = mx.symbol.Reshape(data=rcnn_group[ind], shape=(config.TRAIN.BATCH_IMAGES, -1, last_shape),
name=name + '_reshape')
else:
bbox_loss = mx.sym.MakeLoss(name='bbox_loss', data=bbox_loss_, grad_scale=1.0 / config.TRAIN.BATCH_ROIS)
rcnn_group = [cls_prob, bbox_loss]
for ind, name, last_shape in zip(range(len(rcnn_group)),
['cls_prob', 'bbox_loss'],
[num_classes, num_classes * 4]):
rcnn_group[ind] = mx.symbol.Reshape(data=rcnn_group[ind], shape=(config.TRAIN.BATCH_IMAGES, -1, last_shape),
name=name + '_reshape')
# append label
label = mx.symbol.Reshape(data=label, shape=(config.TRAIN.BATCH_IMAGES, -1), name='label_reshape')
rcnn_group += [mx.symbol.BlockGrad(label, name='label_blockgrad')]
group = mx.symbol.Group(rpn_group + rcnn_group)
return group
def get_embedding_feature(feature_input,embedding_param_dic):
embedding_conv1 = mx.symbol.Convolution(data=feature_input, kernel=(1, 1), \
pad=(0, 0), num_filter=512, name="embedding_conv1",\
weight=embedding_param_dic['embedding_conv1_weight'],\
bias=embedding_param_dic['embedding_conv1_bias'])
embedding_conv2 = mx.symbol.Convolution(data=embedding_conv1, kernel=(1, 1),\
pad=(1, 1), num_filter=512, name="embedding_conv2",\
weight=embedding_param_dic['embedding_conv2_weight'],\
bias=embedding_param_dic['embedding_conv2_bias'])
embedding_conv3 = mx.symbol.Convolution(data=embedding_conv2, kernel=(1, 1),\
pad=(0, 0), num_filter=512, name="embedding_conv3",\
weight=embedding_param_dic['embedding_conv3_weight'],\
bias=embedding_param_dic['embedding_conv3_bias'])
embedding_flatten = mx.symbol.Flatten(data=embedding_conv3, name='embedding_flatten')
return embedding_flatten
def get_feature_aggregation(relu5_3_dict):
"""
Flow Feature Aggregation use weight from similarity measurement.
:param relu5_3_dict: nearby feature from flow warp
:return: Symbol
"""
embedding_param_variable_list = [mx.sym.Variable(item) for item in config.EMBEDDING_PARAMS_LIST ]
embedding_param_dic = dict(zip(config.EMBEDDING_PARAMS_LIST, embedding_param_variable_list))
embedding_dict = {}
for item in relu5_3_dict.keys():
embedding_dict[item] = get_embedding_feature(relu5_3_dict[item],embedding_param_dic)
ref_feature = embedding_dict['data']
# l2 distance of ref_feature (name = 'data')
ref_length_sqr = mx.symbol.dot(ref_feature ,mx.symbol.Reshape(ref_feature, shape=(-1,)))
ref_length = mx.symbol.sqrt(ref_length_sqr)
score_dict = {}
dot_product_dict = {}
l2_product_dict = {}
for item in embedding_dict.keys():
curr_feature = embedding_dict[item]
# mx.symbol.Reshape(ref_feature, shape=(-1,))
dot_product = mx.symbol.dot(curr_feature, mx.symbol.Reshape(ref_feature,\
shape=(-1,)), \
name='dot_product_{}'.format(item))
dot_product = mx.symbol.Reshape(dot_product, shape=(1,))
dot_product_dict[item] = dot_product
# l2 distance of curr_feature (name = item, e.g. 'prev_1, next_1')
curr_length_sqr = mx.symbol.dot(curr_feature ,mx.symbol.Reshape(curr_feature, shape=(-1,)))
curr_length = mx.symbol.sqrt(curr_length_sqr)
l2_product = curr_length * ref_length
l2_product = mx.symbol.Reshape(l2_product, shape=(1,))
score_dict[item] = mx.symbol.exp( dot_product / l2_product , name='score_{}'.format(item))
arg_shape, output_shape, aux_shape = embedding_dict['data'].infer_shape(data=(2, 3, 384, 1280))
print '####dot_product:',arg_shape, output_shape, aux_shape
# calculate the weight of every feature
weight_dict = {}
weight_sum = score_dict['data']
for item in score_dict.keys():
if item != 'data':
weight_sum += score_dict[item]
for item in score_dict.keys():
weight_dict[item] = score_dict[item] / weight_sum
relu5_3 = relu5_3_dict['data']
for item in weight_dict.keys():
relu5_3_temp = mx.symbol.broadcast_mul(lhs=relu5_3_dict[item], rhs=weight_dict[item])
relu5_3 += relu5_3_temp
relu5_3 -= relu5_3_dict['data']
return relu5_3
def get_vgg_train_ffa(num_classes=config.NUM_CLASSES, num_anchors=config.NUM_ANCHORS):
"""
Flow Feature Aggregation
Faster R-CNN end-to-end with VGG 16 conv layers
Edited for deep feature flow, use flownet warp feature
:param num_classes: used to determine output size
:param num_anchors: used to determine output size
:return: Symbol
"""
# flow share param
param_variable_list = [mx.sym.Variable(item) for item in config.SHARE_PARAMS_LIST]
param_dic = dict(zip(config.SHARE_PARAMS_LIST, param_variable_list))
# create data dict to store curr image and nearby image
data_dict = {}
for i in range(config.FRAMES_FEATURE_AGGREGATION):
data_dict['prev_{}'.format(i+1)] = mx.symbol.Variable(name='prev_{}'.format(i+1))
data_dict['next_{}'.format(i+1)] = mx.symbol.Variable(name='next_{}'.format(i+1))
data_dict['data'] = mx.symbol.Variable(name="data")
im_info = mx.symbol.Variable(name="im_info")
gt_boxes = mx.symbol.Variable(name="gt_boxes")
rpn_label = mx.symbol.Variable(name='label')
rpn_bbox_target = mx.symbol.Variable(name='bbox_target')
rpn_bbox_weight = mx.symbol.Variable(name='bbox_weight')
# direct feature from rcnn (relu5_3 reference)
relu5_3_ref = get_vgg_dilate_conv(data_dict['data'])
# create feature dict to store ref feature and nearby features
relu5_3_dict = {}
for item in data_dict.keys():
# use flow to get feature
relu5_3_dict[item], _, _ = feature_propagate_share(item, param_dic, \
relu5_3_ref, \
data_dict[item],\
data_dict['data'])
relu5_3_dict['data'] = relu5_3_ref
# feature aggregation use neary features
relu5_3 = get_feature_aggregation(relu5_3_dict)
# RPN layers
rpn_conv = mx.symbol.Convolution(
data=relu5_3, kernel=(3, 3), pad=(1, 1), num_filter=512, name="rpn_conv_3x3")
rpn_relu = mx.symbol.Activation(data=rpn_conv, act_type="relu", name="rpn_relu")
rpn_cls_score = mx.symbol.Convolution(
data=rpn_relu, kernel=(1, 1), pad=(0, 0), num_filter=2 * num_anchors, name="rpn_cls_score")
rpn_bbox_pred = mx.symbol.Convolution(
data=rpn_relu, kernel=(1, 1), pad=(0, 0), num_filter=4 * num_anchors, name="rpn_bbox_pred")
# prepare rpn data
rpn_cls_score_reshape = mx.symbol.Reshape(
data=rpn_cls_score, shape=(0, 2, -1, 0), name="rpn_cls_score_reshape")
# classification
rpn_cls_prob = mx.symbol.SoftmaxOutput(data=rpn_cls_score_reshape, label=rpn_label, multi_output=True,
normalization='valid', use_ignore=True, ignore_label=-1,
is_hidden_layer=config.TRAIN.RPN_OHEM, name="rpn_cls_prob")
# bounding box regression
if config.RPN_IOU_LOSS:
rpn_bbox_loss_ = mx.symbol.Custom(data=rpn_bbox_pred, bbox_target=rpn_bbox_target, bbox_weight=rpn_bbox_weight,
op_type='rpn_iou_loss', name='rpn_bbox_loss_',
feature_stride=config.RPN_FEAT_STRIDE, scales=tuple(config.ANCHOR_SCALES),
ratios=tuple(config.ANCHOR_RATIOS))
else:
rpn_bbox_loss_ = rpn_bbox_weight * mx.symbol.smooth_l1(name='rpn_bbox_loss_', scalar=3.0, data=(rpn_bbox_pred - rpn_bbox_target))
# rpn output
if config.TRAIN.RPN_OHEM:
group = mx.symbol.Custom(
cls_prob=rpn_cls_prob, bbox_loss=rpn_bbox_loss_, label=rpn_label, bbox_pred=rpn_bbox_pred,
name='rpn_ohem', op_type='sample_anchors',
feature_stride=config.RPN_FEAT_STRIDE, scales=tuple(config.ANCHOR_SCALES), ratios=tuple(config.ANCHOR_RATIOS),
rpn_pre_nms_top_n=config.TRAIN.RPN_OHEM_ANCHORS, rpn_batch_size=config.TRAIN.RPN_BATCH_SIZE,
nms_threshold=config.TRAIN.RPN_OHEM_NMS, iou_loss=config.RPN_IOU_LOSS,
transform=config.TRAIN.RPN_OHEM_TRANSFORM, ignore=config.TRAIN.RPN_OHEM_IGNORE, np_ratio=config.TRAIN.RPN_OHEM_NP_RATIO)
rpn_group = [group[0], group[1], group[2], group[3]]
else:
rpn_bbox_loss = mx.sym.MakeLoss(name='rpn_bbox_loss', data=rpn_bbox_loss_, grad_scale=1.0 / config.TRAIN.RPN_BATCH_SIZE)
rpn_group = [rpn_cls_prob, rpn_bbox_loss]
# ROI proposal
rpn_cls_act = mx.symbol.SoftmaxActivation(
data=rpn_cls_score_reshape, mode="channel", name="rpn_cls_act")
rpn_cls_act_reshape = mx.symbol.Reshape(
data=rpn_cls_act, shape=(0, 2 * num_anchors, -1, 0), name='rpn_cls_act_reshape')
if config.TRAIN.CXX_PROPOSAL:
rois = mx.symbol.Proposal(
cls_prob=rpn_cls_act_reshape, bbox_pred=rpn_bbox_pred, im_info=im_info, name='rois',
feature_stride=config.RPN_FEAT_STRIDE, scales=tuple(config.ANCHOR_SCALES), ratios=tuple(config.ANCHOR_RATIOS),
rpn_pre_nms_top_n=config.TRAIN.RPN_PRE_NMS_TOP_N, rpn_post_nms_top_n=config.TRAIN.RPN_POST_NMS_TOP_N,
threshold=config.TRAIN.RPN_NMS_THRESH, rpn_min_size=config.TRAIN.RPN_MIN_SIZE, iou_loss=config.RPN_IOU_LOSS)
else:
rois = mx.symbol.Custom(
cls_prob=rpn_cls_act_reshape, bbox_pred=rpn_bbox_pred, im_info=im_info, name='rois',
op_type='proposal', feat_stride=config.RPN_FEAT_STRIDE,
scales=tuple(config.ANCHOR_SCALES), ratios=tuple(config.ANCHOR_RATIOS),
rpn_pre_nms_top_n=config.TRAIN.RPN_PRE_NMS_TOP_N, rpn_post_nms_top_n=config.TRAIN.RPN_POST_NMS_TOP_N,
threshold=config.TRAIN.RPN_NMS_THRESH, rpn_min_size=config.TRAIN.RPN_MIN_SIZE, iou_loss=config.RPN_IOU_LOSS)
# ROI proposal target
gt_boxes_reshape = mx.symbol.Reshape(data=gt_boxes, shape=(-1, 5), name='gt_boxes_reshape')
if config.TRAIN.RCNN_OHEM:
group = mx.symbol.Custom(rois=rois, gt_boxes=gt_boxes_reshape, op_type='proposal_target',
num_classes=num_classes, batch_images=config.TRAIN.BATCH_IMAGES,
batch_rois=config.TRAIN.RCNN_OHEM_ROIS, ohem=config.TRAIN.RCNN_OHEM)
else:
group = mx.symbol.Custom(rois=rois, gt_boxes=gt_boxes_reshape, op_type='proposal_target',
num_classes=num_classes, batch_images=config.TRAIN.BATCH_IMAGES,
batch_rois=config.TRAIN.BATCH_ROIS, fg_fraction=config.TRAIN.FG_FRACTION)
rois = group[0]
label = group[1]
bbox_target = group[2]
bbox_weight = group[3]
# Fast R-CNN
roi_pool = mx.symbol.ROIPooling(
name='roi_pool', data=relu5_3, rois=rois, pooled_size=(7, 7), spatial_scale=1.0 / config.RCNN_FEAT_STRIDE)
if config.RCNN_CTX_WINDOW:
roi_pool_ctx = mx.symbol.ROIPooling(
name='roi_pool_ctx', data=relu5_3, rois=rois, pooled_size=(7, 7),
spatial_scale=1.0 / config.RCNN_FEAT_STRIDE, pad=0.25)
roi_pool_concat = mx.symbol.Concat(roi_pool, roi_pool_ctx, name='roi_pool_concat')
roi_pool_red = mx.symbol.Convolution(
data=roi_pool_concat, num_filter=512, kernel=(1, 1), stride=(1, 1), name='roi_pool_ctx_red')
roi_pool = mx.symbol.Activation(data=roi_pool_red, act_type='relu', name='roi_pool_relu')
# group 6
flatten = mx.symbol.Flatten(data=roi_pool, name="flatten")
fc6 = mx.symbol.FullyConnected(data=flatten, num_hidden=4096, name="fc6")
relu6 = mx.symbol.Activation(data=fc6, act_type="relu", name="relu6")
drop6 = mx.symbol.Dropout(data=relu6, p=0.5, name="drop6")
# group 7
fc7 = mx.symbol.FullyConnected(data=drop6, num_hidden=4096, name="fc7")
relu7 = mx.symbol.Activation(data=fc7, act_type="relu", name="relu7")
drop7 = mx.symbol.Dropout(data=relu7, p=0.5, name="drop7")
# classification
cls_score = mx.symbol.FullyConnected(name='cls_score', data=drop7, num_hidden=num_classes)
cls_prob = mx.symbol.SoftmaxOutput(name='cls_prob', data=cls_score, label=label, normalization='batch',
is_hidden_layer=config.TRAIN.RCNN_OHEM)
# bounding box regression
bbox_pred = mx.symbol.FullyConnected(name='bbox_pred', data=drop7, num_hidden=num_classes * 4)
if config.RCNN_IOU_LOSS:
bbox_loss_ = mx.symbol.Custom(data=bbox_pred, bbox_target=bbox_target, bbox_weight=bbox_weight, rois=rois,
op_type='rcnn_iou_loss', name='bbox_loss_', num_classes=num_classes)
else:
bbox_loss_ = bbox_weight * mx.symbol.smooth_l1(name='bbox_loss_', scalar=1.0, data=(bbox_pred - bbox_target))
if config.TRAIN.RCNN_OHEM:
group = mx.symbol.Custom(
cls_prob=cls_prob, bbox_loss=bbox_loss_, label=label, rois=rois, bbox_pred=bbox_pred,
name='rcnn_ohem', op_type='sample_rois',
batch_images=config.TRAIN.BATCH_IMAGES, batch_size=config.TRAIN.BATCH_ROIS,
nms_threshold=config.TRAIN.RCNN_OHEM_NMS, iou_loss=config.RPN_IOU_LOSS,
transform=config.TRAIN.RCNN_OHEM_TRANSFORM, ignore=config.TRAIN.RCNN_OHEM_IGNORE)
rcnn_group = [group[0], group[1], group[2], group[3]]
for ind, name, last_shape in zip(range(len(rcnn_group)),
['cls_prob', 'bbox_loss', 'cls_mask', 'bbox_mask'],
[num_classes, num_classes * 4, num_classes, num_classes * 4]):
rcnn_group[ind] = mx.symbol.Reshape(data=rcnn_group[ind], shape=(config.TRAIN.BATCH_IMAGES, -1, last_shape),
name=name + '_reshape')
else:
bbox_loss = mx.sym.MakeLoss(name='bbox_loss', data=bbox_loss_, grad_scale=1.0 / config.TRAIN.BATCH_ROIS)
rcnn_group = [cls_prob, bbox_loss]
for ind, name, last_shape in zip(range(len(rcnn_group)),
['cls_prob', 'bbox_loss'],
[num_classes, num_classes * 4]):
rcnn_group[ind] = mx.symbol.Reshape(data=rcnn_group[ind], shape=(config.TRAIN.BATCH_IMAGES, -1, last_shape),
name=name + '_reshape')
# append label
label = mx.symbol.Reshape(data=label, shape=(config.TRAIN.BATCH_IMAGES, -1), name='label_reshape')
rcnn_group += [mx.symbol.BlockGrad(label, name='label_blockgrad')]
group = mx.symbol.Group(rpn_group + rcnn_group)
return group
def get_vgg_test_ffa(num_classes=config.NUM_CLASSES, num_anchors=config.NUM_ANCHORS):
"""
Faster R-CNN test with VGG 16 conv layers
:param num_classes: used to determine output size
:param num_anchors: used to determine output size
:return: Symbol
"""
param_variable_list = [mx.sym.Variable(item) for item in config.SHARE_PARAMS_LIST]
param_dic = dict(zip(config.SHARE_PARAMS_LIST, param_variable_list))
data_dict = {}
for i in range(config.FRAMES_FEATURE_AGGREGATION):
data_dict['prev_{}'.format(i+1)] = mx.symbol.Variable(name='prev_{}'.format(i+1))
data_dict['next_{}'.format(i+1)] = mx.symbol.Variable(name='next_{}'.format(i+1))
data_dict['data'] = mx.symbol.Variable(name="data")
# data = data_dict['data']
im_info = mx.symbol.Variable(name="im_info")
gt_boxes = mx.symbol.Variable(name="gt_boxes")
rpn_label = mx.symbol.Variable(name='label')
rpn_bbox_target = mx.symbol.Variable(name='bbox_target')
rpn_bbox_weight = mx.symbol.Variable(name='bbox_weight')
# shared convolutional layers
# relu5_3 = get_vgg_conv(data)
relu5_3_ref = get_vgg_dilate_conv(data_dict['data'])
arg_shape, output_shape, aux_shape = relu5_3_ref.infer_shape(data=(1, 3, 384, 1280))
print 'aaaaaa relu5_3_ref:',arg_shape, output_shape, aux_shape
relu5_3_dict = {}
for item in data_dict.keys():
# print "#####", data_dict[item].list_arguments()
# feature_propagate_share(param_dic, src_feature, data2, data)
relu5_3_dict[item], flow, flow_avg = feature_propagate_share(item, param_dic, relu5_3_ref, data_dict[item], data_dict['data'])
# print "~~~~~~", relu5_3_dict[item].list_arguments()
relu5_3_dict['data'] = relu5_3_ref
# print "flow flow flow", relu5_3_dict['prev_1'].list_arguments()
relu5_3 = get_feature_aggregation(relu5_3_dict)
# RPN
rpn_conv = mx.symbol.Convolution(
data=relu5_3, kernel=(3, 3), pad=(1, 1), num_filter=512, name="rpn_conv_3x3")
rpn_relu = mx.symbol.Activation(data=rpn_conv, act_type="relu", name="rpn_relu")
rpn_cls_score = mx.symbol.Convolution(
data=rpn_relu, kernel=(1, 1), pad=(0, 0), num_filter=2 * num_anchors, name="rpn_cls_score")
rpn_bbox_pred = mx.symbol.Convolution(
data=rpn_relu, kernel=(1, 1), pad=(0, 0), num_filter=4 * num_anchors, name="rpn_bbox_pred")
# ROI Proposal
rpn_cls_score_reshape = mx.symbol.Reshape(
data=rpn_cls_score, shape=(0, 2, -1, 0), name="rpn_cls_score_reshape")
rpn_cls_prob = mx.symbol.SoftmaxActivation(
data=rpn_cls_score_reshape, mode="channel", name="rpn_cls_prob")
rpn_cls_prob_reshape = mx.symbol.Reshape(
data=rpn_cls_prob, shape=(0, 2 * num_anchors, -1, 0), name='rpn_cls_prob_reshape')
if config.TEST.CXX_PROPOSAL:
rois = mx.symbol.Proposal(
cls_prob=rpn_cls_prob_reshape, bbox_pred=rpn_bbox_pred, im_info=im_info, name='rois',
feature_stride=config.RPN_FEAT_STRIDE, scales=tuple(config.ANCHOR_SCALES), ratios=tuple(config.ANCHOR_RATIOS),
rpn_pre_nms_top_n=config.TEST.RPN_PRE_NMS_TOP_N, rpn_post_nms_top_n=config.TEST.RPN_POST_NMS_TOP_N,
threshold=config.TEST.RPN_NMS_THRESH, rpn_min_size=config.TEST.RPN_MIN_SIZE, iou_loss=config.RPN_IOU_LOSS)
else:
rois = mx.symbol.Custom(
cls_prob=rpn_cls_prob_reshape, bbox_pred=rpn_bbox_pred, im_info=im_info, name='rois',
op_type='proposal', feat_stride=config.RPN_FEAT_STRIDE,
scales=tuple(config.ANCHOR_SCALES), ratios=tuple(config.ANCHOR_RATIOS),
rpn_pre_nms_top_n=config.TEST.RPN_PRE_NMS_TOP_N, rpn_post_nms_top_n=config.TEST.RPN_POST_NMS_TOP_N,
threshold=config.TEST.RPN_NMS_THRESH, rpn_min_size=config.TEST.RPN_MIN_SIZE, iou_loss=config.RPN_IOU_LOSS)
# Fast R-CNN
roi_pool = mx.symbol.ROIPooling(
name='roi_pool', data=relu5_3, rois=rois, pooled_size=(7, 7), spatial_scale=1.0 / config.RCNN_FEAT_STRIDE)
if config.RCNN_CTX_WINDOW:
roi_pool_ctx = mx.symbol.ROIPooling(
name='roi_pool_ctx', data=relu5_3, rois=rois, pooled_size=(7, 7),
spatial_scale=1.0 / config.RCNN_FEAT_STRIDE, pad=0.25)
roi_pool_concat = mx.symbol.Concat(roi_pool, roi_pool_ctx, name='roi_pool_concat')
roi_pool_red = mx.symbol.Convolution(
data=roi_pool_concat, num_filter=512, kernel=(1, 1), stride=(1, 1), name='roi_pool_ctx_red')
roi_pool = mx.symbol.Activation(data=roi_pool_red, act_type='relu', name='roi_pool_relu')
# group 6
flatten = mx.symbol.Flatten(data=roi_pool, name="flatten")
fc6 = mx.symbol.FullyConnected(data=flatten, num_hidden=4096, name="fc6")
relu6 = mx.symbol.Activation(data=fc6, act_type="relu", name="relu6")
drop6 = mx.symbol.Dropout(data=relu6, p=0.5, name="drop6")
# group 7
fc7 = mx.symbol.FullyConnected(data=drop6, num_hidden=4096, name="fc7")
relu7 = mx.symbol.Activation(data=fc7, act_type="relu", name="relu7")
drop7 = mx.symbol.Dropout(data=relu7, p=0.5, name="drop7")
# classification
cls_score = mx.symbol.FullyConnected(name='cls_score', data=drop7, num_hidden=num_classes)
cls_prob = mx.symbol.SoftmaxOutput(name='cls_prob', data=cls_score)
# bounding box regression
bbox_pred = mx.symbol.FullyConnected(name='bbox_pred', data=drop7, num_hidden=num_classes * 4)
# reshape output
cls_prob = mx.symbol.Reshape(data=cls_prob, shape=(config.TEST.BATCH_IMAGES, -1, num_classes), name='cls_prob_reshape')
bbox_pred = mx.symbol.Reshape(data=bbox_pred, shape=(config.TEST.BATCH_IMAGES, -1, 4 * num_classes), name='bbox_pred_reshape')
# group output
group = mx.symbol.Group([rois, cls_prob, bbox_pred])
# group = mx.symbol.Group([rois, cls_prob, bbox_pred, flow, flow_avg])
return group
| 54.724782 | 137 | 0.661569 | 12,795 | 87,888 | 4.240719 | 0.021258 | 0.073129 | 0.028714 | 0.034759 | 0.956414 | 0.94742 | 0.938242 | 0.933081 | 0.927368 | 0.921508 | 0 | 0.032082 | 0.212657 | 87,888 | 1,605 | 138 | 54.758879 | 0.752045 | 0.054945 | 0 | 0.823529 | 0 | 0 | 0.071241 | 0.003438 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.008913 | null | null | 0.002674 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
105b61b3fa692247bb72d2917e9a498607d16513 | 102 | py | Python | tests/common.py | pyexcel-renderers/pyexcel-echarts | 37200195c8ffe0fe4cc13754f322989accaa3548 | [
"BSD-3-Clause"
] | 2 | 2018-10-14T16:10:33.000Z | 2019-08-28T19:24:18.000Z | tests/common.py | pyexcel/pyexcel-echarts | 37200195c8ffe0fe4cc13754f322989accaa3548 | [
"BSD-3-Clause"
] | 1 | 2018-03-04T09:02:51.000Z | 2018-03-04T09:02:51.000Z | tests/common.py | pyexcel/pyexcel-echarts | 37200195c8ffe0fe4cc13754f322989accaa3548 | [
"BSD-3-Clause"
] | null | null | null | import os
def get_fixtures(file_name):
return os.path.join("docs", "source", "data", file_name)
| 17 | 60 | 0.696078 | 16 | 102 | 4.25 | 0.8125 | 0.235294 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.147059 | 102 | 5 | 61 | 20.4 | 0.781609 | 0 | 0 | 0 | 0 | 0 | 0.137255 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 7 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.