hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
4a92f8d12f8097d1369ed2baa7d451f1b1b3b351 | 13,501 | py | Python | evology/bin/mc.py | aymericvie/evology | 8f00d94dee7208be5a5bdd0375a9d6ced25097f4 | [
"Apache-2.0"
] | null | null | null | evology/bin/mc.py | aymericvie/evology | 8f00d94dee7208be5a5bdd0375a9d6ced25097f4 | [
"Apache-2.0"
] | 2 | 2022-01-10T02:10:56.000Z | 2022-01-14T03:41:42.000Z | evology/bin/mc.py | aymericvie/evology | 8f00d94dee7208be5a5bdd0375a9d6ced25097f4 | [
"Apache-2.0"
] | null | null | null | print(
"Looking at different choices to represent the trading functions and how they impact the price, when we initialise at p=100"
)
initial_price = 100
wealth = 50_000_000 + 500_000 * initial_price
assets = 500_000
print("For ValNT = 100, reference")
def func1(asset_key, price): # VI
return (wealth / price) * np.tanh(np.log2(100) - np.log2(price)) - assets
def func2(asset_key, price): # NT
return (wealth / price) * np.tanh(np.log2(100) - np.log2(price)) - assets
def func3(asset_key, price): # TF
return (wealth / price) * np.tanh(0.5) - assets
functions = [func1, func2, func3]
new_price = solve(functions, initial_price)
print(new_price)
initial_price = 100
wealth = 50_000_000 + 500_000 * initial_price
assets = 500_000
print("For ValNT = 100, reference WITH LEVERAGE ")
def func1(asset_key, price): # VI
return (8 * wealth / price) * np.tanh(np.log2(100) - np.log2(price)) - assets
def func2(asset_key, price): # NT
return (wealth / price) * np.tanh(np.log2(100) - np.log2(price)) - assets
def func3(asset_key, price): # TF
return (wealth / price) * np.tanh(0.5) - assets
functions = [func1, func2, func3]
new_price = solve(functions, initial_price)
print(new_price)
print("For ValNT = 110, higher price as expected")
def func1(asset_key, price): # VI
return (wealth / price) * np.tanh(np.log2(100) - np.log2(price)) - assets
def func2(asset_key, price): # NT
return (wealth / price) * np.tanh(np.log2(110) - np.log2(price)) - assets
def func3(asset_key, price): # TF
return (wealth / price) * np.tanh(0.5) - assets
functions = [func1, func2, func3]
new_price = solve(functions, initial_price)
print(new_price)
print("For ValNT = 90, lower price as expected")
def func1(asset_key, price): # VI
return (wealth / price) * np.tanh(np.log2(100) - np.log2(price)) - assets
def func2(asset_key, price): # NT
return (wealth / price) * np.tanh(np.log2(90) - np.log2(price)) - assets
def func3(asset_key, price): # TF
return (wealth / price) * np.tanh(0.5) - assets
functions = [func1, func2, func3]
new_price = solve(functions, initial_price)
print(new_price)
print("With 0,5 inside tanh, price is higher")
def func1(asset_key, price): # VI
return (wealth / price) * np.tanh(np.log2(100) - np.log2(price) + 0.5) - assets
def func2(asset_key, price): # NT
return (wealth / price) * np.tanh(np.log2(90) - np.log2(price) + 0.5) - assets
def func3(asset_key, price): # TF
return (wealth / price) * np.tanh(0.5 + 0.5) - assets
functions = [func1, func2, func3]
new_price = solve(functions, initial_price)
print(new_price)
print("With 0,5 outside tanh, price is even higher")
def func1(asset_key, price): # VI
return (wealth / price) * (np.tanh(np.log2(100) - np.log2(price)) + 0.5) - assets
def func2(asset_key, price): # NT
return (wealth / price) * (np.tanh(np.log2(90) - np.log2(price)) + 0.5) - assets
def func3(asset_key, price): # TF
return (wealth / price) * (np.tanh(0.5) + 0.5) - assets
functions = [func1, func2, func3]
new_price = solve(functions, initial_price)
print(new_price)
print(
"Question: MC is deterministic, which is good. Because now, does the 0,.5 choice gives unintended power to some strategies?"
)
"""
print('----')
print('Lets study the resulting 10 day series')
print('VI 90, no 0.5')
new_price = 100
cash = 50_000_000
asset = 500_000
for i in range(10):
wealth = cash + asset * new_price
def func1(asset_key, price): #VI
return (wealth / price) * np.tanh(np.log2(100) - np.log2(price)) - assets
def func2(asset_key, price): #NT
return (wealth / price) * np.tanh(np.log2(90) - np.log2(price)) - assets
def func3(asset_key, price): #TF
return (wealth / price) * np.tanh(0.5) - assets
functions = [func1, func2, func3]
new_price = float(solve(functions, initial_price)[0])
print(new_price)
print('With 0,5 inside tanh')
new_price = 100
cash = 50_000_000
asset = 500_000
for i in range(10):
wealth = cash + asset * new_price
def func1(asset_key, price): #VI
return (wealth / price) * np.tanh(np.log2(100) - np.log2(price) + 0.5) - assets
def func2(asset_key, price): #NT
return (wealth / price) * np.tanh(np.log2(90) - np.log2(price) + 0.5) - assets
def func3(asset_key, price): #TF
return (wealth / price) * np.tanh(0.5 + 0.5) - assets
functions = [func1, func2, func3]
new_price = float(solve(functions, initial_price)[0])
print(new_price)
print('With 0,5 outside tanh')
new_price = 100
cash = 50_000_000
asset = 500_000
for i in range(10):
wealth = cash + asset * new_price
def func1(asset_key, price): #VI
return (wealth / price) * (np.tanh(np.log2(100) - np.log2(price)) + 0.5) - assets
def func2(asset_key, price): #NT
return (wealth / price) * (np.tanh(np.log2(90) - np.log2(price)) + 0.5) - assets
def func3(asset_key, price): #TF
return (wealth / price) * (np.tanh(0.5) + 0.5) - assets
functions = [func1, func2, func3]
new_price = float(solve(functions, initial_price)[0])
print(new_price)
"""
"""
print('With 0,5 outside tanh, and VI/NT depending on previous price')
print('Then we get oscillations around two attractors')
print('Unless we adapt TF and then nothing happens')
new_price = 100
previous_price = 90
cash = 50_000_000
asset = 500_000
for i in range(10):
wealth = cash + asset * new_price
LogPrev = np.log2(new_price)
def func1(asset_key, price): #VI
return (wealth / price) * (np.tanh(np.log2(100) - LogPrev) + 0.5) - assets
def func2(asset_key, price): #NT
return (wealth / price) * (np.tanh(np.log2(90) - LogPrev) + 0.5) - assets
def func3(asset_key, price): #TF
return (wealth / price) * (np.tanh(np.log2(price) - LogPrev) + 0.5) - assets
functions = [func1, func2, func3]
new_price = float(solve(functions, initial_price)[0])
print(new_price)
print('Without and VI/NT depending on previous price')
print('Then we stabilise')
new_price = 100
previous_price = 90
cash = 50_000_000
asset = 500_000
for i in range(10):
wealth = cash + asset * new_price
LogPrev = np.log2(new_price)
def func1(asset_key, price): #VI
return (wealth / price) * (np.tanh(np.log2(100) - LogPrev) + 0) - assets
def func2(asset_key, price): #NT
return (wealth / price) * (np.tanh(np.log2(90) - LogPrev) + 0) - assets
def func3(asset_key, price): #TF
return (wealth / price) * (np.tanh(np.log2(price) - LogPrev) + 0) - assets
functions = [func1, func2, func3]
new_price = float(solve(functions, initial_price)[0])
print(new_price)
"""
"""
print('Y Y N with noise')
def func1(asset_key, price): #value investor
return ((50_000_000 + 500_000 * float(initial_price)) / price ) * (np.tanh(np.log2(100) - np.log2(price) + 0.5)) - 500_000
ValNT = 100 + 10
print(ValNT)
def func2(asset_key, price): #noise trader
return ((50_000_000 + 500_000 * float(initial_price)) / price) * (np.tanh(np.log2(ValNT)) - np.log2(price) + 0.5) - 500_000
def func3(asset_key, price): #trend follower
return ((50_000_000 + 500_000 * float(initial_price)) / price ) * (np.tanh(0) + 0.5) - 500_000
functions = [func1, func2, func3]
new_price = solve(functions, initial_price)
print(new_price)
print('Y Y N without noise')
initial_price = 100
def func1(asset_key, price): #value investor
return ((50_000_000 + 500_000 * float(initial_price)) / price ) * (np.tanh(np.log2(100) - np.log2(price) + 0.5)) - 500_000
ValNT = 100
print(ValNT)
def func2(asset_key, price): #noise trader
return ((50_000_000 + 500_000 * float(initial_price)) / price) * (np.tanh(np.log2(ValNT)) - np.log2(price) + 0.5) - 500_000
def func3(asset_key, price): #trend follower
return ((50_000_000 + 500_000 * float(initial_price)) / price ) * (np.tanh(0) + 0.5) - 500_000
print('Y Y N with noise without 05')
initial_price = 100
def func1(asset_key, price): #value investor
return ((50_000_000 + 500_000 * float(initial_price)) / price ) * (np.tanh(np.log2(100) - np.log2(price))) - 500_000
ValNT = 100 + 10
print(ValNT)
def func2(asset_key, price): #noise trader
return ((50_000_000 + 500_000 * float(initial_price)) / price) * (np.tanh(np.log2(ValNT)) - np.log2(price)) - 500_000
def func3(asset_key, price): #trend follower
return ((50_000_000 + 500_000 * float(initial_price)) / price ) * (np.tanh(0)) - 500_000
functions = [func1, func2, func3]
new_price = solve(functions, initial_price)
print(new_price)
print('Y Y N without noise without 0.5')
initial_price = 100
def func1(asset_key, price): #value investor
return ((50_000_000 + 500_000 * float(initial_price)) / price ) * (np.tanh(np.log2(100) - np.log2(price))) - 500_000
ValNT = 100
print(ValNT)
def func2(asset_key, price): #noise trader
return ((50_000_000 + 500_000 * float(initial_price)) / price) * (np.tanh(np.log2(ValNT)) - np.log2(price)) - 500_000
def func3(asset_key, price): #trend follower
return ((50_000_000 + 500_000 * float(initial_price)) / price ) * (np.tanh(0)) - 500_000
functions = [func1, func2, func3]
new_price = solve(functions, initial_price)
print(new_price)
print('N N N ')
initial_price = 100
def func1(asset_key, price): #value investor
return ((50_000_000 + 500_000 * float(initial_price)) / price ) * (np.tanh(np.log2(100 + random.normalvariate(0,1)) - np.log2(price)) + 0.5) - 500_000
def func2(asset_key, price): #noise trader
return ((50_000_000 + 500_000 * float(initial_price)) / price) * (np.tanh(np.log2(100 + random.normalvariate(0,1)) - np.log2(price)) + 0.5) - 500_000
def func3(asset_key, price): #trend follower
return ((50_000_000 + 500_000 * float(initial_price)) / price ) * (np.tanh(0) + 0.5) - 500_000
functions = [func1, func2, func3]
new_price = solve(functions, initial_price)
print(new_price)
print('Y Y Y ')
initial_price = 100
import random
def func1(asset_key, price): #value investor
return ((50_000_000 + 500_000 * float(initial_price)) / price ) * (np.tanh(np.log2(100) - np.log2(price) + 0.5)) - 500_000
def func2(asset_key, price): #noise trader
return ((50_000_000 + 500_000 * float(initial_price)) / price) * (np.tanh(np.log2(100 + random.normalvariate(0,1)) - np.log2(price) + 0.5)) - 500_000
def func3(asset_key, price): #trend follower
return ((50_000_000 + 500_000 * float(initial_price)) / price ) * (np.tanh(0 + 0.5)) - 500_000
functions = [func1, func2, func3]
new_price = solve(functions, initial_price)
print(new_price)
# print(func1(0, float(new_price[0])))
# print(func2(0, float(new_price[0])))
# print(func3(0, float(new_price[0])))
# for i in range(10):
# new_price = solve(functions, float(new_price[0]))
# print(new_price)
print('------')
print('VI and NT depending on last price')
initial_price = 100
def func1(asset_key, price): #value investor
return ((50_000_000 + 500_000 * float(initial_price)) / price ) * (np.tanh(np.log2(100) - np.log2(initial_price) + 0.5)) - 500_000
def func2(asset_key, price): #noise trader
return ((50_000_000 + 500_000 * float(initial_price)) / price) * (np.tanh(np.log2(100 + random.normalvariate(0,1)) - np.log2(initial_price) + 0.5)) - 500_000
def func3(asset_key, price): #trend follower
return ((50_000_000 + 500_000 * float(initial_price)) / price ) * (np.tanh(0 + 0.5)) - 500_000
functions = [func1, func2, func3]
new_price = solve(functions, initial_price)
print(new_price)
# print(func1(0, float(new_price[0])))
# print(func2(0, float(new_price[0])))
# print(func3(0, float(new_price[0])))
# for i in range(10):
# new_price = solve(functions, float(new_price[0]))
# print(new_price)
print('------')
print('Removing 0.5')
initial_price = 100
def func4(asset_key, price): #value investor
return ((50_000_000 + 500_000 * float(initial_price)) / price ) * (np.tanh(np.log2(100) - np.log2(initial_price))) - 500_000
def func5(asset_key, price): #noise trader
return ((50_000_000 + 500_000 * float(initial_price)) / price) * (np.tanh(np.log2(100 + random.normalvariate(0,1)) - np.log2(initial_price))) - 500_000
def func6(asset_key, price): #trend follower
return ((50_000_000 + 500_000 * float(initial_price)) / price ) * (np.tanh(0)) - 500_000
functions = [func4, func5, func6]
new_price = solve(functions, initial_price)
print(new_price)
"""
initial_price = 100
wealth = 50_000_000 + 500_000 * initial_price
assets = 500_000
assets = 400_000
print("For ValNT = 100, reference")
def func1(asset_key, price): # VI
return ((60_000_000 + assets * initial_price) / price) * np.tanh(
np.log2(100) - np.log2(price)
) - assets
def func2(asset_key, price): # NT
return ((60_000_000 + assets * initial_price) / price) * np.tanh(
np.log2(100) - np.log2(price)
) - assets
def func3(asset_key, price): # TF
return ((60_000_000 + assets * initial_price) / price) * np.tanh(0.5) - assets
functions = [func1, func2, func3]
new_price = solve(functions, initial_price)
print(new_price)
assets = 500_000
print("For ValNT = 100, reference")
def func1(asset_key, price): # VI
return ((50_000_000 + assets * initial_price) / price) * np.tanh(
np.log2(100) - np.log2(price)
) - assets
def func2(asset_key, price): # NT
return ((50_000_000 + assets * initial_price) / price) * np.tanh(
np.log2(100) - np.log2(price)
) - assets
def func3(asset_key, price): # TF
return ((50_000_000 + assets * initial_price) / price) * np.tanh(0.5) - assets
functions = [func1, func2, func3]
new_price = solve(functions, initial_price)
print(new_price)
| 30.002222 | 160 | 0.676542 | 2,142 | 13,501 | 4.108777 | 0.056489 | 0.057266 | 0.093058 | 0.064993 | 0.940916 | 0.940916 | 0.935462 | 0.929781 | 0.929781 | 0.926713 | 0 | 0.110297 | 0.170654 | 13,501 | 449 | 161 | 30.069042 | 0.675717 | 0.005259 | 0 | 0.8 | 0 | 0.019048 | 0.117423 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.228571 | false | 0 | 0 | 0.228571 | 0.457143 | 0.171429 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 9 |
4ab83179f1239007ebb0eb7bb3fe0a1ac8adece4 | 105 | py | Python | highcliff/medication/__init__.py | sermelo/Highcliff-SDK | 255dd12b3402361cba8b1ea7a28c506f32a11dae | [
"Apache-2.0"
] | null | null | null | highcliff/medication/__init__.py | sermelo/Highcliff-SDK | 255dd12b3402361cba8b1ea7a28c506f32a11dae | [
"Apache-2.0"
] | null | null | null | highcliff/medication/__init__.py | sermelo/Highcliff-SDK | 255dd12b3402361cba8b1ea7a28c506f32a11dae | [
"Apache-2.0"
] | null | null | null | from highcliff.medication.medication import MonitorMedication, RequestMedication, ConfirmMedicationGiven
| 52.5 | 104 | 0.904762 | 8 | 105 | 11.875 | 0.875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.057143 | 105 | 1 | 105 | 105 | 0.959596 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
4ac870da0ad0956154ddce8d4b0d192596a7e0ea | 96 | py | Python | src/nncomp_molecule/decoders/__init__.py | k-fujikawa/Kaggle-BMS-Molecular-Translation | 5503572686ed6c4082e276d9e17078185249be9e | [
"MIT"
] | 3 | 2021-08-29T21:07:37.000Z | 2022-03-30T07:46:57.000Z | src/nncomp_molecule/decoders/__init__.py | k-fujikawa/Kaggle-BMS-Molecular-Translation- | 5503572686ed6c4082e276d9e17078185249be9e | [
"MIT"
] | null | null | null | src/nncomp_molecule/decoders/__init__.py | k-fujikawa/Kaggle-BMS-Molecular-Translation- | 5503572686ed6c4082e276d9e17078185249be9e | [
"MIT"
] | 1 | 2022-03-30T10:20:25.000Z | 2022-03-30T10:20:25.000Z | from . import rnn # NOQA
from . import transformer # NOQA
from . import transformer_v2 # NOQA | 32 | 36 | 0.729167 | 13 | 96 | 5.307692 | 0.461538 | 0.434783 | 0.405797 | 0.724638 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.013158 | 0.208333 | 96 | 3 | 36 | 32 | 0.894737 | 0.145833 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
4367d589e313150eb0589cafc94ab3b4b8d10e4e | 40,079 | py | Python | src/openprocurement/tender/competitivedialogue/tests/stage1/lot_blanks.py | pontostroy/api | 5afdd3a62a8e562cf77e2d963d88f1a26613d16a | [
"Apache-2.0"
] | 3 | 2020-03-13T06:44:23.000Z | 2020-11-05T18:25:29.000Z | src/openprocurement/tender/competitivedialogue/tests/stage1/lot_blanks.py | pontostroy/api | 5afdd3a62a8e562cf77e2d963d88f1a26613d16a | [
"Apache-2.0"
] | 2 | 2021-03-25T23:29:58.000Z | 2022-03-21T22:18:37.000Z | src/openprocurement/tender/competitivedialogue/tests/stage1/lot_blanks.py | pontostroy/api | 5afdd3a62a8e562cf77e2d963d88f1a26613d16a | [
"Apache-2.0"
] | 3 | 2020-10-16T16:25:14.000Z | 2021-05-22T12:26:20.000Z | # -*- coding: utf-8 -*-
from copy import deepcopy
from openprocurement.api.utils import get_now
from openprocurement.api.constants import RELEASE_2020_04_19
from openprocurement.tender.core.tests.cancellation import activate_cancellation_with_complaints_after_2020_04_19
# CompetitiveDialogueEU(UA)LotBidderResourceTest
from openprocurement.tender.belowthreshold.tests.base import test_cancellation
def create_tender_bidder_invalid(self):
request_path = "/tenders/{}/bids".format(self.tender_id)
response = self.app.post_json(
request_path,
{"data": {"selfEligible": True, "selfQualified": True, "tenderers": self.test_bids_data[0]["tenderers"]}},
status=422,
)
self.assertEqual(response.status, "422 Unprocessable Entity")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["status"], "error")
self.assertEqual(
response.json["errors"],
[{u"description": [u"This field is required."], u"location": u"body", u"name": u"lotValues"}],
)
response = self.app.post_json(
request_path,
{
"data": {
"selfEligible": True,
"selfQualified": True,
"tenderers": self.test_bids_data[0]["tenderers"],
"lotValues": [{"value": {"amount": 500}}],
}
},
status=422,
)
self.assertEqual(response.status, "422 Unprocessable Entity")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["status"], "error")
self.assertEqual(
response.json["errors"],
[
{
u"description": [{u"relatedLot": [u"This field is required."]}],
u"location": u"body",
u"name": u"lotValues",
}
],
)
response = self.app.post_json(
request_path,
{
"data": {
"selfEligible": True,
"selfQualified": True,
"tenderers": self.test_bids_data[0]["tenderers"],
"lotValues": [{"value": {"amount": 500}, "relatedLot": "0" * 32}],
}
},
status=422,
)
self.assertEqual(response.status, "422 Unprocessable Entity")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["status"], "error")
self.assertEqual(
response.json["errors"],
[
{
u"description": [{u"relatedLot": [u"relatedLot should be one of lots"]}],
u"location": u"body",
u"name": u"lotValues",
}
],
)
# Field 'value' doesn't exists on first stage
response = self.app.post_json(
request_path,
{
"data": {
"selfEligible": True,
"selfQualified": True,
"tenderers": self.test_bids_data[0]["tenderers"],
"lotValues": [{"value": {"amount": 5000000}, "relatedLot": self.initial_lots[0]["id"]}],
}
},
)
self.assertEqual(response.status, "201 Created")
self.assertEqual(response.content_type, "application/json")
response = self.app.post_json(
request_path,
{
"data": {
"selfEligible": True,
"selfQualified": True,
"tenderers": self.test_bids_data[0]["tenderers"],
"lotValues": [
{"value": {"amount": 500, "valueAddedTaxIncluded": False}, "relatedLot": self.initial_lots[0]["id"]}
],
}
},
)
self.assertEqual(response.status, "201 Created")
self.assertEqual(response.content_type, "application/json")
response = self.app.post_json(
request_path,
{
"data": {
"selfEligible": True,
"selfQualified": True,
"tenderers": self.test_bids_data[0]["tenderers"],
"lotValues": [{"value": {"amount": 500, "currency": "USD"}, "relatedLot": self.initial_lots[0]["id"]}],
}
},
)
self.assertEqual(response.status, "201 Created")
self.assertEqual(response.content_type, "application/json")
response = self.app.post_json(
request_path,
{
"data": {
"selfEligible": True,
"selfQualified": True,
"tenderers": self.test_bids_data[0]["tenderers"],
"value": {"amount": 500},
"lotValues": [{"value": {"amount": 500}, "relatedLot": self.initial_lots[0]["id"]}],
}
},
)
self.assertEqual(response.status, "201 Created")
self.assertEqual(response.content_type, "application/json")
def patch_tender_bidder(self):
lot_id = self.initial_lots[0]["id"]
response = self.app.post_json(
"/tenders/{}/bids".format(self.tender_id),
{
"data": {
"selfEligible": True,
"selfQualified": True,
"tenderers": self.test_bids_data[0]["tenderers"],
"lotValues": [{"value": {"amount": 500}, "relatedLot": lot_id}],
}
},
)
self.assertEqual(response.status, "201 Created")
self.assertEqual(response.content_type, "application/json")
bidder = response.json["data"]
bid_token = response.json["access"]["token"]
lot = bidder["lotValues"][0]
response = self.app.patch_json(
"/tenders/{}/bids/{}?acc_token={}".format(self.tender_id, bidder["id"], bid_token),
{"data": {"tenderers": [{"name": u"Державне управління управлінням справами"}]}},
)
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["data"]["lotValues"][0]["date"], lot["date"])
self.assertNotEqual(response.json["data"]["tenderers"][0]["name"], bidder["tenderers"][0]["name"])
response = self.app.patch_json(
"/tenders/{}/bids/{}?acc_token={}".format(self.tender_id, bidder["id"], bid_token),
{
"data": {
"lotValues": [{"value": {"amount": 500}, "relatedLot": lot_id}],
"tenderers": self.test_bids_data[0]["tenderers"],
}
},
)
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["data"]["lotValues"][0]["date"], lot["date"])
self.assertEqual(response.json["data"]["tenderers"][0]["name"], bidder["tenderers"][0]["name"])
# If we don't change anything then return null
response = self.app.patch_json(
"/tenders/{}/bids/{}?acc_token={}".format(self.tender_id, bidder["id"], bid_token),
{"data": {"lotValues": [{"value": {"amount": 400}, "relatedLot": lot_id}]}},
)
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.content_type, "application/json")
self.time_shift("active.pre-qualification")
self.check_chronograph()
response = self.app.get("/tenders/{}/bids/{}?acc_token={}".format(self.tender_id, bidder["id"], bid_token))
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.content_type, "application/json")
self.assertNotIn("lotValues", response.json["data"])
response = self.app.patch_json(
"/tenders/{}/bids/{}?acc_token={}".format(self.tender_id, bidder["id"], bid_token),
{"data": {"lotValues": [{"value": {"amount": 500}, "relatedLot": lot_id}], "status": "active"}},
status=403,
)
self.assertEqual(response.status, "403 Forbidden")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(
response.json["errors"][0]["description"], "Can't update bid in current (unsuccessful) tender status"
)
# CompetitiveDialogueEULotFeatureBidderResourceTest
def create_tender_with_features_bidder_invalid(self):
request_path = "/tenders/{}/bids".format(self.tender_id)
response = self.app.post_json(
request_path,
{"data": {"selfEligible": True, "selfQualified": True, "tenderers": self.test_bids_data[0]["tenderers"]}},
status=422,
)
self.assertEqual(response.status, "422 Unprocessable Entity")
self.assertEqual(response.content_type, "application/json")
response = self.app.post_json(
request_path,
{
"data": {
"selfEligible": True,
"selfQualified": True,
"tenderers": self.test_bids_data[0]["tenderers"],
"lotValues": [{"value": {"amount": 500}}],
}
},
status=422,
)
self.assertEqual(response.status, "422 Unprocessable Entity")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["status"], "error")
self.assertEqual(
response.json["errors"],
[
{
u"description": [{u"relatedLot": [u"This field is required."]}],
u"location": u"body",
u"name": u"lotValues",
}
],
)
response = self.app.post_json(
request_path,
{
"data": {
"selfEligible": True,
"selfQualified": True,
"tenderers": self.test_bids_data[0]["tenderers"],
"lotValues": [{"value": {"amount": 500}, "relatedLot": "0" * 32}],
}
},
status=422,
)
self.assertEqual(response.status, "422 Unprocessable Entity")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["status"], "error")
self.assertEqual(
response.json["errors"],
[
{
u"description": [{u"relatedLot": [u"relatedLot should be one of lots"]}],
u"location": u"body",
u"name": u"lotValues",
}
],
)
# Field 'value' doesn't exists on first stage
response = self.app.post_json(
request_path,
{
"data": {
"selfEligible": True,
"selfQualified": True,
"tenderers": self.test_bids_data[0]["tenderers"],
"lotValues": [{"value": {"amount": 5000000}, "relatedLot": self.lot_id}],
}
},
)
self.assertEqual(response.status, "201 Created")
self.assertEqual(response.content_type, "application/json")
response = self.app.post_json(
request_path,
{
"data": {
"selfEligible": True,
"selfQualified": True,
"tenderers": self.test_bids_data[0]["tenderers"],
"lotValues": [{"value": {"amount": 500, "valueAddedTaxIncluded": False}, "relatedLot": self.lot_id}],
}
},
)
self.assertEqual(response.status, "201 Created")
self.assertEqual(response.content_type, "application/json")
response = self.app.post_json(
request_path,
{
"data": {
"selfEligible": True,
"selfQualified": True,
"tenderers": self.test_bids_data[0]["tenderers"],
"lotValues": [{"value": {"amount": 500, "currency": "USD"}, "relatedLot": self.lot_id}],
}
},
)
self.assertEqual(response.status, "201 Created")
self.assertEqual(response.content_type, "application/json")
# CompetitiveDialogueEULotProcessTest
def one_lot_0bid(self):
self.app.authorization = ("Basic", ("broker", ""))
# create tender
response = self.app.post_json("/tenders", {"data": self.test_tender_data})
tender_id = self.tender_id = response.json["data"]["id"]
owner_token = response.json["access"]["token"]
# add lot
response = self.app.post_json(
"/tenders/{}/lots?acc_token={}".format(tender_id, owner_token), {"data": self.test_lots_data[0]}
)
self.assertEqual(response.status, "201 Created")
lot_id = response.json["data"]["id"]
# add relatedLot for item
response = self.app.patch_json(
"/tenders/{}?acc_token={}".format(tender_id, owner_token), {"data": {"items": [{"relatedLot": lot_id}]}}
)
self.assertEqual(response.status, "200 OK")
# switch to active.tendering
response = self.set_status("active.tendering")
self.assertNotIn("auctionPeriod", response.json["data"]["lots"][0])
# switch to unsuccessful
response = self.set_status("active.stage2.pending", {"status": "active.tendering"})
self.app.authorization = ("Basic", ("chronograph", ""))
response = self.app.patch_json("/tenders/{}".format(tender_id), {"data": {"id": tender_id}})
self.assertEqual(response.json["data"]["lots"][0]["status"], "unsuccessful")
self.assertEqual(response.json["data"]["status"], "unsuccessful")
response = self.app.post_json(
"/tenders/{}/lots?acc_token={}".format(tender_id, owner_token), {"data": self.test_lots_data[0]}, status=403
)
self.assertEqual(response.status, "403 Forbidden")
self.assertEqual(
response.json["errors"],
[{"location": "body", "name": "data", "description": "Can't add lot in current (unsuccessful) tender status"}],
)
def one_lot_2bid_1unqualified(self):
self.app.authorization = ("Basic", ("broker", ""))
# create tender
response = self.app.post_json("/tenders", {"data": self.test_tender_data})
tender_id = self.tender_id = response.json["data"]["id"]
owner_token = response.json["access"]["token"]
# add lot
response = self.app.post_json(
"/tenders/{}/lots?acc_token={}".format(tender_id, owner_token), {"data": self.test_lots_data[0]}
)
self.assertEqual(response.status, "201 Created")
lot_id = response.json["data"]["id"]
# add relatedLot for item
response = self.app.patch_json(
"/tenders/{}?acc_token={}".format(tender_id, owner_token), {"data": {"items": [{"relatedLot": lot_id}]}}
)
self.assertEqual(response.status, "200 OK")
# create bid
self.app.authorization = ("Basic", ("broker", ""))
bidder_data = deepcopy(self.test_bids_data[0]["tenderers"][0])
bidder_data["identifier"]["id"] = u"00037256"
response = self.app.post_json(
"/tenders/{}/bids".format(tender_id),
{
"data": {
"selfEligible": True,
"selfQualified": True,
"tenderers": [bidder_data],
"lotValues": [{"value": {"amount": 500}, "relatedLot": lot_id}],
}
},
)
bidder_data["identifier"]["id"] = u"00037257"
response = self.app.post_json(
"/tenders/{}/bids".format(tender_id),
{
"data": {
"selfEligible": True,
"selfQualified": True,
"tenderers": [bidder_data],
"lotValues": [{"value": {"amount": 500}, "relatedLot": lot_id}],
}
},
)
bidder_data["identifier"]["id"] = u"00037258"
response = self.app.post_json(
"/tenders/{}/bids".format(tender_id),
{
"data": {
"selfEligible": True,
"selfQualified": True,
"tenderers": [bidder_data],
"lotValues": [{"value": {"amount": 500}, "relatedLot": lot_id}],
}
},
)
# switch to active.pre-qualification
self.time_shift("active.pre-qualification")
self.check_chronograph()
response = self.app.get("/tenders/{}/qualifications?acc_token={}".format(self.tender_id, owner_token))
self.assertEqual(response.content_type, "application/json")
qualifications = response.json["data"]
response = self.app.patch_json(
"/tenders/{}/qualifications/{}?acc_token={}".format(self.tender_id, qualifications[0]["id"], owner_token),
{"data": {"status": "active", "qualified": True, "eligible": True}},
)
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.json["data"]["status"], "active")
response = self.app.patch_json(
"/tenders/{}/qualifications/{}?acc_token={}".format(self.tender_id, qualifications[1]["id"], owner_token),
{"data": {"status": "unsuccessful"}},
)
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.json["data"]["status"], "unsuccessful")
response = self.app.patch_json(
"/tenders/{}/qualifications/{}?acc_token={}".format(self.tender_id, qualifications[2]["id"], owner_token),
{"data": {"status": "active", "qualified": True, "eligible": True}},
)
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.json["data"]["status"], "active")
response = self.app.patch_json(
"/tenders/{}?acc_token={}".format(tender_id, owner_token),
{"data": {"status": "active.pre-qualification.stand-still"}},
)
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.json["data"]["status"], "active.pre-qualification.stand-still")
def one_lot_2bid(self):
self.app.authorization = ("Basic", ("broker", ""))
# create tender
response = self.app.post_json("/tenders", {"data": self.test_tender_data})
tender_id = self.tender_id = response.json["data"]["id"]
owner_token = response.json["access"]["token"]
# add lot
response = self.app.post_json(
"/tenders/{}/lots?acc_token={}".format(tender_id, owner_token), {"data": self.test_lots_data[0]}
)
self.assertEqual(response.status, "201 Created")
lot_id = response.json["data"]["id"]
self.initial_lots = [response.json["data"]]
# add relatedLot for item
response = self.app.patch_json(
"/tenders/{}?acc_token={}".format(tender_id, owner_token), {"data": {"items": [{"relatedLot": lot_id}]}}
)
self.assertEqual(response.status, "200 OK")
# create bid
self.app.authorization = ("Basic", ("broker", ""))
bidder_data = deepcopy(self.test_bids_data[0]["tenderers"][0])
bidder_data["identifier"]["id"] = u"00037256"
response = self.app.post_json(
"/tenders/{}/bids".format(tender_id),
{
"data": {
"selfEligible": True,
"selfQualified": True,
"tenderers": [bidder_data],
"lotValues": [{"value": {"amount": 450}, "relatedLot": lot_id}],
}
},
)
bid_id = response.json["data"]["id"]
bid_token = response.json["access"]["token"]
# create second bid
self.app.authorization = ("Basic", ("broker", ""))
bidder_data["identifier"]["id"] = u"00037257"
response = self.app.post_json(
"/tenders/{}/bids".format(tender_id),
{
"data": {
"selfEligible": True,
"selfQualified": True,
"tenderers": [bidder_data],
"lotValues": [{"value": {"amount": 475}, "relatedLot": lot_id}],
}
},
)
# create third
bidder_data["identifier"]["id"] = u"00037258"
response = self.app.post_json(
"/tenders/{}/bids".format(tender_id),
{
"data": {
"selfEligible": True,
"selfQualified": True,
"tenderers": [bidder_data],
"lotValues": [{"value": {"amount": 470}, "relatedLot": lot_id}],
}
},
)
# switch to active.pre-qualification
self.time_shift("active.pre-qualification")
self.check_chronograph()
response = self.app.get("/tenders/{}/qualifications?acc_token={}".format(self.tender_id, owner_token))
self.assertEqual(response.content_type, "application/json")
qualifications = response.json["data"]
for qualification in qualifications:
response = self.app.patch_json(
"/tenders/{}/qualifications/{}?acc_token={}".format(self.tender_id, qualification["id"], owner_token),
{"data": {"status": "active", "qualified": True, "eligible": True}},
)
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.json["data"]["status"], "active")
response = self.app.get("/tenders/{}?acc_token={}".format(tender_id, owner_token))
self.assertEqual(response.status, "200 OK")
for bid in response.json["data"]["bids"]:
self.assertEqual(bid["status"], "active")
response = self.app.patch_json(
"/tenders/{}?acc_token={}".format(tender_id, owner_token),
{"data": {"status": "active.pre-qualification.stand-still"}},
)
self.assertEqual(response.status, "200 OK")
self.check_chronograph()
response = self.app.get("/tenders/{}?acc_token={}".format(self.tender_id, owner_token))
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.status, "200 OK")
def two_lot_2bid_1lot_del(self):
self.app.authorization = ("Basic", ("broker", ""))
# create tender
response = self.app.post_json("/tenders", {"data": self.test_tender_data})
tender_id = self.tender_id = response.json["data"]["id"]
owner_token = response.json["access"]["token"]
lots = []
for lot in 2 * self.test_lots_data:
# add lot
response = self.app.post_json(
"/tenders/{}/lots?acc_token={}".format(tender_id, owner_token), {"data": self.test_lots_data[0]}
)
self.assertEqual(response.status, "201 Created")
lots.append(response.json["data"]["id"])
self.initial_lots = lots
# add item
response = self.app.patch_json(
"/tenders/{}?acc_token={}".format(tender_id, owner_token),
{"data": {"items": [self.test_tender_data["items"][0] for i in lots]}},
)
response = self.set_status("active.tendering")
# create bid
bids = []
self.app.authorization = ("Basic", ("broker", ""))
response = self.app.post_json(
"/tenders/{}/bids".format(tender_id),
{
"data": {
"selfEligible": True,
"selfQualified": True,
"tenderers": self.test_bids_data[0]["tenderers"],
"lotValues": [{"value": {"amount": 500}, "relatedLot": lot_id} for lot_id in lots],
}
},
)
bids.append(response.json)
# create second bid
self.app.authorization = ("Basic", ("broker", ""))
response = self.app.post_json(
"/tenders/{}/bids".format(tender_id),
{
"data": {
"selfEligible": True,
"selfQualified": True,
"tenderers": self.test_bids_data[1]["tenderers"],
"lotValues": [{"value": {"amount": 500}, "relatedLot": lot_id} for lot_id in lots],
}
},
)
bids.append(response.json)
response = self.app.delete("/tenders/{}/lots/{}?acc_token={}".format(self.tender_id, lots[0], owner_token))
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.content_type, "application/json")
def one_lot_3bid_1del(self):
self.app.authorization = ("Basic", ("broker", ""))
# create tender
response = self.app.post_json("/tenders", {"data": self.test_tender_data})
tender_id = self.tender_id = response.json["data"]["id"]
owner_token = response.json["access"]["token"]
# add lot
response = self.app.post_json(
"/tenders/{}/lots?acc_token={}".format(tender_id, owner_token), {"data": self.test_lots_data[0]}
)
self.assertEqual(response.status, "201 Created")
lot_id = response.json["data"]["id"]
self.initial_lots = [response.json["data"]]
# add relatedLot for item
response = self.app.patch_json(
"/tenders/{}?acc_token={}".format(tender_id, owner_token), {"data": {"items": [{"relatedLot": lot_id}]}}
)
self.assertEqual(response.status, "200 OK")
# create bids
self.app.authorization = ("Basic", ("broker", ""))
bids = []
bidder_data = deepcopy(self.test_bids_data[0]["tenderers"][0])
for index, test_bid in enumerate(self.test_bids_data):
bidder_data["identifier"]["id"] = "00037256" + str(index)
response = self.app.post_json(
"/tenders/{}/bids".format(tender_id),
{
"data": {
"selfEligible": True,
"selfQualified": True,
"tenderers": [bidder_data],
"lotValues": [{"value": {"amount": 450}, "relatedLot": lot_id}],
}
},
)
bids.append({response.json["data"]["id"]: response.json["access"]["token"]})
response = self.app.delete(
"/tenders/{}/bids/{}?acc_token={}".format(tender_id, bids[2].keys()[0], bids[2].values()[0])
)
self.assertEqual(response.status, "200 OK")
# switch to active.pre-qualification
self.time_shift("active.pre-qualification")
self.check_chronograph()
# check tender status
response = self.app.get("/tenders/{}?acc_token={}".format(self.tender_id, owner_token))
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.json["data"]["status"], "unsuccessful")
def one_lot_3bid_1un(self):
self.app.authorization = ("Basic", ("broker", ""))
# create tender
response = self.app.post_json("/tenders", {"data": self.test_tender_data})
tender_id = self.tender_id = response.json["data"]["id"]
owner_token = response.json["access"]["token"]
# add lot
response = self.app.post_json(
"/tenders/{}/lots?acc_token={}".format(tender_id, owner_token), {"data": self.test_lots_data[0]}
)
self.assertEqual(response.status, "201 Created")
lot_id = response.json["data"]["id"]
self.initial_lots = [response.json["data"]]
# add relatedLot for item
response = self.app.patch_json(
"/tenders/{}?acc_token={}".format(tender_id, owner_token), {"data": {"items": [{"relatedLot": lot_id}]}}
)
self.assertEqual(response.status, "200 OK")
# create bid
self.app.authorization = ("Basic", ("broker", ""))
bids = []
bidder_data = deepcopy(self.test_bids_data[0]["tenderers"][0])
for i in range(3):
bidder_data["identifier"]["id"] = "00037256" + str(i)
response = self.app.post_json(
"/tenders/{}/bids".format(tender_id),
{
"data": {
"selfEligible": True,
"selfQualified": True,
"tenderers": [bidder_data],
"lotValues": [{"value": {"amount": 450}, "relatedLot": lot_id}],
}
},
)
bids.append({response.json["data"]["id"]: response.json["access"]["token"]})
# switch to active.pre-qualification
self.time_shift("active.pre-qualification")
self.check_chronograph()
response = self.app.get("/tenders/{}/qualifications?acc_token={}".format(self.tender_id, owner_token))
self.assertEqual(response.content_type, "application/json")
qualifications = response.json["data"]
for qualification in qualifications:
if qualification["bidID"] == bids[2].keys()[0]:
response = self.app.patch_json(
"/tenders/{}/qualifications/{}?acc_token={}".format(self.tender_id, qualification["id"], owner_token),
{"data": {"status": "unsuccessful"}},
)
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.json["data"]["status"], "unsuccessful")
else:
response = self.app.patch_json(
"/tenders/{}/qualifications/{}?acc_token={}".format(self.tender_id, qualification["id"], owner_token),
{"data": {"status": "active", "qualified": True, "eligible": True}},
)
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.json["data"]["status"], "active")
response = self.app.patch_json(
"/tenders/{}?acc_token={}".format(tender_id, owner_token),
{"data": {"status": "active.pre-qualification.stand-still"}},
)
self.assertEqual(response.status, "200 OK")
self.check_chronograph()
response = self.app.get("/tenders/{}?acc_token={}".format(self.tender_id, owner_token))
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.status, "200 OK")
response = self.app.get("/tenders/{}/qualifications?acc_token={}".format(self.tender_id, owner_token))
self.assertEqual(response.content_type, "application/json")
qualifications = response.json["data"]
def two_lot_0bid(self):
self.app.authorization = ("Basic", ("broker", ""))
# create tender
response = self.app.post_json("/tenders", {"data": self.test_tender_data})
tender_id = self.tender_id = response.json["data"]["id"]
owner_token = response.json["access"]["token"]
lots = []
for lot in 2 * self.test_lots_data:
# add lot
response = self.app.post_json(
"/tenders/{}/lots?acc_token={}".format(tender_id, owner_token), {"data": self.test_lots_data[0]}
)
self.assertEqual(response.status, "201 Created")
lots.append(response.json["data"]["id"])
# add item
response = self.app.patch_json(
"/tenders/{}?acc_token={}".format(tender_id, owner_token),
{"data": {"items": [self.test_tender_data["items"][0] for i in lots]}},
)
# add relatedLot for item
response = self.app.patch_json(
"/tenders/{}?acc_token={}".format(tender_id, owner_token),
{"data": {"items": [{"relatedLot": i} for i in lots]}},
)
self.assertEqual(response.status, "200 OK")
self.time_shift("active.pre-qualification")
self.check_chronograph()
# switch to unsuccessful
self.app.authorization = ("Basic", ("broker", ""))
response = self.app.get("/tenders/{}?acc_token={}".format(tender_id, owner_token))
self.assertTrue(all([i["status"] == "unsuccessful" for i in response.json["data"]["lots"]]))
self.assertEqual(response.json["data"]["status"], "unsuccessful")
def two_lot_2can(self):
self.app.authorization = ("Basic", ("broker", ""))
# create tender
response = self.app.post_json("/tenders", {"data": self.test_tender_data})
tender_id = self.tender_id = response.json["data"]["id"]
owner_token = response.json["access"]["token"]
lots = []
for lot in 2 * self.test_lots_data:
# add lot
response = self.app.post_json(
"/tenders/{}/lots?acc_token={}".format(tender_id, owner_token), {"data": self.test_lots_data[0]}
)
self.assertEqual(response.status, "201 Created")
lots.append(response.json["data"]["id"])
# add item
response = self.app.patch_json(
"/tenders/{}?acc_token={}".format(tender_id, owner_token),
{"data": {"items": [self.test_tender_data["items"][0] for i in lots]}},
)
# add relatedLot for item
response = self.app.patch_json(
"/tenders/{}?acc_token={}".format(tender_id, owner_token),
{"data": {"items": [{"relatedLot": i} for i in lots]}},
)
self.assertEqual(response.status, "200 OK")
set_complaint_period_end = getattr(self, "set_complaint_period_end", None)
if RELEASE_2020_04_19 < get_now() and set_complaint_period_end:
set_complaint_period_end()
# cancel every lot
for lot_id in lots:
cancellation = dict(**test_cancellation)
cancellation.update({
"status": "active",
"cancellationOf": "lot",
"relatedLot": lot_id,
})
response = self.app.post_json(
"/tenders/{}/cancellations?acc_token={}".format(tender_id, owner_token),
{"data": cancellation},
)
cancellation_id = response.json["data"]["id"]
if RELEASE_2020_04_19 < get_now():
activate_cancellation_with_complaints_after_2020_04_19(self, cancellation_id, tender_id, owner_token)
response = self.app.get("/tenders/{}".format(tender_id))
self.assertTrue(all([i["status"] == "cancelled" for i in response.json["data"]["lots"]]))
self.assertEqual(response.json["data"]["status"], "cancelled")
def two_lot_2bid_0com_1can(self):
self.app.authorization = ("Basic", ("broker", ""))
# create tender
response = self.app.post_json("/tenders", {"data": self.test_tender_data})
tender_id = self.tender_id = response.json["data"]["id"]
owner_token = response.json["access"]["token"]
lots = []
for lot in 2 * self.test_lots_data:
# add lot
response = self.app.post_json(
"/tenders/{}/lots?acc_token={}".format(tender_id, owner_token), {"data": self.test_lots_data[0]}
)
self.assertEqual(response.status, "201 Created")
lots.append(response.json["data"]["id"])
# add item
response = self.app.patch_json(
"/tenders/{}?acc_token={}".format(tender_id, owner_token),
{"data": {"items": [self.test_tender_data["items"][0] for i in lots]}},
)
# add relatedLot for item
response = self.app.patch_json(
"/tenders/{}?acc_token={}".format(tender_id, owner_token),
{"data": {"items": [{"relatedLot": i} for i in lots]}},
)
self.assertEqual(response.status, "200 OK")
# create bid
self.app.authorization = ("Basic", ("broker", ""))
bidder_data = deepcopy(self.test_bids_data[0]["tenderers"][0])
bidder_data["identifier"]["id"] = u"00037256"
response = self.app.post_json(
"/tenders/{}/bids".format(tender_id),
{
"data": {
"selfEligible": True,
"selfQualified": True,
"tenderers": [bidder_data],
"lotValues": [{"value": {"amount": 500}, "relatedLot": lot_id} for lot_id in lots],
}
},
)
bidder_data["identifier"]["id"] = u"00037257"
response = self.app.post_json(
"/tenders/{}/bids".format(tender_id),
{
"data": {
"selfEligible": True,
"selfQualified": True,
"tenderers": [bidder_data],
"lotValues": [{"value": {"amount": 499}, "relatedLot": lot_id} for lot_id in lots],
}
},
)
bidder_data["identifier"]["id"] = u"00037258"
response = self.app.post_json(
"/tenders/{}/bids".format(tender_id),
{
"data": {
"selfEligible": True,
"selfQualified": True,
"tenderers": [bidder_data],
"lotValues": [{"value": {"amount": 499}, "relatedLot": lot_id} for lot_id in lots],
}
},
)
set_complaint_period_end = getattr(self, "set_complaint_period_end", None)
if RELEASE_2020_04_19 < get_now() and set_complaint_period_end:
set_complaint_period_end()
self.app.authorization = ("Basic", ("broker", ""))
cancellation = dict(**test_cancellation)
cancellation.update({
"status": "active",
"cancellationOf": "lot",
"relatedLot": lots[0],
})
response = self.app.post_json(
"/tenders/{}/cancellations?acc_token={}".format(tender_id, owner_token),
{"data": cancellation},
)
cancellation_id = response.json["data"]["id"]
if RELEASE_2020_04_19 < get_now():
activate_cancellation_with_complaints_after_2020_04_19(self, cancellation_id, tender_id, owner_token)
response = self.app.get("/tenders/{}?acc_token={}".format(tender_id, owner_token))
self.assertEqual(response.status, "200 OK")
# active.pre-qualification
self.time_shift("active.pre-qualification")
self.check_chronograph()
response = self.app.get("/tenders/{}/qualifications?acc_token={}".format(self.tender_id, owner_token))
self.assertEqual(response.content_type, "application/json")
qualifications = response.json["data"]
self.assertEqual(len(qualifications), 3)
for qualification in qualifications:
response = self.app.patch_json(
"/tenders/{}/qualifications/{}?acc_token={}".format(self.tender_id, qualification["id"], owner_token),
{"data": {"status": "active", "qualified": True, "eligible": True}},
)
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.json["data"]["status"], "active")
response = self.app.patch_json(
"/tenders/{}?acc_token={}".format(tender_id, owner_token),
{"data": {"status": "active.pre-qualification.stand-still"}},
)
self.assertEqual(response.status, "200 OK")
def two_lot_2bid_2com_2win(self):
self.app.authorization = ("Basic", ("broker", ""))
# create tender
response = self.app.post_json("/tenders", {"data": self.test_tender_data})
tender_id = self.tender_id = response.json["data"]["id"]
owner_token = response.json["access"]["token"]
lots = []
for lot in 2 * self.test_lots_data:
# add lot
response = self.app.post_json(
"/tenders/{}/lots?acc_token={}".format(tender_id, owner_token), {"data": self.test_lots_data[0]}
)
self.assertEqual(response.status, "201 Created")
lots.append(response.json["data"]["id"])
self.initial_lots = lots
# add item
self.app.patch_json(
"/tenders/{}?acc_token={}".format(tender_id, owner_token),
{"data": {"items": [self.test_tender_data["items"][0] for i in lots]}},
)
# add relatedLot for item
response = self.app.patch_json(
"/tenders/{}?acc_token={}".format(tender_id, owner_token),
{"data": {"items": [{"relatedLot": i} for i in lots]}},
)
self.assertEqual(response.status, "200 OK")
# create bid
bidder_data = deepcopy(self.test_bids_data[0]["tenderers"][0])
bidder_data["identifier"]["id"] = u"00037256"
self.app.authorization = ("Basic", ("broker", ""))
self.app.post_json(
"/tenders/{}/bids".format(tender_id),
{
"data": {
"selfEligible": True,
"selfQualified": True,
"tenderers": [bidder_data],
"lotValues": [{"value": {"amount": 500}, "relatedLot": lot_id} for lot_id in lots],
}
},
)
# create second bid
bidder_data["identifier"]["id"] = u"00037257"
self.app.post_json(
"/tenders/{}/bids".format(tender_id),
{
"data": {
"selfEligible": True,
"selfQualified": True,
"tenderers": [bidder_data],
"lotValues": [{"value": {"amount": 500}, "relatedLot": lot_id} for lot_id in lots],
}
},
)
# create third bid
bidder_data["identifier"]["id"] = u"00037258"
self.app.post_json(
"/tenders/{}/bids".format(tender_id),
{
"data": {
"selfEligible": True,
"selfQualified": True,
"tenderers": [bidder_data],
"lotValues": [{"value": {"amount": 500}, "relatedLot": lot_id} for lot_id in lots],
}
},
)
# switch to active.pre-qualification
self.time_shift("active.pre-qualification")
self.check_chronograph()
response = self.app.get("/tenders/{}/qualifications?acc_token={}".format(self.tender_id, owner_token))
self.assertEqual(response.content_type, "application/json")
qualifications = response.json["data"]
self.assertEqual(len(qualifications), 6)
for qualification in qualifications:
response = self.app.patch_json(
"/tenders/{}/qualifications/{}?acc_token={}".format(self.tender_id, qualification["id"], owner_token),
{"data": {"status": "active", "qualified": True, "eligible": True}},
)
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.json["data"]["status"], "active")
response = self.app.patch_json(
"/tenders/{}?acc_token={}".format(tender_id, owner_token),
{"data": {"status": "active.pre-qualification.stand-still"}},
)
self.assertEqual(response.status, "200 OK")
| 39.642928 | 120 | 0.589311 | 4,299 | 40,079 | 5.332636 | 0.049081 | 0.037557 | 0.11638 | 0.074635 | 0.943381 | 0.923272 | 0.908571 | 0.905649 | 0.896794 | 0.890949 | 0 | 0.019206 | 0.238729 | 40,079 | 1,010 | 121 | 39.682178 | 0.732162 | 0.030939 | 0 | 0.703107 | 0 | 0 | 0.228529 | 0.059524 | 0 | 0 | 0 | 0 | 0.142693 | 1 | 0.01496 | false | 0 | 0.005754 | 0 | 0.020713 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
439c458c0e0f69c4609fa712c3acde7f98eea5b5 | 3,236 | py | Python | src/examples/VRGameFog-IFogSim-WL/placement_Cluster_Edge.py | MarkoRimac/YAFS | 5ea354439e4acb4ca83714b01eb427b508718836 | [
"MIT"
] | 58 | 2018-09-19T12:00:01.000Z | 2022-03-28T12:14:32.000Z | src/examples/VRGameFog-IFogSim-WL/placement_Cluster_Edge.py | MarkoRimac/YAFS | 5ea354439e4acb4ca83714b01eb427b508718836 | [
"MIT"
] | 55 | 2018-03-18T09:58:27.000Z | 2022-02-19T16:40:02.000Z | src/examples/VRGameFog-IFogSim-WL/placement_Cluster_Edge.py | MarkoRimac/YAFS | 5ea354439e4acb4ca83714b01eb427b508718836 | [
"MIT"
] | 51 | 2018-05-30T11:33:10.000Z | 2022-03-14T15:37:01.000Z | """
This type of algorithm have two obligatory functions:
*initial_allocation*: invoked at the start of the simulation
*run* invoked according to the assigned temporal distribution.
"""
from yafs.placement import Placement
class CloudPlacement(Placement):
"""
This implementation locates the services of the application in the cheapest cloud regardless of where the sources or sinks are located.
It only runs once, in the initialization.
"""
def initial_allocation(self, sim, app_name):
#We find the ID-nodo/resource
value = {"model": "Cluster"}
id_cluster = sim.topology.find_IDs(value) #there is only ONE Cluster
value = {"model": "m-"}
id_mobiles = sim.topology.find_IDs(value)
#Given an application we get its modules implemented
app = sim.apps[app_name]
services = app.services
for module in services.keys():
if "Coordinator" == module:
if "Coordinator" in self.scaleServices.keys():
# print self.scaleServices["Coordinator"]
for rep in range(0,self.scaleServices["Coordinator"]):
idDES = sim.deploy_module(app_name,module,services[module],id_cluster) #Deploy as many modules as elements in the array
elif "Calculator" == module:
if "Calculator" in self.scaleServices.keys():
for rep in range(0, self.scaleServices["Calculator"]):
idDES = sim.deploy_module(app_name,module,services[module],id_cluster)
elif "Client" == module:
idDES = sim.deploy_module(app_name,module, services[module],id_mobiles)
#end function
class FogPlacement(Placement):
"""
This implementation locates the services of the application in the fog-device regardless of where the sources or sinks are located.
It only runs once, in the initialization.
"""
def initial_allocation(self, sim, app_name):
#We find the ID-nodo/resource
value = {"model": "Cluster"}
id_cluster = sim.topology.find_IDs(value) #there is only ONE Cluster
value = {"model": "d-"}
id_proxies = sim.topology.find_IDs(value)
value = {"model": "m-"}
id_mobiles = sim.topology.find_IDs(value)
#Given an application we get its modules implemented
app = sim.apps[app_name]
services = app.services
for module in services.keys():
if "Coordinator" == module:
if "Coordinator" in self.scaleServices.keys():
for rep in range(0, self.scaleServices["Coordinator"]):
idDES = sim.deploy_module(app_name, module, services[module],id_cluster) # Deploy as many modules as elements in the array
elif "Calculator" == module:
if "Calculator" in self.scaleServices.keys():
for rep in range(0, self.scaleServices["Calculator"]):
idDES = sim.deploy_module(app_name, module, services[module], id_proxies)
elif "Client" == module:
idDES = sim.deploy_module(app_name,module, services[module],id_mobiles)
| 38.52381 | 147 | 0.624845 | 384 | 3,236 | 5.174479 | 0.257813 | 0.035229 | 0.042275 | 0.060393 | 0.850528 | 0.838953 | 0.838953 | 0.838953 | 0.838953 | 0.838953 | 0 | 0.00172 | 0.281211 | 3,236 | 83 | 148 | 38.987952 | 0.852537 | 0.277194 | 0 | 0.853659 | 0 | 0 | 0.080759 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.04878 | false | 0 | 0.02439 | 0 | 0.121951 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
43d66d12e65e710131e613838fd0a0e7f5555ff9 | 1,446 | py | Python | registry/smart_contract/migrations/0024_auto_20180819_0841.py | RustamSultanov/Python-test-registry- | 1d779a8135567a0b3aeca0151b2d7f0905014e88 | [
"MIT"
] | 1 | 2019-01-16T14:52:37.000Z | 2019-01-16T14:52:37.000Z | registry/smart_contract/migrations/0024_auto_20180819_0841.py | RustamSultanov/Python-test-registry- | 1d779a8135567a0b3aeca0151b2d7f0905014e88 | [
"MIT"
] | 8 | 2019-10-21T16:18:33.000Z | 2021-06-08T20:33:14.000Z | registry/smart_contract/migrations/0024_auto_20180819_0841.py | RustamSultanov/Python-test-registry- | 1d779a8135567a0b3aeca0151b2d7f0905014e88 | [
"MIT"
] | null | null | null | # Generated by Django 2.1 on 2018-08-19 08:41
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('smart_contract', '0023_comment_adition_user'),
]
operations = [
migrations.AlterField(
model_name='comment',
name='accept',
field=models.BooleanField(blank=True, default=False),
),
migrations.AlterField(
model_name='comment',
name='customer_flag',
field=models.BooleanField(blank=True, default=False),
),
migrations.AlterField(
model_name='comment',
name='failure',
field=models.BooleanField(blank=True, default=False),
),
migrations.AlterField(
model_name='comment',
name='hide',
field=models.BooleanField(blank=True, default=False),
),
migrations.AlterField(
model_name='comment',
name='implementer_flag',
field=models.BooleanField(blank=True, default=False),
),
migrations.AlterField(
model_name='useraccept',
name='accept',
field=models.BooleanField(blank=True, default=False),
),
migrations.AlterField(
model_name='useraccept',
name='failure',
field=models.BooleanField(blank=True, default=False),
),
]
| 29.510204 | 65 | 0.567773 | 129 | 1,446 | 6.263566 | 0.317829 | 0.173267 | 0.216584 | 0.251238 | 0.777228 | 0.777228 | 0.732673 | 0.732673 | 0.732673 | 0.660891 | 0 | 0.018256 | 0.318119 | 1,446 | 48 | 66 | 30.125 | 0.801217 | 0.029737 | 0 | 0.761905 | 1 | 0 | 0.109208 | 0.017844 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.02381 | 0 | 0.095238 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
43db135916e1fff3e792c8a89088c94fad89e0b3 | 23,297 | py | Python | lib/src/owlracer/grpcClient/core_pb2_grpc.py | MATHEMA-GmbH/Owl-Racer-AI-Client-Python | 3a16a254710e4a2e868e8569e7d6a67050cbc180 | [
"MIT"
] | null | null | null | lib/src/owlracer/grpcClient/core_pb2_grpc.py | MATHEMA-GmbH/Owl-Racer-AI-Client-Python | 3a16a254710e4a2e868e8569e7d6a67050cbc180 | [
"MIT"
] | null | null | null | lib/src/owlracer/grpcClient/core_pb2_grpc.py | MATHEMA-GmbH/Owl-Racer-AI-Client-Python | 3a16a254710e4a2e868e8569e7d6a67050cbc180 | [
"MIT"
] | null | null | null | # Generated by the gRPC Python protocol compiler plugin. DO NOT EDIT!
"""Client and server classes corresponding to protobuf-defined services."""
import grpc
from google.protobuf import empty_pb2 as google_dot_protobuf_dot_empty__pb2
from matlabs.owlracer import core_pb2 as matlabs_dot_owlracer_dot_core__pb2
class GrpcCoreServiceStub(object):
"""Missing associated documentation comment in .proto file."""
def __init__(self, channel):
"""Constructor.
Args:
channel: A grpc.Channel.
"""
self.GetCarIds = channel.unary_unary(
'/matlabs.owlracer.core.GrpcCoreService/GetCarIds',
request_serializer=matlabs_dot_owlracer_dot_core__pb2.GuidData.SerializeToString,
response_deserializer=matlabs_dot_owlracer_dot_core__pb2.GuidListData.FromString,
)
self.CreateSession = channel.unary_unary(
'/matlabs.owlracer.core.GrpcCoreService/CreateSession',
request_serializer=matlabs_dot_owlracer_dot_core__pb2.CreateSessionData.SerializeToString,
response_deserializer=matlabs_dot_owlracer_dot_core__pb2.SessionData.FromString,
)
self.GetSession = channel.unary_unary(
'/matlabs.owlracer.core.GrpcCoreService/GetSession',
request_serializer=matlabs_dot_owlracer_dot_core__pb2.GuidData.SerializeToString,
response_deserializer=matlabs_dot_owlracer_dot_core__pb2.SessionData.FromString,
)
self.GetSessionIds = channel.unary_unary(
'/matlabs.owlracer.core.GrpcCoreService/GetSessionIds',
request_serializer=google_dot_protobuf_dot_empty__pb2.Empty.SerializeToString,
response_deserializer=matlabs_dot_owlracer_dot_core__pb2.GuidListData.FromString,
)
self.CreateCar = channel.unary_unary(
'/matlabs.owlracer.core.GrpcCoreService/CreateCar',
request_serializer=matlabs_dot_owlracer_dot_core__pb2.CreateCarData.SerializeToString,
response_deserializer=matlabs_dot_owlracer_dot_core__pb2.RaceCarData.FromString,
)
self.DestroyCar = channel.unary_unary(
'/matlabs.owlracer.core.GrpcCoreService/DestroyCar',
request_serializer=matlabs_dot_owlracer_dot_core__pb2.GuidData.SerializeToString,
response_deserializer=google_dot_protobuf_dot_empty__pb2.Empty.FromString,
)
self.DestroySession = channel.unary_unary(
'/matlabs.owlracer.core.GrpcCoreService/DestroySession',
request_serializer=matlabs_dot_owlracer_dot_core__pb2.GuidData.SerializeToString,
response_deserializer=google_dot_protobuf_dot_empty__pb2.Empty.FromString,
)
self.GetCarData = channel.unary_unary(
'/matlabs.owlracer.core.GrpcCoreService/GetCarData',
request_serializer=matlabs_dot_owlracer_dot_core__pb2.GuidData.SerializeToString,
response_deserializer=matlabs_dot_owlracer_dot_core__pb2.RaceCarData.FromString,
)
self.Step = channel.unary_unary(
'/matlabs.owlracer.core.GrpcCoreService/Step',
request_serializer=matlabs_dot_owlracer_dot_core__pb2.StepData.SerializeToString,
response_deserializer=matlabs_dot_owlracer_dot_core__pb2.RaceCarData.FromString,
)
self.Reset = channel.unary_unary(
'/matlabs.owlracer.core.GrpcCoreService/Reset',
request_serializer=matlabs_dot_owlracer_dot_core__pb2.GuidData.SerializeToString,
response_deserializer=google_dot_protobuf_dot_empty__pb2.Empty.FromString,
)
class GrpcCoreServiceServicer(object):
"""Missing associated documentation comment in .proto file."""
def GetCarIds(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def CreateSession(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def GetSession(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def GetSessionIds(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def CreateCar(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def DestroyCar(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def DestroySession(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def GetCarData(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def Step(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def Reset(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def add_GrpcCoreServiceServicer_to_server(servicer, server):
rpc_method_handlers = {
'GetCarIds': grpc.unary_unary_rpc_method_handler(
servicer.GetCarIds,
request_deserializer=matlabs_dot_owlracer_dot_core__pb2.GuidData.FromString,
response_serializer=matlabs_dot_owlracer_dot_core__pb2.GuidListData.SerializeToString,
),
'CreateSession': grpc.unary_unary_rpc_method_handler(
servicer.CreateSession,
request_deserializer=matlabs_dot_owlracer_dot_core__pb2.CreateSessionData.FromString,
response_serializer=matlabs_dot_owlracer_dot_core__pb2.SessionData.SerializeToString,
),
'GetSession': grpc.unary_unary_rpc_method_handler(
servicer.GetSession,
request_deserializer=matlabs_dot_owlracer_dot_core__pb2.GuidData.FromString,
response_serializer=matlabs_dot_owlracer_dot_core__pb2.SessionData.SerializeToString,
),
'GetSessionIds': grpc.unary_unary_rpc_method_handler(
servicer.GetSessionIds,
request_deserializer=google_dot_protobuf_dot_empty__pb2.Empty.FromString,
response_serializer=matlabs_dot_owlracer_dot_core__pb2.GuidListData.SerializeToString,
),
'CreateCar': grpc.unary_unary_rpc_method_handler(
servicer.CreateCar,
request_deserializer=matlabs_dot_owlracer_dot_core__pb2.CreateCarData.FromString,
response_serializer=matlabs_dot_owlracer_dot_core__pb2.RaceCarData.SerializeToString,
),
'DestroyCar': grpc.unary_unary_rpc_method_handler(
servicer.DestroyCar,
request_deserializer=matlabs_dot_owlracer_dot_core__pb2.GuidData.FromString,
response_serializer=google_dot_protobuf_dot_empty__pb2.Empty.SerializeToString,
),
'DestroySession': grpc.unary_unary_rpc_method_handler(
servicer.DestroySession,
request_deserializer=matlabs_dot_owlracer_dot_core__pb2.GuidData.FromString,
response_serializer=google_dot_protobuf_dot_empty__pb2.Empty.SerializeToString,
),
'GetCarData': grpc.unary_unary_rpc_method_handler(
servicer.GetCarData,
request_deserializer=matlabs_dot_owlracer_dot_core__pb2.GuidData.FromString,
response_serializer=matlabs_dot_owlracer_dot_core__pb2.RaceCarData.SerializeToString,
),
'Step': grpc.unary_unary_rpc_method_handler(
servicer.Step,
request_deserializer=matlabs_dot_owlracer_dot_core__pb2.StepData.FromString,
response_serializer=matlabs_dot_owlracer_dot_core__pb2.RaceCarData.SerializeToString,
),
'Reset': grpc.unary_unary_rpc_method_handler(
servicer.Reset,
request_deserializer=matlabs_dot_owlracer_dot_core__pb2.GuidData.FromString,
response_serializer=google_dot_protobuf_dot_empty__pb2.Empty.SerializeToString,
),
}
generic_handler = grpc.method_handlers_generic_handler(
'matlabs.owlracer.core.GrpcCoreService', rpc_method_handlers)
server.add_generic_rpc_handlers((generic_handler,))
# This class is part of an EXPERIMENTAL API.
class GrpcCoreService(object):
"""Missing associated documentation comment in .proto file."""
@staticmethod
def GetCarIds(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/matlabs.owlracer.core.GrpcCoreService/GetCarIds',
matlabs_dot_owlracer_dot_core__pb2.GuidData.SerializeToString,
matlabs_dot_owlracer_dot_core__pb2.GuidListData.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def CreateSession(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/matlabs.owlracer.core.GrpcCoreService/CreateSession',
matlabs_dot_owlracer_dot_core__pb2.CreateSessionData.SerializeToString,
matlabs_dot_owlracer_dot_core__pb2.SessionData.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def GetSession(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/matlabs.owlracer.core.GrpcCoreService/GetSession',
matlabs_dot_owlracer_dot_core__pb2.GuidData.SerializeToString,
matlabs_dot_owlracer_dot_core__pb2.SessionData.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def GetSessionIds(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/matlabs.owlracer.core.GrpcCoreService/GetSessionIds',
google_dot_protobuf_dot_empty__pb2.Empty.SerializeToString,
matlabs_dot_owlracer_dot_core__pb2.GuidListData.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def CreateCar(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/matlabs.owlracer.core.GrpcCoreService/CreateCar',
matlabs_dot_owlracer_dot_core__pb2.CreateCarData.SerializeToString,
matlabs_dot_owlracer_dot_core__pb2.RaceCarData.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def DestroyCar(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/matlabs.owlracer.core.GrpcCoreService/DestroyCar',
matlabs_dot_owlracer_dot_core__pb2.GuidData.SerializeToString,
google_dot_protobuf_dot_empty__pb2.Empty.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def DestroySession(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/matlabs.owlracer.core.GrpcCoreService/DestroySession',
matlabs_dot_owlracer_dot_core__pb2.GuidData.SerializeToString,
google_dot_protobuf_dot_empty__pb2.Empty.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def GetCarData(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/matlabs.owlracer.core.GrpcCoreService/GetCarData',
matlabs_dot_owlracer_dot_core__pb2.GuidData.SerializeToString,
matlabs_dot_owlracer_dot_core__pb2.RaceCarData.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def Step(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/matlabs.owlracer.core.GrpcCoreService/Step',
matlabs_dot_owlracer_dot_core__pb2.StepData.SerializeToString,
matlabs_dot_owlracer_dot_core__pb2.RaceCarData.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def Reset(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/matlabs.owlracer.core.GrpcCoreService/Reset',
matlabs_dot_owlracer_dot_core__pb2.GuidData.SerializeToString,
google_dot_protobuf_dot_empty__pb2.Empty.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
class GrpcResourceServiceStub(object):
"""Missing associated documentation comment in .proto file."""
def __init__(self, channel):
"""Constructor.
Args:
channel: A grpc.Channel.
"""
self.GetBaseImages = channel.unary_unary(
'/matlabs.owlracer.core.GrpcResourceService/GetBaseImages',
request_serializer=google_dot_protobuf_dot_empty__pb2.Empty.SerializeToString,
response_deserializer=matlabs_dot_owlracer_dot_core__pb2.ResourceImagesDataResponse.FromString,
)
self.GetTrackImage = channel.unary_unary(
'/matlabs.owlracer.core.GrpcResourceService/GetTrackImage',
request_serializer=matlabs_dot_owlracer_dot_core__pb2.TrackIdData.SerializeToString,
response_deserializer=matlabs_dot_owlracer_dot_core__pb2.TrackImageDataResponse.FromString,
)
self.GetTrackData = channel.unary_unary(
'/matlabs.owlracer.core.GrpcResourceService/GetTrackData',
request_serializer=matlabs_dot_owlracer_dot_core__pb2.TrackIdData.SerializeToString,
response_deserializer=matlabs_dot_owlracer_dot_core__pb2.TrackData.FromString,
)
class GrpcResourceServiceServicer(object):
"""Missing associated documentation comment in .proto file."""
def GetBaseImages(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def GetTrackImage(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def GetTrackData(self, request, context):
"""Missing associated documentation comment in .proto file."""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def add_GrpcResourceServiceServicer_to_server(servicer, server):
rpc_method_handlers = {
'GetBaseImages': grpc.unary_unary_rpc_method_handler(
servicer.GetBaseImages,
request_deserializer=google_dot_protobuf_dot_empty__pb2.Empty.FromString,
response_serializer=matlabs_dot_owlracer_dot_core__pb2.ResourceImagesDataResponse.SerializeToString,
),
'GetTrackImage': grpc.unary_unary_rpc_method_handler(
servicer.GetTrackImage,
request_deserializer=matlabs_dot_owlracer_dot_core__pb2.TrackIdData.FromString,
response_serializer=matlabs_dot_owlracer_dot_core__pb2.TrackImageDataResponse.SerializeToString,
),
'GetTrackData': grpc.unary_unary_rpc_method_handler(
servicer.GetTrackData,
request_deserializer=matlabs_dot_owlracer_dot_core__pb2.TrackIdData.FromString,
response_serializer=matlabs_dot_owlracer_dot_core__pb2.TrackData.SerializeToString,
),
}
generic_handler = grpc.method_handlers_generic_handler(
'matlabs.owlracer.core.GrpcResourceService', rpc_method_handlers)
server.add_generic_rpc_handlers((generic_handler,))
# This class is part of an EXPERIMENTAL API.
class GrpcResourceService(object):
"""Missing associated documentation comment in .proto file."""
@staticmethod
def GetBaseImages(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/matlabs.owlracer.core.GrpcResourceService/GetBaseImages',
google_dot_protobuf_dot_empty__pb2.Empty.SerializeToString,
matlabs_dot_owlracer_dot_core__pb2.ResourceImagesDataResponse.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def GetTrackImage(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/matlabs.owlracer.core.GrpcResourceService/GetTrackImage',
matlabs_dot_owlracer_dot_core__pb2.TrackIdData.SerializeToString,
matlabs_dot_owlracer_dot_core__pb2.TrackImageDataResponse.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
@staticmethod
def GetTrackData(request,
target,
options=(),
channel_credentials=None,
call_credentials=None,
insecure=False,
compression=None,
wait_for_ready=None,
timeout=None,
metadata=None):
return grpc.experimental.unary_unary(request, target, '/matlabs.owlracer.core.GrpcResourceService/GetTrackData',
matlabs_dot_owlracer_dot_core__pb2.TrackIdData.SerializeToString,
matlabs_dot_owlracer_dot_core__pb2.TrackData.FromString,
options, channel_credentials,
insecure, call_credentials, compression, wait_for_ready, timeout, metadata)
| 47.351626 | 121 | 0.676997 | 2,182 | 23,297 | 6.870761 | 0.055454 | 0.03035 | 0.076841 | 0.089648 | 0.923959 | 0.90595 | 0.902215 | 0.8127 | 0.774013 | 0.766409 | 0 | 0.004699 | 0.250891 | 23,297 | 491 | 122 | 47.448065 | 0.854343 | 0.060308 | 0 | 0.668281 | 1 | 0 | 0.097519 | 0.063786 | 0 | 0 | 0 | 0 | 0 | 1 | 0.072639 | false | 0 | 0.007264 | 0.031477 | 0.125908 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
6064fcc42a14c6f4e39b299f72b675b7191e529f | 167 | py | Python | blog/admin.py | ChaoFanMa01/my_site | 29c2598a835c10684669e9ae66af641cdaf22ec4 | [
"MIT"
] | null | null | null | blog/admin.py | ChaoFanMa01/my_site | 29c2598a835c10684669e9ae66af641cdaf22ec4 | [
"MIT"
] | null | null | null | blog/admin.py | ChaoFanMa01/my_site | 29c2598a835c10684669e9ae66af641cdaf22ec4 | [
"MIT"
] | null | null | null | from django.contrib import admin
from .models import Category, Tag, Article, Photo
# Register your models here.
admin.site.register([Category, Tag, Article, Photo])
| 23.857143 | 52 | 0.772455 | 23 | 167 | 5.608696 | 0.608696 | 0.170543 | 0.27907 | 0.356589 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.131737 | 167 | 6 | 53 | 27.833333 | 0.889655 | 0.155689 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.666667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
607e40bfe05ecd3bd9ea77d8fc0169f7d8fdd226 | 67 | py | Python | bare_python/s09_imutable_hashable.py | AndreiHondrari/python_exploration | cb4ac0b92ddc48c322201ba31cd6e7c5ee6af06d | [
"MIT"
] | 3 | 2019-05-04T12:19:09.000Z | 2019-08-30T07:12:31.000Z | bare_python/s09_imutable_hashable.py | AndreiHondrari/python_exploration | cb4ac0b92ddc48c322201ba31cd6e7c5ee6af06d | [
"MIT"
] | null | null | null | bare_python/s09_imutable_hashable.py | AndreiHondrari/python_exploration | cb4ac0b92ddc48c322201ba31cd6e7c5ee6af06d | [
"MIT"
] | null | null | null | #!python3
d = {}
d[(1, 1)] = 10
d[(1, 2)] = 22
print(d[(1, 1)])
| 7.444444 | 16 | 0.38806 | 14 | 67 | 1.857143 | 0.5 | 0.230769 | 0.230769 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.22 | 0.253731 | 67 | 8 | 17 | 8.375 | 0.3 | 0.119403 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.25 | 1 | 0 | 1 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
6081369ec4615005de695e0b430b124d664fa420 | 459 | py | Python | Exercicios Python/ex108/moeda.py | ClaudioSiqueira/Exercicios-Python | 128387769b34b7d42aee5c1effda16de21216e10 | [
"MIT"
] | null | null | null | Exercicios Python/ex108/moeda.py | ClaudioSiqueira/Exercicios-Python | 128387769b34b7d42aee5c1effda16de21216e10 | [
"MIT"
] | null | null | null | Exercicios Python/ex108/moeda.py | ClaudioSiqueira/Exercicios-Python | 128387769b34b7d42aee5c1effda16de21216e10 | [
"MIT"
] | null | null | null | def metade(preco):
res = preco/2
return res
def aumentar(preco, taxa):
res = preco + (preco * taxa/100)
return res
def diminuir(preco, taxa):
res = preco - (preco * taxa/100)
return res
def dobro(preco):
res = preco * 2
return res
def formatacao(preco = 0, moeda = 'R$'):
return f'{moeda}{preco:.2f}'.replace('.', ',')
'''def moeda(preco = 0, moeda = 'R$'):
return f'{moeda}{preco:.2f}'.replace('.', ',')'''
| 16.392857 | 53 | 0.562092 | 62 | 459 | 4.16129 | 0.290323 | 0.124031 | 0.186047 | 0.108527 | 0.813953 | 0.813953 | 0.813953 | 0.612403 | 0.612403 | 0.612403 | 0 | 0.034483 | 0.24183 | 459 | 27 | 54 | 17 | 0.706897 | 0 | 0 | 0.285714 | 0 | 0 | 0.059946 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.357143 | false | 0 | 0 | 0.071429 | 0.714286 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 7 |
6098fc5172555c9cd0c34d258d49b8b070454c4f | 278 | py | Python | PythonExercicios/ex016.py | VitorFRodrigues/Python-curso | af75ff4a7ca14bc7e67b4f3362af837d355b1746 | [
"MIT"
] | null | null | null | PythonExercicios/ex016.py | VitorFRodrigues/Python-curso | af75ff4a7ca14bc7e67b4f3362af837d355b1746 | [
"MIT"
] | null | null | null | PythonExercicios/ex016.py | VitorFRodrigues/Python-curso | af75ff4a7ca14bc7e67b4f3362af837d355b1746 | [
"MIT"
] | null | null | null | from math import trunc
num = float(input('Digite um número com casas decimais: '))
print('O número {} tem a parte inteira {}'.format(num, trunc(num)))
num = float(input('Digite um número com casas decimais: '))
print('O número {} tem a parte inteira {}'.format(num, int(num)))
| 39.714286 | 67 | 0.694245 | 44 | 278 | 4.386364 | 0.477273 | 0.082902 | 0.134715 | 0.196891 | 0.829016 | 0.829016 | 0.829016 | 0.829016 | 0.829016 | 0.829016 | 0 | 0 | 0.147482 | 278 | 6 | 68 | 46.333333 | 0.814346 | 0 | 0 | 0.4 | 0 | 0 | 0.510791 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.2 | 0 | 0.2 | 0.4 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
88127f0da0b70ec2910993f5aba2f5a8b14997a3 | 9,335 | py | Python | models/ac_simple.py | praveeenbadimala/flow_unsupervised | 07385fd45e9213c06acacfd891e116f07993575e | [
"MIT"
] | null | null | null | models/ac_simple.py | praveeenbadimala/flow_unsupervised | 07385fd45e9213c06acacfd891e116f07993575e | [
"MIT"
] | null | null | null | models/ac_simple.py | praveeenbadimala/flow_unsupervised | 07385fd45e9213c06acacfd891e116f07993575e | [
"MIT"
] | null | null | null | import torch
import torch.nn as nn
from torch.nn.init import kaiming_normal
def conv(batchNorm, in_planes, out_planes, kernel_size=3, stride=1):
if batchNorm:
return nn.Sequential(
nn.Conv2d(in_planes, out_planes, kernel_size=kernel_size, stride=stride, padding=(kernel_size-1)//2, bias=False),
nn.BatchNorm2d(out_planes),
nn.LeakyReLU(0.1,inplace=True)
)
else:
return nn.Sequential(
nn.Conv2d(in_planes, out_planes, kernel_size=kernel_size, stride=stride, padding=(kernel_size-1)//2, bias=True),
nn.LeakyReLU(0.1,inplace=True)
)
def predict_flow(in_planes):
return nn.Conv2d(in_planes,2,kernel_size=3,stride=1,padding=1,bias=False)
def deconv(in_planes, out_planes):
return nn.Sequential(
nn.ConvTranspose2d(in_planes, out_planes, kernel_size=4, stride=2, padding=1, bias=False),
nn.LeakyReLU(0.1,inplace=True)
)
def crop_like(input, target):
if input.size()[2:] == target.size()[2:]:
return input
else:
return input[:, :, :target.size(2), :target.size(3)]
class Actor(nn.Module):
expansion = 1
def __init__(self,batchNorm=True):
super(Actor,self).__init__()
self.batchNorm = batchNorm
self.conv1 = conv(self.batchNorm, 6, 64, kernel_size=7, stride=2)
self.conv2 = conv(self.batchNorm, 64, 128, kernel_size=5, stride=2)
self.conv3 = conv(self.batchNorm, 128, 256, kernel_size=5, stride=2)
self.conv3_1 = conv(self.batchNorm, 256, 256)
self.conv4 = conv(self.batchNorm, 256, 512, stride=2)
self.conv4_1 = conv(self.batchNorm, 512, 512)
self.conv5 = conv(self.batchNorm, 512, 512, stride=2)
self.conv5_1 = conv(self.batchNorm, 512, 512)
self.conv6 = conv(self.batchNorm, 512, 1024, stride=2)
self.conv6_1 = conv(self.batchNorm,1024, 1024)
self.deconv5 = deconv(1024,512)
self.deconv4 = deconv(1026,256)
self.deconv3 = deconv(770,128)
self.deconv2 = deconv(386,64)
self.predict_flow6 = predict_flow(1024)
self.predict_flow5 = predict_flow(1026)
self.predict_flow4 = predict_flow(770)
self.predict_flow3 = predict_flow(386)
self.predict_flow2 = predict_flow(194)
self.upsampled_flow6_to_5 = nn.ConvTranspose2d(2, 2, 4, 2, 1, bias=False)
self.upsampled_flow5_to_4 = nn.ConvTranspose2d(2, 2, 4, 2, 1, bias=False)
self.upsampled_flow4_to_3 = nn.ConvTranspose2d(2, 2, 4, 2, 1, bias=False)
self.upsampled_flow3_to_2 = nn.ConvTranspose2d(2, 2, 4, 2, 1, bias=False)
for m in self.modules():
if isinstance(m, nn.Conv2d) or isinstance(m, nn.ConvTranspose2d):
kaiming_normal(m.weight.data)
if m.bias is not None:
m.bias.data.zero_()
elif isinstance(m, nn.BatchNorm2d):
m.weight.data.fill_(1)
m.bias.data.zero_()
def forward(self, x):
out_conv2 = self.conv2(self.conv1(x))
out_conv3 = self.conv3_1(self.conv3(out_conv2))
out_conv4 = self.conv4_1(self.conv4(out_conv3))
out_conv5 = self.conv5_1(self.conv5(out_conv4))
out_conv6 = self.conv6_1(self.conv6(out_conv5))
flow6 = self.predict_flow6(out_conv6)
flow6_up = crop_like(self.upsampled_flow6_to_5(flow6), out_conv5)
out_deconv5 = crop_like(self.deconv5(out_conv6), out_conv5)
concat5 = torch.cat((out_conv5,out_deconv5,flow6_up),1)
flow5 = self.predict_flow5(concat5)
flow5_up = crop_like(self.upsampled_flow5_to_4(flow5), out_conv4)
out_deconv4 = crop_like(self.deconv4(concat5), out_conv4)
concat4 = torch.cat((out_conv4,out_deconv4,flow5_up),1)
flow4 = self.predict_flow4(concat4)
flow4_up = crop_like(self.upsampled_flow4_to_3(flow4), out_conv3)
out_deconv3 = crop_like(self.deconv3(concat4), out_conv3)
concat3 = torch.cat((out_conv3,out_deconv3,flow4_up),1)
flow3 = self.predict_flow3(concat3)
flow3_up = crop_like(self.upsampled_flow3_to_2(flow3), out_conv2)
out_deconv2 = crop_like(self.deconv2(concat3), out_conv2)
concat2 = torch.cat((out_conv2,out_deconv2,flow3_up),1)
flow2 = self.predict_flow2(concat2)
return flow2
def weight_parameters(self):
return [param for name, param in self.named_parameters() if 'weight' in name]
def bias_parameters(self):
return [param for name, param in self.named_parameters() if 'bias' in name]
class Critic(nn.Module):
expansion = 1
def __init__(self,batchNorm=True):
super(Critic,self).__init__()
self.batchNorm = batchNorm
self.conv1 = conv(self.batchNorm, 8, 64, kernel_size=7, stride=2)
self.conv2 = conv(self.batchNorm, 64, 128, kernel_size=5, stride=2)
self.conv3 = conv(self.batchNorm, 128, 256, kernel_size=5, stride=2)
self.conv3_1 = conv(self.batchNorm, 256, 256)
self.conv4 = conv(self.batchNorm, 256, 512, stride=2)
self.conv4_1 = conv(self.batchNorm, 512, 512)
self.conv5 = conv(self.batchNorm, 512, 512, stride=2)
self.conv5_1 = conv(self.batchNorm, 512, 512)
self.conv6 = conv(self.batchNorm, 512, 1024, stride=2)
self.conv6_1 = conv(self.batchNorm,1024, 1024)
self.deconv5 = deconv(1024,512)
self.deconv4 = deconv(1026,256)
self.deconv3 = deconv(770,128)
self.deconv2 = deconv(386,64)
self.predict_flow6 = predict_flow(1024)
self.predict_flow5 = predict_flow(1026)
self.predict_flow4 = predict_flow(770)
self.predict_flow3 = predict_flow(386)
self.predict_flow2 = predict_flow(194)
self.upsampled_flow6_to_5 = nn.ConvTranspose2d(2, 2, 4, 2, 1, bias=False)
self.upsampled_flow5_to_4 = nn.ConvTranspose2d(2, 2, 4, 2, 1, bias=False)
self.upsampled_flow4_to_3 = nn.ConvTranspose2d(2, 2, 4, 2, 1, bias=False)
self.upsampled_flow3_to_2 = nn.ConvTranspose2d(2, 2, 4, 2, 1, bias=False)
for m in self.modules():
if isinstance(m, nn.Conv2d) or isinstance(m, nn.ConvTranspose2d):
kaiming_normal(m.weight.data)
if m.bias is not None:
m.bias.data.zero_()
elif isinstance(m, nn.BatchNorm2d):
m.weight.data.fill_(1)
m.bias.data.zero_()
def forward(self, x):
out_conv2 = self.conv2(self.conv1(x))
out_conv3 = self.conv3_1(self.conv3(out_conv2))
out_conv4 = self.conv4_1(self.conv4(out_conv3))
out_conv5 = self.conv5_1(self.conv5(out_conv4))
out_conv6 = self.conv6_1(self.conv6(out_conv5))
flow6 = self.predict_flow6(out_conv6)
flow6_up = crop_like(self.upsampled_flow6_to_5(flow6), out_conv5)
out_deconv5 = crop_like(self.deconv5(out_conv6), out_conv5)
concat5 = torch.cat((out_conv5,out_deconv5,flow6_up),1)
flow5 = self.predict_flow5(concat5)
flow5_up = crop_like(self.upsampled_flow5_to_4(flow5), out_conv4)
out_deconv4 = crop_like(self.deconv4(concat5), out_conv4)
concat4 = torch.cat((out_conv4,out_deconv4,flow5_up),1)
flow4 = self.predict_flow4(concat4)
flow4_up = crop_like(self.upsampled_flow4_to_3(flow4), out_conv3)
out_deconv3 = crop_like(self.deconv3(concat4), out_conv3)
concat3 = torch.cat((out_conv3,out_deconv3,flow4_up),1)
flow3 = self.predict_flow3(concat3)
flow3_up = crop_like(self.upsampled_flow3_to_2(flow3), out_conv2)
out_deconv2 = crop_like(self.deconv2(concat3), out_conv2)
concat2 = torch.cat((out_conv2,out_deconv2,flow3_up),1)
flow2 = self.predict_flow2(concat2)
expected_energy = flow2.sum(3).sum(2).sum(1)
return expected_energy
def weight_parameters(self):
return [param for name, param in self.named_parameters() if 'weight' in name]
def bias_parameters(self):
return [param for name, param in self.named_parameters() if 'bias' in name]
def ActorLoad(path=None):
"""FlowNetS model architecture from the
"Learning Optical Flow with Convolutional Networks" paper (https://arxiv.org/abs/1504.06852)
Args:
path : where to load pretrained network of actor network. will create a new one if not set
"""
model = Actor(batchNorm=False)
if path is not None:
data = torch.load(path)
if 'state_dict' in data.keys():
model.load_state_dict(data['state_dict'])
else:
model.load_state_dict(data)
return model
def CriticLoad(path=None):
"""FlowNetS model architecture from the
"Learning Optical Flow with Convolutional Networks" paper (https://arxiv.org/abs/1504.06852)
Args:
path : where to load pretrained network of critic network. will create a new one if not set
"""
model = Critic(batchNorm=False)
if path is not None:
data = torch.load(path)
if 'state_dict' in data.keys():
model.load_state_dict(data['state_dict'])
else:
model.load_state_dict(data)
return model | 40.942982 | 125 | 0.644349 | 1,314 | 9,335 | 4.375951 | 0.107306 | 0.054261 | 0.05913 | 0.025043 | 0.912696 | 0.908174 | 0.894609 | 0.885217 | 0.885217 | 0.885217 | 0 | 0.081814 | 0.241885 | 9,335 | 228 | 126 | 40.942982 | 0.730677 | 0.049705 | 0 | 0.816092 | 0 | 0 | 0.006797 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.08046 | false | 0 | 0.017241 | 0.034483 | 0.201149 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
7156a86935cb4bb72cc786923e60807d77cb0eab | 15,122 | py | Python | Along_isopycnal_property_values.py | UBC-MOAD/analysis_saurav_wcvi | 16a348a3a828b04f20cac019a3ef1f1476ae9c4e | [
"Apache-2.0"
] | null | null | null | Along_isopycnal_property_values.py | UBC-MOAD/analysis_saurav_wcvi | 16a348a3a828b04f20cac019a3ef1f1476ae9c4e | [
"Apache-2.0"
] | null | null | null | Along_isopycnal_property_values.py | UBC-MOAD/analysis_saurav_wcvi | 16a348a3a828b04f20cac019a3ef1f1476ae9c4e | [
"Apache-2.0"
] | null | null | null | import numpy as np
import netCDF4 as nc
from scipy.interpolate import interp1d
NEP_aug = nc.Dataset('/home/ssahu/saurav/NEP36_T_S_Spice_aug.nc')
sal_aug = NEP_aug.variables['vosaline']
temp_aug = NEP_aug.variables['votemper']
spic_aug = NEP_aug.variables['spiciness']
rho_aug = NEP_aug.variables['density']
zlevels = nc.Dataset('/data/mdunphy/NEP036-N30-OUT/CDF_COMB_COMPRESSED/NEP036-N30_IN_20140915_00001440_grid_T.nc').variables['deptht']
mesh_mask = nc.Dataset('/data/mdunphy/NEP036-N30-OUT/INV/mesh_mask.nc')
mbathy = mesh_mask['mbathy'][0,...]
NEP_jul = nc.Dataset('/home/ssahu/saurav/NEP36_T_S_Spice_july.nc')
sal_jul = NEP_jul.variables['vosaline']
temp_jul = NEP_jul.variables['votemper']
spic_jul = NEP_jul.variables['spiciness']
rho_jul = NEP_jul.variables['density']
y_wcvi_slice = np.arange(230,350)
x_wcvi_slice = np.arange(550,650)
#znew = np.arange(0,150,0.1)
#dens_cont = np.arange(25.,27.,0.25/8.)
#tol = 0.001
#spic_iso = np.empty((rho_jul.shape[0],dens_cont.shape[0],y_wcvi_slice.shape[0],x_wcvi_slice.shape[0]))
#rho_iso = np.empty((rho_jul.shape[0],dens_cont.shape[0],y_wcvi_slice.shape[0],x_wcvi_slice.shape[0]))
#temp_iso = np.empty((rho_jul.shape[0],dens_cont.shape[0],y_wcvi_slice.shape[0],x_wcvi_slice.shape[0]))
#sal_iso = np.empty((rho_jul.shape[0],dens_cont.shape[0],y_wcvi_slice.shape[0],y_wcvi_slice.shape[0]))
#t =12
znew = np.arange(0,250,0.05)
den = np.arange(23.,28.,0.1)
tol = 0.01
#rho_new = np.empty((znew.shape[0],x_wcvi_slice.shape[0]))
#spic_new = np.empty((znew.shape[0],x_wcvi_slice.shape[0]))
#rho_0 = rho_jul[t,:,y_wcvi_slice,x_wcvi_slice] - 1000
#spic_0 = spic_jul[t,:,y_wcvi_slice,x_wcvi_slice]
spic_time_iso = np.empty((spic_jul.shape[0],den.shape[0],y_wcvi_slice.shape[0],x_wcvi_slice.shape[0]))
tem_time_iso = np.empty((spic_jul.shape[0],den.shape[0],y_wcvi_slice.shape[0],x_wcvi_slice.shape[0]))
sal_time_iso = np.empty((spic_jul.shape[0],den.shape[0],y_wcvi_slice.shape[0],x_wcvi_slice.shape[0]))
for t in np.arange(spic_time_iso.shape[0]):
rho_0 = rho_jul[t,:,y_wcvi_slice,x_wcvi_slice] - 1000
spic_0 = spic_jul[t,:,y_wcvi_slice,x_wcvi_slice]
tem_0 = temp_jul[t,:,y_wcvi_slice,x_wcvi_slice]
sal_0 = sal_jul[t,:,y_wcvi_slice,x_wcvi_slice]
spic_spec_iso = np.empty((den.shape[0],y_wcvi_slice.shape[0],x_wcvi_slice.shape[0]))
tem_spec_iso = np.empty((den.shape[0],y_wcvi_slice.shape[0],x_wcvi_slice.shape[0]))
sal_spec_iso = np.empty((den.shape[0],y_wcvi_slice.shape[0],x_wcvi_slice.shape[0]))
for iso in np.arange(den.shape[0]):
spic_den = np.empty((y_wcvi_slice.shape[0],x_wcvi_slice.shape[0]))
tem_den = np.empty((y_wcvi_slice.shape[0],x_wcvi_slice.shape[0]))
sal_den = np.empty((y_wcvi_slice.shape[0],x_wcvi_slice.shape[0]))
for j in np.arange(y_wcvi_slice.shape[0]):
spic_iso = np.empty(x_wcvi_slice.shape[0])
sal_iso = np.empty(x_wcvi_slice.shape[0])
tem_iso = np.empty(x_wcvi_slice.shape[0])
rho_new = np.empty((znew.shape[0],x_wcvi_slice.shape[0]))
spic_new = np.empty((znew.shape[0],x_wcvi_slice.shape[0]))
tem_new = np.empty((znew.shape[0],x_wcvi_slice.shape[0]))
sal_new = np.empty((znew.shape[0],x_wcvi_slice.shape[0]))
for i in np.arange(rho_new.shape[1]):
f = interp1d(zlevels[:],rho_0[:,j,i],fill_value='extrapolate')
g = interp1d(zlevels[:],spic_0[:,j,i],fill_value='extrapolate')
h = interp1d(zlevels[:],tem_0[:,j,i],fill_value='extrapolate')
p = interp1d(zlevels[:],sal_0[:,j,i],fill_value='extrapolate')
rho_new[:,i] = f(znew[:])
spic_new[:,i] = g(znew[:])
tem_new[:,i] = h(znew[:])
sal_new[:,i] = p(znew[:])
V = rho_new[:,i]
ind = (V>den[iso]-tol)&(V<den[iso]+tol)
spic_iso[i] = np.nanmean(spic_new[ind,i])
tem_iso[i] = np.nanmean(tem_new[ind,i])
sal_iso[i] = np.nanmean(sal_new[ind,i])
spic_den[j,i] = spic_iso[i]
tem_den[j,i] = tem_iso[i]
sal_den[j,i] = sal_iso[i]
spic_spec_iso[iso,j,i] = spic_den[j,i]
tem_spec_iso[iso,j,i] = tem_den[j,i]
sal_spec_iso[iso,j,i] = sal_den[j,i]
spic_time_iso[t,iso,j,i] = spic_spec_iso[iso,j,i]
tem_time_iso[t,iso,j,i] = tem_spec_iso[iso,j,i]
sal_time_iso[t,iso,j,i] = sal_spec_iso[iso,j,i]
print("Calculating the depths of the isopycnals (in July) for 3D plots")
depth_rho_0 = np.empty((sal_time_iso[...].shape[0],sal_time_iso.shape[1],rho_jul.shape[2],rho_jul.shape[3]))
for t in np.arange(spic_time_iso.shape[0]):
for iso in np.arange(den.shape[0]):
for j in np.arange(230,350):
for i in np.arange(550,650):
if mbathy[j,i] > 0:
depth_rho_0[t,iso,j, i] = np.interp(den[iso], rho_jul[t,:mbathy[j, i], j, i]-1000, zlevels[:mbathy[j, i]])
depth_rho = np.empty_like(sal_time_iso[...])
depth_rho = depth_rho_0[:,:,y_wcvi_slice,x_wcvi_slice]
#for den in np.arange(dens_cont.shape[0]):
# for t in np.arange(rho_jul.shape[0]):
# for j in np.arange(y_wcvi_slice.shape[0]):
#
# for i in np.arange(y_wcvi_slice.shape[0]):
#
# print(i)
#Choose the data slice in x-z
# rho_0 = rho_jul[t,:,j,x_wcvi_slice] - 1000
# spic_0 = spic_jul[t,:,j,x_wcvi_slice]
# temp_0 = temp_jul[t,:,j,x_wcvi_slice]
# sal_0 = sal_jul[t,:,j,x_wcvi_slice]
#
# # initialise the shapes of the variables#
# rho_new = np.empty((znew.shape[0],rho_0.shape[1]))
# spic_new = np.empty((znew.shape[0],rho_0.shape[1]))
# temp_new = np.empty((znew.shape[0],rho_0.shape[1]))
# sal_new = np.empty((znew.shape[0],rho_0.shape[1]))
# ind = np.empty((znew.shape[0],rho_0.shape[1]))
# Interpolate over z to choose the exact values of z for the isopycnals
# f = interp1d(zlevels[:],rho_0[:,i],fill_value='extrapolate')
# g = interp1d(zlevels[:],spic_0[:,i],fill_value='extrapolate')
# h = interp1d(zlevels[:],temp_0[:,i],fill_value='extrapolate')
# wine = interp1d(zlevels[:],sal_0[:,i],fill_value='extrapolate')
#
# # find the values of the variables at the fine z resolutions
#
#
# rho_new[:,i] = f(znew[:])
# spic_new[:,i] = g(znew[:])
# temp_new[:,i] = h(znew[:])
# sal_new[:,i] = wine(znew[:])
#
# # find the indices which relate to those isopycnal values in x and z from a created boolean masked tuple ind
#
# V = rho_new
# ind = np.where((V>dens_cont[den]-tol)&(V<dens_cont[den]+tol))
#
# edit the intialised array with the values returned from the isopycnal indices
# spic_iso[t,den,j,i] = spic_new[ind[0][:],ind[1][:]]
# rho_iso[t,den,j,i] = rho_new[ind[0][:],ind[1][:]]
# temp_iso[t,den,j,i] = temp_new[ind[0][:],ind[1][:]]
# sal_iso[t,den,j,i] = sal_new[ind[0][:],ind[1][:]]
print("Writing the isopycnal data for July")
path_to_save = '/home/ssahu/saurav/'
bdy_file = nc.Dataset(path_to_save + 'NEP36_jul_along_isopycnal.nc', 'w', zlib=True);
bdy_file.createDimension('x', spic_time_iso.shape[3]);
bdy_file.createDimension('y', spic_time_iso.shape[2]);
bdy_file.createDimension('isot', spic_time_iso.shape[1]);
bdy_file.createDimension('time_counter', None);
x = bdy_file.createVariable('x', 'int32', ('x',), zlib=True);
x.units = 'indices';
x.longname = 'x indices of NEP36';
y = bdy_file.createVariable('y', 'int32', ('y',), zlib=True);
y.units = 'indices';
y.longname = 'y indices of NEP36';
isot = bdy_file.createVariable('isot', 'float32', ('isot',), zlib=True);
isot.units = 'm';
isot.longname = 'Vertical isopycnal Levels';
time_counter = bdy_file.createVariable('time_counter', 'int32', ('time_counter',), zlib=True);
time_counter.units = 's';
time_counter.longname = 'time';
spiciness = bdy_file.createVariable('spiciness', 'float32', ('time_counter','isot', 'y', 'x'), zlib=True)
temperature = bdy_file.createVariable('temperature', 'float32', ('time_counter','isot', 'y', 'x'), zlib=True)
salinity = bdy_file.createVariable('salinity', 'float32', ('time_counter','isot', 'y', 'x'), zlib=True)
zdepth_of_isopycnal = bdy_file.createVariable('Depth of Isopycnal', 'float32', ('time_counter','isot', 'y', 'x'), zlib=True)
#density = bdy_file.createVariable('density', 'float32', ('time_counter','isot', 'y', 'x'), zlib=True)
spiciness[...] = spic_time_iso[...];
temperature[...] = tem_time_iso[...];
salinity[...] = sal_time_iso[...];
zdepth_of_isopycnal[...] = depth_rho[...]
#density[...] = rho_iso[...];
isot[...] = den[:];
x[...] = x_wcvi_slice[:];
y[...] = y_wcvi_slice[:];
bdy_file.close()
print("File for July Written: Thanks")
print("Starting interpolation and data extraction for August")
spic_time_iso = np.empty((spic_aug.shape[0],den.shape[0],y_wcvi_slice.shape[0],x_wcvi_slice.shape[0]))
tem_time_iso = np.empty((spic_aug.shape[0],den.shape[0],y_wcvi_slice.shape[0],x_wcvi_slice.shape[0]))
sal_time_iso = np.empty((spic_aug.shape[0],den.shape[0],y_wcvi_slice.shape[0],x_wcvi_slice.shape[0]))
for t in np.arange(spic_time_iso.shape[0]):
rho_0 = rho_aug[t,:,y_wcvi_slice,x_wcvi_slice] - 1000
spic_0 = spic_aug[t,:,y_wcvi_slice,x_wcvi_slice]
tem_0 = temp_aug[t,:,y_wcvi_slice,x_wcvi_slice]
sal_0 = sal_aug[t,:,y_wcvi_slice,x_wcvi_slice]
spic_spec_iso = np.empty((den.shape[0],y_wcvi_slice.shape[0],x_wcvi_slice.shape[0]))
tem_spec_iso = np.empty((den.shape[0],y_wcvi_slice.shape[0],x_wcvi_slice.shape[0]))
sal_spec_iso = np.empty((den.shape[0],y_wcvi_slice.shape[0],x_wcvi_slice.shape[0]))
for iso in np.arange(den.shape[0]):
spic_den = np.empty((y_wcvi_slice.shape[0],x_wcvi_slice.shape[0]))
tem_den = np.empty((y_wcvi_slice.shape[0],x_wcvi_slice.shape[0]))
sal_den = np.empty((y_wcvi_slice.shape[0],x_wcvi_slice.shape[0]))
for j in np.arange(y_wcvi_slice.shape[0]):
spic_iso = np.empty(x_wcvi_slice.shape[0])
sal_iso = np.empty(x_wcvi_slice.shape[0])
tem_iso = np.empty(x_wcvi_slice.shape[0])
rho_new = np.empty((znew.shape[0],x_wcvi_slice.shape[0]))
spic_new = np.empty((znew.shape[0],x_wcvi_slice.shape[0]))
tem_new = np.empty((znew.shape[0],x_wcvi_slice.shape[0]))
sal_new = np.empty((znew.shape[0],x_wcvi_slice.shape[0]))
for i in np.arange(rho_new.shape[1]):
f = interp1d(zlevels[:],rho_0[:,j,i],fill_value='extrapolate')
g = interp1d(zlevels[:],spic_0[:,j,i],fill_value='extrapolate')
h = interp1d(zlevels[:],tem_0[:,j,i],fill_value='extrapolate')
p = interp1d(zlevels[:],sal_0[:,j,i],fill_value='extrapolate')
rho_new[:,i] = f(znew[:])
spic_new[:,i] = g(znew[:])
tem_new[:,i] = h(znew[:])
sal_new[:,i] = p(znew[:])
V = rho_new[:,i]
ind = (V>den[iso]-tol)&(V<den[iso]+tol)
spic_iso[i] = np.nanmean(spic_new[ind,i])
tem_iso[i] = np.nanmean(tem_new[ind,i])
sal_iso[i] = np.nanmean(sal_new[ind,i])
spic_den[j,i] = spic_iso[i]
tem_den[j,i] = tem_iso[i]
sal_den[j,i] = sal_iso[i]
spic_spec_iso[iso,j,i] = spic_den[j,i]
tem_spec_iso[iso,j,i] = tem_den[j,i]
sal_spec_iso[iso,j,i] = sal_den[j,i]
spic_time_iso[t,iso,j,i] = spic_spec_iso[iso,j,i]
tem_time_iso[t,iso,j,i] = tem_spec_iso[iso,j,i]
sal_time_iso[t,iso,j,i] = sal_spec_iso[iso,j,i]
print("Calculating the depths of the isopycnals (in August) for 3D plots")
depth_rho_0 = np.empty((sal_time_iso[...].shape[0],sal_time_iso.shape[1],rho_jul.shape[2],rho_jul.shape[3]))
for t in np.arange(spic_time_iso.shape[0]):
for iso in np.arange(den.shape[0]):
for j in np.arange(230,350):
for i in np.arange(550,650):
if mbathy[j,i] > 0:
depth_rho_0[t,iso,j, i] = np.interp(den[iso], rho_aug[t,:mbathy[j, i], j, i]-1000, zlevels[:mbathy[j, i]])
depth_rho = np.empty_like(sal_time_iso[...])
depth_rho = depth_rho_0[:,:,y_wcvi_slice,x_wcvi_slice]
print("Writing the isopycnal data for August")
path_to_save = '/home/ssahu/saurav/'
bdy_file = nc.Dataset(path_to_save + 'NEP36_aug_along_isopycnal.nc', 'w', zlib=True);
bdy_file.createDimension('x', spic_time_iso.shape[3]);
bdy_file.createDimension('y', spic_time_iso.shape[2]);
bdy_file.createDimension('isot', spic_time_iso.shape[1]);
bdy_file.createDimension('time_counter', None);
x = bdy_file.createVariable('x', 'int32', ('x',), zlib=True);
x.units = 'indices';
x.longname = 'x indices of NEP36';
y = bdy_file.createVariable('y', 'int32', ('y',), zlib=True);
y.units = 'indices';
y.longname = 'y indices of NEP36';
isot = bdy_file.createVariable('isot', 'float32', ('isot',), zlib=True);
isot.units = 'm';
isot.longname = 'Vertical isopycnal Levels';
time_counter = bdy_file.createVariable('time_counter', 'int32', ('time_counter',), zlib=True);
time_counter.units = 's';
time_counter.longname = 'time';
spiciness = bdy_file.createVariable('spiciness', 'float32', ('time_counter','isot', 'y', 'x'), zlib=True)
temperature = bdy_file.createVariable('temperature', 'float32', ('time_counter','isot', 'y', 'x'), zlib=True)
salinity = bdy_file.createVariable('salinity', 'float32', ('time_counter','isot', 'y', 'x'), zlib=True)
zdepth_of_isopycnal = bdy_file.createVariable('Depth of Isopycnal', 'float32', ('time_counter','isot', 'y', 'x'), zlib=True)
#density = bdy_file.createVariable('density', 'float32', ('time_counter','isot', 'y', 'x'), zlib=True)
spiciness[...] = spic_time_iso[...];
temperature[...] = temp_time_iso[...];
salinity[...] = sal_time_iso[...];
zdepth_of_isopycnal[...] = depth_rho[...]
#density[...] = rho_iso[...];
isot[...] = den[:];
x[...] = x_wcvi_slice[:];
y[...] = y_wcvi_slice[:];
bdy_file.close()
print("File for August Written: Thanks")
| 37.994975 | 134 | 0.598267 | 2,373 | 15,122 | 3.560051 | 0.072482 | 0.083097 | 0.106061 | 0.113636 | 0.862098 | 0.840672 | 0.829072 | 0.809777 | 0.803267 | 0.780895 | 0 | 0.032286 | 0.223714 | 15,122 | 397 | 135 | 38.09068 | 0.687367 | 0.206719 | 0 | 0.757576 | 0 | 0 | 0.1171 | 0.022967 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.015152 | 0 | 0.015152 | 0.035354 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
71775be0edc49973a6327889d79d8fb5ee3b89fd | 103 | py | Python | Exercicios-mundo-2/desafio049.py | talitadeoa/CEV-Exercicios-Python | 11e8ad6c6b758c5b5fdf5050a3e97f98c308ea7e | [
"MIT"
] | null | null | null | Exercicios-mundo-2/desafio049.py | talitadeoa/CEV-Exercicios-Python | 11e8ad6c6b758c5b5fdf5050a3e97f98c308ea7e | [
"MIT"
] | null | null | null | Exercicios-mundo-2/desafio049.py | talitadeoa/CEV-Exercicios-Python | 11e8ad6c6b758c5b5fdf5050a3e97f98c308ea7e | [
"MIT"
] | null | null | null | #Refaça a tabuada do desafio 009 utilizando o laço for
#kkk fiz o desafio 009 já utilizando o laço for | 34.333333 | 54 | 0.786408 | 20 | 103 | 4.05 | 0.65 | 0.246914 | 0.37037 | 0.444444 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.072289 | 0.194175 | 103 | 3 | 55 | 34.333333 | 0.903614 | 0.961165 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
71b089167474786396993988893b5eadd6dc0f1d | 33,499 | py | Python | sdk/python/pulumi_gcp/billing/budget.py | sisisin/pulumi-gcp | af6681d70ea457843409110c1324817fe55f68ad | [
"ECL-2.0",
"Apache-2.0"
] | 121 | 2018-06-18T19:16:42.000Z | 2022-03-31T06:06:48.000Z | sdk/python/pulumi_gcp/billing/budget.py | sisisin/pulumi-gcp | af6681d70ea457843409110c1324817fe55f68ad | [
"ECL-2.0",
"Apache-2.0"
] | 492 | 2018-06-22T19:41:03.000Z | 2022-03-31T15:33:53.000Z | sdk/python/pulumi_gcp/billing/budget.py | sisisin/pulumi-gcp | af6681d70ea457843409110c1324817fe55f68ad | [
"ECL-2.0",
"Apache-2.0"
] | 43 | 2018-06-19T01:43:13.000Z | 2022-03-23T22:43:37.000Z | # coding=utf-8
# *** WARNING: this file was generated by the Pulumi Terraform Bridge (tfgen) Tool. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
import warnings
import pulumi
import pulumi.runtime
from typing import Any, Mapping, Optional, Sequence, Union, overload
from .. import _utilities
from . import outputs
from ._inputs import *
__all__ = ['BudgetArgs', 'Budget']
@pulumi.input_type
class BudgetArgs:
def __init__(__self__, *,
amount: pulumi.Input['BudgetAmountArgs'],
billing_account: pulumi.Input[str],
threshold_rules: pulumi.Input[Sequence[pulumi.Input['BudgetThresholdRuleArgs']]],
all_updates_rule: Optional[pulumi.Input['BudgetAllUpdatesRuleArgs']] = None,
budget_filter: Optional[pulumi.Input['BudgetBudgetFilterArgs']] = None,
display_name: Optional[pulumi.Input[str]] = None):
"""
The set of arguments for constructing a Budget resource.
:param pulumi.Input['BudgetAmountArgs'] amount: The budgeted amount for each usage period.
Structure is documented below.
:param pulumi.Input[str] billing_account: ID of the billing account to set a budget on.
:param pulumi.Input[Sequence[pulumi.Input['BudgetThresholdRuleArgs']]] threshold_rules: Rules that trigger alerts (notifications of thresholds being
crossed) when spend exceeds the specified percentages of the
budget.
Structure is documented below.
:param pulumi.Input['BudgetAllUpdatesRuleArgs'] all_updates_rule: Defines notifications that are sent on every update to the
billing account's spend, regardless of the thresholds defined
using threshold rules.
Structure is documented below.
:param pulumi.Input['BudgetBudgetFilterArgs'] budget_filter: Filters that define which resources are used to compute the actual
spend against the budget.
Structure is documented below.
:param pulumi.Input[str] display_name: User data for display name in UI. Must be <= 60 chars.
"""
pulumi.set(__self__, "amount", amount)
pulumi.set(__self__, "billing_account", billing_account)
pulumi.set(__self__, "threshold_rules", threshold_rules)
if all_updates_rule is not None:
pulumi.set(__self__, "all_updates_rule", all_updates_rule)
if budget_filter is not None:
pulumi.set(__self__, "budget_filter", budget_filter)
if display_name is not None:
pulumi.set(__self__, "display_name", display_name)
@property
@pulumi.getter
def amount(self) -> pulumi.Input['BudgetAmountArgs']:
"""
The budgeted amount for each usage period.
Structure is documented below.
"""
return pulumi.get(self, "amount")
@amount.setter
def amount(self, value: pulumi.Input['BudgetAmountArgs']):
pulumi.set(self, "amount", value)
@property
@pulumi.getter(name="billingAccount")
def billing_account(self) -> pulumi.Input[str]:
"""
ID of the billing account to set a budget on.
"""
return pulumi.get(self, "billing_account")
@billing_account.setter
def billing_account(self, value: pulumi.Input[str]):
pulumi.set(self, "billing_account", value)
@property
@pulumi.getter(name="thresholdRules")
def threshold_rules(self) -> pulumi.Input[Sequence[pulumi.Input['BudgetThresholdRuleArgs']]]:
"""
Rules that trigger alerts (notifications of thresholds being
crossed) when spend exceeds the specified percentages of the
budget.
Structure is documented below.
"""
return pulumi.get(self, "threshold_rules")
@threshold_rules.setter
def threshold_rules(self, value: pulumi.Input[Sequence[pulumi.Input['BudgetThresholdRuleArgs']]]):
pulumi.set(self, "threshold_rules", value)
@property
@pulumi.getter(name="allUpdatesRule")
def all_updates_rule(self) -> Optional[pulumi.Input['BudgetAllUpdatesRuleArgs']]:
"""
Defines notifications that are sent on every update to the
billing account's spend, regardless of the thresholds defined
using threshold rules.
Structure is documented below.
"""
return pulumi.get(self, "all_updates_rule")
@all_updates_rule.setter
def all_updates_rule(self, value: Optional[pulumi.Input['BudgetAllUpdatesRuleArgs']]):
pulumi.set(self, "all_updates_rule", value)
@property
@pulumi.getter(name="budgetFilter")
def budget_filter(self) -> Optional[pulumi.Input['BudgetBudgetFilterArgs']]:
"""
Filters that define which resources are used to compute the actual
spend against the budget.
Structure is documented below.
"""
return pulumi.get(self, "budget_filter")
@budget_filter.setter
def budget_filter(self, value: Optional[pulumi.Input['BudgetBudgetFilterArgs']]):
pulumi.set(self, "budget_filter", value)
@property
@pulumi.getter(name="displayName")
def display_name(self) -> Optional[pulumi.Input[str]]:
"""
User data for display name in UI. Must be <= 60 chars.
"""
return pulumi.get(self, "display_name")
@display_name.setter
def display_name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "display_name", value)
@pulumi.input_type
class _BudgetState:
def __init__(__self__, *,
all_updates_rule: Optional[pulumi.Input['BudgetAllUpdatesRuleArgs']] = None,
amount: Optional[pulumi.Input['BudgetAmountArgs']] = None,
billing_account: Optional[pulumi.Input[str]] = None,
budget_filter: Optional[pulumi.Input['BudgetBudgetFilterArgs']] = None,
display_name: Optional[pulumi.Input[str]] = None,
name: Optional[pulumi.Input[str]] = None,
threshold_rules: Optional[pulumi.Input[Sequence[pulumi.Input['BudgetThresholdRuleArgs']]]] = None):
"""
Input properties used for looking up and filtering Budget resources.
:param pulumi.Input['BudgetAllUpdatesRuleArgs'] all_updates_rule: Defines notifications that are sent on every update to the
billing account's spend, regardless of the thresholds defined
using threshold rules.
Structure is documented below.
:param pulumi.Input['BudgetAmountArgs'] amount: The budgeted amount for each usage period.
Structure is documented below.
:param pulumi.Input[str] billing_account: ID of the billing account to set a budget on.
:param pulumi.Input['BudgetBudgetFilterArgs'] budget_filter: Filters that define which resources are used to compute the actual
spend against the budget.
Structure is documented below.
:param pulumi.Input[str] display_name: User data for display name in UI. Must be <= 60 chars.
:param pulumi.Input[str] name: Resource name of the budget. The resource name implies the scope of a budget. Values are of the form
billingAccounts/{billingAccountId}/budgets/{budgetId}.
:param pulumi.Input[Sequence[pulumi.Input['BudgetThresholdRuleArgs']]] threshold_rules: Rules that trigger alerts (notifications of thresholds being
crossed) when spend exceeds the specified percentages of the
budget.
Structure is documented below.
"""
if all_updates_rule is not None:
pulumi.set(__self__, "all_updates_rule", all_updates_rule)
if amount is not None:
pulumi.set(__self__, "amount", amount)
if billing_account is not None:
pulumi.set(__self__, "billing_account", billing_account)
if budget_filter is not None:
pulumi.set(__self__, "budget_filter", budget_filter)
if display_name is not None:
pulumi.set(__self__, "display_name", display_name)
if name is not None:
pulumi.set(__self__, "name", name)
if threshold_rules is not None:
pulumi.set(__self__, "threshold_rules", threshold_rules)
@property
@pulumi.getter(name="allUpdatesRule")
def all_updates_rule(self) -> Optional[pulumi.Input['BudgetAllUpdatesRuleArgs']]:
"""
Defines notifications that are sent on every update to the
billing account's spend, regardless of the thresholds defined
using threshold rules.
Structure is documented below.
"""
return pulumi.get(self, "all_updates_rule")
@all_updates_rule.setter
def all_updates_rule(self, value: Optional[pulumi.Input['BudgetAllUpdatesRuleArgs']]):
pulumi.set(self, "all_updates_rule", value)
@property
@pulumi.getter
def amount(self) -> Optional[pulumi.Input['BudgetAmountArgs']]:
"""
The budgeted amount for each usage period.
Structure is documented below.
"""
return pulumi.get(self, "amount")
@amount.setter
def amount(self, value: Optional[pulumi.Input['BudgetAmountArgs']]):
pulumi.set(self, "amount", value)
@property
@pulumi.getter(name="billingAccount")
def billing_account(self) -> Optional[pulumi.Input[str]]:
"""
ID of the billing account to set a budget on.
"""
return pulumi.get(self, "billing_account")
@billing_account.setter
def billing_account(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "billing_account", value)
@property
@pulumi.getter(name="budgetFilter")
def budget_filter(self) -> Optional[pulumi.Input['BudgetBudgetFilterArgs']]:
"""
Filters that define which resources are used to compute the actual
spend against the budget.
Structure is documented below.
"""
return pulumi.get(self, "budget_filter")
@budget_filter.setter
def budget_filter(self, value: Optional[pulumi.Input['BudgetBudgetFilterArgs']]):
pulumi.set(self, "budget_filter", value)
@property
@pulumi.getter(name="displayName")
def display_name(self) -> Optional[pulumi.Input[str]]:
"""
User data for display name in UI. Must be <= 60 chars.
"""
return pulumi.get(self, "display_name")
@display_name.setter
def display_name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "display_name", value)
@property
@pulumi.getter
def name(self) -> Optional[pulumi.Input[str]]:
"""
Resource name of the budget. The resource name implies the scope of a budget. Values are of the form
billingAccounts/{billingAccountId}/budgets/{budgetId}.
"""
return pulumi.get(self, "name")
@name.setter
def name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "name", value)
@property
@pulumi.getter(name="thresholdRules")
def threshold_rules(self) -> Optional[pulumi.Input[Sequence[pulumi.Input['BudgetThresholdRuleArgs']]]]:
"""
Rules that trigger alerts (notifications of thresholds being
crossed) when spend exceeds the specified percentages of the
budget.
Structure is documented below.
"""
return pulumi.get(self, "threshold_rules")
@threshold_rules.setter
def threshold_rules(self, value: Optional[pulumi.Input[Sequence[pulumi.Input['BudgetThresholdRuleArgs']]]]):
pulumi.set(self, "threshold_rules", value)
class Budget(pulumi.CustomResource):
@overload
def __init__(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
all_updates_rule: Optional[pulumi.Input[pulumi.InputType['BudgetAllUpdatesRuleArgs']]] = None,
amount: Optional[pulumi.Input[pulumi.InputType['BudgetAmountArgs']]] = None,
billing_account: Optional[pulumi.Input[str]] = None,
budget_filter: Optional[pulumi.Input[pulumi.InputType['BudgetBudgetFilterArgs']]] = None,
display_name: Optional[pulumi.Input[str]] = None,
threshold_rules: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['BudgetThresholdRuleArgs']]]]] = None,
__props__=None):
"""
Budget configuration for a billing account.
To get more information about Budget, see:
* [API documentation](https://cloud.google.com/billing/docs/reference/budget/rest/v1/billingAccounts.budgets)
* How-to Guides
* [Creating a budget](https://cloud.google.com/billing/docs/how-to/budgets)
> **Warning:** If you are using User ADCs (Application Default Credentials) with this resource,
you must specify a `billing_project` and set `user_project_override` to true
in the provider configuration. Otherwise the Billing Budgets API will return a 403 error.
Your account must have the `serviceusage.services.use` permission on the
`billing_project` you defined.
## Example Usage
### Billing Budget Basic
```python
import pulumi
import pulumi_gcp as gcp
account = gcp.organizations.get_billing_account(billing_account="000000-0000000-0000000-000000")
budget = gcp.billing.Budget("budget",
billing_account=account.id,
display_name="Example Billing Budget",
amount=gcp.billing.BudgetAmountArgs(
specified_amount=gcp.billing.BudgetAmountSpecifiedAmountArgs(
currency_code="USD",
units="100000",
),
),
threshold_rules=[gcp.billing.BudgetThresholdRuleArgs(
threshold_percent=0.5,
)])
```
### Billing Budget Lastperiod
```python
import pulumi
import pulumi_gcp as gcp
account = gcp.organizations.get_billing_account(billing_account="000000-0000000-0000000-000000")
project = gcp.organizations.get_project()
budget = gcp.billing.Budget("budget",
billing_account=account.id,
display_name="Example Billing Budget",
budget_filter=gcp.billing.BudgetBudgetFilterArgs(
projects=[f"projects/{project.number}"],
),
amount=gcp.billing.BudgetAmountArgs(
last_period_amount=True,
),
threshold_rules=[gcp.billing.BudgetThresholdRuleArgs(
threshold_percent=10,
)])
```
### Billing Budget Filter
```python
import pulumi
import pulumi_gcp as gcp
account = gcp.organizations.get_billing_account(billing_account="000000-0000000-0000000-000000")
project = gcp.organizations.get_project()
budget = gcp.billing.Budget("budget",
billing_account=account.id,
display_name="Example Billing Budget",
budget_filter=gcp.billing.BudgetBudgetFilterArgs(
projects=[f"projects/{project.number}"],
credit_types_treatment="EXCLUDE_ALL_CREDITS",
services=["services/24E6-581D-38E5"],
),
amount=gcp.billing.BudgetAmountArgs(
specified_amount=gcp.billing.BudgetAmountSpecifiedAmountArgs(
currency_code="USD",
units="100000",
),
),
threshold_rules=[
gcp.billing.BudgetThresholdRuleArgs(
threshold_percent=0.5,
),
gcp.billing.BudgetThresholdRuleArgs(
threshold_percent=0.9,
spend_basis="FORECASTED_SPEND",
),
])
```
### Billing Budget Notify
```python
import pulumi
import pulumi_gcp as gcp
account = gcp.organizations.get_billing_account(billing_account="000000-0000000-0000000-000000")
project = gcp.organizations.get_project()
notification_channel = gcp.monitoring.NotificationChannel("notificationChannel",
display_name="Example Notification Channel",
type="email",
labels={
"email_address": "address@example.com",
})
budget = gcp.billing.Budget("budget",
billing_account=account.id,
display_name="Example Billing Budget",
budget_filter=gcp.billing.BudgetBudgetFilterArgs(
projects=[f"projects/{project.number}"],
),
amount=gcp.billing.BudgetAmountArgs(
specified_amount=gcp.billing.BudgetAmountSpecifiedAmountArgs(
currency_code="USD",
units="100000",
),
),
threshold_rules=[
gcp.billing.BudgetThresholdRuleArgs(
threshold_percent=1,
),
gcp.billing.BudgetThresholdRuleArgs(
threshold_percent=1,
spend_basis="FORECASTED_SPEND",
),
],
all_updates_rule=gcp.billing.BudgetAllUpdatesRuleArgs(
monitoring_notification_channels=[notification_channel.id],
disable_default_iam_recipients=True,
))
```
## Import
Budget can be imported using any of these accepted formats
```sh
$ pulumi import gcp:billing/budget:Budget default billingAccounts/{{billing_account}}/budgets/{{name}}
```
```sh
$ pulumi import gcp:billing/budget:Budget default {{billing_account}}/{{name}}
```
```sh
$ pulumi import gcp:billing/budget:Budget default {{name}}
```
:param str resource_name: The name of the resource.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[pulumi.InputType['BudgetAllUpdatesRuleArgs']] all_updates_rule: Defines notifications that are sent on every update to the
billing account's spend, regardless of the thresholds defined
using threshold rules.
Structure is documented below.
:param pulumi.Input[pulumi.InputType['BudgetAmountArgs']] amount: The budgeted amount for each usage period.
Structure is documented below.
:param pulumi.Input[str] billing_account: ID of the billing account to set a budget on.
:param pulumi.Input[pulumi.InputType['BudgetBudgetFilterArgs']] budget_filter: Filters that define which resources are used to compute the actual
spend against the budget.
Structure is documented below.
:param pulumi.Input[str] display_name: User data for display name in UI. Must be <= 60 chars.
:param pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['BudgetThresholdRuleArgs']]]] threshold_rules: Rules that trigger alerts (notifications of thresholds being
crossed) when spend exceeds the specified percentages of the
budget.
Structure is documented below.
"""
...
@overload
def __init__(__self__,
resource_name: str,
args: BudgetArgs,
opts: Optional[pulumi.ResourceOptions] = None):
"""
Budget configuration for a billing account.
To get more information about Budget, see:
* [API documentation](https://cloud.google.com/billing/docs/reference/budget/rest/v1/billingAccounts.budgets)
* How-to Guides
* [Creating a budget](https://cloud.google.com/billing/docs/how-to/budgets)
> **Warning:** If you are using User ADCs (Application Default Credentials) with this resource,
you must specify a `billing_project` and set `user_project_override` to true
in the provider configuration. Otherwise the Billing Budgets API will return a 403 error.
Your account must have the `serviceusage.services.use` permission on the
`billing_project` you defined.
## Example Usage
### Billing Budget Basic
```python
import pulumi
import pulumi_gcp as gcp
account = gcp.organizations.get_billing_account(billing_account="000000-0000000-0000000-000000")
budget = gcp.billing.Budget("budget",
billing_account=account.id,
display_name="Example Billing Budget",
amount=gcp.billing.BudgetAmountArgs(
specified_amount=gcp.billing.BudgetAmountSpecifiedAmountArgs(
currency_code="USD",
units="100000",
),
),
threshold_rules=[gcp.billing.BudgetThresholdRuleArgs(
threshold_percent=0.5,
)])
```
### Billing Budget Lastperiod
```python
import pulumi
import pulumi_gcp as gcp
account = gcp.organizations.get_billing_account(billing_account="000000-0000000-0000000-000000")
project = gcp.organizations.get_project()
budget = gcp.billing.Budget("budget",
billing_account=account.id,
display_name="Example Billing Budget",
budget_filter=gcp.billing.BudgetBudgetFilterArgs(
projects=[f"projects/{project.number}"],
),
amount=gcp.billing.BudgetAmountArgs(
last_period_amount=True,
),
threshold_rules=[gcp.billing.BudgetThresholdRuleArgs(
threshold_percent=10,
)])
```
### Billing Budget Filter
```python
import pulumi
import pulumi_gcp as gcp
account = gcp.organizations.get_billing_account(billing_account="000000-0000000-0000000-000000")
project = gcp.organizations.get_project()
budget = gcp.billing.Budget("budget",
billing_account=account.id,
display_name="Example Billing Budget",
budget_filter=gcp.billing.BudgetBudgetFilterArgs(
projects=[f"projects/{project.number}"],
credit_types_treatment="EXCLUDE_ALL_CREDITS",
services=["services/24E6-581D-38E5"],
),
amount=gcp.billing.BudgetAmountArgs(
specified_amount=gcp.billing.BudgetAmountSpecifiedAmountArgs(
currency_code="USD",
units="100000",
),
),
threshold_rules=[
gcp.billing.BudgetThresholdRuleArgs(
threshold_percent=0.5,
),
gcp.billing.BudgetThresholdRuleArgs(
threshold_percent=0.9,
spend_basis="FORECASTED_SPEND",
),
])
```
### Billing Budget Notify
```python
import pulumi
import pulumi_gcp as gcp
account = gcp.organizations.get_billing_account(billing_account="000000-0000000-0000000-000000")
project = gcp.organizations.get_project()
notification_channel = gcp.monitoring.NotificationChannel("notificationChannel",
display_name="Example Notification Channel",
type="email",
labels={
"email_address": "address@example.com",
})
budget = gcp.billing.Budget("budget",
billing_account=account.id,
display_name="Example Billing Budget",
budget_filter=gcp.billing.BudgetBudgetFilterArgs(
projects=[f"projects/{project.number}"],
),
amount=gcp.billing.BudgetAmountArgs(
specified_amount=gcp.billing.BudgetAmountSpecifiedAmountArgs(
currency_code="USD",
units="100000",
),
),
threshold_rules=[
gcp.billing.BudgetThresholdRuleArgs(
threshold_percent=1,
),
gcp.billing.BudgetThresholdRuleArgs(
threshold_percent=1,
spend_basis="FORECASTED_SPEND",
),
],
all_updates_rule=gcp.billing.BudgetAllUpdatesRuleArgs(
monitoring_notification_channels=[notification_channel.id],
disable_default_iam_recipients=True,
))
```
## Import
Budget can be imported using any of these accepted formats
```sh
$ pulumi import gcp:billing/budget:Budget default billingAccounts/{{billing_account}}/budgets/{{name}}
```
```sh
$ pulumi import gcp:billing/budget:Budget default {{billing_account}}/{{name}}
```
```sh
$ pulumi import gcp:billing/budget:Budget default {{name}}
```
:param str resource_name: The name of the resource.
:param BudgetArgs args: The arguments to use to populate this resource's properties.
:param pulumi.ResourceOptions opts: Options for the resource.
"""
...
def __init__(__self__, resource_name: str, *args, **kwargs):
resource_args, opts = _utilities.get_resource_args_opts(BudgetArgs, pulumi.ResourceOptions, *args, **kwargs)
if resource_args is not None:
__self__._internal_init(resource_name, opts, **resource_args.__dict__)
else:
__self__._internal_init(resource_name, *args, **kwargs)
def _internal_init(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
all_updates_rule: Optional[pulumi.Input[pulumi.InputType['BudgetAllUpdatesRuleArgs']]] = None,
amount: Optional[pulumi.Input[pulumi.InputType['BudgetAmountArgs']]] = None,
billing_account: Optional[pulumi.Input[str]] = None,
budget_filter: Optional[pulumi.Input[pulumi.InputType['BudgetBudgetFilterArgs']]] = None,
display_name: Optional[pulumi.Input[str]] = None,
threshold_rules: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['BudgetThresholdRuleArgs']]]]] = None,
__props__=None):
if opts is None:
opts = pulumi.ResourceOptions()
if not isinstance(opts, pulumi.ResourceOptions):
raise TypeError('Expected resource options to be a ResourceOptions instance')
if opts.version is None:
opts.version = _utilities.get_version()
if opts.id is None:
if __props__ is not None:
raise TypeError('__props__ is only valid when passed in combination with a valid opts.id to get an existing resource')
__props__ = BudgetArgs.__new__(BudgetArgs)
__props__.__dict__["all_updates_rule"] = all_updates_rule
if amount is None and not opts.urn:
raise TypeError("Missing required property 'amount'")
__props__.__dict__["amount"] = amount
if billing_account is None and not opts.urn:
raise TypeError("Missing required property 'billing_account'")
__props__.__dict__["billing_account"] = billing_account
__props__.__dict__["budget_filter"] = budget_filter
__props__.__dict__["display_name"] = display_name
if threshold_rules is None and not opts.urn:
raise TypeError("Missing required property 'threshold_rules'")
__props__.__dict__["threshold_rules"] = threshold_rules
__props__.__dict__["name"] = None
super(Budget, __self__).__init__(
'gcp:billing/budget:Budget',
resource_name,
__props__,
opts)
@staticmethod
def get(resource_name: str,
id: pulumi.Input[str],
opts: Optional[pulumi.ResourceOptions] = None,
all_updates_rule: Optional[pulumi.Input[pulumi.InputType['BudgetAllUpdatesRuleArgs']]] = None,
amount: Optional[pulumi.Input[pulumi.InputType['BudgetAmountArgs']]] = None,
billing_account: Optional[pulumi.Input[str]] = None,
budget_filter: Optional[pulumi.Input[pulumi.InputType['BudgetBudgetFilterArgs']]] = None,
display_name: Optional[pulumi.Input[str]] = None,
name: Optional[pulumi.Input[str]] = None,
threshold_rules: Optional[pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['BudgetThresholdRuleArgs']]]]] = None) -> 'Budget':
"""
Get an existing Budget resource's state with the given name, id, and optional extra
properties used to qualify the lookup.
:param str resource_name: The unique name of the resulting resource.
:param pulumi.Input[str] id: The unique provider ID of the resource to lookup.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[pulumi.InputType['BudgetAllUpdatesRuleArgs']] all_updates_rule: Defines notifications that are sent on every update to the
billing account's spend, regardless of the thresholds defined
using threshold rules.
Structure is documented below.
:param pulumi.Input[pulumi.InputType['BudgetAmountArgs']] amount: The budgeted amount for each usage period.
Structure is documented below.
:param pulumi.Input[str] billing_account: ID of the billing account to set a budget on.
:param pulumi.Input[pulumi.InputType['BudgetBudgetFilterArgs']] budget_filter: Filters that define which resources are used to compute the actual
spend against the budget.
Structure is documented below.
:param pulumi.Input[str] display_name: User data for display name in UI. Must be <= 60 chars.
:param pulumi.Input[str] name: Resource name of the budget. The resource name implies the scope of a budget. Values are of the form
billingAccounts/{billingAccountId}/budgets/{budgetId}.
:param pulumi.Input[Sequence[pulumi.Input[pulumi.InputType['BudgetThresholdRuleArgs']]]] threshold_rules: Rules that trigger alerts (notifications of thresholds being
crossed) when spend exceeds the specified percentages of the
budget.
Structure is documented below.
"""
opts = pulumi.ResourceOptions.merge(opts, pulumi.ResourceOptions(id=id))
__props__ = _BudgetState.__new__(_BudgetState)
__props__.__dict__["all_updates_rule"] = all_updates_rule
__props__.__dict__["amount"] = amount
__props__.__dict__["billing_account"] = billing_account
__props__.__dict__["budget_filter"] = budget_filter
__props__.__dict__["display_name"] = display_name
__props__.__dict__["name"] = name
__props__.__dict__["threshold_rules"] = threshold_rules
return Budget(resource_name, opts=opts, __props__=__props__)
@property
@pulumi.getter(name="allUpdatesRule")
def all_updates_rule(self) -> pulumi.Output[Optional['outputs.BudgetAllUpdatesRule']]:
"""
Defines notifications that are sent on every update to the
billing account's spend, regardless of the thresholds defined
using threshold rules.
Structure is documented below.
"""
return pulumi.get(self, "all_updates_rule")
@property
@pulumi.getter
def amount(self) -> pulumi.Output['outputs.BudgetAmount']:
"""
The budgeted amount for each usage period.
Structure is documented below.
"""
return pulumi.get(self, "amount")
@property
@pulumi.getter(name="billingAccount")
def billing_account(self) -> pulumi.Output[str]:
"""
ID of the billing account to set a budget on.
"""
return pulumi.get(self, "billing_account")
@property
@pulumi.getter(name="budgetFilter")
def budget_filter(self) -> pulumi.Output['outputs.BudgetBudgetFilter']:
"""
Filters that define which resources are used to compute the actual
spend against the budget.
Structure is documented below.
"""
return pulumi.get(self, "budget_filter")
@property
@pulumi.getter(name="displayName")
def display_name(self) -> pulumi.Output[Optional[str]]:
"""
User data for display name in UI. Must be <= 60 chars.
"""
return pulumi.get(self, "display_name")
@property
@pulumi.getter
def name(self) -> pulumi.Output[str]:
"""
Resource name of the budget. The resource name implies the scope of a budget. Values are of the form
billingAccounts/{billingAccountId}/budgets/{budgetId}.
"""
return pulumi.get(self, "name")
@property
@pulumi.getter(name="thresholdRules")
def threshold_rules(self) -> pulumi.Output[Sequence['outputs.BudgetThresholdRule']]:
"""
Rules that trigger alerts (notifications of thresholds being
crossed) when spend exceeds the specified percentages of the
budget.
Structure is documented below.
"""
return pulumi.get(self, "threshold_rules")
| 43.561769 | 174 | 0.633661 | 3,476 | 33,499 | 5.921461 | 0.08084 | 0.053977 | 0.045232 | 0.035369 | 0.906865 | 0.897391 | 0.880873 | 0.862411 | 0.851965 | 0.844192 | 0 | 0.012537 | 0.273769 | 33,499 | 768 | 175 | 43.61849 | 0.833525 | 0.503418 | 0 | 0.667984 | 1 | 0 | 0.158953 | 0.054976 | 0 | 0 | 0 | 0 | 0 | 1 | 0.158103 | false | 0.003953 | 0.027668 | 0 | 0.280632 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
71c2a7eeb383ff66b2a35eddd0ae1a31ca4e7701 | 4,956 | py | Python | visualizations/ch5-results/gtex_barcharts.py | arnegebert/splicing | 3e19ce83a9f6d98bc6c2d8b653660d22e453ca77 | [
"MIT"
] | 1 | 2021-05-13T15:30:39.000Z | 2021-05-13T15:30:39.000Z | visualizations/ch5-results/gtex_barcharts.py | arnegebert/splicing | 3e19ce83a9f6d98bc6c2d8b653660d22e453ca77 | [
"MIT"
] | null | null | null | visualizations/ch5-results/gtex_barcharts.py | arnegebert/splicing | 3e19ce83a9f6d98bc6c2d8b653660d22e453ca77 | [
"MIT"
] | null | null | null | import matplotlib.pyplot as plt
from sklearn.metrics import roc_curve, auc
import numpy as np
def bar_charts_exons():
plt.style.use('seaborn')
with_extra_random_guessing_plot = False
# can be further fine-tuned; either 0.5 from start or I include baseline model
labels = ['DSC', 'D2V', 'RASC']
brain_means = [0.664, 0.617, 0.645]
cerebellum_means = [0.649, 0.610, 0.631]
heart_means = [0.657, 0.604, 0.627]
brain_stds = [0.011, 0.010, 0.022]
cerebellum_stds = [0.008, 0.008, 0.029]
heart_stds = [0.016, 0.009, 0.014]
x = np.arange(len(brain_means)) # the label locations
width = 0.2 # the width of the bars
fig, ax = plt.subplots()
rects1 = ax.bar(x - width, brain_means, width, yerr=brain_stds, label='Brain cortex')
rects2 = ax.bar(x, cerebellum_means, width, yerr=cerebellum_stds, label='Cerebellum')
rects3 = ax.bar(x + width, heart_means, width, yerr=heart_stds, label='Heart')
# ax.set_title('AUC and inverse number of weights by model')
ax.set_xticks(np.arange(len(labels)))
ax.set_ylim(0.5, 1)
ax.set_ylabel('AUC')
ax.set_xticklabels(labels)
ax.legend()
def autolabel(rects, labels=None, stds=None):
"""Attach a text label above each bar in *rects*, displaying its height."""
for i, rect in enumerate(rects):
label = labels[i] if labels else rect.get_height()
height = rect.get_height()
if label == '1/20,000': continue
y_label = height + 0.02 if not stds else height + stds[i]*0.7 + 0.002
ax.annotate(f'{label}',
xy=(rect.get_x() + rect.get_width() / 2, y_label),
xytext=(0, 3), # 3 points vertical offset
textcoords="offset points",
ha='center', va='bottom')
autolabel(rects1, stds=brain_stds)
autolabel(rects2, stds=cerebellum_stds)
autolabel(rects3, stds=heart_stds)
if with_extra_random_guessing_plot:
labels.append('Random guessing')
ax.bar(x - width, 0.5, width, color='gray')
ax.bar(x, 0.5, width, color='gray')
ax.bar(x + width, 0.5, width, color='gray')
baseline_x = max(x) + 1
width2 = 0.3
rect4 = ax.bar(baseline_x, 0.5, width2, color='grey')
autolabel(rect4)
ax.axhline(y=0.5, color='gray')
fig.tight_layout()
plt.savefig('gtex_exon_barcharts.png', dpi=300, bbox_inches='tight')
plt.show()
def bar_charts_juncs():
plt.style.use('seaborn')
with_extra_random_guessing_plot = False
# can be further fine-tuned; either 0.5 from start or I include baseline model
labels = ['DSC', 'D2V', 'RASC']
brain_means = [0.699, 0.671, 0.810]
cerebellum_means = [0.704, 0.673, 0.808]
heart_means = [0.699, 0.677, 0.807]
brain_stds = [0.006, 0.003, 0.012]
cerebellum_stds = [0.006, 0.004, 0.008]
heart_stds = [0.008, 0.005, 0.013]
x = np.arange(len(brain_means)) # the label locations
width = 0.2 # the width of the bars
fig, ax = plt.subplots()
rects1 = ax.bar(x - width, brain_means, width, yerr=brain_stds, label='Brain cortex')
rects2 = ax.bar(x, cerebellum_means, width, yerr=cerebellum_stds, label='Cerebellum')
rects3 = ax.bar(x + width, heart_means, width, yerr=heart_stds, label='Heart')
# ax.set_title('AUC and inverse number of weights by model')
ax.set_xticks(np.arange(len(labels)))
ax.set_ylim(0.5, 1)
ax.set_ylabel('AUC')
ax.set_xticklabels(labels)
ax.legend()
def autolabel(rects, labels=None, stds=None):
"""Attach a text label above each bar in *rects*, displaying its height."""
for i, rect in enumerate(rects):
label = labels[i] if labels else rect.get_height()
height = rect.get_height()
if label == '1/20,000': continue
y_label = height + 0.02 if not stds else height + stds[i]*0.7 + 0.002
ax.annotate(f'{label}',
xy=(rect.get_x() + rect.get_width() / 2, y_label),
xytext=(0, 3), # 3 points vertical offset
textcoords="offset points",
ha='center', va='bottom')
autolabel(rects1, stds=brain_stds)
autolabel(rects2, stds=cerebellum_stds)
autolabel(rects3, stds=heart_stds)
if with_extra_random_guessing_plot:
labels.append('Random guessing')
ax.bar(x - width, 0.5, width, color='gray')
ax.bar(x, 0.5, width, color='gray')
ax.bar(x + width, 0.5, width, color='gray')
baseline_x = max(x) + 1
width2 = 0.3
rect4 = ax.bar(baseline_x, 0.5, width2, color='grey')
autolabel(rect4)
ax.axhline(y=0.5, color='gray')
fig.tight_layout()
plt.savefig('gtex_junc_barcharts.png', dpi=300, bbox_inches='tight')
plt.show()
if __name__ == '__main__':
bar_charts_exons()
bar_charts_juncs() | 38.123077 | 89 | 0.609766 | 739 | 4,956 | 3.955345 | 0.235453 | 0.009579 | 0.024632 | 0.030106 | 0.858707 | 0.858707 | 0.858707 | 0.858707 | 0.858707 | 0.831338 | 0 | 0.067385 | 0.251412 | 4,956 | 130 | 90 | 38.123077 | 0.720485 | 0.110169 | 0 | 0.784314 | 0 | 0 | 0.070143 | 0.010476 | 0 | 0 | 0 | 0 | 0 | 1 | 0.039216 | false | 0 | 0.029412 | 0 | 0.068627 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
71ce2d71bdb625e6aa93c6ea9878613f3cb9972a | 2,515 | py | Python | ietf/group/migrations/0004_auto_20150430_0847.py | ekr/ietfdb | 8d936836b0b9ff31cda415b0a423e3f5b33ab695 | [
"BSD-3-Clause-No-Nuclear-License-2014",
"BSD-3-Clause"
] | 2 | 2021-11-20T03:40:40.000Z | 2021-11-20T03:40:42.000Z | ietf/group/migrations/0004_auto_20150430_0847.py | ekr/ietfdb | 8d936836b0b9ff31cda415b0a423e3f5b33ab695 | [
"BSD-3-Clause-No-Nuclear-License-2014",
"BSD-3-Clause"
] | null | null | null | ietf/group/migrations/0004_auto_20150430_0847.py | ekr/ietfdb | 8d936836b0b9ff31cda415b0a423e3f5b33ab695 | [
"BSD-3-Clause-No-Nuclear-License-2014",
"BSD-3-Clause"
] | null | null | null | # -*- coding: utf-8 -*-
from __future__ import unicode_literals
from django.db import models, migrations
class Migration(migrations.Migration):
dependencies = [
('group', '0003_auto_20150304_0743'),
]
operations = [
migrations.AlterField(
model_name='group',
name='unused_states',
field=models.ManyToManyField(help_text=b'Document states that have been disabled for the group.', to='doc.State', blank=True),
preserve_default=True,
),
migrations.AlterField(
model_name='group',
name='unused_tags',
field=models.ManyToManyField(help_text=b'Document tags that have been disabled for the group.', to='name.DocTagName', blank=True),
preserve_default=True,
),
migrations.AlterField(
model_name='grouphistory',
name='unused_states',
field=models.ManyToManyField(help_text=b'Document states that have been disabled for the group.', to='doc.State', blank=True),
preserve_default=True,
),
migrations.AlterField(
model_name='grouphistory',
name='unused_tags',
field=models.ManyToManyField(help_text=b'Document tags that have been disabled for the group.', to='name.DocTagName', blank=True),
preserve_default=True,
),
migrations.AlterField(
model_name='groupmilestone',
name='resolved',
field=models.CharField(help_text=b'Explanation of why milestone is resolved (usually "Done"), or empty if still due.', max_length=50, blank=True),
preserve_default=True,
),
migrations.AlterField(
model_name='groupmilestonehistory',
name='resolved',
field=models.CharField(help_text=b'Explanation of why milestone is resolved (usually "Done"), or empty if still due.', max_length=50, blank=True),
preserve_default=True,
),
migrations.AlterField(
model_name='role',
name='email',
field=models.ForeignKey(help_text=b'Email address used by person for this role.', to='person.Email'),
preserve_default=True,
),
migrations.AlterField(
model_name='rolehistory',
name='email',
field=models.ForeignKey(help_text=b'Email address used by person for this role.', to='person.Email'),
preserve_default=True,
),
]
| 39.920635 | 158 | 0.615109 | 271 | 2,515 | 5.568266 | 0.280443 | 0.10603 | 0.132538 | 0.153744 | 0.864148 | 0.864148 | 0.864148 | 0.819085 | 0.819085 | 0.819085 | 0 | 0.011564 | 0.277932 | 2,515 | 62 | 159 | 40.564516 | 0.819383 | 0.00835 | 0 | 0.785714 | 0 | 0 | 0.288122 | 0.017657 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.035714 | 0 | 0.089286 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
461723455604d9ac8342f3d56333b77671c52bba | 987,467 | py | Python | cisco-ios-xr/ydk/models/cisco_ios_xr/Cisco_IOS_XR_asic_errors_oper.py | bopopescu/ACI | dd717bc74739eeed4747b3ea9e36b239580df5e1 | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | cisco-ios-xr/ydk/models/cisco_ios_xr/Cisco_IOS_XR_asic_errors_oper.py | bopopescu/ACI | dd717bc74739eeed4747b3ea9e36b239580df5e1 | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | cisco-ios-xr/ydk/models/cisco_ios_xr/Cisco_IOS_XR_asic_errors_oper.py | bopopescu/ACI | dd717bc74739eeed4747b3ea9e36b239580df5e1 | [
"ECL-2.0",
"Apache-2.0"
] | 1 | 2020-07-22T04:04:44.000Z | 2020-07-22T04:04:44.000Z | """ Cisco_IOS_XR_asic_errors_oper
This module contains a collection of YANG definitions
for Cisco IOS\-XR asic\-errors package operational data.
This module contains definitions
for the following management objects\:
asic\-errors\: Error summary of all asics
Copyright (c) 2013\-2017 by Cisco Systems, Inc.
All rights reserved.
"""
from collections import OrderedDict
from ydk.types import Entity, EntityPath, Identity, Enum, YType, YLeaf, YLeafList, YList, LeafDataList, Bits, Empty, Decimal64
from ydk.filters import YFilter
from ydk.errors import YError, YModelError
from ydk.errors.error_handler import handle_type_error as _handle_type_error
class AsicErrors(Entity):
"""
Error summary of all asics
.. attribute:: nodes
Asic errors for each available nodes
**type**\: :py:class:`Nodes <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors, self).__init__()
self._top_entity = None
self.yang_name = "asic-errors"
self.yang_parent_name = "Cisco-IOS-XR-asic-errors-oper"
self.is_top_level_class = True
self.has_list_ancestor = False
self.ylist_key_names = []
self._child_container_classes = OrderedDict([("nodes", ("nodes", AsicErrors.Nodes))])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict()
self.nodes = AsicErrors.Nodes()
self.nodes.parent = self
self._children_name_map["nodes"] = "nodes"
self._children_yang_names.add("nodes")
self._segment_path = lambda: "Cisco-IOS-XR-asic-errors-oper:asic-errors"
class Nodes(Entity):
"""
Asic errors for each available nodes
.. attribute:: node
Asic error for a particular node
**type**\: list of :py:class:`Node <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes, self).__init__()
self.yang_name = "nodes"
self.yang_parent_name = "asic-errors"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("node", ("node", AsicErrors.Nodes.Node))])
self._leafs = OrderedDict()
self.node = YList(self)
self._segment_path = lambda: "nodes"
self._absolute_path = lambda: "Cisco-IOS-XR-asic-errors-oper:asic-errors/%s" % self._segment_path()
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes, [], name, value)
class Node(Entity):
"""
Asic error for a particular node
.. attribute:: node_name (key)
Node ID
**type**\: str
**pattern:** ([a\-zA\-Z0\-9\_]\*\\d+/){1,2}([a\-zA\-Z0\-9\_]\*\\d+)
.. attribute:: asic_information
Asic on the node
**type**\: list of :py:class:`AsicInformation <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node, self).__init__()
self.yang_name = "node"
self.yang_parent_name = "nodes"
self.is_top_level_class = False
self.has_list_ancestor = False
self.ylist_key_names = ['node_name']
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("asic-information", ("asic_information", AsicErrors.Nodes.Node.AsicInformation))])
self._leafs = OrderedDict([
('node_name', YLeaf(YType.str, 'node-name')),
])
self.node_name = None
self.asic_information = YList(self)
self._segment_path = lambda: "node" + "[node-name='" + str(self.node_name) + "']"
self._absolute_path = lambda: "Cisco-IOS-XR-asic-errors-oper:asic-errors/nodes/%s" % self._segment_path()
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node, ['node_name'], name, value)
class AsicInformation(Entity):
"""
Asic on the node
.. attribute:: asic (key)
Asic string
**type**\: str
**pattern:** [\\w\\\-\\.\:,\_@#%$\\+=\\\|;]+
.. attribute:: all_instances
All asic instance on the node
**type**\: :py:class:`AllInstances <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.AllInstances>`
.. attribute:: instances
All asic errors on the node
**type**\: :py:class:`Instances <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation, self).__init__()
self.yang_name = "asic-information"
self.yang_parent_name = "node"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = ['asic']
self._child_container_classes = OrderedDict([("all-instances", ("all_instances", AsicErrors.Nodes.Node.AsicInformation.AllInstances)), ("instances", ("instances", AsicErrors.Nodes.Node.AsicInformation.Instances))])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('asic', YLeaf(YType.str, 'asic')),
])
self.asic = None
self.all_instances = AsicErrors.Nodes.Node.AsicInformation.AllInstances()
self.all_instances.parent = self
self._children_name_map["all_instances"] = "all-instances"
self._children_yang_names.add("all-instances")
self.instances = AsicErrors.Nodes.Node.AsicInformation.Instances()
self.instances.parent = self
self._children_name_map["instances"] = "instances"
self._children_yang_names.add("instances")
self._segment_path = lambda: "asic-information" + "[asic='" + str(self.asic) + "']"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation, ['asic'], name, value)
class AllInstances(Entity):
"""
All asic instance on the node
.. attribute:: all_error_path
Error path of all instances
**type**\: :py:class:`AllErrorPath <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.AllInstances.AllErrorPath>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.AllInstances, self).__init__()
self.yang_name = "all-instances"
self.yang_parent_name = "asic-information"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([("all-error-path", ("all_error_path", AsicErrors.Nodes.Node.AsicInformation.AllInstances.AllErrorPath))])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict()
self.all_error_path = AsicErrors.Nodes.Node.AsicInformation.AllInstances.AllErrorPath()
self.all_error_path.parent = self
self._children_name_map["all_error_path"] = "all-error-path"
self._children_yang_names.add("all-error-path")
self._segment_path = lambda: "all-instances"
class AllErrorPath(Entity):
"""
Error path of all instances
.. attribute:: summary
Summary of all instances errors
**type**\: :py:class:`Summary <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.AllInstances.AllErrorPath.Summary>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.AllInstances.AllErrorPath, self).__init__()
self.yang_name = "all-error-path"
self.yang_parent_name = "all-instances"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([("summary", ("summary", AsicErrors.Nodes.Node.AsicInformation.AllInstances.AllErrorPath.Summary))])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict()
self.summary = AsicErrors.Nodes.Node.AsicInformation.AllInstances.AllErrorPath.Summary()
self.summary.parent = self
self._children_name_map["summary"] = "summary"
self._children_yang_names.add("summary")
self._segment_path = lambda: "all-error-path"
class Summary(Entity):
"""
Summary of all instances errors
.. attribute:: legacy_client
legacy client
**type**\: bool
.. attribute:: cih_client
cih client
**type**\: bool
.. attribute:: sum_data
sum data
**type**\: list of :py:class:`SumData <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.AllInstances.AllErrorPath.Summary.SumData>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.AllInstances.AllErrorPath.Summary, self).__init__()
self.yang_name = "summary"
self.yang_parent_name = "all-error-path"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("sum-data", ("sum_data", AsicErrors.Nodes.Node.AsicInformation.AllInstances.AllErrorPath.Summary.SumData))])
self._leafs = OrderedDict([
('legacy_client', YLeaf(YType.boolean, 'legacy-client')),
('cih_client', YLeaf(YType.boolean, 'cih-client')),
])
self.legacy_client = None
self.cih_client = None
self.sum_data = YList(self)
self._segment_path = lambda: "summary"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.AllInstances.AllErrorPath.Summary, ['legacy_client', 'cih_client'], name, value)
class SumData(Entity):
"""
sum data
.. attribute:: num_nodes
num nodes
**type**\: int
**range:** 0..4294967295
.. attribute:: crc_err_count
crc err count
**type**\: int
**range:** 0..4294967295
.. attribute:: sbe_err_count
sbe err count
**type**\: int
**range:** 0..4294967295
.. attribute:: mbe_err_count
mbe err count
**type**\: int
**range:** 0..4294967295
.. attribute:: par_err_count
par err count
**type**\: int
**range:** 0..4294967295
.. attribute:: gen_err_count
gen err count
**type**\: int
**range:** 0..4294967295
.. attribute:: reset_err_count
reset err count
**type**\: int
**range:** 0..4294967295
.. attribute:: err_count
err count
**type**\: list of int
**range:** 0..4294967295
.. attribute:: pcie_err_count
pcie err count
**type**\: list of int
**range:** 0..4294967295
.. attribute:: node_key
node key
**type**\: list of int
**range:** 0..4294967295
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.AllInstances.AllErrorPath.Summary.SumData, self).__init__()
self.yang_name = "sum-data"
self.yang_parent_name = "summary"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('num_nodes', YLeaf(YType.uint32, 'num-nodes')),
('crc_err_count', YLeaf(YType.uint32, 'crc-err-count')),
('sbe_err_count', YLeaf(YType.uint32, 'sbe-err-count')),
('mbe_err_count', YLeaf(YType.uint32, 'mbe-err-count')),
('par_err_count', YLeaf(YType.uint32, 'par-err-count')),
('gen_err_count', YLeaf(YType.uint32, 'gen-err-count')),
('reset_err_count', YLeaf(YType.uint32, 'reset-err-count')),
('err_count', YLeafList(YType.uint32, 'err-count')),
('pcie_err_count', YLeafList(YType.uint32, 'pcie-err-count')),
('node_key', YLeafList(YType.uint32, 'node-key')),
])
self.num_nodes = None
self.crc_err_count = None
self.sbe_err_count = None
self.mbe_err_count = None
self.par_err_count = None
self.gen_err_count = None
self.reset_err_count = None
self.err_count = []
self.pcie_err_count = []
self.node_key = []
self._segment_path = lambda: "sum-data"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.AllInstances.AllErrorPath.Summary.SumData, ['num_nodes', 'crc_err_count', 'sbe_err_count', 'mbe_err_count', 'par_err_count', 'gen_err_count', 'reset_err_count', 'err_count', 'pcie_err_count', 'node_key'], name, value)
class Instances(Entity):
"""
All asic errors on the node
.. attribute:: instance
Particular asic instance on the node
**type**\: list of :py:class:`Instance <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances, self).__init__()
self.yang_name = "instances"
self.yang_parent_name = "asic-information"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("instance", ("instance", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance))])
self._leafs = OrderedDict()
self.instance = YList(self)
self._segment_path = lambda: "instances"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances, [], name, value)
class Instance(Entity):
"""
Particular asic instance on the node
.. attribute:: asic_instance (key)
asic instance
**type**\: int
**range:** \-2147483648..2147483647
.. attribute:: error_path
Error path of the instances
**type**\: :py:class:`ErrorPath <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance, self).__init__()
self.yang_name = "instance"
self.yang_parent_name = "instances"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = ['asic_instance']
self._child_container_classes = OrderedDict([("error-path", ("error_path", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath))])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('asic_instance', YLeaf(YType.int32, 'asic-instance')),
])
self.asic_instance = None
self.error_path = AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath()
self.error_path.parent = self
self._children_name_map["error_path"] = "error-path"
self._children_yang_names.add("error-path")
self._segment_path = lambda: "instance" + "[asic-instance='" + str(self.asic_instance) + "']"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance, ['asic_instance'], name, value)
class ErrorPath(Entity):
"""
Error path of the instances
.. attribute:: multiple_bit_soft_errors
Multiple bit soft error information
**type**\: :py:class:`MultipleBitSoftErrors <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.MultipleBitSoftErrors>`
.. attribute:: asic_error_generic_soft
Indirect hard error information
**type**\: :py:class:`AsicErrorGenericSoft <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorGenericSoft>`
.. attribute:: crc_hard_errors
CRC hard error information
**type**\: :py:class:`CrcHardErrors <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.CrcHardErrors>`
.. attribute:: asic_error_sbe_soft
Indirect hard error information
**type**\: :py:class:`AsicErrorSbeSoft <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorSbeSoft>`
.. attribute:: hardware_soft_errors
Hardware soft error information
**type**\: :py:class:`HardwareSoftErrors <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.HardwareSoftErrors>`
.. attribute:: asic_error_crc_soft
Indirect hard error information
**type**\: :py:class:`AsicErrorCrcSoft <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorCrcSoft>`
.. attribute:: asic_error_parity_soft
Indirect hard error information
**type**\: :py:class:`AsicErrorParitySoft <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorParitySoft>`
.. attribute:: io_soft_errors
IO soft error information
**type**\: :py:class:`IoSoftErrors <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.IoSoftErrors>`
.. attribute:: reset_soft_errors
Reset soft error information
**type**\: :py:class:`ResetSoftErrors <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ResetSoftErrors>`
.. attribute:: barrier_hard_errors
Barrier hard error information
**type**\: :py:class:`BarrierHardErrors <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.BarrierHardErrors>`
.. attribute:: ucode_soft_errors
Ucode soft error information
**type**\: :py:class:`UcodeSoftErrors <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.UcodeSoftErrors>`
.. attribute:: asic_error_reset_hard
Indirect hard error information
**type**\: :py:class:`AsicErrorResetHard <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorResetHard>`
.. attribute:: single_bit_hard_errors
Single bit hard error information
**type**\: :py:class:`SingleBitHardErrors <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.SingleBitHardErrors>`
.. attribute:: indirect_hard_errors
Indirect hard error information
**type**\: :py:class:`IndirectHardErrors <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.IndirectHardErrors>`
.. attribute:: outof_resource_soft
OOR thresh information
**type**\: :py:class:`OutofResourceSoft <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.OutofResourceSoft>`
.. attribute:: crc_soft_errors
CRC soft error information
**type**\: :py:class:`CrcSoftErrors <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.CrcSoftErrors>`
.. attribute:: time_out_hard_errors
Time out hard error information
**type**\: :py:class:`TimeOutHardErrors <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.TimeOutHardErrors>`
.. attribute:: barrier_soft_errors
Barrier soft error information
**type**\: :py:class:`BarrierSoftErrors <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.BarrierSoftErrors>`
.. attribute:: asic_error_mbe_soft
Indirect hard error information
**type**\: :py:class:`AsicErrorMbeSoft <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorMbeSoft>`
.. attribute:: back_pressure_hard_errors
BP hard error information
**type**\: :py:class:`BackPressureHardErrors <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.BackPressureHardErrors>`
.. attribute:: single_bit_soft_errors
Single bit soft error information
**type**\: :py:class:`SingleBitSoftErrors <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.SingleBitSoftErrors>`
.. attribute:: indirect_soft_errors
Indirect soft error information
**type**\: :py:class:`IndirectSoftErrors <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.IndirectSoftErrors>`
.. attribute:: generic_hard_errors
Generic hard error information
**type**\: :py:class:`GenericHardErrors <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.GenericHardErrors>`
.. attribute:: link_hard_errors
Link hard error information
**type**\: :py:class:`LinkHardErrors <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.LinkHardErrors>`
.. attribute:: configuration_hard_errors
Configuration hard error information
**type**\: :py:class:`ConfigurationHardErrors <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ConfigurationHardErrors>`
.. attribute:: instance_summary
Summary for a specific instance
**type**\: :py:class:`InstanceSummary <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.InstanceSummary>`
.. attribute:: unexpected_hard_errors
Unexpected hard error information
**type**\: :py:class:`UnexpectedHardErrors <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.UnexpectedHardErrors>`
.. attribute:: time_out_soft_errors
Time out soft error information
**type**\: :py:class:`TimeOutSoftErrors <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.TimeOutSoftErrors>`
.. attribute:: asic_error_generic_hard
Indirect hard error information
**type**\: :py:class:`AsicErrorGenericHard <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorGenericHard>`
.. attribute:: parity_hard_errors
Parity hard error information
**type**\: :py:class:`ParityHardErrors <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ParityHardErrors>`
.. attribute:: descriptor_hard_errors
Descriptor hard error information
**type**\: :py:class:`DescriptorHardErrors <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.DescriptorHardErrors>`
.. attribute:: interface_hard_errors
Interface hard error information
**type**\: :py:class:`InterfaceHardErrors <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.InterfaceHardErrors>`
.. attribute:: asic_error_sbe_hard
Indirect hard error information
**type**\: :py:class:`AsicErrorSbeHard <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorSbeHard>`
.. attribute:: asic_error_crc_hard
Indirect hard error information
**type**\: :py:class:`AsicErrorCrcHard <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorCrcHard>`
.. attribute:: asic_error_parity_hard
Indirect hard error information
**type**\: :py:class:`AsicErrorParityHard <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorParityHard>`
.. attribute:: asic_error_reset_soft
Indirect hard error information
**type**\: :py:class:`AsicErrorResetSoft <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorResetSoft>`
.. attribute:: back_pressure_soft_errors
BP soft error information
**type**\: :py:class:`BackPressureSoftErrors <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.BackPressureSoftErrors>`
.. attribute:: generic_soft_errors
Generic soft error information
**type**\: :py:class:`GenericSoftErrors <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.GenericSoftErrors>`
.. attribute:: link_soft_errors
Link soft error information
**type**\: :py:class:`LinkSoftErrors <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.LinkSoftErrors>`
.. attribute:: configuration_soft_errors
Configuration soft error information
**type**\: :py:class:`ConfigurationSoftErrors <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ConfigurationSoftErrors>`
.. attribute:: multiple_bit_hard_errors
Multiple bit hard error information
**type**\: :py:class:`MultipleBitHardErrors <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.MultipleBitHardErrors>`
.. attribute:: unexpected_soft_errors
Unexpected soft error information
**type**\: :py:class:`UnexpectedSoftErrors <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.UnexpectedSoftErrors>`
.. attribute:: outof_resource_hard
OOR thresh information
**type**\: :py:class:`OutofResourceHard <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.OutofResourceHard>`
.. attribute:: hardware_hard_errors
Hardware hard error information
**type**\: :py:class:`HardwareHardErrors <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.HardwareHardErrors>`
.. attribute:: parity_soft_errors
Parity soft error information
**type**\: :py:class:`ParitySoftErrors <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ParitySoftErrors>`
.. attribute:: descriptor_soft_errors
Descriptor soft error information
**type**\: :py:class:`DescriptorSoftErrors <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.DescriptorSoftErrors>`
.. attribute:: interface_soft_errors
Interface soft error information
**type**\: :py:class:`InterfaceSoftErrors <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.InterfaceSoftErrors>`
.. attribute:: io_hard_errors
IO hard error information
**type**\: :py:class:`IoHardErrors <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.IoHardErrors>`
.. attribute:: reset_hard_errors
Reset hard error information
**type**\: :py:class:`ResetHardErrors <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ResetHardErrors>`
.. attribute:: ucode_hard_errors
UCode hard error information
**type**\: :py:class:`UcodeHardErrors <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.UcodeHardErrors>`
.. attribute:: asic_error_mbe_hard
Indirect hard error information
**type**\: :py:class:`AsicErrorMbeHard <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorMbeHard>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath, self).__init__()
self.yang_name = "error-path"
self.yang_parent_name = "instance"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([("multiple-bit-soft-errors", ("multiple_bit_soft_errors", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.MultipleBitSoftErrors)), ("asic-error-generic-soft", ("asic_error_generic_soft", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorGenericSoft)), ("crc-hard-errors", ("crc_hard_errors", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.CrcHardErrors)), ("asic-error-sbe-soft", ("asic_error_sbe_soft", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorSbeSoft)), ("hardware-soft-errors", ("hardware_soft_errors", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.HardwareSoftErrors)), ("asic-error-crc-soft", ("asic_error_crc_soft", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorCrcSoft)), ("asic-error-parity-soft", ("asic_error_parity_soft", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorParitySoft)), ("io-soft-errors", ("io_soft_errors", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.IoSoftErrors)), ("reset-soft-errors", ("reset_soft_errors", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ResetSoftErrors)), ("barrier-hard-errors", ("barrier_hard_errors", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.BarrierHardErrors)), ("ucode-soft-errors", ("ucode_soft_errors", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.UcodeSoftErrors)), ("asic-error-reset-hard", ("asic_error_reset_hard", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorResetHard)), ("single-bit-hard-errors", ("single_bit_hard_errors", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.SingleBitHardErrors)), ("indirect-hard-errors", ("indirect_hard_errors", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.IndirectHardErrors)), ("outof-resource-soft", ("outof_resource_soft", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.OutofResourceSoft)), ("crc-soft-errors", ("crc_soft_errors", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.CrcSoftErrors)), ("time-out-hard-errors", ("time_out_hard_errors", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.TimeOutHardErrors)), ("barrier-soft-errors", ("barrier_soft_errors", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.BarrierSoftErrors)), ("asic-error-mbe-soft", ("asic_error_mbe_soft", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorMbeSoft)), ("back-pressure-hard-errors", ("back_pressure_hard_errors", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.BackPressureHardErrors)), ("single-bit-soft-errors", ("single_bit_soft_errors", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.SingleBitSoftErrors)), ("indirect-soft-errors", ("indirect_soft_errors", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.IndirectSoftErrors)), ("generic-hard-errors", ("generic_hard_errors", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.GenericHardErrors)), ("link-hard-errors", ("link_hard_errors", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.LinkHardErrors)), ("configuration-hard-errors", ("configuration_hard_errors", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ConfigurationHardErrors)), ("instance-summary", ("instance_summary", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.InstanceSummary)), ("unexpected-hard-errors", ("unexpected_hard_errors", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.UnexpectedHardErrors)), ("time-out-soft-errors", ("time_out_soft_errors", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.TimeOutSoftErrors)), ("asic-error-generic-hard", ("asic_error_generic_hard", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorGenericHard)), ("parity-hard-errors", ("parity_hard_errors", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ParityHardErrors)), ("descriptor-hard-errors", ("descriptor_hard_errors", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.DescriptorHardErrors)), ("interface-hard-errors", ("interface_hard_errors", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.InterfaceHardErrors)), ("asic-error-sbe-hard", ("asic_error_sbe_hard", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorSbeHard)), ("asic-error-crc-hard", ("asic_error_crc_hard", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorCrcHard)), ("asic-error-parity-hard", ("asic_error_parity_hard", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorParityHard)), ("asic-error-reset-soft", ("asic_error_reset_soft", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorResetSoft)), ("back-pressure-soft-errors", ("back_pressure_soft_errors", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.BackPressureSoftErrors)), ("generic-soft-errors", ("generic_soft_errors", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.GenericSoftErrors)), ("link-soft-errors", ("link_soft_errors", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.LinkSoftErrors)), ("configuration-soft-errors", ("configuration_soft_errors", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ConfigurationSoftErrors)), ("multiple-bit-hard-errors", ("multiple_bit_hard_errors", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.MultipleBitHardErrors)), ("unexpected-soft-errors", ("unexpected_soft_errors", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.UnexpectedSoftErrors)), ("outof-resource-hard", ("outof_resource_hard", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.OutofResourceHard)), ("hardware-hard-errors", ("hardware_hard_errors", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.HardwareHardErrors)), ("parity-soft-errors", ("parity_soft_errors", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ParitySoftErrors)), ("descriptor-soft-errors", ("descriptor_soft_errors", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.DescriptorSoftErrors)), ("interface-soft-errors", ("interface_soft_errors", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.InterfaceSoftErrors)), ("io-hard-errors", ("io_hard_errors", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.IoHardErrors)), ("reset-hard-errors", ("reset_hard_errors", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ResetHardErrors)), ("ucode-hard-errors", ("ucode_hard_errors", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.UcodeHardErrors)), ("asic-error-mbe-hard", ("asic_error_mbe_hard", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorMbeHard))])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict()
self.multiple_bit_soft_errors = AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.MultipleBitSoftErrors()
self.multiple_bit_soft_errors.parent = self
self._children_name_map["multiple_bit_soft_errors"] = "multiple-bit-soft-errors"
self._children_yang_names.add("multiple-bit-soft-errors")
self.asic_error_generic_soft = AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorGenericSoft()
self.asic_error_generic_soft.parent = self
self._children_name_map["asic_error_generic_soft"] = "asic-error-generic-soft"
self._children_yang_names.add("asic-error-generic-soft")
self.crc_hard_errors = AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.CrcHardErrors()
self.crc_hard_errors.parent = self
self._children_name_map["crc_hard_errors"] = "crc-hard-errors"
self._children_yang_names.add("crc-hard-errors")
self.asic_error_sbe_soft = AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorSbeSoft()
self.asic_error_sbe_soft.parent = self
self._children_name_map["asic_error_sbe_soft"] = "asic-error-sbe-soft"
self._children_yang_names.add("asic-error-sbe-soft")
self.hardware_soft_errors = AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.HardwareSoftErrors()
self.hardware_soft_errors.parent = self
self._children_name_map["hardware_soft_errors"] = "hardware-soft-errors"
self._children_yang_names.add("hardware-soft-errors")
self.asic_error_crc_soft = AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorCrcSoft()
self.asic_error_crc_soft.parent = self
self._children_name_map["asic_error_crc_soft"] = "asic-error-crc-soft"
self._children_yang_names.add("asic-error-crc-soft")
self.asic_error_parity_soft = AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorParitySoft()
self.asic_error_parity_soft.parent = self
self._children_name_map["asic_error_parity_soft"] = "asic-error-parity-soft"
self._children_yang_names.add("asic-error-parity-soft")
self.io_soft_errors = AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.IoSoftErrors()
self.io_soft_errors.parent = self
self._children_name_map["io_soft_errors"] = "io-soft-errors"
self._children_yang_names.add("io-soft-errors")
self.reset_soft_errors = AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ResetSoftErrors()
self.reset_soft_errors.parent = self
self._children_name_map["reset_soft_errors"] = "reset-soft-errors"
self._children_yang_names.add("reset-soft-errors")
self.barrier_hard_errors = AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.BarrierHardErrors()
self.barrier_hard_errors.parent = self
self._children_name_map["barrier_hard_errors"] = "barrier-hard-errors"
self._children_yang_names.add("barrier-hard-errors")
self.ucode_soft_errors = AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.UcodeSoftErrors()
self.ucode_soft_errors.parent = self
self._children_name_map["ucode_soft_errors"] = "ucode-soft-errors"
self._children_yang_names.add("ucode-soft-errors")
self.asic_error_reset_hard = AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorResetHard()
self.asic_error_reset_hard.parent = self
self._children_name_map["asic_error_reset_hard"] = "asic-error-reset-hard"
self._children_yang_names.add("asic-error-reset-hard")
self.single_bit_hard_errors = AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.SingleBitHardErrors()
self.single_bit_hard_errors.parent = self
self._children_name_map["single_bit_hard_errors"] = "single-bit-hard-errors"
self._children_yang_names.add("single-bit-hard-errors")
self.indirect_hard_errors = AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.IndirectHardErrors()
self.indirect_hard_errors.parent = self
self._children_name_map["indirect_hard_errors"] = "indirect-hard-errors"
self._children_yang_names.add("indirect-hard-errors")
self.outof_resource_soft = AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.OutofResourceSoft()
self.outof_resource_soft.parent = self
self._children_name_map["outof_resource_soft"] = "outof-resource-soft"
self._children_yang_names.add("outof-resource-soft")
self.crc_soft_errors = AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.CrcSoftErrors()
self.crc_soft_errors.parent = self
self._children_name_map["crc_soft_errors"] = "crc-soft-errors"
self._children_yang_names.add("crc-soft-errors")
self.time_out_hard_errors = AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.TimeOutHardErrors()
self.time_out_hard_errors.parent = self
self._children_name_map["time_out_hard_errors"] = "time-out-hard-errors"
self._children_yang_names.add("time-out-hard-errors")
self.barrier_soft_errors = AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.BarrierSoftErrors()
self.barrier_soft_errors.parent = self
self._children_name_map["barrier_soft_errors"] = "barrier-soft-errors"
self._children_yang_names.add("barrier-soft-errors")
self.asic_error_mbe_soft = AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorMbeSoft()
self.asic_error_mbe_soft.parent = self
self._children_name_map["asic_error_mbe_soft"] = "asic-error-mbe-soft"
self._children_yang_names.add("asic-error-mbe-soft")
self.back_pressure_hard_errors = AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.BackPressureHardErrors()
self.back_pressure_hard_errors.parent = self
self._children_name_map["back_pressure_hard_errors"] = "back-pressure-hard-errors"
self._children_yang_names.add("back-pressure-hard-errors")
self.single_bit_soft_errors = AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.SingleBitSoftErrors()
self.single_bit_soft_errors.parent = self
self._children_name_map["single_bit_soft_errors"] = "single-bit-soft-errors"
self._children_yang_names.add("single-bit-soft-errors")
self.indirect_soft_errors = AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.IndirectSoftErrors()
self.indirect_soft_errors.parent = self
self._children_name_map["indirect_soft_errors"] = "indirect-soft-errors"
self._children_yang_names.add("indirect-soft-errors")
self.generic_hard_errors = AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.GenericHardErrors()
self.generic_hard_errors.parent = self
self._children_name_map["generic_hard_errors"] = "generic-hard-errors"
self._children_yang_names.add("generic-hard-errors")
self.link_hard_errors = AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.LinkHardErrors()
self.link_hard_errors.parent = self
self._children_name_map["link_hard_errors"] = "link-hard-errors"
self._children_yang_names.add("link-hard-errors")
self.configuration_hard_errors = AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ConfigurationHardErrors()
self.configuration_hard_errors.parent = self
self._children_name_map["configuration_hard_errors"] = "configuration-hard-errors"
self._children_yang_names.add("configuration-hard-errors")
self.instance_summary = AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.InstanceSummary()
self.instance_summary.parent = self
self._children_name_map["instance_summary"] = "instance-summary"
self._children_yang_names.add("instance-summary")
self.unexpected_hard_errors = AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.UnexpectedHardErrors()
self.unexpected_hard_errors.parent = self
self._children_name_map["unexpected_hard_errors"] = "unexpected-hard-errors"
self._children_yang_names.add("unexpected-hard-errors")
self.time_out_soft_errors = AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.TimeOutSoftErrors()
self.time_out_soft_errors.parent = self
self._children_name_map["time_out_soft_errors"] = "time-out-soft-errors"
self._children_yang_names.add("time-out-soft-errors")
self.asic_error_generic_hard = AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorGenericHard()
self.asic_error_generic_hard.parent = self
self._children_name_map["asic_error_generic_hard"] = "asic-error-generic-hard"
self._children_yang_names.add("asic-error-generic-hard")
self.parity_hard_errors = AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ParityHardErrors()
self.parity_hard_errors.parent = self
self._children_name_map["parity_hard_errors"] = "parity-hard-errors"
self._children_yang_names.add("parity-hard-errors")
self.descriptor_hard_errors = AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.DescriptorHardErrors()
self.descriptor_hard_errors.parent = self
self._children_name_map["descriptor_hard_errors"] = "descriptor-hard-errors"
self._children_yang_names.add("descriptor-hard-errors")
self.interface_hard_errors = AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.InterfaceHardErrors()
self.interface_hard_errors.parent = self
self._children_name_map["interface_hard_errors"] = "interface-hard-errors"
self._children_yang_names.add("interface-hard-errors")
self.asic_error_sbe_hard = AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorSbeHard()
self.asic_error_sbe_hard.parent = self
self._children_name_map["asic_error_sbe_hard"] = "asic-error-sbe-hard"
self._children_yang_names.add("asic-error-sbe-hard")
self.asic_error_crc_hard = AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorCrcHard()
self.asic_error_crc_hard.parent = self
self._children_name_map["asic_error_crc_hard"] = "asic-error-crc-hard"
self._children_yang_names.add("asic-error-crc-hard")
self.asic_error_parity_hard = AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorParityHard()
self.asic_error_parity_hard.parent = self
self._children_name_map["asic_error_parity_hard"] = "asic-error-parity-hard"
self._children_yang_names.add("asic-error-parity-hard")
self.asic_error_reset_soft = AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorResetSoft()
self.asic_error_reset_soft.parent = self
self._children_name_map["asic_error_reset_soft"] = "asic-error-reset-soft"
self._children_yang_names.add("asic-error-reset-soft")
self.back_pressure_soft_errors = AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.BackPressureSoftErrors()
self.back_pressure_soft_errors.parent = self
self._children_name_map["back_pressure_soft_errors"] = "back-pressure-soft-errors"
self._children_yang_names.add("back-pressure-soft-errors")
self.generic_soft_errors = AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.GenericSoftErrors()
self.generic_soft_errors.parent = self
self._children_name_map["generic_soft_errors"] = "generic-soft-errors"
self._children_yang_names.add("generic-soft-errors")
self.link_soft_errors = AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.LinkSoftErrors()
self.link_soft_errors.parent = self
self._children_name_map["link_soft_errors"] = "link-soft-errors"
self._children_yang_names.add("link-soft-errors")
self.configuration_soft_errors = AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ConfigurationSoftErrors()
self.configuration_soft_errors.parent = self
self._children_name_map["configuration_soft_errors"] = "configuration-soft-errors"
self._children_yang_names.add("configuration-soft-errors")
self.multiple_bit_hard_errors = AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.MultipleBitHardErrors()
self.multiple_bit_hard_errors.parent = self
self._children_name_map["multiple_bit_hard_errors"] = "multiple-bit-hard-errors"
self._children_yang_names.add("multiple-bit-hard-errors")
self.unexpected_soft_errors = AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.UnexpectedSoftErrors()
self.unexpected_soft_errors.parent = self
self._children_name_map["unexpected_soft_errors"] = "unexpected-soft-errors"
self._children_yang_names.add("unexpected-soft-errors")
self.outof_resource_hard = AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.OutofResourceHard()
self.outof_resource_hard.parent = self
self._children_name_map["outof_resource_hard"] = "outof-resource-hard"
self._children_yang_names.add("outof-resource-hard")
self.hardware_hard_errors = AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.HardwareHardErrors()
self.hardware_hard_errors.parent = self
self._children_name_map["hardware_hard_errors"] = "hardware-hard-errors"
self._children_yang_names.add("hardware-hard-errors")
self.parity_soft_errors = AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ParitySoftErrors()
self.parity_soft_errors.parent = self
self._children_name_map["parity_soft_errors"] = "parity-soft-errors"
self._children_yang_names.add("parity-soft-errors")
self.descriptor_soft_errors = AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.DescriptorSoftErrors()
self.descriptor_soft_errors.parent = self
self._children_name_map["descriptor_soft_errors"] = "descriptor-soft-errors"
self._children_yang_names.add("descriptor-soft-errors")
self.interface_soft_errors = AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.InterfaceSoftErrors()
self.interface_soft_errors.parent = self
self._children_name_map["interface_soft_errors"] = "interface-soft-errors"
self._children_yang_names.add("interface-soft-errors")
self.io_hard_errors = AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.IoHardErrors()
self.io_hard_errors.parent = self
self._children_name_map["io_hard_errors"] = "io-hard-errors"
self._children_yang_names.add("io-hard-errors")
self.reset_hard_errors = AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ResetHardErrors()
self.reset_hard_errors.parent = self
self._children_name_map["reset_hard_errors"] = "reset-hard-errors"
self._children_yang_names.add("reset-hard-errors")
self.ucode_hard_errors = AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.UcodeHardErrors()
self.ucode_hard_errors.parent = self
self._children_name_map["ucode_hard_errors"] = "ucode-hard-errors"
self._children_yang_names.add("ucode-hard-errors")
self.asic_error_mbe_hard = AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorMbeHard()
self.asic_error_mbe_hard.parent = self
self._children_name_map["asic_error_mbe_hard"] = "asic-error-mbe-hard"
self._children_yang_names.add("asic-error-mbe-hard")
self._segment_path = lambda: "error-path"
class MultipleBitSoftErrors(Entity):
"""
Multiple bit soft error information
.. attribute:: error
Collection of errors
**type**\: list of :py:class:`Error <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.MultipleBitSoftErrors.Error>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.MultipleBitSoftErrors, self).__init__()
self.yang_name = "multiple-bit-soft-errors"
self.yang_parent_name = "error-path"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("error", ("error", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.MultipleBitSoftErrors.Error))])
self._leafs = OrderedDict()
self.error = YList(self)
self._segment_path = lambda: "multiple-bit-soft-errors"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.MultipleBitSoftErrors, [], name, value)
class Error(Entity):
"""
Collection of errors
.. attribute:: name
Name assigned to mem
**type**\: str
.. attribute:: asic_info
Name of rack/board/asic
**type**\: str
.. attribute:: node_key
32 bit key
**type**\: int
**range:** 0..4294967295
.. attribute:: alarm_on
High threshold crossed
**type**\: bool
.. attribute:: thresh_hi
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_hi
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: thresh_lo
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_lo
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: count
Accumulated count
**type**\: int
**range:** 0..4294967295
.. attribute:: intr_type
Type of error
**type**\: int
**range:** 0..4294967295
.. attribute:: leaf_id
Leaf ID defined in user data
**type**\: int
**range:** 0..4294967295
.. attribute:: last_cleared
Time cleared
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: csrs_info
List of csrs\_info
**type**\: list of :py:class:`CsrsInfo <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.MultipleBitSoftErrors.Error.CsrsInfo>`
.. attribute:: last_err
Last Printable error information
**type**\: list of :py:class:`LastErr <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.MultipleBitSoftErrors.Error.LastErr>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.MultipleBitSoftErrors.Error, self).__init__()
self.yang_name = "error"
self.yang_parent_name = "multiple-bit-soft-errors"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("csrs-info", ("csrs_info", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.MultipleBitSoftErrors.Error.CsrsInfo)), ("last-err", ("last_err", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.MultipleBitSoftErrors.Error.LastErr))])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('asic_info', YLeaf(YType.str, 'asic-info')),
('node_key', YLeaf(YType.uint32, 'node-key')),
('alarm_on', YLeaf(YType.boolean, 'alarm-on')),
('thresh_hi', YLeaf(YType.uint32, 'thresh-hi')),
('period_hi', YLeaf(YType.uint32, 'period-hi')),
('thresh_lo', YLeaf(YType.uint32, 'thresh-lo')),
('period_lo', YLeaf(YType.uint32, 'period-lo')),
('count', YLeaf(YType.uint32, 'count')),
('intr_type', YLeaf(YType.uint32, 'intr-type')),
('leaf_id', YLeaf(YType.uint32, 'leaf-id')),
('last_cleared', YLeaf(YType.uint64, 'last-cleared')),
])
self.name = None
self.asic_info = None
self.node_key = None
self.alarm_on = None
self.thresh_hi = None
self.period_hi = None
self.thresh_lo = None
self.period_lo = None
self.count = None
self.intr_type = None
self.leaf_id = None
self.last_cleared = None
self.csrs_info = YList(self)
self.last_err = YList(self)
self._segment_path = lambda: "error"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.MultipleBitSoftErrors.Error, ['name', 'asic_info', 'node_key', 'alarm_on', 'thresh_hi', 'period_hi', 'thresh_lo', 'period_lo', 'count', 'intr_type', 'leaf_id', 'last_cleared'], name, value)
class CsrsInfo(Entity):
"""
List of csrs\_info
.. attribute:: name
name
**type**\: str
.. attribute:: address
address
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: width
width
**type**\: int
**range:** 0..4294967295
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.MultipleBitSoftErrors.Error.CsrsInfo, self).__init__()
self.yang_name = "csrs-info"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('address', YLeaf(YType.uint64, 'address')),
('width', YLeaf(YType.uint32, 'width')),
])
self.name = None
self.address = None
self.width = None
self._segment_path = lambda: "csrs-info"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.MultipleBitSoftErrors.Error.CsrsInfo, ['name', 'address', 'width'], name, value)
class LastErr(Entity):
"""
Last Printable error information
.. attribute:: at_time
at time
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: at_time_nsec
at time nsec
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: counter_val
counter val
**type**\: int
**range:** 0..4294967295
.. attribute:: error_desc
error desc
**type**\: str
.. attribute:: error_regval
error regval
**type**\: list of int
**range:** 0..255
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.MultipleBitSoftErrors.Error.LastErr, self).__init__()
self.yang_name = "last-err"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('at_time', YLeaf(YType.uint64, 'at-time')),
('at_time_nsec', YLeaf(YType.uint64, 'at-time-nsec')),
('counter_val', YLeaf(YType.uint32, 'counter-val')),
('error_desc', YLeaf(YType.str, 'error-desc')),
('error_regval', YLeafList(YType.uint8, 'error-regval')),
])
self.at_time = None
self.at_time_nsec = None
self.counter_val = None
self.error_desc = None
self.error_regval = []
self._segment_path = lambda: "last-err"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.MultipleBitSoftErrors.Error.LastErr, ['at_time', 'at_time_nsec', 'counter_val', 'error_desc', 'error_regval'], name, value)
class AsicErrorGenericSoft(Entity):
"""
Indirect hard error information
.. attribute:: error
Collection of errors
**type**\: list of :py:class:`Error <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorGenericSoft.Error>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorGenericSoft, self).__init__()
self.yang_name = "asic-error-generic-soft"
self.yang_parent_name = "error-path"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("error", ("error", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorGenericSoft.Error))])
self._leafs = OrderedDict()
self.error = YList(self)
self._segment_path = lambda: "asic-error-generic-soft"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorGenericSoft, [], name, value)
class Error(Entity):
"""
Collection of errors
.. attribute:: name
Name assigned to mem
**type**\: str
.. attribute:: asic_info
Name of rack/board/asic
**type**\: str
.. attribute:: node_key
32 bit key
**type**\: int
**range:** 0..4294967295
.. attribute:: alarm_on
High threshold crossed
**type**\: bool
.. attribute:: thresh_hi
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_hi
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: thresh_lo
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_lo
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: count
Accumulated count
**type**\: int
**range:** 0..4294967295
.. attribute:: intr_type
Type of error
**type**\: int
**range:** 0..4294967295
.. attribute:: leaf_id
Leaf ID defined in user data
**type**\: int
**range:** 0..4294967295
.. attribute:: last_cleared
Time cleared
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: csrs_info
List of csrs\_info
**type**\: list of :py:class:`CsrsInfo <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorGenericSoft.Error.CsrsInfo>`
.. attribute:: last_err
Last Printable error information
**type**\: list of :py:class:`LastErr <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorGenericSoft.Error.LastErr>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorGenericSoft.Error, self).__init__()
self.yang_name = "error"
self.yang_parent_name = "asic-error-generic-soft"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("csrs-info", ("csrs_info", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorGenericSoft.Error.CsrsInfo)), ("last-err", ("last_err", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorGenericSoft.Error.LastErr))])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('asic_info', YLeaf(YType.str, 'asic-info')),
('node_key', YLeaf(YType.uint32, 'node-key')),
('alarm_on', YLeaf(YType.boolean, 'alarm-on')),
('thresh_hi', YLeaf(YType.uint32, 'thresh-hi')),
('period_hi', YLeaf(YType.uint32, 'period-hi')),
('thresh_lo', YLeaf(YType.uint32, 'thresh-lo')),
('period_lo', YLeaf(YType.uint32, 'period-lo')),
('count', YLeaf(YType.uint32, 'count')),
('intr_type', YLeaf(YType.uint32, 'intr-type')),
('leaf_id', YLeaf(YType.uint32, 'leaf-id')),
('last_cleared', YLeaf(YType.uint64, 'last-cleared')),
])
self.name = None
self.asic_info = None
self.node_key = None
self.alarm_on = None
self.thresh_hi = None
self.period_hi = None
self.thresh_lo = None
self.period_lo = None
self.count = None
self.intr_type = None
self.leaf_id = None
self.last_cleared = None
self.csrs_info = YList(self)
self.last_err = YList(self)
self._segment_path = lambda: "error"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorGenericSoft.Error, ['name', 'asic_info', 'node_key', 'alarm_on', 'thresh_hi', 'period_hi', 'thresh_lo', 'period_lo', 'count', 'intr_type', 'leaf_id', 'last_cleared'], name, value)
class CsrsInfo(Entity):
"""
List of csrs\_info
.. attribute:: name
name
**type**\: str
.. attribute:: address
address
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: width
width
**type**\: int
**range:** 0..4294967295
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorGenericSoft.Error.CsrsInfo, self).__init__()
self.yang_name = "csrs-info"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('address', YLeaf(YType.uint64, 'address')),
('width', YLeaf(YType.uint32, 'width')),
])
self.name = None
self.address = None
self.width = None
self._segment_path = lambda: "csrs-info"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorGenericSoft.Error.CsrsInfo, ['name', 'address', 'width'], name, value)
class LastErr(Entity):
"""
Last Printable error information
.. attribute:: at_time
at time
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: at_time_nsec
at time nsec
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: counter_val
counter val
**type**\: int
**range:** 0..4294967295
.. attribute:: error_desc
error desc
**type**\: str
.. attribute:: error_regval
error regval
**type**\: list of int
**range:** 0..255
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorGenericSoft.Error.LastErr, self).__init__()
self.yang_name = "last-err"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('at_time', YLeaf(YType.uint64, 'at-time')),
('at_time_nsec', YLeaf(YType.uint64, 'at-time-nsec')),
('counter_val', YLeaf(YType.uint32, 'counter-val')),
('error_desc', YLeaf(YType.str, 'error-desc')),
('error_regval', YLeafList(YType.uint8, 'error-regval')),
])
self.at_time = None
self.at_time_nsec = None
self.counter_val = None
self.error_desc = None
self.error_regval = []
self._segment_path = lambda: "last-err"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorGenericSoft.Error.LastErr, ['at_time', 'at_time_nsec', 'counter_val', 'error_desc', 'error_regval'], name, value)
class CrcHardErrors(Entity):
"""
CRC hard error information
.. attribute:: error
Collection of errors
**type**\: list of :py:class:`Error <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.CrcHardErrors.Error>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.CrcHardErrors, self).__init__()
self.yang_name = "crc-hard-errors"
self.yang_parent_name = "error-path"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("error", ("error", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.CrcHardErrors.Error))])
self._leafs = OrderedDict()
self.error = YList(self)
self._segment_path = lambda: "crc-hard-errors"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.CrcHardErrors, [], name, value)
class Error(Entity):
"""
Collection of errors
.. attribute:: name
Name assigned to mem
**type**\: str
.. attribute:: asic_info
Name of rack/board/asic
**type**\: str
.. attribute:: node_key
32 bit key
**type**\: int
**range:** 0..4294967295
.. attribute:: alarm_on
High threshold crossed
**type**\: bool
.. attribute:: thresh_hi
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_hi
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: thresh_lo
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_lo
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: count
Accumulated count
**type**\: int
**range:** 0..4294967295
.. attribute:: intr_type
Type of error
**type**\: int
**range:** 0..4294967295
.. attribute:: leaf_id
Leaf ID defined in user data
**type**\: int
**range:** 0..4294967295
.. attribute:: last_cleared
Time cleared
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: csrs_info
List of csrs\_info
**type**\: list of :py:class:`CsrsInfo <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.CrcHardErrors.Error.CsrsInfo>`
.. attribute:: last_err
Last Printable error information
**type**\: list of :py:class:`LastErr <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.CrcHardErrors.Error.LastErr>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.CrcHardErrors.Error, self).__init__()
self.yang_name = "error"
self.yang_parent_name = "crc-hard-errors"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("csrs-info", ("csrs_info", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.CrcHardErrors.Error.CsrsInfo)), ("last-err", ("last_err", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.CrcHardErrors.Error.LastErr))])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('asic_info', YLeaf(YType.str, 'asic-info')),
('node_key', YLeaf(YType.uint32, 'node-key')),
('alarm_on', YLeaf(YType.boolean, 'alarm-on')),
('thresh_hi', YLeaf(YType.uint32, 'thresh-hi')),
('period_hi', YLeaf(YType.uint32, 'period-hi')),
('thresh_lo', YLeaf(YType.uint32, 'thresh-lo')),
('period_lo', YLeaf(YType.uint32, 'period-lo')),
('count', YLeaf(YType.uint32, 'count')),
('intr_type', YLeaf(YType.uint32, 'intr-type')),
('leaf_id', YLeaf(YType.uint32, 'leaf-id')),
('last_cleared', YLeaf(YType.uint64, 'last-cleared')),
])
self.name = None
self.asic_info = None
self.node_key = None
self.alarm_on = None
self.thresh_hi = None
self.period_hi = None
self.thresh_lo = None
self.period_lo = None
self.count = None
self.intr_type = None
self.leaf_id = None
self.last_cleared = None
self.csrs_info = YList(self)
self.last_err = YList(self)
self._segment_path = lambda: "error"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.CrcHardErrors.Error, ['name', 'asic_info', 'node_key', 'alarm_on', 'thresh_hi', 'period_hi', 'thresh_lo', 'period_lo', 'count', 'intr_type', 'leaf_id', 'last_cleared'], name, value)
class CsrsInfo(Entity):
"""
List of csrs\_info
.. attribute:: name
name
**type**\: str
.. attribute:: address
address
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: width
width
**type**\: int
**range:** 0..4294967295
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.CrcHardErrors.Error.CsrsInfo, self).__init__()
self.yang_name = "csrs-info"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('address', YLeaf(YType.uint64, 'address')),
('width', YLeaf(YType.uint32, 'width')),
])
self.name = None
self.address = None
self.width = None
self._segment_path = lambda: "csrs-info"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.CrcHardErrors.Error.CsrsInfo, ['name', 'address', 'width'], name, value)
class LastErr(Entity):
"""
Last Printable error information
.. attribute:: at_time
at time
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: at_time_nsec
at time nsec
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: counter_val
counter val
**type**\: int
**range:** 0..4294967295
.. attribute:: error_desc
error desc
**type**\: str
.. attribute:: error_regval
error regval
**type**\: list of int
**range:** 0..255
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.CrcHardErrors.Error.LastErr, self).__init__()
self.yang_name = "last-err"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('at_time', YLeaf(YType.uint64, 'at-time')),
('at_time_nsec', YLeaf(YType.uint64, 'at-time-nsec')),
('counter_val', YLeaf(YType.uint32, 'counter-val')),
('error_desc', YLeaf(YType.str, 'error-desc')),
('error_regval', YLeafList(YType.uint8, 'error-regval')),
])
self.at_time = None
self.at_time_nsec = None
self.counter_val = None
self.error_desc = None
self.error_regval = []
self._segment_path = lambda: "last-err"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.CrcHardErrors.Error.LastErr, ['at_time', 'at_time_nsec', 'counter_val', 'error_desc', 'error_regval'], name, value)
class AsicErrorSbeSoft(Entity):
"""
Indirect hard error information
.. attribute:: error
Collection of errors
**type**\: list of :py:class:`Error <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorSbeSoft.Error>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorSbeSoft, self).__init__()
self.yang_name = "asic-error-sbe-soft"
self.yang_parent_name = "error-path"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("error", ("error", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorSbeSoft.Error))])
self._leafs = OrderedDict()
self.error = YList(self)
self._segment_path = lambda: "asic-error-sbe-soft"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorSbeSoft, [], name, value)
class Error(Entity):
"""
Collection of errors
.. attribute:: name
Name assigned to mem
**type**\: str
.. attribute:: asic_info
Name of rack/board/asic
**type**\: str
.. attribute:: node_key
32 bit key
**type**\: int
**range:** 0..4294967295
.. attribute:: alarm_on
High threshold crossed
**type**\: bool
.. attribute:: thresh_hi
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_hi
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: thresh_lo
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_lo
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: count
Accumulated count
**type**\: int
**range:** 0..4294967295
.. attribute:: intr_type
Type of error
**type**\: int
**range:** 0..4294967295
.. attribute:: leaf_id
Leaf ID defined in user data
**type**\: int
**range:** 0..4294967295
.. attribute:: last_cleared
Time cleared
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: csrs_info
List of csrs\_info
**type**\: list of :py:class:`CsrsInfo <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorSbeSoft.Error.CsrsInfo>`
.. attribute:: last_err
Last Printable error information
**type**\: list of :py:class:`LastErr <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorSbeSoft.Error.LastErr>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorSbeSoft.Error, self).__init__()
self.yang_name = "error"
self.yang_parent_name = "asic-error-sbe-soft"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("csrs-info", ("csrs_info", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorSbeSoft.Error.CsrsInfo)), ("last-err", ("last_err", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorSbeSoft.Error.LastErr))])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('asic_info', YLeaf(YType.str, 'asic-info')),
('node_key', YLeaf(YType.uint32, 'node-key')),
('alarm_on', YLeaf(YType.boolean, 'alarm-on')),
('thresh_hi', YLeaf(YType.uint32, 'thresh-hi')),
('period_hi', YLeaf(YType.uint32, 'period-hi')),
('thresh_lo', YLeaf(YType.uint32, 'thresh-lo')),
('period_lo', YLeaf(YType.uint32, 'period-lo')),
('count', YLeaf(YType.uint32, 'count')),
('intr_type', YLeaf(YType.uint32, 'intr-type')),
('leaf_id', YLeaf(YType.uint32, 'leaf-id')),
('last_cleared', YLeaf(YType.uint64, 'last-cleared')),
])
self.name = None
self.asic_info = None
self.node_key = None
self.alarm_on = None
self.thresh_hi = None
self.period_hi = None
self.thresh_lo = None
self.period_lo = None
self.count = None
self.intr_type = None
self.leaf_id = None
self.last_cleared = None
self.csrs_info = YList(self)
self.last_err = YList(self)
self._segment_path = lambda: "error"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorSbeSoft.Error, ['name', 'asic_info', 'node_key', 'alarm_on', 'thresh_hi', 'period_hi', 'thresh_lo', 'period_lo', 'count', 'intr_type', 'leaf_id', 'last_cleared'], name, value)
class CsrsInfo(Entity):
"""
List of csrs\_info
.. attribute:: name
name
**type**\: str
.. attribute:: address
address
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: width
width
**type**\: int
**range:** 0..4294967295
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorSbeSoft.Error.CsrsInfo, self).__init__()
self.yang_name = "csrs-info"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('address', YLeaf(YType.uint64, 'address')),
('width', YLeaf(YType.uint32, 'width')),
])
self.name = None
self.address = None
self.width = None
self._segment_path = lambda: "csrs-info"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorSbeSoft.Error.CsrsInfo, ['name', 'address', 'width'], name, value)
class LastErr(Entity):
"""
Last Printable error information
.. attribute:: at_time
at time
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: at_time_nsec
at time nsec
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: counter_val
counter val
**type**\: int
**range:** 0..4294967295
.. attribute:: error_desc
error desc
**type**\: str
.. attribute:: error_regval
error regval
**type**\: list of int
**range:** 0..255
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorSbeSoft.Error.LastErr, self).__init__()
self.yang_name = "last-err"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('at_time', YLeaf(YType.uint64, 'at-time')),
('at_time_nsec', YLeaf(YType.uint64, 'at-time-nsec')),
('counter_val', YLeaf(YType.uint32, 'counter-val')),
('error_desc', YLeaf(YType.str, 'error-desc')),
('error_regval', YLeafList(YType.uint8, 'error-regval')),
])
self.at_time = None
self.at_time_nsec = None
self.counter_val = None
self.error_desc = None
self.error_regval = []
self._segment_path = lambda: "last-err"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorSbeSoft.Error.LastErr, ['at_time', 'at_time_nsec', 'counter_val', 'error_desc', 'error_regval'], name, value)
class HardwareSoftErrors(Entity):
"""
Hardware soft error information
.. attribute:: error
Collection of errors
**type**\: list of :py:class:`Error <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.HardwareSoftErrors.Error>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.HardwareSoftErrors, self).__init__()
self.yang_name = "hardware-soft-errors"
self.yang_parent_name = "error-path"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("error", ("error", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.HardwareSoftErrors.Error))])
self._leafs = OrderedDict()
self.error = YList(self)
self._segment_path = lambda: "hardware-soft-errors"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.HardwareSoftErrors, [], name, value)
class Error(Entity):
"""
Collection of errors
.. attribute:: name
Name assigned to mem
**type**\: str
.. attribute:: asic_info
Name of rack/board/asic
**type**\: str
.. attribute:: node_key
32 bit key
**type**\: int
**range:** 0..4294967295
.. attribute:: alarm_on
High threshold crossed
**type**\: bool
.. attribute:: thresh_hi
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_hi
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: thresh_lo
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_lo
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: count
Accumulated count
**type**\: int
**range:** 0..4294967295
.. attribute:: intr_type
Type of error
**type**\: int
**range:** 0..4294967295
.. attribute:: leaf_id
Leaf ID defined in user data
**type**\: int
**range:** 0..4294967295
.. attribute:: last_cleared
Time cleared
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: csrs_info
List of csrs\_info
**type**\: list of :py:class:`CsrsInfo <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.HardwareSoftErrors.Error.CsrsInfo>`
.. attribute:: last_err
Last Printable error information
**type**\: list of :py:class:`LastErr <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.HardwareSoftErrors.Error.LastErr>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.HardwareSoftErrors.Error, self).__init__()
self.yang_name = "error"
self.yang_parent_name = "hardware-soft-errors"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("csrs-info", ("csrs_info", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.HardwareSoftErrors.Error.CsrsInfo)), ("last-err", ("last_err", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.HardwareSoftErrors.Error.LastErr))])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('asic_info', YLeaf(YType.str, 'asic-info')),
('node_key', YLeaf(YType.uint32, 'node-key')),
('alarm_on', YLeaf(YType.boolean, 'alarm-on')),
('thresh_hi', YLeaf(YType.uint32, 'thresh-hi')),
('period_hi', YLeaf(YType.uint32, 'period-hi')),
('thresh_lo', YLeaf(YType.uint32, 'thresh-lo')),
('period_lo', YLeaf(YType.uint32, 'period-lo')),
('count', YLeaf(YType.uint32, 'count')),
('intr_type', YLeaf(YType.uint32, 'intr-type')),
('leaf_id', YLeaf(YType.uint32, 'leaf-id')),
('last_cleared', YLeaf(YType.uint64, 'last-cleared')),
])
self.name = None
self.asic_info = None
self.node_key = None
self.alarm_on = None
self.thresh_hi = None
self.period_hi = None
self.thresh_lo = None
self.period_lo = None
self.count = None
self.intr_type = None
self.leaf_id = None
self.last_cleared = None
self.csrs_info = YList(self)
self.last_err = YList(self)
self._segment_path = lambda: "error"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.HardwareSoftErrors.Error, ['name', 'asic_info', 'node_key', 'alarm_on', 'thresh_hi', 'period_hi', 'thresh_lo', 'period_lo', 'count', 'intr_type', 'leaf_id', 'last_cleared'], name, value)
class CsrsInfo(Entity):
"""
List of csrs\_info
.. attribute:: name
name
**type**\: str
.. attribute:: address
address
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: width
width
**type**\: int
**range:** 0..4294967295
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.HardwareSoftErrors.Error.CsrsInfo, self).__init__()
self.yang_name = "csrs-info"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('address', YLeaf(YType.uint64, 'address')),
('width', YLeaf(YType.uint32, 'width')),
])
self.name = None
self.address = None
self.width = None
self._segment_path = lambda: "csrs-info"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.HardwareSoftErrors.Error.CsrsInfo, ['name', 'address', 'width'], name, value)
class LastErr(Entity):
"""
Last Printable error information
.. attribute:: at_time
at time
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: at_time_nsec
at time nsec
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: counter_val
counter val
**type**\: int
**range:** 0..4294967295
.. attribute:: error_desc
error desc
**type**\: str
.. attribute:: error_regval
error regval
**type**\: list of int
**range:** 0..255
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.HardwareSoftErrors.Error.LastErr, self).__init__()
self.yang_name = "last-err"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('at_time', YLeaf(YType.uint64, 'at-time')),
('at_time_nsec', YLeaf(YType.uint64, 'at-time-nsec')),
('counter_val', YLeaf(YType.uint32, 'counter-val')),
('error_desc', YLeaf(YType.str, 'error-desc')),
('error_regval', YLeafList(YType.uint8, 'error-regval')),
])
self.at_time = None
self.at_time_nsec = None
self.counter_val = None
self.error_desc = None
self.error_regval = []
self._segment_path = lambda: "last-err"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.HardwareSoftErrors.Error.LastErr, ['at_time', 'at_time_nsec', 'counter_val', 'error_desc', 'error_regval'], name, value)
class AsicErrorCrcSoft(Entity):
"""
Indirect hard error information
.. attribute:: error
Collection of errors
**type**\: list of :py:class:`Error <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorCrcSoft.Error>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorCrcSoft, self).__init__()
self.yang_name = "asic-error-crc-soft"
self.yang_parent_name = "error-path"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("error", ("error", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorCrcSoft.Error))])
self._leafs = OrderedDict()
self.error = YList(self)
self._segment_path = lambda: "asic-error-crc-soft"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorCrcSoft, [], name, value)
class Error(Entity):
"""
Collection of errors
.. attribute:: name
Name assigned to mem
**type**\: str
.. attribute:: asic_info
Name of rack/board/asic
**type**\: str
.. attribute:: node_key
32 bit key
**type**\: int
**range:** 0..4294967295
.. attribute:: alarm_on
High threshold crossed
**type**\: bool
.. attribute:: thresh_hi
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_hi
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: thresh_lo
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_lo
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: count
Accumulated count
**type**\: int
**range:** 0..4294967295
.. attribute:: intr_type
Type of error
**type**\: int
**range:** 0..4294967295
.. attribute:: leaf_id
Leaf ID defined in user data
**type**\: int
**range:** 0..4294967295
.. attribute:: last_cleared
Time cleared
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: csrs_info
List of csrs\_info
**type**\: list of :py:class:`CsrsInfo <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorCrcSoft.Error.CsrsInfo>`
.. attribute:: last_err
Last Printable error information
**type**\: list of :py:class:`LastErr <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorCrcSoft.Error.LastErr>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorCrcSoft.Error, self).__init__()
self.yang_name = "error"
self.yang_parent_name = "asic-error-crc-soft"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("csrs-info", ("csrs_info", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorCrcSoft.Error.CsrsInfo)), ("last-err", ("last_err", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorCrcSoft.Error.LastErr))])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('asic_info', YLeaf(YType.str, 'asic-info')),
('node_key', YLeaf(YType.uint32, 'node-key')),
('alarm_on', YLeaf(YType.boolean, 'alarm-on')),
('thresh_hi', YLeaf(YType.uint32, 'thresh-hi')),
('period_hi', YLeaf(YType.uint32, 'period-hi')),
('thresh_lo', YLeaf(YType.uint32, 'thresh-lo')),
('period_lo', YLeaf(YType.uint32, 'period-lo')),
('count', YLeaf(YType.uint32, 'count')),
('intr_type', YLeaf(YType.uint32, 'intr-type')),
('leaf_id', YLeaf(YType.uint32, 'leaf-id')),
('last_cleared', YLeaf(YType.uint64, 'last-cleared')),
])
self.name = None
self.asic_info = None
self.node_key = None
self.alarm_on = None
self.thresh_hi = None
self.period_hi = None
self.thresh_lo = None
self.period_lo = None
self.count = None
self.intr_type = None
self.leaf_id = None
self.last_cleared = None
self.csrs_info = YList(self)
self.last_err = YList(self)
self._segment_path = lambda: "error"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorCrcSoft.Error, ['name', 'asic_info', 'node_key', 'alarm_on', 'thresh_hi', 'period_hi', 'thresh_lo', 'period_lo', 'count', 'intr_type', 'leaf_id', 'last_cleared'], name, value)
class CsrsInfo(Entity):
"""
List of csrs\_info
.. attribute:: name
name
**type**\: str
.. attribute:: address
address
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: width
width
**type**\: int
**range:** 0..4294967295
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorCrcSoft.Error.CsrsInfo, self).__init__()
self.yang_name = "csrs-info"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('address', YLeaf(YType.uint64, 'address')),
('width', YLeaf(YType.uint32, 'width')),
])
self.name = None
self.address = None
self.width = None
self._segment_path = lambda: "csrs-info"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorCrcSoft.Error.CsrsInfo, ['name', 'address', 'width'], name, value)
class LastErr(Entity):
"""
Last Printable error information
.. attribute:: at_time
at time
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: at_time_nsec
at time nsec
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: counter_val
counter val
**type**\: int
**range:** 0..4294967295
.. attribute:: error_desc
error desc
**type**\: str
.. attribute:: error_regval
error regval
**type**\: list of int
**range:** 0..255
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorCrcSoft.Error.LastErr, self).__init__()
self.yang_name = "last-err"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('at_time', YLeaf(YType.uint64, 'at-time')),
('at_time_nsec', YLeaf(YType.uint64, 'at-time-nsec')),
('counter_val', YLeaf(YType.uint32, 'counter-val')),
('error_desc', YLeaf(YType.str, 'error-desc')),
('error_regval', YLeafList(YType.uint8, 'error-regval')),
])
self.at_time = None
self.at_time_nsec = None
self.counter_val = None
self.error_desc = None
self.error_regval = []
self._segment_path = lambda: "last-err"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorCrcSoft.Error.LastErr, ['at_time', 'at_time_nsec', 'counter_val', 'error_desc', 'error_regval'], name, value)
class AsicErrorParitySoft(Entity):
"""
Indirect hard error information
.. attribute:: error
Collection of errors
**type**\: list of :py:class:`Error <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorParitySoft.Error>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorParitySoft, self).__init__()
self.yang_name = "asic-error-parity-soft"
self.yang_parent_name = "error-path"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("error", ("error", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorParitySoft.Error))])
self._leafs = OrderedDict()
self.error = YList(self)
self._segment_path = lambda: "asic-error-parity-soft"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorParitySoft, [], name, value)
class Error(Entity):
"""
Collection of errors
.. attribute:: name
Name assigned to mem
**type**\: str
.. attribute:: asic_info
Name of rack/board/asic
**type**\: str
.. attribute:: node_key
32 bit key
**type**\: int
**range:** 0..4294967295
.. attribute:: alarm_on
High threshold crossed
**type**\: bool
.. attribute:: thresh_hi
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_hi
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: thresh_lo
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_lo
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: count
Accumulated count
**type**\: int
**range:** 0..4294967295
.. attribute:: intr_type
Type of error
**type**\: int
**range:** 0..4294967295
.. attribute:: leaf_id
Leaf ID defined in user data
**type**\: int
**range:** 0..4294967295
.. attribute:: last_cleared
Time cleared
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: csrs_info
List of csrs\_info
**type**\: list of :py:class:`CsrsInfo <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorParitySoft.Error.CsrsInfo>`
.. attribute:: last_err
Last Printable error information
**type**\: list of :py:class:`LastErr <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorParitySoft.Error.LastErr>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorParitySoft.Error, self).__init__()
self.yang_name = "error"
self.yang_parent_name = "asic-error-parity-soft"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("csrs-info", ("csrs_info", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorParitySoft.Error.CsrsInfo)), ("last-err", ("last_err", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorParitySoft.Error.LastErr))])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('asic_info', YLeaf(YType.str, 'asic-info')),
('node_key', YLeaf(YType.uint32, 'node-key')),
('alarm_on', YLeaf(YType.boolean, 'alarm-on')),
('thresh_hi', YLeaf(YType.uint32, 'thresh-hi')),
('period_hi', YLeaf(YType.uint32, 'period-hi')),
('thresh_lo', YLeaf(YType.uint32, 'thresh-lo')),
('period_lo', YLeaf(YType.uint32, 'period-lo')),
('count', YLeaf(YType.uint32, 'count')),
('intr_type', YLeaf(YType.uint32, 'intr-type')),
('leaf_id', YLeaf(YType.uint32, 'leaf-id')),
('last_cleared', YLeaf(YType.uint64, 'last-cleared')),
])
self.name = None
self.asic_info = None
self.node_key = None
self.alarm_on = None
self.thresh_hi = None
self.period_hi = None
self.thresh_lo = None
self.period_lo = None
self.count = None
self.intr_type = None
self.leaf_id = None
self.last_cleared = None
self.csrs_info = YList(self)
self.last_err = YList(self)
self._segment_path = lambda: "error"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorParitySoft.Error, ['name', 'asic_info', 'node_key', 'alarm_on', 'thresh_hi', 'period_hi', 'thresh_lo', 'period_lo', 'count', 'intr_type', 'leaf_id', 'last_cleared'], name, value)
class CsrsInfo(Entity):
"""
List of csrs\_info
.. attribute:: name
name
**type**\: str
.. attribute:: address
address
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: width
width
**type**\: int
**range:** 0..4294967295
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorParitySoft.Error.CsrsInfo, self).__init__()
self.yang_name = "csrs-info"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('address', YLeaf(YType.uint64, 'address')),
('width', YLeaf(YType.uint32, 'width')),
])
self.name = None
self.address = None
self.width = None
self._segment_path = lambda: "csrs-info"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorParitySoft.Error.CsrsInfo, ['name', 'address', 'width'], name, value)
class LastErr(Entity):
"""
Last Printable error information
.. attribute:: at_time
at time
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: at_time_nsec
at time nsec
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: counter_val
counter val
**type**\: int
**range:** 0..4294967295
.. attribute:: error_desc
error desc
**type**\: str
.. attribute:: error_regval
error regval
**type**\: list of int
**range:** 0..255
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorParitySoft.Error.LastErr, self).__init__()
self.yang_name = "last-err"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('at_time', YLeaf(YType.uint64, 'at-time')),
('at_time_nsec', YLeaf(YType.uint64, 'at-time-nsec')),
('counter_val', YLeaf(YType.uint32, 'counter-val')),
('error_desc', YLeaf(YType.str, 'error-desc')),
('error_regval', YLeafList(YType.uint8, 'error-regval')),
])
self.at_time = None
self.at_time_nsec = None
self.counter_val = None
self.error_desc = None
self.error_regval = []
self._segment_path = lambda: "last-err"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorParitySoft.Error.LastErr, ['at_time', 'at_time_nsec', 'counter_val', 'error_desc', 'error_regval'], name, value)
class IoSoftErrors(Entity):
"""
IO soft error information
.. attribute:: error
Collection of errors
**type**\: list of :py:class:`Error <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.IoSoftErrors.Error>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.IoSoftErrors, self).__init__()
self.yang_name = "io-soft-errors"
self.yang_parent_name = "error-path"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("error", ("error", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.IoSoftErrors.Error))])
self._leafs = OrderedDict()
self.error = YList(self)
self._segment_path = lambda: "io-soft-errors"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.IoSoftErrors, [], name, value)
class Error(Entity):
"""
Collection of errors
.. attribute:: name
Name assigned to mem
**type**\: str
.. attribute:: asic_info
Name of rack/board/asic
**type**\: str
.. attribute:: node_key
32 bit key
**type**\: int
**range:** 0..4294967295
.. attribute:: alarm_on
High threshold crossed
**type**\: bool
.. attribute:: thresh_hi
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_hi
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: thresh_lo
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_lo
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: count
Accumulated count
**type**\: int
**range:** 0..4294967295
.. attribute:: intr_type
Type of error
**type**\: int
**range:** 0..4294967295
.. attribute:: leaf_id
Leaf ID defined in user data
**type**\: int
**range:** 0..4294967295
.. attribute:: last_cleared
Time cleared
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: csrs_info
List of csrs\_info
**type**\: list of :py:class:`CsrsInfo <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.IoSoftErrors.Error.CsrsInfo>`
.. attribute:: last_err
Last Printable error information
**type**\: list of :py:class:`LastErr <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.IoSoftErrors.Error.LastErr>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.IoSoftErrors.Error, self).__init__()
self.yang_name = "error"
self.yang_parent_name = "io-soft-errors"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("csrs-info", ("csrs_info", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.IoSoftErrors.Error.CsrsInfo)), ("last-err", ("last_err", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.IoSoftErrors.Error.LastErr))])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('asic_info', YLeaf(YType.str, 'asic-info')),
('node_key', YLeaf(YType.uint32, 'node-key')),
('alarm_on', YLeaf(YType.boolean, 'alarm-on')),
('thresh_hi', YLeaf(YType.uint32, 'thresh-hi')),
('period_hi', YLeaf(YType.uint32, 'period-hi')),
('thresh_lo', YLeaf(YType.uint32, 'thresh-lo')),
('period_lo', YLeaf(YType.uint32, 'period-lo')),
('count', YLeaf(YType.uint32, 'count')),
('intr_type', YLeaf(YType.uint32, 'intr-type')),
('leaf_id', YLeaf(YType.uint32, 'leaf-id')),
('last_cleared', YLeaf(YType.uint64, 'last-cleared')),
])
self.name = None
self.asic_info = None
self.node_key = None
self.alarm_on = None
self.thresh_hi = None
self.period_hi = None
self.thresh_lo = None
self.period_lo = None
self.count = None
self.intr_type = None
self.leaf_id = None
self.last_cleared = None
self.csrs_info = YList(self)
self.last_err = YList(self)
self._segment_path = lambda: "error"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.IoSoftErrors.Error, ['name', 'asic_info', 'node_key', 'alarm_on', 'thresh_hi', 'period_hi', 'thresh_lo', 'period_lo', 'count', 'intr_type', 'leaf_id', 'last_cleared'], name, value)
class CsrsInfo(Entity):
"""
List of csrs\_info
.. attribute:: name
name
**type**\: str
.. attribute:: address
address
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: width
width
**type**\: int
**range:** 0..4294967295
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.IoSoftErrors.Error.CsrsInfo, self).__init__()
self.yang_name = "csrs-info"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('address', YLeaf(YType.uint64, 'address')),
('width', YLeaf(YType.uint32, 'width')),
])
self.name = None
self.address = None
self.width = None
self._segment_path = lambda: "csrs-info"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.IoSoftErrors.Error.CsrsInfo, ['name', 'address', 'width'], name, value)
class LastErr(Entity):
"""
Last Printable error information
.. attribute:: at_time
at time
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: at_time_nsec
at time nsec
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: counter_val
counter val
**type**\: int
**range:** 0..4294967295
.. attribute:: error_desc
error desc
**type**\: str
.. attribute:: error_regval
error regval
**type**\: list of int
**range:** 0..255
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.IoSoftErrors.Error.LastErr, self).__init__()
self.yang_name = "last-err"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('at_time', YLeaf(YType.uint64, 'at-time')),
('at_time_nsec', YLeaf(YType.uint64, 'at-time-nsec')),
('counter_val', YLeaf(YType.uint32, 'counter-val')),
('error_desc', YLeaf(YType.str, 'error-desc')),
('error_regval', YLeafList(YType.uint8, 'error-regval')),
])
self.at_time = None
self.at_time_nsec = None
self.counter_val = None
self.error_desc = None
self.error_regval = []
self._segment_path = lambda: "last-err"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.IoSoftErrors.Error.LastErr, ['at_time', 'at_time_nsec', 'counter_val', 'error_desc', 'error_regval'], name, value)
class ResetSoftErrors(Entity):
"""
Reset soft error information
.. attribute:: error
Collection of errors
**type**\: list of :py:class:`Error <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ResetSoftErrors.Error>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ResetSoftErrors, self).__init__()
self.yang_name = "reset-soft-errors"
self.yang_parent_name = "error-path"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("error", ("error", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ResetSoftErrors.Error))])
self._leafs = OrderedDict()
self.error = YList(self)
self._segment_path = lambda: "reset-soft-errors"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ResetSoftErrors, [], name, value)
class Error(Entity):
"""
Collection of errors
.. attribute:: name
Name assigned to mem
**type**\: str
.. attribute:: asic_info
Name of rack/board/asic
**type**\: str
.. attribute:: node_key
32 bit key
**type**\: int
**range:** 0..4294967295
.. attribute:: alarm_on
High threshold crossed
**type**\: bool
.. attribute:: thresh_hi
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_hi
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: thresh_lo
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_lo
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: count
Accumulated count
**type**\: int
**range:** 0..4294967295
.. attribute:: intr_type
Type of error
**type**\: int
**range:** 0..4294967295
.. attribute:: leaf_id
Leaf ID defined in user data
**type**\: int
**range:** 0..4294967295
.. attribute:: last_cleared
Time cleared
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: csrs_info
List of csrs\_info
**type**\: list of :py:class:`CsrsInfo <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ResetSoftErrors.Error.CsrsInfo>`
.. attribute:: last_err
Last Printable error information
**type**\: list of :py:class:`LastErr <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ResetSoftErrors.Error.LastErr>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ResetSoftErrors.Error, self).__init__()
self.yang_name = "error"
self.yang_parent_name = "reset-soft-errors"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("csrs-info", ("csrs_info", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ResetSoftErrors.Error.CsrsInfo)), ("last-err", ("last_err", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ResetSoftErrors.Error.LastErr))])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('asic_info', YLeaf(YType.str, 'asic-info')),
('node_key', YLeaf(YType.uint32, 'node-key')),
('alarm_on', YLeaf(YType.boolean, 'alarm-on')),
('thresh_hi', YLeaf(YType.uint32, 'thresh-hi')),
('period_hi', YLeaf(YType.uint32, 'period-hi')),
('thresh_lo', YLeaf(YType.uint32, 'thresh-lo')),
('period_lo', YLeaf(YType.uint32, 'period-lo')),
('count', YLeaf(YType.uint32, 'count')),
('intr_type', YLeaf(YType.uint32, 'intr-type')),
('leaf_id', YLeaf(YType.uint32, 'leaf-id')),
('last_cleared', YLeaf(YType.uint64, 'last-cleared')),
])
self.name = None
self.asic_info = None
self.node_key = None
self.alarm_on = None
self.thresh_hi = None
self.period_hi = None
self.thresh_lo = None
self.period_lo = None
self.count = None
self.intr_type = None
self.leaf_id = None
self.last_cleared = None
self.csrs_info = YList(self)
self.last_err = YList(self)
self._segment_path = lambda: "error"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ResetSoftErrors.Error, ['name', 'asic_info', 'node_key', 'alarm_on', 'thresh_hi', 'period_hi', 'thresh_lo', 'period_lo', 'count', 'intr_type', 'leaf_id', 'last_cleared'], name, value)
class CsrsInfo(Entity):
"""
List of csrs\_info
.. attribute:: name
name
**type**\: str
.. attribute:: address
address
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: width
width
**type**\: int
**range:** 0..4294967295
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ResetSoftErrors.Error.CsrsInfo, self).__init__()
self.yang_name = "csrs-info"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('address', YLeaf(YType.uint64, 'address')),
('width', YLeaf(YType.uint32, 'width')),
])
self.name = None
self.address = None
self.width = None
self._segment_path = lambda: "csrs-info"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ResetSoftErrors.Error.CsrsInfo, ['name', 'address', 'width'], name, value)
class LastErr(Entity):
"""
Last Printable error information
.. attribute:: at_time
at time
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: at_time_nsec
at time nsec
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: counter_val
counter val
**type**\: int
**range:** 0..4294967295
.. attribute:: error_desc
error desc
**type**\: str
.. attribute:: error_regval
error regval
**type**\: list of int
**range:** 0..255
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ResetSoftErrors.Error.LastErr, self).__init__()
self.yang_name = "last-err"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('at_time', YLeaf(YType.uint64, 'at-time')),
('at_time_nsec', YLeaf(YType.uint64, 'at-time-nsec')),
('counter_val', YLeaf(YType.uint32, 'counter-val')),
('error_desc', YLeaf(YType.str, 'error-desc')),
('error_regval', YLeafList(YType.uint8, 'error-regval')),
])
self.at_time = None
self.at_time_nsec = None
self.counter_val = None
self.error_desc = None
self.error_regval = []
self._segment_path = lambda: "last-err"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ResetSoftErrors.Error.LastErr, ['at_time', 'at_time_nsec', 'counter_val', 'error_desc', 'error_regval'], name, value)
class BarrierHardErrors(Entity):
"""
Barrier hard error information
.. attribute:: error
Collection of errors
**type**\: list of :py:class:`Error <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.BarrierHardErrors.Error>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.BarrierHardErrors, self).__init__()
self.yang_name = "barrier-hard-errors"
self.yang_parent_name = "error-path"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("error", ("error", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.BarrierHardErrors.Error))])
self._leafs = OrderedDict()
self.error = YList(self)
self._segment_path = lambda: "barrier-hard-errors"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.BarrierHardErrors, [], name, value)
class Error(Entity):
"""
Collection of errors
.. attribute:: name
Name assigned to mem
**type**\: str
.. attribute:: asic_info
Name of rack/board/asic
**type**\: str
.. attribute:: node_key
32 bit key
**type**\: int
**range:** 0..4294967295
.. attribute:: alarm_on
High threshold crossed
**type**\: bool
.. attribute:: thresh_hi
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_hi
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: thresh_lo
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_lo
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: count
Accumulated count
**type**\: int
**range:** 0..4294967295
.. attribute:: intr_type
Type of error
**type**\: int
**range:** 0..4294967295
.. attribute:: leaf_id
Leaf ID defined in user data
**type**\: int
**range:** 0..4294967295
.. attribute:: last_cleared
Time cleared
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: csrs_info
List of csrs\_info
**type**\: list of :py:class:`CsrsInfo <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.BarrierHardErrors.Error.CsrsInfo>`
.. attribute:: last_err
Last Printable error information
**type**\: list of :py:class:`LastErr <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.BarrierHardErrors.Error.LastErr>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.BarrierHardErrors.Error, self).__init__()
self.yang_name = "error"
self.yang_parent_name = "barrier-hard-errors"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("csrs-info", ("csrs_info", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.BarrierHardErrors.Error.CsrsInfo)), ("last-err", ("last_err", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.BarrierHardErrors.Error.LastErr))])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('asic_info', YLeaf(YType.str, 'asic-info')),
('node_key', YLeaf(YType.uint32, 'node-key')),
('alarm_on', YLeaf(YType.boolean, 'alarm-on')),
('thresh_hi', YLeaf(YType.uint32, 'thresh-hi')),
('period_hi', YLeaf(YType.uint32, 'period-hi')),
('thresh_lo', YLeaf(YType.uint32, 'thresh-lo')),
('period_lo', YLeaf(YType.uint32, 'period-lo')),
('count', YLeaf(YType.uint32, 'count')),
('intr_type', YLeaf(YType.uint32, 'intr-type')),
('leaf_id', YLeaf(YType.uint32, 'leaf-id')),
('last_cleared', YLeaf(YType.uint64, 'last-cleared')),
])
self.name = None
self.asic_info = None
self.node_key = None
self.alarm_on = None
self.thresh_hi = None
self.period_hi = None
self.thresh_lo = None
self.period_lo = None
self.count = None
self.intr_type = None
self.leaf_id = None
self.last_cleared = None
self.csrs_info = YList(self)
self.last_err = YList(self)
self._segment_path = lambda: "error"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.BarrierHardErrors.Error, ['name', 'asic_info', 'node_key', 'alarm_on', 'thresh_hi', 'period_hi', 'thresh_lo', 'period_lo', 'count', 'intr_type', 'leaf_id', 'last_cleared'], name, value)
class CsrsInfo(Entity):
"""
List of csrs\_info
.. attribute:: name
name
**type**\: str
.. attribute:: address
address
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: width
width
**type**\: int
**range:** 0..4294967295
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.BarrierHardErrors.Error.CsrsInfo, self).__init__()
self.yang_name = "csrs-info"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('address', YLeaf(YType.uint64, 'address')),
('width', YLeaf(YType.uint32, 'width')),
])
self.name = None
self.address = None
self.width = None
self._segment_path = lambda: "csrs-info"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.BarrierHardErrors.Error.CsrsInfo, ['name', 'address', 'width'], name, value)
class LastErr(Entity):
"""
Last Printable error information
.. attribute:: at_time
at time
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: at_time_nsec
at time nsec
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: counter_val
counter val
**type**\: int
**range:** 0..4294967295
.. attribute:: error_desc
error desc
**type**\: str
.. attribute:: error_regval
error regval
**type**\: list of int
**range:** 0..255
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.BarrierHardErrors.Error.LastErr, self).__init__()
self.yang_name = "last-err"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('at_time', YLeaf(YType.uint64, 'at-time')),
('at_time_nsec', YLeaf(YType.uint64, 'at-time-nsec')),
('counter_val', YLeaf(YType.uint32, 'counter-val')),
('error_desc', YLeaf(YType.str, 'error-desc')),
('error_regval', YLeafList(YType.uint8, 'error-regval')),
])
self.at_time = None
self.at_time_nsec = None
self.counter_val = None
self.error_desc = None
self.error_regval = []
self._segment_path = lambda: "last-err"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.BarrierHardErrors.Error.LastErr, ['at_time', 'at_time_nsec', 'counter_val', 'error_desc', 'error_regval'], name, value)
class UcodeSoftErrors(Entity):
"""
Ucode soft error information
.. attribute:: error
Collection of errors
**type**\: list of :py:class:`Error <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.UcodeSoftErrors.Error>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.UcodeSoftErrors, self).__init__()
self.yang_name = "ucode-soft-errors"
self.yang_parent_name = "error-path"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("error", ("error", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.UcodeSoftErrors.Error))])
self._leafs = OrderedDict()
self.error = YList(self)
self._segment_path = lambda: "ucode-soft-errors"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.UcodeSoftErrors, [], name, value)
class Error(Entity):
"""
Collection of errors
.. attribute:: name
Name assigned to mem
**type**\: str
.. attribute:: asic_info
Name of rack/board/asic
**type**\: str
.. attribute:: node_key
32 bit key
**type**\: int
**range:** 0..4294967295
.. attribute:: alarm_on
High threshold crossed
**type**\: bool
.. attribute:: thresh_hi
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_hi
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: thresh_lo
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_lo
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: count
Accumulated count
**type**\: int
**range:** 0..4294967295
.. attribute:: intr_type
Type of error
**type**\: int
**range:** 0..4294967295
.. attribute:: leaf_id
Leaf ID defined in user data
**type**\: int
**range:** 0..4294967295
.. attribute:: last_cleared
Time cleared
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: csrs_info
List of csrs\_info
**type**\: list of :py:class:`CsrsInfo <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.UcodeSoftErrors.Error.CsrsInfo>`
.. attribute:: last_err
Last Printable error information
**type**\: list of :py:class:`LastErr <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.UcodeSoftErrors.Error.LastErr>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.UcodeSoftErrors.Error, self).__init__()
self.yang_name = "error"
self.yang_parent_name = "ucode-soft-errors"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("csrs-info", ("csrs_info", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.UcodeSoftErrors.Error.CsrsInfo)), ("last-err", ("last_err", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.UcodeSoftErrors.Error.LastErr))])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('asic_info', YLeaf(YType.str, 'asic-info')),
('node_key', YLeaf(YType.uint32, 'node-key')),
('alarm_on', YLeaf(YType.boolean, 'alarm-on')),
('thresh_hi', YLeaf(YType.uint32, 'thresh-hi')),
('period_hi', YLeaf(YType.uint32, 'period-hi')),
('thresh_lo', YLeaf(YType.uint32, 'thresh-lo')),
('period_lo', YLeaf(YType.uint32, 'period-lo')),
('count', YLeaf(YType.uint32, 'count')),
('intr_type', YLeaf(YType.uint32, 'intr-type')),
('leaf_id', YLeaf(YType.uint32, 'leaf-id')),
('last_cleared', YLeaf(YType.uint64, 'last-cleared')),
])
self.name = None
self.asic_info = None
self.node_key = None
self.alarm_on = None
self.thresh_hi = None
self.period_hi = None
self.thresh_lo = None
self.period_lo = None
self.count = None
self.intr_type = None
self.leaf_id = None
self.last_cleared = None
self.csrs_info = YList(self)
self.last_err = YList(self)
self._segment_path = lambda: "error"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.UcodeSoftErrors.Error, ['name', 'asic_info', 'node_key', 'alarm_on', 'thresh_hi', 'period_hi', 'thresh_lo', 'period_lo', 'count', 'intr_type', 'leaf_id', 'last_cleared'], name, value)
class CsrsInfo(Entity):
"""
List of csrs\_info
.. attribute:: name
name
**type**\: str
.. attribute:: address
address
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: width
width
**type**\: int
**range:** 0..4294967295
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.UcodeSoftErrors.Error.CsrsInfo, self).__init__()
self.yang_name = "csrs-info"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('address', YLeaf(YType.uint64, 'address')),
('width', YLeaf(YType.uint32, 'width')),
])
self.name = None
self.address = None
self.width = None
self._segment_path = lambda: "csrs-info"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.UcodeSoftErrors.Error.CsrsInfo, ['name', 'address', 'width'], name, value)
class LastErr(Entity):
"""
Last Printable error information
.. attribute:: at_time
at time
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: at_time_nsec
at time nsec
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: counter_val
counter val
**type**\: int
**range:** 0..4294967295
.. attribute:: error_desc
error desc
**type**\: str
.. attribute:: error_regval
error regval
**type**\: list of int
**range:** 0..255
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.UcodeSoftErrors.Error.LastErr, self).__init__()
self.yang_name = "last-err"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('at_time', YLeaf(YType.uint64, 'at-time')),
('at_time_nsec', YLeaf(YType.uint64, 'at-time-nsec')),
('counter_val', YLeaf(YType.uint32, 'counter-val')),
('error_desc', YLeaf(YType.str, 'error-desc')),
('error_regval', YLeafList(YType.uint8, 'error-regval')),
])
self.at_time = None
self.at_time_nsec = None
self.counter_val = None
self.error_desc = None
self.error_regval = []
self._segment_path = lambda: "last-err"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.UcodeSoftErrors.Error.LastErr, ['at_time', 'at_time_nsec', 'counter_val', 'error_desc', 'error_regval'], name, value)
class AsicErrorResetHard(Entity):
"""
Indirect hard error information
.. attribute:: error
Collection of errors
**type**\: list of :py:class:`Error <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorResetHard.Error>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorResetHard, self).__init__()
self.yang_name = "asic-error-reset-hard"
self.yang_parent_name = "error-path"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("error", ("error", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorResetHard.Error))])
self._leafs = OrderedDict()
self.error = YList(self)
self._segment_path = lambda: "asic-error-reset-hard"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorResetHard, [], name, value)
class Error(Entity):
"""
Collection of errors
.. attribute:: name
Name assigned to mem
**type**\: str
.. attribute:: asic_info
Name of rack/board/asic
**type**\: str
.. attribute:: node_key
32 bit key
**type**\: int
**range:** 0..4294967295
.. attribute:: alarm_on
High threshold crossed
**type**\: bool
.. attribute:: thresh_hi
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_hi
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: thresh_lo
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_lo
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: count
Accumulated count
**type**\: int
**range:** 0..4294967295
.. attribute:: intr_type
Type of error
**type**\: int
**range:** 0..4294967295
.. attribute:: leaf_id
Leaf ID defined in user data
**type**\: int
**range:** 0..4294967295
.. attribute:: last_cleared
Time cleared
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: csrs_info
List of csrs\_info
**type**\: list of :py:class:`CsrsInfo <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorResetHard.Error.CsrsInfo>`
.. attribute:: last_err
Last Printable error information
**type**\: list of :py:class:`LastErr <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorResetHard.Error.LastErr>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorResetHard.Error, self).__init__()
self.yang_name = "error"
self.yang_parent_name = "asic-error-reset-hard"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("csrs-info", ("csrs_info", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorResetHard.Error.CsrsInfo)), ("last-err", ("last_err", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorResetHard.Error.LastErr))])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('asic_info', YLeaf(YType.str, 'asic-info')),
('node_key', YLeaf(YType.uint32, 'node-key')),
('alarm_on', YLeaf(YType.boolean, 'alarm-on')),
('thresh_hi', YLeaf(YType.uint32, 'thresh-hi')),
('period_hi', YLeaf(YType.uint32, 'period-hi')),
('thresh_lo', YLeaf(YType.uint32, 'thresh-lo')),
('period_lo', YLeaf(YType.uint32, 'period-lo')),
('count', YLeaf(YType.uint32, 'count')),
('intr_type', YLeaf(YType.uint32, 'intr-type')),
('leaf_id', YLeaf(YType.uint32, 'leaf-id')),
('last_cleared', YLeaf(YType.uint64, 'last-cleared')),
])
self.name = None
self.asic_info = None
self.node_key = None
self.alarm_on = None
self.thresh_hi = None
self.period_hi = None
self.thresh_lo = None
self.period_lo = None
self.count = None
self.intr_type = None
self.leaf_id = None
self.last_cleared = None
self.csrs_info = YList(self)
self.last_err = YList(self)
self._segment_path = lambda: "error"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorResetHard.Error, ['name', 'asic_info', 'node_key', 'alarm_on', 'thresh_hi', 'period_hi', 'thresh_lo', 'period_lo', 'count', 'intr_type', 'leaf_id', 'last_cleared'], name, value)
class CsrsInfo(Entity):
"""
List of csrs\_info
.. attribute:: name
name
**type**\: str
.. attribute:: address
address
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: width
width
**type**\: int
**range:** 0..4294967295
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorResetHard.Error.CsrsInfo, self).__init__()
self.yang_name = "csrs-info"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('address', YLeaf(YType.uint64, 'address')),
('width', YLeaf(YType.uint32, 'width')),
])
self.name = None
self.address = None
self.width = None
self._segment_path = lambda: "csrs-info"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorResetHard.Error.CsrsInfo, ['name', 'address', 'width'], name, value)
class LastErr(Entity):
"""
Last Printable error information
.. attribute:: at_time
at time
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: at_time_nsec
at time nsec
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: counter_val
counter val
**type**\: int
**range:** 0..4294967295
.. attribute:: error_desc
error desc
**type**\: str
.. attribute:: error_regval
error regval
**type**\: list of int
**range:** 0..255
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorResetHard.Error.LastErr, self).__init__()
self.yang_name = "last-err"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('at_time', YLeaf(YType.uint64, 'at-time')),
('at_time_nsec', YLeaf(YType.uint64, 'at-time-nsec')),
('counter_val', YLeaf(YType.uint32, 'counter-val')),
('error_desc', YLeaf(YType.str, 'error-desc')),
('error_regval', YLeafList(YType.uint8, 'error-regval')),
])
self.at_time = None
self.at_time_nsec = None
self.counter_val = None
self.error_desc = None
self.error_regval = []
self._segment_path = lambda: "last-err"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorResetHard.Error.LastErr, ['at_time', 'at_time_nsec', 'counter_val', 'error_desc', 'error_regval'], name, value)
class SingleBitHardErrors(Entity):
"""
Single bit hard error information
.. attribute:: error
Collection of errors
**type**\: list of :py:class:`Error <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.SingleBitHardErrors.Error>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.SingleBitHardErrors, self).__init__()
self.yang_name = "single-bit-hard-errors"
self.yang_parent_name = "error-path"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("error", ("error", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.SingleBitHardErrors.Error))])
self._leafs = OrderedDict()
self.error = YList(self)
self._segment_path = lambda: "single-bit-hard-errors"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.SingleBitHardErrors, [], name, value)
class Error(Entity):
"""
Collection of errors
.. attribute:: name
Name assigned to mem
**type**\: str
.. attribute:: asic_info
Name of rack/board/asic
**type**\: str
.. attribute:: node_key
32 bit key
**type**\: int
**range:** 0..4294967295
.. attribute:: alarm_on
High threshold crossed
**type**\: bool
.. attribute:: thresh_hi
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_hi
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: thresh_lo
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_lo
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: count
Accumulated count
**type**\: int
**range:** 0..4294967295
.. attribute:: intr_type
Type of error
**type**\: int
**range:** 0..4294967295
.. attribute:: leaf_id
Leaf ID defined in user data
**type**\: int
**range:** 0..4294967295
.. attribute:: last_cleared
Time cleared
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: csrs_info
List of csrs\_info
**type**\: list of :py:class:`CsrsInfo <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.SingleBitHardErrors.Error.CsrsInfo>`
.. attribute:: last_err
Last Printable error information
**type**\: list of :py:class:`LastErr <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.SingleBitHardErrors.Error.LastErr>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.SingleBitHardErrors.Error, self).__init__()
self.yang_name = "error"
self.yang_parent_name = "single-bit-hard-errors"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("csrs-info", ("csrs_info", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.SingleBitHardErrors.Error.CsrsInfo)), ("last-err", ("last_err", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.SingleBitHardErrors.Error.LastErr))])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('asic_info', YLeaf(YType.str, 'asic-info')),
('node_key', YLeaf(YType.uint32, 'node-key')),
('alarm_on', YLeaf(YType.boolean, 'alarm-on')),
('thresh_hi', YLeaf(YType.uint32, 'thresh-hi')),
('period_hi', YLeaf(YType.uint32, 'period-hi')),
('thresh_lo', YLeaf(YType.uint32, 'thresh-lo')),
('period_lo', YLeaf(YType.uint32, 'period-lo')),
('count', YLeaf(YType.uint32, 'count')),
('intr_type', YLeaf(YType.uint32, 'intr-type')),
('leaf_id', YLeaf(YType.uint32, 'leaf-id')),
('last_cleared', YLeaf(YType.uint64, 'last-cleared')),
])
self.name = None
self.asic_info = None
self.node_key = None
self.alarm_on = None
self.thresh_hi = None
self.period_hi = None
self.thresh_lo = None
self.period_lo = None
self.count = None
self.intr_type = None
self.leaf_id = None
self.last_cleared = None
self.csrs_info = YList(self)
self.last_err = YList(self)
self._segment_path = lambda: "error"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.SingleBitHardErrors.Error, ['name', 'asic_info', 'node_key', 'alarm_on', 'thresh_hi', 'period_hi', 'thresh_lo', 'period_lo', 'count', 'intr_type', 'leaf_id', 'last_cleared'], name, value)
class CsrsInfo(Entity):
"""
List of csrs\_info
.. attribute:: name
name
**type**\: str
.. attribute:: address
address
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: width
width
**type**\: int
**range:** 0..4294967295
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.SingleBitHardErrors.Error.CsrsInfo, self).__init__()
self.yang_name = "csrs-info"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('address', YLeaf(YType.uint64, 'address')),
('width', YLeaf(YType.uint32, 'width')),
])
self.name = None
self.address = None
self.width = None
self._segment_path = lambda: "csrs-info"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.SingleBitHardErrors.Error.CsrsInfo, ['name', 'address', 'width'], name, value)
class LastErr(Entity):
"""
Last Printable error information
.. attribute:: at_time
at time
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: at_time_nsec
at time nsec
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: counter_val
counter val
**type**\: int
**range:** 0..4294967295
.. attribute:: error_desc
error desc
**type**\: str
.. attribute:: error_regval
error regval
**type**\: list of int
**range:** 0..255
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.SingleBitHardErrors.Error.LastErr, self).__init__()
self.yang_name = "last-err"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('at_time', YLeaf(YType.uint64, 'at-time')),
('at_time_nsec', YLeaf(YType.uint64, 'at-time-nsec')),
('counter_val', YLeaf(YType.uint32, 'counter-val')),
('error_desc', YLeaf(YType.str, 'error-desc')),
('error_regval', YLeafList(YType.uint8, 'error-regval')),
])
self.at_time = None
self.at_time_nsec = None
self.counter_val = None
self.error_desc = None
self.error_regval = []
self._segment_path = lambda: "last-err"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.SingleBitHardErrors.Error.LastErr, ['at_time', 'at_time_nsec', 'counter_val', 'error_desc', 'error_regval'], name, value)
class IndirectHardErrors(Entity):
"""
Indirect hard error information
.. attribute:: error
Collection of errors
**type**\: list of :py:class:`Error <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.IndirectHardErrors.Error>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.IndirectHardErrors, self).__init__()
self.yang_name = "indirect-hard-errors"
self.yang_parent_name = "error-path"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("error", ("error", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.IndirectHardErrors.Error))])
self._leafs = OrderedDict()
self.error = YList(self)
self._segment_path = lambda: "indirect-hard-errors"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.IndirectHardErrors, [], name, value)
class Error(Entity):
"""
Collection of errors
.. attribute:: name
Name assigned to mem
**type**\: str
.. attribute:: asic_info
Name of rack/board/asic
**type**\: str
.. attribute:: node_key
32 bit key
**type**\: int
**range:** 0..4294967295
.. attribute:: alarm_on
High threshold crossed
**type**\: bool
.. attribute:: thresh_hi
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_hi
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: thresh_lo
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_lo
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: count
Accumulated count
**type**\: int
**range:** 0..4294967295
.. attribute:: intr_type
Type of error
**type**\: int
**range:** 0..4294967295
.. attribute:: leaf_id
Leaf ID defined in user data
**type**\: int
**range:** 0..4294967295
.. attribute:: last_cleared
Time cleared
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: csrs_info
List of csrs\_info
**type**\: list of :py:class:`CsrsInfo <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.IndirectHardErrors.Error.CsrsInfo>`
.. attribute:: last_err
Last Printable error information
**type**\: list of :py:class:`LastErr <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.IndirectHardErrors.Error.LastErr>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.IndirectHardErrors.Error, self).__init__()
self.yang_name = "error"
self.yang_parent_name = "indirect-hard-errors"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("csrs-info", ("csrs_info", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.IndirectHardErrors.Error.CsrsInfo)), ("last-err", ("last_err", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.IndirectHardErrors.Error.LastErr))])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('asic_info', YLeaf(YType.str, 'asic-info')),
('node_key', YLeaf(YType.uint32, 'node-key')),
('alarm_on', YLeaf(YType.boolean, 'alarm-on')),
('thresh_hi', YLeaf(YType.uint32, 'thresh-hi')),
('period_hi', YLeaf(YType.uint32, 'period-hi')),
('thresh_lo', YLeaf(YType.uint32, 'thresh-lo')),
('period_lo', YLeaf(YType.uint32, 'period-lo')),
('count', YLeaf(YType.uint32, 'count')),
('intr_type', YLeaf(YType.uint32, 'intr-type')),
('leaf_id', YLeaf(YType.uint32, 'leaf-id')),
('last_cleared', YLeaf(YType.uint64, 'last-cleared')),
])
self.name = None
self.asic_info = None
self.node_key = None
self.alarm_on = None
self.thresh_hi = None
self.period_hi = None
self.thresh_lo = None
self.period_lo = None
self.count = None
self.intr_type = None
self.leaf_id = None
self.last_cleared = None
self.csrs_info = YList(self)
self.last_err = YList(self)
self._segment_path = lambda: "error"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.IndirectHardErrors.Error, ['name', 'asic_info', 'node_key', 'alarm_on', 'thresh_hi', 'period_hi', 'thresh_lo', 'period_lo', 'count', 'intr_type', 'leaf_id', 'last_cleared'], name, value)
class CsrsInfo(Entity):
"""
List of csrs\_info
.. attribute:: name
name
**type**\: str
.. attribute:: address
address
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: width
width
**type**\: int
**range:** 0..4294967295
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.IndirectHardErrors.Error.CsrsInfo, self).__init__()
self.yang_name = "csrs-info"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('address', YLeaf(YType.uint64, 'address')),
('width', YLeaf(YType.uint32, 'width')),
])
self.name = None
self.address = None
self.width = None
self._segment_path = lambda: "csrs-info"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.IndirectHardErrors.Error.CsrsInfo, ['name', 'address', 'width'], name, value)
class LastErr(Entity):
"""
Last Printable error information
.. attribute:: at_time
at time
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: at_time_nsec
at time nsec
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: counter_val
counter val
**type**\: int
**range:** 0..4294967295
.. attribute:: error_desc
error desc
**type**\: str
.. attribute:: error_regval
error regval
**type**\: list of int
**range:** 0..255
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.IndirectHardErrors.Error.LastErr, self).__init__()
self.yang_name = "last-err"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('at_time', YLeaf(YType.uint64, 'at-time')),
('at_time_nsec', YLeaf(YType.uint64, 'at-time-nsec')),
('counter_val', YLeaf(YType.uint32, 'counter-val')),
('error_desc', YLeaf(YType.str, 'error-desc')),
('error_regval', YLeafList(YType.uint8, 'error-regval')),
])
self.at_time = None
self.at_time_nsec = None
self.counter_val = None
self.error_desc = None
self.error_regval = []
self._segment_path = lambda: "last-err"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.IndirectHardErrors.Error.LastErr, ['at_time', 'at_time_nsec', 'counter_val', 'error_desc', 'error_regval'], name, value)
class OutofResourceSoft(Entity):
"""
OOR thresh information
.. attribute:: error
Collection of errors
**type**\: list of :py:class:`Error <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.OutofResourceSoft.Error>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.OutofResourceSoft, self).__init__()
self.yang_name = "outof-resource-soft"
self.yang_parent_name = "error-path"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("error", ("error", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.OutofResourceSoft.Error))])
self._leafs = OrderedDict()
self.error = YList(self)
self._segment_path = lambda: "outof-resource-soft"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.OutofResourceSoft, [], name, value)
class Error(Entity):
"""
Collection of errors
.. attribute:: name
Name assigned to mem
**type**\: str
.. attribute:: asic_info
Name of rack/board/asic
**type**\: str
.. attribute:: node_key
32 bit key
**type**\: int
**range:** 0..4294967295
.. attribute:: alarm_on
High threshold crossed
**type**\: bool
.. attribute:: thresh_hi
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_hi
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: thresh_lo
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_lo
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: count
Accumulated count
**type**\: int
**range:** 0..4294967295
.. attribute:: intr_type
Type of error
**type**\: int
**range:** 0..4294967295
.. attribute:: leaf_id
Leaf ID defined in user data
**type**\: int
**range:** 0..4294967295
.. attribute:: last_cleared
Time cleared
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: csrs_info
List of csrs\_info
**type**\: list of :py:class:`CsrsInfo <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.OutofResourceSoft.Error.CsrsInfo>`
.. attribute:: last_err
Last Printable error information
**type**\: list of :py:class:`LastErr <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.OutofResourceSoft.Error.LastErr>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.OutofResourceSoft.Error, self).__init__()
self.yang_name = "error"
self.yang_parent_name = "outof-resource-soft"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("csrs-info", ("csrs_info", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.OutofResourceSoft.Error.CsrsInfo)), ("last-err", ("last_err", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.OutofResourceSoft.Error.LastErr))])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('asic_info', YLeaf(YType.str, 'asic-info')),
('node_key', YLeaf(YType.uint32, 'node-key')),
('alarm_on', YLeaf(YType.boolean, 'alarm-on')),
('thresh_hi', YLeaf(YType.uint32, 'thresh-hi')),
('period_hi', YLeaf(YType.uint32, 'period-hi')),
('thresh_lo', YLeaf(YType.uint32, 'thresh-lo')),
('period_lo', YLeaf(YType.uint32, 'period-lo')),
('count', YLeaf(YType.uint32, 'count')),
('intr_type', YLeaf(YType.uint32, 'intr-type')),
('leaf_id', YLeaf(YType.uint32, 'leaf-id')),
('last_cleared', YLeaf(YType.uint64, 'last-cleared')),
])
self.name = None
self.asic_info = None
self.node_key = None
self.alarm_on = None
self.thresh_hi = None
self.period_hi = None
self.thresh_lo = None
self.period_lo = None
self.count = None
self.intr_type = None
self.leaf_id = None
self.last_cleared = None
self.csrs_info = YList(self)
self.last_err = YList(self)
self._segment_path = lambda: "error"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.OutofResourceSoft.Error, ['name', 'asic_info', 'node_key', 'alarm_on', 'thresh_hi', 'period_hi', 'thresh_lo', 'period_lo', 'count', 'intr_type', 'leaf_id', 'last_cleared'], name, value)
class CsrsInfo(Entity):
"""
List of csrs\_info
.. attribute:: name
name
**type**\: str
.. attribute:: address
address
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: width
width
**type**\: int
**range:** 0..4294967295
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.OutofResourceSoft.Error.CsrsInfo, self).__init__()
self.yang_name = "csrs-info"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('address', YLeaf(YType.uint64, 'address')),
('width', YLeaf(YType.uint32, 'width')),
])
self.name = None
self.address = None
self.width = None
self._segment_path = lambda: "csrs-info"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.OutofResourceSoft.Error.CsrsInfo, ['name', 'address', 'width'], name, value)
class LastErr(Entity):
"""
Last Printable error information
.. attribute:: at_time
at time
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: at_time_nsec
at time nsec
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: counter_val
counter val
**type**\: int
**range:** 0..4294967295
.. attribute:: error_desc
error desc
**type**\: str
.. attribute:: error_regval
error regval
**type**\: list of int
**range:** 0..255
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.OutofResourceSoft.Error.LastErr, self).__init__()
self.yang_name = "last-err"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('at_time', YLeaf(YType.uint64, 'at-time')),
('at_time_nsec', YLeaf(YType.uint64, 'at-time-nsec')),
('counter_val', YLeaf(YType.uint32, 'counter-val')),
('error_desc', YLeaf(YType.str, 'error-desc')),
('error_regval', YLeafList(YType.uint8, 'error-regval')),
])
self.at_time = None
self.at_time_nsec = None
self.counter_val = None
self.error_desc = None
self.error_regval = []
self._segment_path = lambda: "last-err"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.OutofResourceSoft.Error.LastErr, ['at_time', 'at_time_nsec', 'counter_val', 'error_desc', 'error_regval'], name, value)
class CrcSoftErrors(Entity):
"""
CRC soft error information
.. attribute:: error
Collection of errors
**type**\: list of :py:class:`Error <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.CrcSoftErrors.Error>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.CrcSoftErrors, self).__init__()
self.yang_name = "crc-soft-errors"
self.yang_parent_name = "error-path"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("error", ("error", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.CrcSoftErrors.Error))])
self._leafs = OrderedDict()
self.error = YList(self)
self._segment_path = lambda: "crc-soft-errors"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.CrcSoftErrors, [], name, value)
class Error(Entity):
"""
Collection of errors
.. attribute:: name
Name assigned to mem
**type**\: str
.. attribute:: asic_info
Name of rack/board/asic
**type**\: str
.. attribute:: node_key
32 bit key
**type**\: int
**range:** 0..4294967295
.. attribute:: alarm_on
High threshold crossed
**type**\: bool
.. attribute:: thresh_hi
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_hi
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: thresh_lo
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_lo
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: count
Accumulated count
**type**\: int
**range:** 0..4294967295
.. attribute:: intr_type
Type of error
**type**\: int
**range:** 0..4294967295
.. attribute:: leaf_id
Leaf ID defined in user data
**type**\: int
**range:** 0..4294967295
.. attribute:: last_cleared
Time cleared
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: csrs_info
List of csrs\_info
**type**\: list of :py:class:`CsrsInfo <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.CrcSoftErrors.Error.CsrsInfo>`
.. attribute:: last_err
Last Printable error information
**type**\: list of :py:class:`LastErr <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.CrcSoftErrors.Error.LastErr>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.CrcSoftErrors.Error, self).__init__()
self.yang_name = "error"
self.yang_parent_name = "crc-soft-errors"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("csrs-info", ("csrs_info", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.CrcSoftErrors.Error.CsrsInfo)), ("last-err", ("last_err", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.CrcSoftErrors.Error.LastErr))])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('asic_info', YLeaf(YType.str, 'asic-info')),
('node_key', YLeaf(YType.uint32, 'node-key')),
('alarm_on', YLeaf(YType.boolean, 'alarm-on')),
('thresh_hi', YLeaf(YType.uint32, 'thresh-hi')),
('period_hi', YLeaf(YType.uint32, 'period-hi')),
('thresh_lo', YLeaf(YType.uint32, 'thresh-lo')),
('period_lo', YLeaf(YType.uint32, 'period-lo')),
('count', YLeaf(YType.uint32, 'count')),
('intr_type', YLeaf(YType.uint32, 'intr-type')),
('leaf_id', YLeaf(YType.uint32, 'leaf-id')),
('last_cleared', YLeaf(YType.uint64, 'last-cleared')),
])
self.name = None
self.asic_info = None
self.node_key = None
self.alarm_on = None
self.thresh_hi = None
self.period_hi = None
self.thresh_lo = None
self.period_lo = None
self.count = None
self.intr_type = None
self.leaf_id = None
self.last_cleared = None
self.csrs_info = YList(self)
self.last_err = YList(self)
self._segment_path = lambda: "error"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.CrcSoftErrors.Error, ['name', 'asic_info', 'node_key', 'alarm_on', 'thresh_hi', 'period_hi', 'thresh_lo', 'period_lo', 'count', 'intr_type', 'leaf_id', 'last_cleared'], name, value)
class CsrsInfo(Entity):
"""
List of csrs\_info
.. attribute:: name
name
**type**\: str
.. attribute:: address
address
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: width
width
**type**\: int
**range:** 0..4294967295
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.CrcSoftErrors.Error.CsrsInfo, self).__init__()
self.yang_name = "csrs-info"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('address', YLeaf(YType.uint64, 'address')),
('width', YLeaf(YType.uint32, 'width')),
])
self.name = None
self.address = None
self.width = None
self._segment_path = lambda: "csrs-info"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.CrcSoftErrors.Error.CsrsInfo, ['name', 'address', 'width'], name, value)
class LastErr(Entity):
"""
Last Printable error information
.. attribute:: at_time
at time
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: at_time_nsec
at time nsec
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: counter_val
counter val
**type**\: int
**range:** 0..4294967295
.. attribute:: error_desc
error desc
**type**\: str
.. attribute:: error_regval
error regval
**type**\: list of int
**range:** 0..255
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.CrcSoftErrors.Error.LastErr, self).__init__()
self.yang_name = "last-err"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('at_time', YLeaf(YType.uint64, 'at-time')),
('at_time_nsec', YLeaf(YType.uint64, 'at-time-nsec')),
('counter_val', YLeaf(YType.uint32, 'counter-val')),
('error_desc', YLeaf(YType.str, 'error-desc')),
('error_regval', YLeafList(YType.uint8, 'error-regval')),
])
self.at_time = None
self.at_time_nsec = None
self.counter_val = None
self.error_desc = None
self.error_regval = []
self._segment_path = lambda: "last-err"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.CrcSoftErrors.Error.LastErr, ['at_time', 'at_time_nsec', 'counter_val', 'error_desc', 'error_regval'], name, value)
class TimeOutHardErrors(Entity):
"""
Time out hard error information
.. attribute:: error
Collection of errors
**type**\: list of :py:class:`Error <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.TimeOutHardErrors.Error>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.TimeOutHardErrors, self).__init__()
self.yang_name = "time-out-hard-errors"
self.yang_parent_name = "error-path"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("error", ("error", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.TimeOutHardErrors.Error))])
self._leafs = OrderedDict()
self.error = YList(self)
self._segment_path = lambda: "time-out-hard-errors"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.TimeOutHardErrors, [], name, value)
class Error(Entity):
"""
Collection of errors
.. attribute:: name
Name assigned to mem
**type**\: str
.. attribute:: asic_info
Name of rack/board/asic
**type**\: str
.. attribute:: node_key
32 bit key
**type**\: int
**range:** 0..4294967295
.. attribute:: alarm_on
High threshold crossed
**type**\: bool
.. attribute:: thresh_hi
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_hi
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: thresh_lo
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_lo
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: count
Accumulated count
**type**\: int
**range:** 0..4294967295
.. attribute:: intr_type
Type of error
**type**\: int
**range:** 0..4294967295
.. attribute:: leaf_id
Leaf ID defined in user data
**type**\: int
**range:** 0..4294967295
.. attribute:: last_cleared
Time cleared
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: csrs_info
List of csrs\_info
**type**\: list of :py:class:`CsrsInfo <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.TimeOutHardErrors.Error.CsrsInfo>`
.. attribute:: last_err
Last Printable error information
**type**\: list of :py:class:`LastErr <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.TimeOutHardErrors.Error.LastErr>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.TimeOutHardErrors.Error, self).__init__()
self.yang_name = "error"
self.yang_parent_name = "time-out-hard-errors"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("csrs-info", ("csrs_info", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.TimeOutHardErrors.Error.CsrsInfo)), ("last-err", ("last_err", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.TimeOutHardErrors.Error.LastErr))])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('asic_info', YLeaf(YType.str, 'asic-info')),
('node_key', YLeaf(YType.uint32, 'node-key')),
('alarm_on', YLeaf(YType.boolean, 'alarm-on')),
('thresh_hi', YLeaf(YType.uint32, 'thresh-hi')),
('period_hi', YLeaf(YType.uint32, 'period-hi')),
('thresh_lo', YLeaf(YType.uint32, 'thresh-lo')),
('period_lo', YLeaf(YType.uint32, 'period-lo')),
('count', YLeaf(YType.uint32, 'count')),
('intr_type', YLeaf(YType.uint32, 'intr-type')),
('leaf_id', YLeaf(YType.uint32, 'leaf-id')),
('last_cleared', YLeaf(YType.uint64, 'last-cleared')),
])
self.name = None
self.asic_info = None
self.node_key = None
self.alarm_on = None
self.thresh_hi = None
self.period_hi = None
self.thresh_lo = None
self.period_lo = None
self.count = None
self.intr_type = None
self.leaf_id = None
self.last_cleared = None
self.csrs_info = YList(self)
self.last_err = YList(self)
self._segment_path = lambda: "error"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.TimeOutHardErrors.Error, ['name', 'asic_info', 'node_key', 'alarm_on', 'thresh_hi', 'period_hi', 'thresh_lo', 'period_lo', 'count', 'intr_type', 'leaf_id', 'last_cleared'], name, value)
class CsrsInfo(Entity):
"""
List of csrs\_info
.. attribute:: name
name
**type**\: str
.. attribute:: address
address
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: width
width
**type**\: int
**range:** 0..4294967295
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.TimeOutHardErrors.Error.CsrsInfo, self).__init__()
self.yang_name = "csrs-info"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('address', YLeaf(YType.uint64, 'address')),
('width', YLeaf(YType.uint32, 'width')),
])
self.name = None
self.address = None
self.width = None
self._segment_path = lambda: "csrs-info"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.TimeOutHardErrors.Error.CsrsInfo, ['name', 'address', 'width'], name, value)
class LastErr(Entity):
"""
Last Printable error information
.. attribute:: at_time
at time
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: at_time_nsec
at time nsec
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: counter_val
counter val
**type**\: int
**range:** 0..4294967295
.. attribute:: error_desc
error desc
**type**\: str
.. attribute:: error_regval
error regval
**type**\: list of int
**range:** 0..255
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.TimeOutHardErrors.Error.LastErr, self).__init__()
self.yang_name = "last-err"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('at_time', YLeaf(YType.uint64, 'at-time')),
('at_time_nsec', YLeaf(YType.uint64, 'at-time-nsec')),
('counter_val', YLeaf(YType.uint32, 'counter-val')),
('error_desc', YLeaf(YType.str, 'error-desc')),
('error_regval', YLeafList(YType.uint8, 'error-regval')),
])
self.at_time = None
self.at_time_nsec = None
self.counter_val = None
self.error_desc = None
self.error_regval = []
self._segment_path = lambda: "last-err"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.TimeOutHardErrors.Error.LastErr, ['at_time', 'at_time_nsec', 'counter_val', 'error_desc', 'error_regval'], name, value)
class BarrierSoftErrors(Entity):
"""
Barrier soft error information
.. attribute:: error
Collection of errors
**type**\: list of :py:class:`Error <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.BarrierSoftErrors.Error>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.BarrierSoftErrors, self).__init__()
self.yang_name = "barrier-soft-errors"
self.yang_parent_name = "error-path"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("error", ("error", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.BarrierSoftErrors.Error))])
self._leafs = OrderedDict()
self.error = YList(self)
self._segment_path = lambda: "barrier-soft-errors"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.BarrierSoftErrors, [], name, value)
class Error(Entity):
"""
Collection of errors
.. attribute:: name
Name assigned to mem
**type**\: str
.. attribute:: asic_info
Name of rack/board/asic
**type**\: str
.. attribute:: node_key
32 bit key
**type**\: int
**range:** 0..4294967295
.. attribute:: alarm_on
High threshold crossed
**type**\: bool
.. attribute:: thresh_hi
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_hi
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: thresh_lo
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_lo
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: count
Accumulated count
**type**\: int
**range:** 0..4294967295
.. attribute:: intr_type
Type of error
**type**\: int
**range:** 0..4294967295
.. attribute:: leaf_id
Leaf ID defined in user data
**type**\: int
**range:** 0..4294967295
.. attribute:: last_cleared
Time cleared
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: csrs_info
List of csrs\_info
**type**\: list of :py:class:`CsrsInfo <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.BarrierSoftErrors.Error.CsrsInfo>`
.. attribute:: last_err
Last Printable error information
**type**\: list of :py:class:`LastErr <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.BarrierSoftErrors.Error.LastErr>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.BarrierSoftErrors.Error, self).__init__()
self.yang_name = "error"
self.yang_parent_name = "barrier-soft-errors"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("csrs-info", ("csrs_info", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.BarrierSoftErrors.Error.CsrsInfo)), ("last-err", ("last_err", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.BarrierSoftErrors.Error.LastErr))])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('asic_info', YLeaf(YType.str, 'asic-info')),
('node_key', YLeaf(YType.uint32, 'node-key')),
('alarm_on', YLeaf(YType.boolean, 'alarm-on')),
('thresh_hi', YLeaf(YType.uint32, 'thresh-hi')),
('period_hi', YLeaf(YType.uint32, 'period-hi')),
('thresh_lo', YLeaf(YType.uint32, 'thresh-lo')),
('period_lo', YLeaf(YType.uint32, 'period-lo')),
('count', YLeaf(YType.uint32, 'count')),
('intr_type', YLeaf(YType.uint32, 'intr-type')),
('leaf_id', YLeaf(YType.uint32, 'leaf-id')),
('last_cleared', YLeaf(YType.uint64, 'last-cleared')),
])
self.name = None
self.asic_info = None
self.node_key = None
self.alarm_on = None
self.thresh_hi = None
self.period_hi = None
self.thresh_lo = None
self.period_lo = None
self.count = None
self.intr_type = None
self.leaf_id = None
self.last_cleared = None
self.csrs_info = YList(self)
self.last_err = YList(self)
self._segment_path = lambda: "error"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.BarrierSoftErrors.Error, ['name', 'asic_info', 'node_key', 'alarm_on', 'thresh_hi', 'period_hi', 'thresh_lo', 'period_lo', 'count', 'intr_type', 'leaf_id', 'last_cleared'], name, value)
class CsrsInfo(Entity):
"""
List of csrs\_info
.. attribute:: name
name
**type**\: str
.. attribute:: address
address
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: width
width
**type**\: int
**range:** 0..4294967295
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.BarrierSoftErrors.Error.CsrsInfo, self).__init__()
self.yang_name = "csrs-info"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('address', YLeaf(YType.uint64, 'address')),
('width', YLeaf(YType.uint32, 'width')),
])
self.name = None
self.address = None
self.width = None
self._segment_path = lambda: "csrs-info"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.BarrierSoftErrors.Error.CsrsInfo, ['name', 'address', 'width'], name, value)
class LastErr(Entity):
"""
Last Printable error information
.. attribute:: at_time
at time
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: at_time_nsec
at time nsec
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: counter_val
counter val
**type**\: int
**range:** 0..4294967295
.. attribute:: error_desc
error desc
**type**\: str
.. attribute:: error_regval
error regval
**type**\: list of int
**range:** 0..255
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.BarrierSoftErrors.Error.LastErr, self).__init__()
self.yang_name = "last-err"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('at_time', YLeaf(YType.uint64, 'at-time')),
('at_time_nsec', YLeaf(YType.uint64, 'at-time-nsec')),
('counter_val', YLeaf(YType.uint32, 'counter-val')),
('error_desc', YLeaf(YType.str, 'error-desc')),
('error_regval', YLeafList(YType.uint8, 'error-regval')),
])
self.at_time = None
self.at_time_nsec = None
self.counter_val = None
self.error_desc = None
self.error_regval = []
self._segment_path = lambda: "last-err"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.BarrierSoftErrors.Error.LastErr, ['at_time', 'at_time_nsec', 'counter_val', 'error_desc', 'error_regval'], name, value)
class AsicErrorMbeSoft(Entity):
"""
Indirect hard error information
.. attribute:: error
Collection of errors
**type**\: list of :py:class:`Error <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorMbeSoft.Error>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorMbeSoft, self).__init__()
self.yang_name = "asic-error-mbe-soft"
self.yang_parent_name = "error-path"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("error", ("error", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorMbeSoft.Error))])
self._leafs = OrderedDict()
self.error = YList(self)
self._segment_path = lambda: "asic-error-mbe-soft"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorMbeSoft, [], name, value)
class Error(Entity):
"""
Collection of errors
.. attribute:: name
Name assigned to mem
**type**\: str
.. attribute:: asic_info
Name of rack/board/asic
**type**\: str
.. attribute:: node_key
32 bit key
**type**\: int
**range:** 0..4294967295
.. attribute:: alarm_on
High threshold crossed
**type**\: bool
.. attribute:: thresh_hi
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_hi
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: thresh_lo
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_lo
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: count
Accumulated count
**type**\: int
**range:** 0..4294967295
.. attribute:: intr_type
Type of error
**type**\: int
**range:** 0..4294967295
.. attribute:: leaf_id
Leaf ID defined in user data
**type**\: int
**range:** 0..4294967295
.. attribute:: last_cleared
Time cleared
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: csrs_info
List of csrs\_info
**type**\: list of :py:class:`CsrsInfo <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorMbeSoft.Error.CsrsInfo>`
.. attribute:: last_err
Last Printable error information
**type**\: list of :py:class:`LastErr <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorMbeSoft.Error.LastErr>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorMbeSoft.Error, self).__init__()
self.yang_name = "error"
self.yang_parent_name = "asic-error-mbe-soft"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("csrs-info", ("csrs_info", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorMbeSoft.Error.CsrsInfo)), ("last-err", ("last_err", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorMbeSoft.Error.LastErr))])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('asic_info', YLeaf(YType.str, 'asic-info')),
('node_key', YLeaf(YType.uint32, 'node-key')),
('alarm_on', YLeaf(YType.boolean, 'alarm-on')),
('thresh_hi', YLeaf(YType.uint32, 'thresh-hi')),
('period_hi', YLeaf(YType.uint32, 'period-hi')),
('thresh_lo', YLeaf(YType.uint32, 'thresh-lo')),
('period_lo', YLeaf(YType.uint32, 'period-lo')),
('count', YLeaf(YType.uint32, 'count')),
('intr_type', YLeaf(YType.uint32, 'intr-type')),
('leaf_id', YLeaf(YType.uint32, 'leaf-id')),
('last_cleared', YLeaf(YType.uint64, 'last-cleared')),
])
self.name = None
self.asic_info = None
self.node_key = None
self.alarm_on = None
self.thresh_hi = None
self.period_hi = None
self.thresh_lo = None
self.period_lo = None
self.count = None
self.intr_type = None
self.leaf_id = None
self.last_cleared = None
self.csrs_info = YList(self)
self.last_err = YList(self)
self._segment_path = lambda: "error"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorMbeSoft.Error, ['name', 'asic_info', 'node_key', 'alarm_on', 'thresh_hi', 'period_hi', 'thresh_lo', 'period_lo', 'count', 'intr_type', 'leaf_id', 'last_cleared'], name, value)
class CsrsInfo(Entity):
"""
List of csrs\_info
.. attribute:: name
name
**type**\: str
.. attribute:: address
address
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: width
width
**type**\: int
**range:** 0..4294967295
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorMbeSoft.Error.CsrsInfo, self).__init__()
self.yang_name = "csrs-info"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('address', YLeaf(YType.uint64, 'address')),
('width', YLeaf(YType.uint32, 'width')),
])
self.name = None
self.address = None
self.width = None
self._segment_path = lambda: "csrs-info"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorMbeSoft.Error.CsrsInfo, ['name', 'address', 'width'], name, value)
class LastErr(Entity):
"""
Last Printable error information
.. attribute:: at_time
at time
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: at_time_nsec
at time nsec
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: counter_val
counter val
**type**\: int
**range:** 0..4294967295
.. attribute:: error_desc
error desc
**type**\: str
.. attribute:: error_regval
error regval
**type**\: list of int
**range:** 0..255
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorMbeSoft.Error.LastErr, self).__init__()
self.yang_name = "last-err"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('at_time', YLeaf(YType.uint64, 'at-time')),
('at_time_nsec', YLeaf(YType.uint64, 'at-time-nsec')),
('counter_val', YLeaf(YType.uint32, 'counter-val')),
('error_desc', YLeaf(YType.str, 'error-desc')),
('error_regval', YLeafList(YType.uint8, 'error-regval')),
])
self.at_time = None
self.at_time_nsec = None
self.counter_val = None
self.error_desc = None
self.error_regval = []
self._segment_path = lambda: "last-err"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorMbeSoft.Error.LastErr, ['at_time', 'at_time_nsec', 'counter_val', 'error_desc', 'error_regval'], name, value)
class BackPressureHardErrors(Entity):
"""
BP hard error information
.. attribute:: error
Collection of errors
**type**\: list of :py:class:`Error <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.BackPressureHardErrors.Error>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.BackPressureHardErrors, self).__init__()
self.yang_name = "back-pressure-hard-errors"
self.yang_parent_name = "error-path"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("error", ("error", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.BackPressureHardErrors.Error))])
self._leafs = OrderedDict()
self.error = YList(self)
self._segment_path = lambda: "back-pressure-hard-errors"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.BackPressureHardErrors, [], name, value)
class Error(Entity):
"""
Collection of errors
.. attribute:: name
Name assigned to mem
**type**\: str
.. attribute:: asic_info
Name of rack/board/asic
**type**\: str
.. attribute:: node_key
32 bit key
**type**\: int
**range:** 0..4294967295
.. attribute:: alarm_on
High threshold crossed
**type**\: bool
.. attribute:: thresh_hi
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_hi
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: thresh_lo
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_lo
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: count
Accumulated count
**type**\: int
**range:** 0..4294967295
.. attribute:: intr_type
Type of error
**type**\: int
**range:** 0..4294967295
.. attribute:: leaf_id
Leaf ID defined in user data
**type**\: int
**range:** 0..4294967295
.. attribute:: last_cleared
Time cleared
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: csrs_info
List of csrs\_info
**type**\: list of :py:class:`CsrsInfo <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.BackPressureHardErrors.Error.CsrsInfo>`
.. attribute:: last_err
Last Printable error information
**type**\: list of :py:class:`LastErr <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.BackPressureHardErrors.Error.LastErr>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.BackPressureHardErrors.Error, self).__init__()
self.yang_name = "error"
self.yang_parent_name = "back-pressure-hard-errors"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("csrs-info", ("csrs_info", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.BackPressureHardErrors.Error.CsrsInfo)), ("last-err", ("last_err", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.BackPressureHardErrors.Error.LastErr))])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('asic_info', YLeaf(YType.str, 'asic-info')),
('node_key', YLeaf(YType.uint32, 'node-key')),
('alarm_on', YLeaf(YType.boolean, 'alarm-on')),
('thresh_hi', YLeaf(YType.uint32, 'thresh-hi')),
('period_hi', YLeaf(YType.uint32, 'period-hi')),
('thresh_lo', YLeaf(YType.uint32, 'thresh-lo')),
('period_lo', YLeaf(YType.uint32, 'period-lo')),
('count', YLeaf(YType.uint32, 'count')),
('intr_type', YLeaf(YType.uint32, 'intr-type')),
('leaf_id', YLeaf(YType.uint32, 'leaf-id')),
('last_cleared', YLeaf(YType.uint64, 'last-cleared')),
])
self.name = None
self.asic_info = None
self.node_key = None
self.alarm_on = None
self.thresh_hi = None
self.period_hi = None
self.thresh_lo = None
self.period_lo = None
self.count = None
self.intr_type = None
self.leaf_id = None
self.last_cleared = None
self.csrs_info = YList(self)
self.last_err = YList(self)
self._segment_path = lambda: "error"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.BackPressureHardErrors.Error, ['name', 'asic_info', 'node_key', 'alarm_on', 'thresh_hi', 'period_hi', 'thresh_lo', 'period_lo', 'count', 'intr_type', 'leaf_id', 'last_cleared'], name, value)
class CsrsInfo(Entity):
"""
List of csrs\_info
.. attribute:: name
name
**type**\: str
.. attribute:: address
address
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: width
width
**type**\: int
**range:** 0..4294967295
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.BackPressureHardErrors.Error.CsrsInfo, self).__init__()
self.yang_name = "csrs-info"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('address', YLeaf(YType.uint64, 'address')),
('width', YLeaf(YType.uint32, 'width')),
])
self.name = None
self.address = None
self.width = None
self._segment_path = lambda: "csrs-info"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.BackPressureHardErrors.Error.CsrsInfo, ['name', 'address', 'width'], name, value)
class LastErr(Entity):
"""
Last Printable error information
.. attribute:: at_time
at time
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: at_time_nsec
at time nsec
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: counter_val
counter val
**type**\: int
**range:** 0..4294967295
.. attribute:: error_desc
error desc
**type**\: str
.. attribute:: error_regval
error regval
**type**\: list of int
**range:** 0..255
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.BackPressureHardErrors.Error.LastErr, self).__init__()
self.yang_name = "last-err"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('at_time', YLeaf(YType.uint64, 'at-time')),
('at_time_nsec', YLeaf(YType.uint64, 'at-time-nsec')),
('counter_val', YLeaf(YType.uint32, 'counter-val')),
('error_desc', YLeaf(YType.str, 'error-desc')),
('error_regval', YLeafList(YType.uint8, 'error-regval')),
])
self.at_time = None
self.at_time_nsec = None
self.counter_val = None
self.error_desc = None
self.error_regval = []
self._segment_path = lambda: "last-err"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.BackPressureHardErrors.Error.LastErr, ['at_time', 'at_time_nsec', 'counter_val', 'error_desc', 'error_regval'], name, value)
class SingleBitSoftErrors(Entity):
"""
Single bit soft error information
.. attribute:: error
Collection of errors
**type**\: list of :py:class:`Error <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.SingleBitSoftErrors.Error>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.SingleBitSoftErrors, self).__init__()
self.yang_name = "single-bit-soft-errors"
self.yang_parent_name = "error-path"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("error", ("error", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.SingleBitSoftErrors.Error))])
self._leafs = OrderedDict()
self.error = YList(self)
self._segment_path = lambda: "single-bit-soft-errors"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.SingleBitSoftErrors, [], name, value)
class Error(Entity):
"""
Collection of errors
.. attribute:: name
Name assigned to mem
**type**\: str
.. attribute:: asic_info
Name of rack/board/asic
**type**\: str
.. attribute:: node_key
32 bit key
**type**\: int
**range:** 0..4294967295
.. attribute:: alarm_on
High threshold crossed
**type**\: bool
.. attribute:: thresh_hi
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_hi
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: thresh_lo
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_lo
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: count
Accumulated count
**type**\: int
**range:** 0..4294967295
.. attribute:: intr_type
Type of error
**type**\: int
**range:** 0..4294967295
.. attribute:: leaf_id
Leaf ID defined in user data
**type**\: int
**range:** 0..4294967295
.. attribute:: last_cleared
Time cleared
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: csrs_info
List of csrs\_info
**type**\: list of :py:class:`CsrsInfo <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.SingleBitSoftErrors.Error.CsrsInfo>`
.. attribute:: last_err
Last Printable error information
**type**\: list of :py:class:`LastErr <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.SingleBitSoftErrors.Error.LastErr>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.SingleBitSoftErrors.Error, self).__init__()
self.yang_name = "error"
self.yang_parent_name = "single-bit-soft-errors"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("csrs-info", ("csrs_info", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.SingleBitSoftErrors.Error.CsrsInfo)), ("last-err", ("last_err", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.SingleBitSoftErrors.Error.LastErr))])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('asic_info', YLeaf(YType.str, 'asic-info')),
('node_key', YLeaf(YType.uint32, 'node-key')),
('alarm_on', YLeaf(YType.boolean, 'alarm-on')),
('thresh_hi', YLeaf(YType.uint32, 'thresh-hi')),
('period_hi', YLeaf(YType.uint32, 'period-hi')),
('thresh_lo', YLeaf(YType.uint32, 'thresh-lo')),
('period_lo', YLeaf(YType.uint32, 'period-lo')),
('count', YLeaf(YType.uint32, 'count')),
('intr_type', YLeaf(YType.uint32, 'intr-type')),
('leaf_id', YLeaf(YType.uint32, 'leaf-id')),
('last_cleared', YLeaf(YType.uint64, 'last-cleared')),
])
self.name = None
self.asic_info = None
self.node_key = None
self.alarm_on = None
self.thresh_hi = None
self.period_hi = None
self.thresh_lo = None
self.period_lo = None
self.count = None
self.intr_type = None
self.leaf_id = None
self.last_cleared = None
self.csrs_info = YList(self)
self.last_err = YList(self)
self._segment_path = lambda: "error"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.SingleBitSoftErrors.Error, ['name', 'asic_info', 'node_key', 'alarm_on', 'thresh_hi', 'period_hi', 'thresh_lo', 'period_lo', 'count', 'intr_type', 'leaf_id', 'last_cleared'], name, value)
class CsrsInfo(Entity):
"""
List of csrs\_info
.. attribute:: name
name
**type**\: str
.. attribute:: address
address
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: width
width
**type**\: int
**range:** 0..4294967295
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.SingleBitSoftErrors.Error.CsrsInfo, self).__init__()
self.yang_name = "csrs-info"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('address', YLeaf(YType.uint64, 'address')),
('width', YLeaf(YType.uint32, 'width')),
])
self.name = None
self.address = None
self.width = None
self._segment_path = lambda: "csrs-info"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.SingleBitSoftErrors.Error.CsrsInfo, ['name', 'address', 'width'], name, value)
class LastErr(Entity):
"""
Last Printable error information
.. attribute:: at_time
at time
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: at_time_nsec
at time nsec
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: counter_val
counter val
**type**\: int
**range:** 0..4294967295
.. attribute:: error_desc
error desc
**type**\: str
.. attribute:: error_regval
error regval
**type**\: list of int
**range:** 0..255
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.SingleBitSoftErrors.Error.LastErr, self).__init__()
self.yang_name = "last-err"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('at_time', YLeaf(YType.uint64, 'at-time')),
('at_time_nsec', YLeaf(YType.uint64, 'at-time-nsec')),
('counter_val', YLeaf(YType.uint32, 'counter-val')),
('error_desc', YLeaf(YType.str, 'error-desc')),
('error_regval', YLeafList(YType.uint8, 'error-regval')),
])
self.at_time = None
self.at_time_nsec = None
self.counter_val = None
self.error_desc = None
self.error_regval = []
self._segment_path = lambda: "last-err"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.SingleBitSoftErrors.Error.LastErr, ['at_time', 'at_time_nsec', 'counter_val', 'error_desc', 'error_regval'], name, value)
class IndirectSoftErrors(Entity):
"""
Indirect soft error information
.. attribute:: error
Collection of errors
**type**\: list of :py:class:`Error <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.IndirectSoftErrors.Error>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.IndirectSoftErrors, self).__init__()
self.yang_name = "indirect-soft-errors"
self.yang_parent_name = "error-path"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("error", ("error", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.IndirectSoftErrors.Error))])
self._leafs = OrderedDict()
self.error = YList(self)
self._segment_path = lambda: "indirect-soft-errors"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.IndirectSoftErrors, [], name, value)
class Error(Entity):
"""
Collection of errors
.. attribute:: name
Name assigned to mem
**type**\: str
.. attribute:: asic_info
Name of rack/board/asic
**type**\: str
.. attribute:: node_key
32 bit key
**type**\: int
**range:** 0..4294967295
.. attribute:: alarm_on
High threshold crossed
**type**\: bool
.. attribute:: thresh_hi
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_hi
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: thresh_lo
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_lo
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: count
Accumulated count
**type**\: int
**range:** 0..4294967295
.. attribute:: intr_type
Type of error
**type**\: int
**range:** 0..4294967295
.. attribute:: leaf_id
Leaf ID defined in user data
**type**\: int
**range:** 0..4294967295
.. attribute:: last_cleared
Time cleared
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: csrs_info
List of csrs\_info
**type**\: list of :py:class:`CsrsInfo <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.IndirectSoftErrors.Error.CsrsInfo>`
.. attribute:: last_err
Last Printable error information
**type**\: list of :py:class:`LastErr <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.IndirectSoftErrors.Error.LastErr>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.IndirectSoftErrors.Error, self).__init__()
self.yang_name = "error"
self.yang_parent_name = "indirect-soft-errors"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("csrs-info", ("csrs_info", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.IndirectSoftErrors.Error.CsrsInfo)), ("last-err", ("last_err", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.IndirectSoftErrors.Error.LastErr))])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('asic_info', YLeaf(YType.str, 'asic-info')),
('node_key', YLeaf(YType.uint32, 'node-key')),
('alarm_on', YLeaf(YType.boolean, 'alarm-on')),
('thresh_hi', YLeaf(YType.uint32, 'thresh-hi')),
('period_hi', YLeaf(YType.uint32, 'period-hi')),
('thresh_lo', YLeaf(YType.uint32, 'thresh-lo')),
('period_lo', YLeaf(YType.uint32, 'period-lo')),
('count', YLeaf(YType.uint32, 'count')),
('intr_type', YLeaf(YType.uint32, 'intr-type')),
('leaf_id', YLeaf(YType.uint32, 'leaf-id')),
('last_cleared', YLeaf(YType.uint64, 'last-cleared')),
])
self.name = None
self.asic_info = None
self.node_key = None
self.alarm_on = None
self.thresh_hi = None
self.period_hi = None
self.thresh_lo = None
self.period_lo = None
self.count = None
self.intr_type = None
self.leaf_id = None
self.last_cleared = None
self.csrs_info = YList(self)
self.last_err = YList(self)
self._segment_path = lambda: "error"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.IndirectSoftErrors.Error, ['name', 'asic_info', 'node_key', 'alarm_on', 'thresh_hi', 'period_hi', 'thresh_lo', 'period_lo', 'count', 'intr_type', 'leaf_id', 'last_cleared'], name, value)
class CsrsInfo(Entity):
"""
List of csrs\_info
.. attribute:: name
name
**type**\: str
.. attribute:: address
address
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: width
width
**type**\: int
**range:** 0..4294967295
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.IndirectSoftErrors.Error.CsrsInfo, self).__init__()
self.yang_name = "csrs-info"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('address', YLeaf(YType.uint64, 'address')),
('width', YLeaf(YType.uint32, 'width')),
])
self.name = None
self.address = None
self.width = None
self._segment_path = lambda: "csrs-info"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.IndirectSoftErrors.Error.CsrsInfo, ['name', 'address', 'width'], name, value)
class LastErr(Entity):
"""
Last Printable error information
.. attribute:: at_time
at time
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: at_time_nsec
at time nsec
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: counter_val
counter val
**type**\: int
**range:** 0..4294967295
.. attribute:: error_desc
error desc
**type**\: str
.. attribute:: error_regval
error regval
**type**\: list of int
**range:** 0..255
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.IndirectSoftErrors.Error.LastErr, self).__init__()
self.yang_name = "last-err"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('at_time', YLeaf(YType.uint64, 'at-time')),
('at_time_nsec', YLeaf(YType.uint64, 'at-time-nsec')),
('counter_val', YLeaf(YType.uint32, 'counter-val')),
('error_desc', YLeaf(YType.str, 'error-desc')),
('error_regval', YLeafList(YType.uint8, 'error-regval')),
])
self.at_time = None
self.at_time_nsec = None
self.counter_val = None
self.error_desc = None
self.error_regval = []
self._segment_path = lambda: "last-err"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.IndirectSoftErrors.Error.LastErr, ['at_time', 'at_time_nsec', 'counter_val', 'error_desc', 'error_regval'], name, value)
class GenericHardErrors(Entity):
"""
Generic hard error information
.. attribute:: error
Collection of errors
**type**\: list of :py:class:`Error <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.GenericHardErrors.Error>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.GenericHardErrors, self).__init__()
self.yang_name = "generic-hard-errors"
self.yang_parent_name = "error-path"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("error", ("error", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.GenericHardErrors.Error))])
self._leafs = OrderedDict()
self.error = YList(self)
self._segment_path = lambda: "generic-hard-errors"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.GenericHardErrors, [], name, value)
class Error(Entity):
"""
Collection of errors
.. attribute:: name
Name assigned to mem
**type**\: str
.. attribute:: asic_info
Name of rack/board/asic
**type**\: str
.. attribute:: node_key
32 bit key
**type**\: int
**range:** 0..4294967295
.. attribute:: alarm_on
High threshold crossed
**type**\: bool
.. attribute:: thresh_hi
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_hi
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: thresh_lo
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_lo
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: count
Accumulated count
**type**\: int
**range:** 0..4294967295
.. attribute:: intr_type
Type of error
**type**\: int
**range:** 0..4294967295
.. attribute:: leaf_id
Leaf ID defined in user data
**type**\: int
**range:** 0..4294967295
.. attribute:: last_cleared
Time cleared
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: csrs_info
List of csrs\_info
**type**\: list of :py:class:`CsrsInfo <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.GenericHardErrors.Error.CsrsInfo>`
.. attribute:: last_err
Last Printable error information
**type**\: list of :py:class:`LastErr <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.GenericHardErrors.Error.LastErr>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.GenericHardErrors.Error, self).__init__()
self.yang_name = "error"
self.yang_parent_name = "generic-hard-errors"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("csrs-info", ("csrs_info", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.GenericHardErrors.Error.CsrsInfo)), ("last-err", ("last_err", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.GenericHardErrors.Error.LastErr))])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('asic_info', YLeaf(YType.str, 'asic-info')),
('node_key', YLeaf(YType.uint32, 'node-key')),
('alarm_on', YLeaf(YType.boolean, 'alarm-on')),
('thresh_hi', YLeaf(YType.uint32, 'thresh-hi')),
('period_hi', YLeaf(YType.uint32, 'period-hi')),
('thresh_lo', YLeaf(YType.uint32, 'thresh-lo')),
('period_lo', YLeaf(YType.uint32, 'period-lo')),
('count', YLeaf(YType.uint32, 'count')),
('intr_type', YLeaf(YType.uint32, 'intr-type')),
('leaf_id', YLeaf(YType.uint32, 'leaf-id')),
('last_cleared', YLeaf(YType.uint64, 'last-cleared')),
])
self.name = None
self.asic_info = None
self.node_key = None
self.alarm_on = None
self.thresh_hi = None
self.period_hi = None
self.thresh_lo = None
self.period_lo = None
self.count = None
self.intr_type = None
self.leaf_id = None
self.last_cleared = None
self.csrs_info = YList(self)
self.last_err = YList(self)
self._segment_path = lambda: "error"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.GenericHardErrors.Error, ['name', 'asic_info', 'node_key', 'alarm_on', 'thresh_hi', 'period_hi', 'thresh_lo', 'period_lo', 'count', 'intr_type', 'leaf_id', 'last_cleared'], name, value)
class CsrsInfo(Entity):
"""
List of csrs\_info
.. attribute:: name
name
**type**\: str
.. attribute:: address
address
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: width
width
**type**\: int
**range:** 0..4294967295
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.GenericHardErrors.Error.CsrsInfo, self).__init__()
self.yang_name = "csrs-info"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('address', YLeaf(YType.uint64, 'address')),
('width', YLeaf(YType.uint32, 'width')),
])
self.name = None
self.address = None
self.width = None
self._segment_path = lambda: "csrs-info"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.GenericHardErrors.Error.CsrsInfo, ['name', 'address', 'width'], name, value)
class LastErr(Entity):
"""
Last Printable error information
.. attribute:: at_time
at time
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: at_time_nsec
at time nsec
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: counter_val
counter val
**type**\: int
**range:** 0..4294967295
.. attribute:: error_desc
error desc
**type**\: str
.. attribute:: error_regval
error regval
**type**\: list of int
**range:** 0..255
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.GenericHardErrors.Error.LastErr, self).__init__()
self.yang_name = "last-err"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('at_time', YLeaf(YType.uint64, 'at-time')),
('at_time_nsec', YLeaf(YType.uint64, 'at-time-nsec')),
('counter_val', YLeaf(YType.uint32, 'counter-val')),
('error_desc', YLeaf(YType.str, 'error-desc')),
('error_regval', YLeafList(YType.uint8, 'error-regval')),
])
self.at_time = None
self.at_time_nsec = None
self.counter_val = None
self.error_desc = None
self.error_regval = []
self._segment_path = lambda: "last-err"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.GenericHardErrors.Error.LastErr, ['at_time', 'at_time_nsec', 'counter_val', 'error_desc', 'error_regval'], name, value)
class LinkHardErrors(Entity):
"""
Link hard error information
.. attribute:: error
Collection of errors
**type**\: list of :py:class:`Error <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.LinkHardErrors.Error>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.LinkHardErrors, self).__init__()
self.yang_name = "link-hard-errors"
self.yang_parent_name = "error-path"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("error", ("error", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.LinkHardErrors.Error))])
self._leafs = OrderedDict()
self.error = YList(self)
self._segment_path = lambda: "link-hard-errors"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.LinkHardErrors, [], name, value)
class Error(Entity):
"""
Collection of errors
.. attribute:: name
Name assigned to mem
**type**\: str
.. attribute:: asic_info
Name of rack/board/asic
**type**\: str
.. attribute:: node_key
32 bit key
**type**\: int
**range:** 0..4294967295
.. attribute:: alarm_on
High threshold crossed
**type**\: bool
.. attribute:: thresh_hi
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_hi
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: thresh_lo
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_lo
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: count
Accumulated count
**type**\: int
**range:** 0..4294967295
.. attribute:: intr_type
Type of error
**type**\: int
**range:** 0..4294967295
.. attribute:: leaf_id
Leaf ID defined in user data
**type**\: int
**range:** 0..4294967295
.. attribute:: last_cleared
Time cleared
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: csrs_info
List of csrs\_info
**type**\: list of :py:class:`CsrsInfo <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.LinkHardErrors.Error.CsrsInfo>`
.. attribute:: last_err
Last Printable error information
**type**\: list of :py:class:`LastErr <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.LinkHardErrors.Error.LastErr>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.LinkHardErrors.Error, self).__init__()
self.yang_name = "error"
self.yang_parent_name = "link-hard-errors"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("csrs-info", ("csrs_info", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.LinkHardErrors.Error.CsrsInfo)), ("last-err", ("last_err", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.LinkHardErrors.Error.LastErr))])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('asic_info', YLeaf(YType.str, 'asic-info')),
('node_key', YLeaf(YType.uint32, 'node-key')),
('alarm_on', YLeaf(YType.boolean, 'alarm-on')),
('thresh_hi', YLeaf(YType.uint32, 'thresh-hi')),
('period_hi', YLeaf(YType.uint32, 'period-hi')),
('thresh_lo', YLeaf(YType.uint32, 'thresh-lo')),
('period_lo', YLeaf(YType.uint32, 'period-lo')),
('count', YLeaf(YType.uint32, 'count')),
('intr_type', YLeaf(YType.uint32, 'intr-type')),
('leaf_id', YLeaf(YType.uint32, 'leaf-id')),
('last_cleared', YLeaf(YType.uint64, 'last-cleared')),
])
self.name = None
self.asic_info = None
self.node_key = None
self.alarm_on = None
self.thresh_hi = None
self.period_hi = None
self.thresh_lo = None
self.period_lo = None
self.count = None
self.intr_type = None
self.leaf_id = None
self.last_cleared = None
self.csrs_info = YList(self)
self.last_err = YList(self)
self._segment_path = lambda: "error"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.LinkHardErrors.Error, ['name', 'asic_info', 'node_key', 'alarm_on', 'thresh_hi', 'period_hi', 'thresh_lo', 'period_lo', 'count', 'intr_type', 'leaf_id', 'last_cleared'], name, value)
class CsrsInfo(Entity):
"""
List of csrs\_info
.. attribute:: name
name
**type**\: str
.. attribute:: address
address
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: width
width
**type**\: int
**range:** 0..4294967295
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.LinkHardErrors.Error.CsrsInfo, self).__init__()
self.yang_name = "csrs-info"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('address', YLeaf(YType.uint64, 'address')),
('width', YLeaf(YType.uint32, 'width')),
])
self.name = None
self.address = None
self.width = None
self._segment_path = lambda: "csrs-info"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.LinkHardErrors.Error.CsrsInfo, ['name', 'address', 'width'], name, value)
class LastErr(Entity):
"""
Last Printable error information
.. attribute:: at_time
at time
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: at_time_nsec
at time nsec
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: counter_val
counter val
**type**\: int
**range:** 0..4294967295
.. attribute:: error_desc
error desc
**type**\: str
.. attribute:: error_regval
error regval
**type**\: list of int
**range:** 0..255
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.LinkHardErrors.Error.LastErr, self).__init__()
self.yang_name = "last-err"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('at_time', YLeaf(YType.uint64, 'at-time')),
('at_time_nsec', YLeaf(YType.uint64, 'at-time-nsec')),
('counter_val', YLeaf(YType.uint32, 'counter-val')),
('error_desc', YLeaf(YType.str, 'error-desc')),
('error_regval', YLeafList(YType.uint8, 'error-regval')),
])
self.at_time = None
self.at_time_nsec = None
self.counter_val = None
self.error_desc = None
self.error_regval = []
self._segment_path = lambda: "last-err"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.LinkHardErrors.Error.LastErr, ['at_time', 'at_time_nsec', 'counter_val', 'error_desc', 'error_regval'], name, value)
class ConfigurationHardErrors(Entity):
"""
Configuration hard error information
.. attribute:: error
Collection of errors
**type**\: list of :py:class:`Error <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ConfigurationHardErrors.Error>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ConfigurationHardErrors, self).__init__()
self.yang_name = "configuration-hard-errors"
self.yang_parent_name = "error-path"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("error", ("error", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ConfigurationHardErrors.Error))])
self._leafs = OrderedDict()
self.error = YList(self)
self._segment_path = lambda: "configuration-hard-errors"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ConfigurationHardErrors, [], name, value)
class Error(Entity):
"""
Collection of errors
.. attribute:: name
Name assigned to mem
**type**\: str
.. attribute:: asic_info
Name of rack/board/asic
**type**\: str
.. attribute:: node_key
32 bit key
**type**\: int
**range:** 0..4294967295
.. attribute:: alarm_on
High threshold crossed
**type**\: bool
.. attribute:: thresh_hi
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_hi
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: thresh_lo
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_lo
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: count
Accumulated count
**type**\: int
**range:** 0..4294967295
.. attribute:: intr_type
Type of error
**type**\: int
**range:** 0..4294967295
.. attribute:: leaf_id
Leaf ID defined in user data
**type**\: int
**range:** 0..4294967295
.. attribute:: last_cleared
Time cleared
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: csrs_info
List of csrs\_info
**type**\: list of :py:class:`CsrsInfo <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ConfigurationHardErrors.Error.CsrsInfo>`
.. attribute:: last_err
Last Printable error information
**type**\: list of :py:class:`LastErr <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ConfigurationHardErrors.Error.LastErr>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ConfigurationHardErrors.Error, self).__init__()
self.yang_name = "error"
self.yang_parent_name = "configuration-hard-errors"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("csrs-info", ("csrs_info", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ConfigurationHardErrors.Error.CsrsInfo)), ("last-err", ("last_err", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ConfigurationHardErrors.Error.LastErr))])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('asic_info', YLeaf(YType.str, 'asic-info')),
('node_key', YLeaf(YType.uint32, 'node-key')),
('alarm_on', YLeaf(YType.boolean, 'alarm-on')),
('thresh_hi', YLeaf(YType.uint32, 'thresh-hi')),
('period_hi', YLeaf(YType.uint32, 'period-hi')),
('thresh_lo', YLeaf(YType.uint32, 'thresh-lo')),
('period_lo', YLeaf(YType.uint32, 'period-lo')),
('count', YLeaf(YType.uint32, 'count')),
('intr_type', YLeaf(YType.uint32, 'intr-type')),
('leaf_id', YLeaf(YType.uint32, 'leaf-id')),
('last_cleared', YLeaf(YType.uint64, 'last-cleared')),
])
self.name = None
self.asic_info = None
self.node_key = None
self.alarm_on = None
self.thresh_hi = None
self.period_hi = None
self.thresh_lo = None
self.period_lo = None
self.count = None
self.intr_type = None
self.leaf_id = None
self.last_cleared = None
self.csrs_info = YList(self)
self.last_err = YList(self)
self._segment_path = lambda: "error"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ConfigurationHardErrors.Error, ['name', 'asic_info', 'node_key', 'alarm_on', 'thresh_hi', 'period_hi', 'thresh_lo', 'period_lo', 'count', 'intr_type', 'leaf_id', 'last_cleared'], name, value)
class CsrsInfo(Entity):
"""
List of csrs\_info
.. attribute:: name
name
**type**\: str
.. attribute:: address
address
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: width
width
**type**\: int
**range:** 0..4294967295
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ConfigurationHardErrors.Error.CsrsInfo, self).__init__()
self.yang_name = "csrs-info"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('address', YLeaf(YType.uint64, 'address')),
('width', YLeaf(YType.uint32, 'width')),
])
self.name = None
self.address = None
self.width = None
self._segment_path = lambda: "csrs-info"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ConfigurationHardErrors.Error.CsrsInfo, ['name', 'address', 'width'], name, value)
class LastErr(Entity):
"""
Last Printable error information
.. attribute:: at_time
at time
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: at_time_nsec
at time nsec
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: counter_val
counter val
**type**\: int
**range:** 0..4294967295
.. attribute:: error_desc
error desc
**type**\: str
.. attribute:: error_regval
error regval
**type**\: list of int
**range:** 0..255
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ConfigurationHardErrors.Error.LastErr, self).__init__()
self.yang_name = "last-err"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('at_time', YLeaf(YType.uint64, 'at-time')),
('at_time_nsec', YLeaf(YType.uint64, 'at-time-nsec')),
('counter_val', YLeaf(YType.uint32, 'counter-val')),
('error_desc', YLeaf(YType.str, 'error-desc')),
('error_regval', YLeafList(YType.uint8, 'error-regval')),
])
self.at_time = None
self.at_time_nsec = None
self.counter_val = None
self.error_desc = None
self.error_regval = []
self._segment_path = lambda: "last-err"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ConfigurationHardErrors.Error.LastErr, ['at_time', 'at_time_nsec', 'counter_val', 'error_desc', 'error_regval'], name, value)
class InstanceSummary(Entity):
"""
Summary for a specific instance
.. attribute:: legacy_client
legacy client
**type**\: bool
.. attribute:: cih_client
cih client
**type**\: bool
.. attribute:: sum_data
sum data
**type**\: list of :py:class:`SumData <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.InstanceSummary.SumData>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.InstanceSummary, self).__init__()
self.yang_name = "instance-summary"
self.yang_parent_name = "error-path"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("sum-data", ("sum_data", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.InstanceSummary.SumData))])
self._leafs = OrderedDict([
('legacy_client', YLeaf(YType.boolean, 'legacy-client')),
('cih_client', YLeaf(YType.boolean, 'cih-client')),
])
self.legacy_client = None
self.cih_client = None
self.sum_data = YList(self)
self._segment_path = lambda: "instance-summary"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.InstanceSummary, ['legacy_client', 'cih_client'], name, value)
class SumData(Entity):
"""
sum data
.. attribute:: num_nodes
num nodes
**type**\: int
**range:** 0..4294967295
.. attribute:: crc_err_count
crc err count
**type**\: int
**range:** 0..4294967295
.. attribute:: sbe_err_count
sbe err count
**type**\: int
**range:** 0..4294967295
.. attribute:: mbe_err_count
mbe err count
**type**\: int
**range:** 0..4294967295
.. attribute:: par_err_count
par err count
**type**\: int
**range:** 0..4294967295
.. attribute:: gen_err_count
gen err count
**type**\: int
**range:** 0..4294967295
.. attribute:: reset_err_count
reset err count
**type**\: int
**range:** 0..4294967295
.. attribute:: err_count
err count
**type**\: list of int
**range:** 0..4294967295
.. attribute:: pcie_err_count
pcie err count
**type**\: list of int
**range:** 0..4294967295
.. attribute:: node_key
node key
**type**\: list of int
**range:** 0..4294967295
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.InstanceSummary.SumData, self).__init__()
self.yang_name = "sum-data"
self.yang_parent_name = "instance-summary"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('num_nodes', YLeaf(YType.uint32, 'num-nodes')),
('crc_err_count', YLeaf(YType.uint32, 'crc-err-count')),
('sbe_err_count', YLeaf(YType.uint32, 'sbe-err-count')),
('mbe_err_count', YLeaf(YType.uint32, 'mbe-err-count')),
('par_err_count', YLeaf(YType.uint32, 'par-err-count')),
('gen_err_count', YLeaf(YType.uint32, 'gen-err-count')),
('reset_err_count', YLeaf(YType.uint32, 'reset-err-count')),
('err_count', YLeafList(YType.uint32, 'err-count')),
('pcie_err_count', YLeafList(YType.uint32, 'pcie-err-count')),
('node_key', YLeafList(YType.uint32, 'node-key')),
])
self.num_nodes = None
self.crc_err_count = None
self.sbe_err_count = None
self.mbe_err_count = None
self.par_err_count = None
self.gen_err_count = None
self.reset_err_count = None
self.err_count = []
self.pcie_err_count = []
self.node_key = []
self._segment_path = lambda: "sum-data"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.InstanceSummary.SumData, ['num_nodes', 'crc_err_count', 'sbe_err_count', 'mbe_err_count', 'par_err_count', 'gen_err_count', 'reset_err_count', 'err_count', 'pcie_err_count', 'node_key'], name, value)
class UnexpectedHardErrors(Entity):
"""
Unexpected hard error information
.. attribute:: error
Collection of errors
**type**\: list of :py:class:`Error <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.UnexpectedHardErrors.Error>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.UnexpectedHardErrors, self).__init__()
self.yang_name = "unexpected-hard-errors"
self.yang_parent_name = "error-path"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("error", ("error", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.UnexpectedHardErrors.Error))])
self._leafs = OrderedDict()
self.error = YList(self)
self._segment_path = lambda: "unexpected-hard-errors"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.UnexpectedHardErrors, [], name, value)
class Error(Entity):
"""
Collection of errors
.. attribute:: name
Name assigned to mem
**type**\: str
.. attribute:: asic_info
Name of rack/board/asic
**type**\: str
.. attribute:: node_key
32 bit key
**type**\: int
**range:** 0..4294967295
.. attribute:: alarm_on
High threshold crossed
**type**\: bool
.. attribute:: thresh_hi
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_hi
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: thresh_lo
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_lo
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: count
Accumulated count
**type**\: int
**range:** 0..4294967295
.. attribute:: intr_type
Type of error
**type**\: int
**range:** 0..4294967295
.. attribute:: leaf_id
Leaf ID defined in user data
**type**\: int
**range:** 0..4294967295
.. attribute:: last_cleared
Time cleared
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: csrs_info
List of csrs\_info
**type**\: list of :py:class:`CsrsInfo <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.UnexpectedHardErrors.Error.CsrsInfo>`
.. attribute:: last_err
Last Printable error information
**type**\: list of :py:class:`LastErr <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.UnexpectedHardErrors.Error.LastErr>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.UnexpectedHardErrors.Error, self).__init__()
self.yang_name = "error"
self.yang_parent_name = "unexpected-hard-errors"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("csrs-info", ("csrs_info", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.UnexpectedHardErrors.Error.CsrsInfo)), ("last-err", ("last_err", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.UnexpectedHardErrors.Error.LastErr))])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('asic_info', YLeaf(YType.str, 'asic-info')),
('node_key', YLeaf(YType.uint32, 'node-key')),
('alarm_on', YLeaf(YType.boolean, 'alarm-on')),
('thresh_hi', YLeaf(YType.uint32, 'thresh-hi')),
('period_hi', YLeaf(YType.uint32, 'period-hi')),
('thresh_lo', YLeaf(YType.uint32, 'thresh-lo')),
('period_lo', YLeaf(YType.uint32, 'period-lo')),
('count', YLeaf(YType.uint32, 'count')),
('intr_type', YLeaf(YType.uint32, 'intr-type')),
('leaf_id', YLeaf(YType.uint32, 'leaf-id')),
('last_cleared', YLeaf(YType.uint64, 'last-cleared')),
])
self.name = None
self.asic_info = None
self.node_key = None
self.alarm_on = None
self.thresh_hi = None
self.period_hi = None
self.thresh_lo = None
self.period_lo = None
self.count = None
self.intr_type = None
self.leaf_id = None
self.last_cleared = None
self.csrs_info = YList(self)
self.last_err = YList(self)
self._segment_path = lambda: "error"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.UnexpectedHardErrors.Error, ['name', 'asic_info', 'node_key', 'alarm_on', 'thresh_hi', 'period_hi', 'thresh_lo', 'period_lo', 'count', 'intr_type', 'leaf_id', 'last_cleared'], name, value)
class CsrsInfo(Entity):
"""
List of csrs\_info
.. attribute:: name
name
**type**\: str
.. attribute:: address
address
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: width
width
**type**\: int
**range:** 0..4294967295
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.UnexpectedHardErrors.Error.CsrsInfo, self).__init__()
self.yang_name = "csrs-info"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('address', YLeaf(YType.uint64, 'address')),
('width', YLeaf(YType.uint32, 'width')),
])
self.name = None
self.address = None
self.width = None
self._segment_path = lambda: "csrs-info"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.UnexpectedHardErrors.Error.CsrsInfo, ['name', 'address', 'width'], name, value)
class LastErr(Entity):
"""
Last Printable error information
.. attribute:: at_time
at time
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: at_time_nsec
at time nsec
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: counter_val
counter val
**type**\: int
**range:** 0..4294967295
.. attribute:: error_desc
error desc
**type**\: str
.. attribute:: error_regval
error regval
**type**\: list of int
**range:** 0..255
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.UnexpectedHardErrors.Error.LastErr, self).__init__()
self.yang_name = "last-err"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('at_time', YLeaf(YType.uint64, 'at-time')),
('at_time_nsec', YLeaf(YType.uint64, 'at-time-nsec')),
('counter_val', YLeaf(YType.uint32, 'counter-val')),
('error_desc', YLeaf(YType.str, 'error-desc')),
('error_regval', YLeafList(YType.uint8, 'error-regval')),
])
self.at_time = None
self.at_time_nsec = None
self.counter_val = None
self.error_desc = None
self.error_regval = []
self._segment_path = lambda: "last-err"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.UnexpectedHardErrors.Error.LastErr, ['at_time', 'at_time_nsec', 'counter_val', 'error_desc', 'error_regval'], name, value)
class TimeOutSoftErrors(Entity):
"""
Time out soft error information
.. attribute:: error
Collection of errors
**type**\: list of :py:class:`Error <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.TimeOutSoftErrors.Error>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.TimeOutSoftErrors, self).__init__()
self.yang_name = "time-out-soft-errors"
self.yang_parent_name = "error-path"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("error", ("error", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.TimeOutSoftErrors.Error))])
self._leafs = OrderedDict()
self.error = YList(self)
self._segment_path = lambda: "time-out-soft-errors"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.TimeOutSoftErrors, [], name, value)
class Error(Entity):
"""
Collection of errors
.. attribute:: name
Name assigned to mem
**type**\: str
.. attribute:: asic_info
Name of rack/board/asic
**type**\: str
.. attribute:: node_key
32 bit key
**type**\: int
**range:** 0..4294967295
.. attribute:: alarm_on
High threshold crossed
**type**\: bool
.. attribute:: thresh_hi
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_hi
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: thresh_lo
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_lo
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: count
Accumulated count
**type**\: int
**range:** 0..4294967295
.. attribute:: intr_type
Type of error
**type**\: int
**range:** 0..4294967295
.. attribute:: leaf_id
Leaf ID defined in user data
**type**\: int
**range:** 0..4294967295
.. attribute:: last_cleared
Time cleared
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: csrs_info
List of csrs\_info
**type**\: list of :py:class:`CsrsInfo <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.TimeOutSoftErrors.Error.CsrsInfo>`
.. attribute:: last_err
Last Printable error information
**type**\: list of :py:class:`LastErr <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.TimeOutSoftErrors.Error.LastErr>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.TimeOutSoftErrors.Error, self).__init__()
self.yang_name = "error"
self.yang_parent_name = "time-out-soft-errors"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("csrs-info", ("csrs_info", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.TimeOutSoftErrors.Error.CsrsInfo)), ("last-err", ("last_err", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.TimeOutSoftErrors.Error.LastErr))])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('asic_info', YLeaf(YType.str, 'asic-info')),
('node_key', YLeaf(YType.uint32, 'node-key')),
('alarm_on', YLeaf(YType.boolean, 'alarm-on')),
('thresh_hi', YLeaf(YType.uint32, 'thresh-hi')),
('period_hi', YLeaf(YType.uint32, 'period-hi')),
('thresh_lo', YLeaf(YType.uint32, 'thresh-lo')),
('period_lo', YLeaf(YType.uint32, 'period-lo')),
('count', YLeaf(YType.uint32, 'count')),
('intr_type', YLeaf(YType.uint32, 'intr-type')),
('leaf_id', YLeaf(YType.uint32, 'leaf-id')),
('last_cleared', YLeaf(YType.uint64, 'last-cleared')),
])
self.name = None
self.asic_info = None
self.node_key = None
self.alarm_on = None
self.thresh_hi = None
self.period_hi = None
self.thresh_lo = None
self.period_lo = None
self.count = None
self.intr_type = None
self.leaf_id = None
self.last_cleared = None
self.csrs_info = YList(self)
self.last_err = YList(self)
self._segment_path = lambda: "error"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.TimeOutSoftErrors.Error, ['name', 'asic_info', 'node_key', 'alarm_on', 'thresh_hi', 'period_hi', 'thresh_lo', 'period_lo', 'count', 'intr_type', 'leaf_id', 'last_cleared'], name, value)
class CsrsInfo(Entity):
"""
List of csrs\_info
.. attribute:: name
name
**type**\: str
.. attribute:: address
address
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: width
width
**type**\: int
**range:** 0..4294967295
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.TimeOutSoftErrors.Error.CsrsInfo, self).__init__()
self.yang_name = "csrs-info"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('address', YLeaf(YType.uint64, 'address')),
('width', YLeaf(YType.uint32, 'width')),
])
self.name = None
self.address = None
self.width = None
self._segment_path = lambda: "csrs-info"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.TimeOutSoftErrors.Error.CsrsInfo, ['name', 'address', 'width'], name, value)
class LastErr(Entity):
"""
Last Printable error information
.. attribute:: at_time
at time
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: at_time_nsec
at time nsec
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: counter_val
counter val
**type**\: int
**range:** 0..4294967295
.. attribute:: error_desc
error desc
**type**\: str
.. attribute:: error_regval
error regval
**type**\: list of int
**range:** 0..255
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.TimeOutSoftErrors.Error.LastErr, self).__init__()
self.yang_name = "last-err"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('at_time', YLeaf(YType.uint64, 'at-time')),
('at_time_nsec', YLeaf(YType.uint64, 'at-time-nsec')),
('counter_val', YLeaf(YType.uint32, 'counter-val')),
('error_desc', YLeaf(YType.str, 'error-desc')),
('error_regval', YLeafList(YType.uint8, 'error-regval')),
])
self.at_time = None
self.at_time_nsec = None
self.counter_val = None
self.error_desc = None
self.error_regval = []
self._segment_path = lambda: "last-err"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.TimeOutSoftErrors.Error.LastErr, ['at_time', 'at_time_nsec', 'counter_val', 'error_desc', 'error_regval'], name, value)
class AsicErrorGenericHard(Entity):
"""
Indirect hard error information
.. attribute:: error
Collection of errors
**type**\: list of :py:class:`Error <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorGenericHard.Error>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorGenericHard, self).__init__()
self.yang_name = "asic-error-generic-hard"
self.yang_parent_name = "error-path"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("error", ("error", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorGenericHard.Error))])
self._leafs = OrderedDict()
self.error = YList(self)
self._segment_path = lambda: "asic-error-generic-hard"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorGenericHard, [], name, value)
class Error(Entity):
"""
Collection of errors
.. attribute:: name
Name assigned to mem
**type**\: str
.. attribute:: asic_info
Name of rack/board/asic
**type**\: str
.. attribute:: node_key
32 bit key
**type**\: int
**range:** 0..4294967295
.. attribute:: alarm_on
High threshold crossed
**type**\: bool
.. attribute:: thresh_hi
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_hi
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: thresh_lo
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_lo
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: count
Accumulated count
**type**\: int
**range:** 0..4294967295
.. attribute:: intr_type
Type of error
**type**\: int
**range:** 0..4294967295
.. attribute:: leaf_id
Leaf ID defined in user data
**type**\: int
**range:** 0..4294967295
.. attribute:: last_cleared
Time cleared
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: csrs_info
List of csrs\_info
**type**\: list of :py:class:`CsrsInfo <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorGenericHard.Error.CsrsInfo>`
.. attribute:: last_err
Last Printable error information
**type**\: list of :py:class:`LastErr <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorGenericHard.Error.LastErr>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorGenericHard.Error, self).__init__()
self.yang_name = "error"
self.yang_parent_name = "asic-error-generic-hard"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("csrs-info", ("csrs_info", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorGenericHard.Error.CsrsInfo)), ("last-err", ("last_err", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorGenericHard.Error.LastErr))])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('asic_info', YLeaf(YType.str, 'asic-info')),
('node_key', YLeaf(YType.uint32, 'node-key')),
('alarm_on', YLeaf(YType.boolean, 'alarm-on')),
('thresh_hi', YLeaf(YType.uint32, 'thresh-hi')),
('period_hi', YLeaf(YType.uint32, 'period-hi')),
('thresh_lo', YLeaf(YType.uint32, 'thresh-lo')),
('period_lo', YLeaf(YType.uint32, 'period-lo')),
('count', YLeaf(YType.uint32, 'count')),
('intr_type', YLeaf(YType.uint32, 'intr-type')),
('leaf_id', YLeaf(YType.uint32, 'leaf-id')),
('last_cleared', YLeaf(YType.uint64, 'last-cleared')),
])
self.name = None
self.asic_info = None
self.node_key = None
self.alarm_on = None
self.thresh_hi = None
self.period_hi = None
self.thresh_lo = None
self.period_lo = None
self.count = None
self.intr_type = None
self.leaf_id = None
self.last_cleared = None
self.csrs_info = YList(self)
self.last_err = YList(self)
self._segment_path = lambda: "error"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorGenericHard.Error, ['name', 'asic_info', 'node_key', 'alarm_on', 'thresh_hi', 'period_hi', 'thresh_lo', 'period_lo', 'count', 'intr_type', 'leaf_id', 'last_cleared'], name, value)
class CsrsInfo(Entity):
"""
List of csrs\_info
.. attribute:: name
name
**type**\: str
.. attribute:: address
address
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: width
width
**type**\: int
**range:** 0..4294967295
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorGenericHard.Error.CsrsInfo, self).__init__()
self.yang_name = "csrs-info"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('address', YLeaf(YType.uint64, 'address')),
('width', YLeaf(YType.uint32, 'width')),
])
self.name = None
self.address = None
self.width = None
self._segment_path = lambda: "csrs-info"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorGenericHard.Error.CsrsInfo, ['name', 'address', 'width'], name, value)
class LastErr(Entity):
"""
Last Printable error information
.. attribute:: at_time
at time
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: at_time_nsec
at time nsec
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: counter_val
counter val
**type**\: int
**range:** 0..4294967295
.. attribute:: error_desc
error desc
**type**\: str
.. attribute:: error_regval
error regval
**type**\: list of int
**range:** 0..255
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorGenericHard.Error.LastErr, self).__init__()
self.yang_name = "last-err"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('at_time', YLeaf(YType.uint64, 'at-time')),
('at_time_nsec', YLeaf(YType.uint64, 'at-time-nsec')),
('counter_val', YLeaf(YType.uint32, 'counter-val')),
('error_desc', YLeaf(YType.str, 'error-desc')),
('error_regval', YLeafList(YType.uint8, 'error-regval')),
])
self.at_time = None
self.at_time_nsec = None
self.counter_val = None
self.error_desc = None
self.error_regval = []
self._segment_path = lambda: "last-err"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorGenericHard.Error.LastErr, ['at_time', 'at_time_nsec', 'counter_val', 'error_desc', 'error_regval'], name, value)
class ParityHardErrors(Entity):
"""
Parity hard error information
.. attribute:: error
Collection of errors
**type**\: list of :py:class:`Error <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ParityHardErrors.Error>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ParityHardErrors, self).__init__()
self.yang_name = "parity-hard-errors"
self.yang_parent_name = "error-path"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("error", ("error", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ParityHardErrors.Error))])
self._leafs = OrderedDict()
self.error = YList(self)
self._segment_path = lambda: "parity-hard-errors"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ParityHardErrors, [], name, value)
class Error(Entity):
"""
Collection of errors
.. attribute:: name
Name assigned to mem
**type**\: str
.. attribute:: asic_info
Name of rack/board/asic
**type**\: str
.. attribute:: node_key
32 bit key
**type**\: int
**range:** 0..4294967295
.. attribute:: alarm_on
High threshold crossed
**type**\: bool
.. attribute:: thresh_hi
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_hi
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: thresh_lo
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_lo
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: count
Accumulated count
**type**\: int
**range:** 0..4294967295
.. attribute:: intr_type
Type of error
**type**\: int
**range:** 0..4294967295
.. attribute:: leaf_id
Leaf ID defined in user data
**type**\: int
**range:** 0..4294967295
.. attribute:: last_cleared
Time cleared
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: csrs_info
List of csrs\_info
**type**\: list of :py:class:`CsrsInfo <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ParityHardErrors.Error.CsrsInfo>`
.. attribute:: last_err
Last Printable error information
**type**\: list of :py:class:`LastErr <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ParityHardErrors.Error.LastErr>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ParityHardErrors.Error, self).__init__()
self.yang_name = "error"
self.yang_parent_name = "parity-hard-errors"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("csrs-info", ("csrs_info", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ParityHardErrors.Error.CsrsInfo)), ("last-err", ("last_err", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ParityHardErrors.Error.LastErr))])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('asic_info', YLeaf(YType.str, 'asic-info')),
('node_key', YLeaf(YType.uint32, 'node-key')),
('alarm_on', YLeaf(YType.boolean, 'alarm-on')),
('thresh_hi', YLeaf(YType.uint32, 'thresh-hi')),
('period_hi', YLeaf(YType.uint32, 'period-hi')),
('thresh_lo', YLeaf(YType.uint32, 'thresh-lo')),
('period_lo', YLeaf(YType.uint32, 'period-lo')),
('count', YLeaf(YType.uint32, 'count')),
('intr_type', YLeaf(YType.uint32, 'intr-type')),
('leaf_id', YLeaf(YType.uint32, 'leaf-id')),
('last_cleared', YLeaf(YType.uint64, 'last-cleared')),
])
self.name = None
self.asic_info = None
self.node_key = None
self.alarm_on = None
self.thresh_hi = None
self.period_hi = None
self.thresh_lo = None
self.period_lo = None
self.count = None
self.intr_type = None
self.leaf_id = None
self.last_cleared = None
self.csrs_info = YList(self)
self.last_err = YList(self)
self._segment_path = lambda: "error"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ParityHardErrors.Error, ['name', 'asic_info', 'node_key', 'alarm_on', 'thresh_hi', 'period_hi', 'thresh_lo', 'period_lo', 'count', 'intr_type', 'leaf_id', 'last_cleared'], name, value)
class CsrsInfo(Entity):
"""
List of csrs\_info
.. attribute:: name
name
**type**\: str
.. attribute:: address
address
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: width
width
**type**\: int
**range:** 0..4294967295
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ParityHardErrors.Error.CsrsInfo, self).__init__()
self.yang_name = "csrs-info"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('address', YLeaf(YType.uint64, 'address')),
('width', YLeaf(YType.uint32, 'width')),
])
self.name = None
self.address = None
self.width = None
self._segment_path = lambda: "csrs-info"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ParityHardErrors.Error.CsrsInfo, ['name', 'address', 'width'], name, value)
class LastErr(Entity):
"""
Last Printable error information
.. attribute:: at_time
at time
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: at_time_nsec
at time nsec
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: counter_val
counter val
**type**\: int
**range:** 0..4294967295
.. attribute:: error_desc
error desc
**type**\: str
.. attribute:: error_regval
error regval
**type**\: list of int
**range:** 0..255
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ParityHardErrors.Error.LastErr, self).__init__()
self.yang_name = "last-err"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('at_time', YLeaf(YType.uint64, 'at-time')),
('at_time_nsec', YLeaf(YType.uint64, 'at-time-nsec')),
('counter_val', YLeaf(YType.uint32, 'counter-val')),
('error_desc', YLeaf(YType.str, 'error-desc')),
('error_regval', YLeafList(YType.uint8, 'error-regval')),
])
self.at_time = None
self.at_time_nsec = None
self.counter_val = None
self.error_desc = None
self.error_regval = []
self._segment_path = lambda: "last-err"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ParityHardErrors.Error.LastErr, ['at_time', 'at_time_nsec', 'counter_val', 'error_desc', 'error_regval'], name, value)
class DescriptorHardErrors(Entity):
"""
Descriptor hard error information
.. attribute:: error
Collection of errors
**type**\: list of :py:class:`Error <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.DescriptorHardErrors.Error>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.DescriptorHardErrors, self).__init__()
self.yang_name = "descriptor-hard-errors"
self.yang_parent_name = "error-path"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("error", ("error", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.DescriptorHardErrors.Error))])
self._leafs = OrderedDict()
self.error = YList(self)
self._segment_path = lambda: "descriptor-hard-errors"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.DescriptorHardErrors, [], name, value)
class Error(Entity):
"""
Collection of errors
.. attribute:: name
Name assigned to mem
**type**\: str
.. attribute:: asic_info
Name of rack/board/asic
**type**\: str
.. attribute:: node_key
32 bit key
**type**\: int
**range:** 0..4294967295
.. attribute:: alarm_on
High threshold crossed
**type**\: bool
.. attribute:: thresh_hi
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_hi
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: thresh_lo
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_lo
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: count
Accumulated count
**type**\: int
**range:** 0..4294967295
.. attribute:: intr_type
Type of error
**type**\: int
**range:** 0..4294967295
.. attribute:: leaf_id
Leaf ID defined in user data
**type**\: int
**range:** 0..4294967295
.. attribute:: last_cleared
Time cleared
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: csrs_info
List of csrs\_info
**type**\: list of :py:class:`CsrsInfo <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.DescriptorHardErrors.Error.CsrsInfo>`
.. attribute:: last_err
Last Printable error information
**type**\: list of :py:class:`LastErr <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.DescriptorHardErrors.Error.LastErr>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.DescriptorHardErrors.Error, self).__init__()
self.yang_name = "error"
self.yang_parent_name = "descriptor-hard-errors"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("csrs-info", ("csrs_info", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.DescriptorHardErrors.Error.CsrsInfo)), ("last-err", ("last_err", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.DescriptorHardErrors.Error.LastErr))])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('asic_info', YLeaf(YType.str, 'asic-info')),
('node_key', YLeaf(YType.uint32, 'node-key')),
('alarm_on', YLeaf(YType.boolean, 'alarm-on')),
('thresh_hi', YLeaf(YType.uint32, 'thresh-hi')),
('period_hi', YLeaf(YType.uint32, 'period-hi')),
('thresh_lo', YLeaf(YType.uint32, 'thresh-lo')),
('period_lo', YLeaf(YType.uint32, 'period-lo')),
('count', YLeaf(YType.uint32, 'count')),
('intr_type', YLeaf(YType.uint32, 'intr-type')),
('leaf_id', YLeaf(YType.uint32, 'leaf-id')),
('last_cleared', YLeaf(YType.uint64, 'last-cleared')),
])
self.name = None
self.asic_info = None
self.node_key = None
self.alarm_on = None
self.thresh_hi = None
self.period_hi = None
self.thresh_lo = None
self.period_lo = None
self.count = None
self.intr_type = None
self.leaf_id = None
self.last_cleared = None
self.csrs_info = YList(self)
self.last_err = YList(self)
self._segment_path = lambda: "error"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.DescriptorHardErrors.Error, ['name', 'asic_info', 'node_key', 'alarm_on', 'thresh_hi', 'period_hi', 'thresh_lo', 'period_lo', 'count', 'intr_type', 'leaf_id', 'last_cleared'], name, value)
class CsrsInfo(Entity):
"""
List of csrs\_info
.. attribute:: name
name
**type**\: str
.. attribute:: address
address
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: width
width
**type**\: int
**range:** 0..4294967295
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.DescriptorHardErrors.Error.CsrsInfo, self).__init__()
self.yang_name = "csrs-info"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('address', YLeaf(YType.uint64, 'address')),
('width', YLeaf(YType.uint32, 'width')),
])
self.name = None
self.address = None
self.width = None
self._segment_path = lambda: "csrs-info"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.DescriptorHardErrors.Error.CsrsInfo, ['name', 'address', 'width'], name, value)
class LastErr(Entity):
"""
Last Printable error information
.. attribute:: at_time
at time
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: at_time_nsec
at time nsec
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: counter_val
counter val
**type**\: int
**range:** 0..4294967295
.. attribute:: error_desc
error desc
**type**\: str
.. attribute:: error_regval
error regval
**type**\: list of int
**range:** 0..255
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.DescriptorHardErrors.Error.LastErr, self).__init__()
self.yang_name = "last-err"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('at_time', YLeaf(YType.uint64, 'at-time')),
('at_time_nsec', YLeaf(YType.uint64, 'at-time-nsec')),
('counter_val', YLeaf(YType.uint32, 'counter-val')),
('error_desc', YLeaf(YType.str, 'error-desc')),
('error_regval', YLeafList(YType.uint8, 'error-regval')),
])
self.at_time = None
self.at_time_nsec = None
self.counter_val = None
self.error_desc = None
self.error_regval = []
self._segment_path = lambda: "last-err"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.DescriptorHardErrors.Error.LastErr, ['at_time', 'at_time_nsec', 'counter_val', 'error_desc', 'error_regval'], name, value)
class InterfaceHardErrors(Entity):
"""
Interface hard error information
.. attribute:: error
Collection of errors
**type**\: list of :py:class:`Error <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.InterfaceHardErrors.Error>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.InterfaceHardErrors, self).__init__()
self.yang_name = "interface-hard-errors"
self.yang_parent_name = "error-path"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("error", ("error", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.InterfaceHardErrors.Error))])
self._leafs = OrderedDict()
self.error = YList(self)
self._segment_path = lambda: "interface-hard-errors"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.InterfaceHardErrors, [], name, value)
class Error(Entity):
"""
Collection of errors
.. attribute:: name
Name assigned to mem
**type**\: str
.. attribute:: asic_info
Name of rack/board/asic
**type**\: str
.. attribute:: node_key
32 bit key
**type**\: int
**range:** 0..4294967295
.. attribute:: alarm_on
High threshold crossed
**type**\: bool
.. attribute:: thresh_hi
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_hi
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: thresh_lo
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_lo
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: count
Accumulated count
**type**\: int
**range:** 0..4294967295
.. attribute:: intr_type
Type of error
**type**\: int
**range:** 0..4294967295
.. attribute:: leaf_id
Leaf ID defined in user data
**type**\: int
**range:** 0..4294967295
.. attribute:: last_cleared
Time cleared
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: csrs_info
List of csrs\_info
**type**\: list of :py:class:`CsrsInfo <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.InterfaceHardErrors.Error.CsrsInfo>`
.. attribute:: last_err
Last Printable error information
**type**\: list of :py:class:`LastErr <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.InterfaceHardErrors.Error.LastErr>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.InterfaceHardErrors.Error, self).__init__()
self.yang_name = "error"
self.yang_parent_name = "interface-hard-errors"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("csrs-info", ("csrs_info", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.InterfaceHardErrors.Error.CsrsInfo)), ("last-err", ("last_err", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.InterfaceHardErrors.Error.LastErr))])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('asic_info', YLeaf(YType.str, 'asic-info')),
('node_key', YLeaf(YType.uint32, 'node-key')),
('alarm_on', YLeaf(YType.boolean, 'alarm-on')),
('thresh_hi', YLeaf(YType.uint32, 'thresh-hi')),
('period_hi', YLeaf(YType.uint32, 'period-hi')),
('thresh_lo', YLeaf(YType.uint32, 'thresh-lo')),
('period_lo', YLeaf(YType.uint32, 'period-lo')),
('count', YLeaf(YType.uint32, 'count')),
('intr_type', YLeaf(YType.uint32, 'intr-type')),
('leaf_id', YLeaf(YType.uint32, 'leaf-id')),
('last_cleared', YLeaf(YType.uint64, 'last-cleared')),
])
self.name = None
self.asic_info = None
self.node_key = None
self.alarm_on = None
self.thresh_hi = None
self.period_hi = None
self.thresh_lo = None
self.period_lo = None
self.count = None
self.intr_type = None
self.leaf_id = None
self.last_cleared = None
self.csrs_info = YList(self)
self.last_err = YList(self)
self._segment_path = lambda: "error"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.InterfaceHardErrors.Error, ['name', 'asic_info', 'node_key', 'alarm_on', 'thresh_hi', 'period_hi', 'thresh_lo', 'period_lo', 'count', 'intr_type', 'leaf_id', 'last_cleared'], name, value)
class CsrsInfo(Entity):
"""
List of csrs\_info
.. attribute:: name
name
**type**\: str
.. attribute:: address
address
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: width
width
**type**\: int
**range:** 0..4294967295
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.InterfaceHardErrors.Error.CsrsInfo, self).__init__()
self.yang_name = "csrs-info"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('address', YLeaf(YType.uint64, 'address')),
('width', YLeaf(YType.uint32, 'width')),
])
self.name = None
self.address = None
self.width = None
self._segment_path = lambda: "csrs-info"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.InterfaceHardErrors.Error.CsrsInfo, ['name', 'address', 'width'], name, value)
class LastErr(Entity):
"""
Last Printable error information
.. attribute:: at_time
at time
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: at_time_nsec
at time nsec
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: counter_val
counter val
**type**\: int
**range:** 0..4294967295
.. attribute:: error_desc
error desc
**type**\: str
.. attribute:: error_regval
error regval
**type**\: list of int
**range:** 0..255
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.InterfaceHardErrors.Error.LastErr, self).__init__()
self.yang_name = "last-err"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('at_time', YLeaf(YType.uint64, 'at-time')),
('at_time_nsec', YLeaf(YType.uint64, 'at-time-nsec')),
('counter_val', YLeaf(YType.uint32, 'counter-val')),
('error_desc', YLeaf(YType.str, 'error-desc')),
('error_regval', YLeafList(YType.uint8, 'error-regval')),
])
self.at_time = None
self.at_time_nsec = None
self.counter_val = None
self.error_desc = None
self.error_regval = []
self._segment_path = lambda: "last-err"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.InterfaceHardErrors.Error.LastErr, ['at_time', 'at_time_nsec', 'counter_val', 'error_desc', 'error_regval'], name, value)
class AsicErrorSbeHard(Entity):
"""
Indirect hard error information
.. attribute:: error
Collection of errors
**type**\: list of :py:class:`Error <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorSbeHard.Error>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorSbeHard, self).__init__()
self.yang_name = "asic-error-sbe-hard"
self.yang_parent_name = "error-path"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("error", ("error", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorSbeHard.Error))])
self._leafs = OrderedDict()
self.error = YList(self)
self._segment_path = lambda: "asic-error-sbe-hard"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorSbeHard, [], name, value)
class Error(Entity):
"""
Collection of errors
.. attribute:: name
Name assigned to mem
**type**\: str
.. attribute:: asic_info
Name of rack/board/asic
**type**\: str
.. attribute:: node_key
32 bit key
**type**\: int
**range:** 0..4294967295
.. attribute:: alarm_on
High threshold crossed
**type**\: bool
.. attribute:: thresh_hi
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_hi
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: thresh_lo
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_lo
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: count
Accumulated count
**type**\: int
**range:** 0..4294967295
.. attribute:: intr_type
Type of error
**type**\: int
**range:** 0..4294967295
.. attribute:: leaf_id
Leaf ID defined in user data
**type**\: int
**range:** 0..4294967295
.. attribute:: last_cleared
Time cleared
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: csrs_info
List of csrs\_info
**type**\: list of :py:class:`CsrsInfo <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorSbeHard.Error.CsrsInfo>`
.. attribute:: last_err
Last Printable error information
**type**\: list of :py:class:`LastErr <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorSbeHard.Error.LastErr>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorSbeHard.Error, self).__init__()
self.yang_name = "error"
self.yang_parent_name = "asic-error-sbe-hard"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("csrs-info", ("csrs_info", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorSbeHard.Error.CsrsInfo)), ("last-err", ("last_err", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorSbeHard.Error.LastErr))])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('asic_info', YLeaf(YType.str, 'asic-info')),
('node_key', YLeaf(YType.uint32, 'node-key')),
('alarm_on', YLeaf(YType.boolean, 'alarm-on')),
('thresh_hi', YLeaf(YType.uint32, 'thresh-hi')),
('period_hi', YLeaf(YType.uint32, 'period-hi')),
('thresh_lo', YLeaf(YType.uint32, 'thresh-lo')),
('period_lo', YLeaf(YType.uint32, 'period-lo')),
('count', YLeaf(YType.uint32, 'count')),
('intr_type', YLeaf(YType.uint32, 'intr-type')),
('leaf_id', YLeaf(YType.uint32, 'leaf-id')),
('last_cleared', YLeaf(YType.uint64, 'last-cleared')),
])
self.name = None
self.asic_info = None
self.node_key = None
self.alarm_on = None
self.thresh_hi = None
self.period_hi = None
self.thresh_lo = None
self.period_lo = None
self.count = None
self.intr_type = None
self.leaf_id = None
self.last_cleared = None
self.csrs_info = YList(self)
self.last_err = YList(self)
self._segment_path = lambda: "error"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorSbeHard.Error, ['name', 'asic_info', 'node_key', 'alarm_on', 'thresh_hi', 'period_hi', 'thresh_lo', 'period_lo', 'count', 'intr_type', 'leaf_id', 'last_cleared'], name, value)
class CsrsInfo(Entity):
"""
List of csrs\_info
.. attribute:: name
name
**type**\: str
.. attribute:: address
address
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: width
width
**type**\: int
**range:** 0..4294967295
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorSbeHard.Error.CsrsInfo, self).__init__()
self.yang_name = "csrs-info"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('address', YLeaf(YType.uint64, 'address')),
('width', YLeaf(YType.uint32, 'width')),
])
self.name = None
self.address = None
self.width = None
self._segment_path = lambda: "csrs-info"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorSbeHard.Error.CsrsInfo, ['name', 'address', 'width'], name, value)
class LastErr(Entity):
"""
Last Printable error information
.. attribute:: at_time
at time
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: at_time_nsec
at time nsec
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: counter_val
counter val
**type**\: int
**range:** 0..4294967295
.. attribute:: error_desc
error desc
**type**\: str
.. attribute:: error_regval
error regval
**type**\: list of int
**range:** 0..255
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorSbeHard.Error.LastErr, self).__init__()
self.yang_name = "last-err"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('at_time', YLeaf(YType.uint64, 'at-time')),
('at_time_nsec', YLeaf(YType.uint64, 'at-time-nsec')),
('counter_val', YLeaf(YType.uint32, 'counter-val')),
('error_desc', YLeaf(YType.str, 'error-desc')),
('error_regval', YLeafList(YType.uint8, 'error-regval')),
])
self.at_time = None
self.at_time_nsec = None
self.counter_val = None
self.error_desc = None
self.error_regval = []
self._segment_path = lambda: "last-err"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorSbeHard.Error.LastErr, ['at_time', 'at_time_nsec', 'counter_val', 'error_desc', 'error_regval'], name, value)
class AsicErrorCrcHard(Entity):
"""
Indirect hard error information
.. attribute:: error
Collection of errors
**type**\: list of :py:class:`Error <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorCrcHard.Error>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorCrcHard, self).__init__()
self.yang_name = "asic-error-crc-hard"
self.yang_parent_name = "error-path"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("error", ("error", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorCrcHard.Error))])
self._leafs = OrderedDict()
self.error = YList(self)
self._segment_path = lambda: "asic-error-crc-hard"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorCrcHard, [], name, value)
class Error(Entity):
"""
Collection of errors
.. attribute:: name
Name assigned to mem
**type**\: str
.. attribute:: asic_info
Name of rack/board/asic
**type**\: str
.. attribute:: node_key
32 bit key
**type**\: int
**range:** 0..4294967295
.. attribute:: alarm_on
High threshold crossed
**type**\: bool
.. attribute:: thresh_hi
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_hi
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: thresh_lo
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_lo
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: count
Accumulated count
**type**\: int
**range:** 0..4294967295
.. attribute:: intr_type
Type of error
**type**\: int
**range:** 0..4294967295
.. attribute:: leaf_id
Leaf ID defined in user data
**type**\: int
**range:** 0..4294967295
.. attribute:: last_cleared
Time cleared
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: csrs_info
List of csrs\_info
**type**\: list of :py:class:`CsrsInfo <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorCrcHard.Error.CsrsInfo>`
.. attribute:: last_err
Last Printable error information
**type**\: list of :py:class:`LastErr <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorCrcHard.Error.LastErr>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorCrcHard.Error, self).__init__()
self.yang_name = "error"
self.yang_parent_name = "asic-error-crc-hard"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("csrs-info", ("csrs_info", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorCrcHard.Error.CsrsInfo)), ("last-err", ("last_err", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorCrcHard.Error.LastErr))])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('asic_info', YLeaf(YType.str, 'asic-info')),
('node_key', YLeaf(YType.uint32, 'node-key')),
('alarm_on', YLeaf(YType.boolean, 'alarm-on')),
('thresh_hi', YLeaf(YType.uint32, 'thresh-hi')),
('period_hi', YLeaf(YType.uint32, 'period-hi')),
('thresh_lo', YLeaf(YType.uint32, 'thresh-lo')),
('period_lo', YLeaf(YType.uint32, 'period-lo')),
('count', YLeaf(YType.uint32, 'count')),
('intr_type', YLeaf(YType.uint32, 'intr-type')),
('leaf_id', YLeaf(YType.uint32, 'leaf-id')),
('last_cleared', YLeaf(YType.uint64, 'last-cleared')),
])
self.name = None
self.asic_info = None
self.node_key = None
self.alarm_on = None
self.thresh_hi = None
self.period_hi = None
self.thresh_lo = None
self.period_lo = None
self.count = None
self.intr_type = None
self.leaf_id = None
self.last_cleared = None
self.csrs_info = YList(self)
self.last_err = YList(self)
self._segment_path = lambda: "error"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorCrcHard.Error, ['name', 'asic_info', 'node_key', 'alarm_on', 'thresh_hi', 'period_hi', 'thresh_lo', 'period_lo', 'count', 'intr_type', 'leaf_id', 'last_cleared'], name, value)
class CsrsInfo(Entity):
"""
List of csrs\_info
.. attribute:: name
name
**type**\: str
.. attribute:: address
address
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: width
width
**type**\: int
**range:** 0..4294967295
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorCrcHard.Error.CsrsInfo, self).__init__()
self.yang_name = "csrs-info"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('address', YLeaf(YType.uint64, 'address')),
('width', YLeaf(YType.uint32, 'width')),
])
self.name = None
self.address = None
self.width = None
self._segment_path = lambda: "csrs-info"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorCrcHard.Error.CsrsInfo, ['name', 'address', 'width'], name, value)
class LastErr(Entity):
"""
Last Printable error information
.. attribute:: at_time
at time
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: at_time_nsec
at time nsec
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: counter_val
counter val
**type**\: int
**range:** 0..4294967295
.. attribute:: error_desc
error desc
**type**\: str
.. attribute:: error_regval
error regval
**type**\: list of int
**range:** 0..255
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorCrcHard.Error.LastErr, self).__init__()
self.yang_name = "last-err"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('at_time', YLeaf(YType.uint64, 'at-time')),
('at_time_nsec', YLeaf(YType.uint64, 'at-time-nsec')),
('counter_val', YLeaf(YType.uint32, 'counter-val')),
('error_desc', YLeaf(YType.str, 'error-desc')),
('error_regval', YLeafList(YType.uint8, 'error-regval')),
])
self.at_time = None
self.at_time_nsec = None
self.counter_val = None
self.error_desc = None
self.error_regval = []
self._segment_path = lambda: "last-err"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorCrcHard.Error.LastErr, ['at_time', 'at_time_nsec', 'counter_val', 'error_desc', 'error_regval'], name, value)
class AsicErrorParityHard(Entity):
"""
Indirect hard error information
.. attribute:: error
Collection of errors
**type**\: list of :py:class:`Error <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorParityHard.Error>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorParityHard, self).__init__()
self.yang_name = "asic-error-parity-hard"
self.yang_parent_name = "error-path"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("error", ("error", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorParityHard.Error))])
self._leafs = OrderedDict()
self.error = YList(self)
self._segment_path = lambda: "asic-error-parity-hard"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorParityHard, [], name, value)
class Error(Entity):
"""
Collection of errors
.. attribute:: name
Name assigned to mem
**type**\: str
.. attribute:: asic_info
Name of rack/board/asic
**type**\: str
.. attribute:: node_key
32 bit key
**type**\: int
**range:** 0..4294967295
.. attribute:: alarm_on
High threshold crossed
**type**\: bool
.. attribute:: thresh_hi
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_hi
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: thresh_lo
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_lo
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: count
Accumulated count
**type**\: int
**range:** 0..4294967295
.. attribute:: intr_type
Type of error
**type**\: int
**range:** 0..4294967295
.. attribute:: leaf_id
Leaf ID defined in user data
**type**\: int
**range:** 0..4294967295
.. attribute:: last_cleared
Time cleared
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: csrs_info
List of csrs\_info
**type**\: list of :py:class:`CsrsInfo <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorParityHard.Error.CsrsInfo>`
.. attribute:: last_err
Last Printable error information
**type**\: list of :py:class:`LastErr <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorParityHard.Error.LastErr>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorParityHard.Error, self).__init__()
self.yang_name = "error"
self.yang_parent_name = "asic-error-parity-hard"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("csrs-info", ("csrs_info", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorParityHard.Error.CsrsInfo)), ("last-err", ("last_err", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorParityHard.Error.LastErr))])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('asic_info', YLeaf(YType.str, 'asic-info')),
('node_key', YLeaf(YType.uint32, 'node-key')),
('alarm_on', YLeaf(YType.boolean, 'alarm-on')),
('thresh_hi', YLeaf(YType.uint32, 'thresh-hi')),
('period_hi', YLeaf(YType.uint32, 'period-hi')),
('thresh_lo', YLeaf(YType.uint32, 'thresh-lo')),
('period_lo', YLeaf(YType.uint32, 'period-lo')),
('count', YLeaf(YType.uint32, 'count')),
('intr_type', YLeaf(YType.uint32, 'intr-type')),
('leaf_id', YLeaf(YType.uint32, 'leaf-id')),
('last_cleared', YLeaf(YType.uint64, 'last-cleared')),
])
self.name = None
self.asic_info = None
self.node_key = None
self.alarm_on = None
self.thresh_hi = None
self.period_hi = None
self.thresh_lo = None
self.period_lo = None
self.count = None
self.intr_type = None
self.leaf_id = None
self.last_cleared = None
self.csrs_info = YList(self)
self.last_err = YList(self)
self._segment_path = lambda: "error"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorParityHard.Error, ['name', 'asic_info', 'node_key', 'alarm_on', 'thresh_hi', 'period_hi', 'thresh_lo', 'period_lo', 'count', 'intr_type', 'leaf_id', 'last_cleared'], name, value)
class CsrsInfo(Entity):
"""
List of csrs\_info
.. attribute:: name
name
**type**\: str
.. attribute:: address
address
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: width
width
**type**\: int
**range:** 0..4294967295
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorParityHard.Error.CsrsInfo, self).__init__()
self.yang_name = "csrs-info"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('address', YLeaf(YType.uint64, 'address')),
('width', YLeaf(YType.uint32, 'width')),
])
self.name = None
self.address = None
self.width = None
self._segment_path = lambda: "csrs-info"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorParityHard.Error.CsrsInfo, ['name', 'address', 'width'], name, value)
class LastErr(Entity):
"""
Last Printable error information
.. attribute:: at_time
at time
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: at_time_nsec
at time nsec
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: counter_val
counter val
**type**\: int
**range:** 0..4294967295
.. attribute:: error_desc
error desc
**type**\: str
.. attribute:: error_regval
error regval
**type**\: list of int
**range:** 0..255
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorParityHard.Error.LastErr, self).__init__()
self.yang_name = "last-err"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('at_time', YLeaf(YType.uint64, 'at-time')),
('at_time_nsec', YLeaf(YType.uint64, 'at-time-nsec')),
('counter_val', YLeaf(YType.uint32, 'counter-val')),
('error_desc', YLeaf(YType.str, 'error-desc')),
('error_regval', YLeafList(YType.uint8, 'error-regval')),
])
self.at_time = None
self.at_time_nsec = None
self.counter_val = None
self.error_desc = None
self.error_regval = []
self._segment_path = lambda: "last-err"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorParityHard.Error.LastErr, ['at_time', 'at_time_nsec', 'counter_val', 'error_desc', 'error_regval'], name, value)
class AsicErrorResetSoft(Entity):
"""
Indirect hard error information
.. attribute:: error
Collection of errors
**type**\: list of :py:class:`Error <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorResetSoft.Error>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorResetSoft, self).__init__()
self.yang_name = "asic-error-reset-soft"
self.yang_parent_name = "error-path"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("error", ("error", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorResetSoft.Error))])
self._leafs = OrderedDict()
self.error = YList(self)
self._segment_path = lambda: "asic-error-reset-soft"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorResetSoft, [], name, value)
class Error(Entity):
"""
Collection of errors
.. attribute:: name
Name assigned to mem
**type**\: str
.. attribute:: asic_info
Name of rack/board/asic
**type**\: str
.. attribute:: node_key
32 bit key
**type**\: int
**range:** 0..4294967295
.. attribute:: alarm_on
High threshold crossed
**type**\: bool
.. attribute:: thresh_hi
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_hi
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: thresh_lo
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_lo
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: count
Accumulated count
**type**\: int
**range:** 0..4294967295
.. attribute:: intr_type
Type of error
**type**\: int
**range:** 0..4294967295
.. attribute:: leaf_id
Leaf ID defined in user data
**type**\: int
**range:** 0..4294967295
.. attribute:: last_cleared
Time cleared
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: csrs_info
List of csrs\_info
**type**\: list of :py:class:`CsrsInfo <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorResetSoft.Error.CsrsInfo>`
.. attribute:: last_err
Last Printable error information
**type**\: list of :py:class:`LastErr <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorResetSoft.Error.LastErr>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorResetSoft.Error, self).__init__()
self.yang_name = "error"
self.yang_parent_name = "asic-error-reset-soft"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("csrs-info", ("csrs_info", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorResetSoft.Error.CsrsInfo)), ("last-err", ("last_err", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorResetSoft.Error.LastErr))])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('asic_info', YLeaf(YType.str, 'asic-info')),
('node_key', YLeaf(YType.uint32, 'node-key')),
('alarm_on', YLeaf(YType.boolean, 'alarm-on')),
('thresh_hi', YLeaf(YType.uint32, 'thresh-hi')),
('period_hi', YLeaf(YType.uint32, 'period-hi')),
('thresh_lo', YLeaf(YType.uint32, 'thresh-lo')),
('period_lo', YLeaf(YType.uint32, 'period-lo')),
('count', YLeaf(YType.uint32, 'count')),
('intr_type', YLeaf(YType.uint32, 'intr-type')),
('leaf_id', YLeaf(YType.uint32, 'leaf-id')),
('last_cleared', YLeaf(YType.uint64, 'last-cleared')),
])
self.name = None
self.asic_info = None
self.node_key = None
self.alarm_on = None
self.thresh_hi = None
self.period_hi = None
self.thresh_lo = None
self.period_lo = None
self.count = None
self.intr_type = None
self.leaf_id = None
self.last_cleared = None
self.csrs_info = YList(self)
self.last_err = YList(self)
self._segment_path = lambda: "error"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorResetSoft.Error, ['name', 'asic_info', 'node_key', 'alarm_on', 'thresh_hi', 'period_hi', 'thresh_lo', 'period_lo', 'count', 'intr_type', 'leaf_id', 'last_cleared'], name, value)
class CsrsInfo(Entity):
"""
List of csrs\_info
.. attribute:: name
name
**type**\: str
.. attribute:: address
address
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: width
width
**type**\: int
**range:** 0..4294967295
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorResetSoft.Error.CsrsInfo, self).__init__()
self.yang_name = "csrs-info"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('address', YLeaf(YType.uint64, 'address')),
('width', YLeaf(YType.uint32, 'width')),
])
self.name = None
self.address = None
self.width = None
self._segment_path = lambda: "csrs-info"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorResetSoft.Error.CsrsInfo, ['name', 'address', 'width'], name, value)
class LastErr(Entity):
"""
Last Printable error information
.. attribute:: at_time
at time
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: at_time_nsec
at time nsec
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: counter_val
counter val
**type**\: int
**range:** 0..4294967295
.. attribute:: error_desc
error desc
**type**\: str
.. attribute:: error_regval
error regval
**type**\: list of int
**range:** 0..255
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorResetSoft.Error.LastErr, self).__init__()
self.yang_name = "last-err"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('at_time', YLeaf(YType.uint64, 'at-time')),
('at_time_nsec', YLeaf(YType.uint64, 'at-time-nsec')),
('counter_val', YLeaf(YType.uint32, 'counter-val')),
('error_desc', YLeaf(YType.str, 'error-desc')),
('error_regval', YLeafList(YType.uint8, 'error-regval')),
])
self.at_time = None
self.at_time_nsec = None
self.counter_val = None
self.error_desc = None
self.error_regval = []
self._segment_path = lambda: "last-err"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorResetSoft.Error.LastErr, ['at_time', 'at_time_nsec', 'counter_val', 'error_desc', 'error_regval'], name, value)
class BackPressureSoftErrors(Entity):
"""
BP soft error information
.. attribute:: error
Collection of errors
**type**\: list of :py:class:`Error <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.BackPressureSoftErrors.Error>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.BackPressureSoftErrors, self).__init__()
self.yang_name = "back-pressure-soft-errors"
self.yang_parent_name = "error-path"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("error", ("error", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.BackPressureSoftErrors.Error))])
self._leafs = OrderedDict()
self.error = YList(self)
self._segment_path = lambda: "back-pressure-soft-errors"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.BackPressureSoftErrors, [], name, value)
class Error(Entity):
"""
Collection of errors
.. attribute:: name
Name assigned to mem
**type**\: str
.. attribute:: asic_info
Name of rack/board/asic
**type**\: str
.. attribute:: node_key
32 bit key
**type**\: int
**range:** 0..4294967295
.. attribute:: alarm_on
High threshold crossed
**type**\: bool
.. attribute:: thresh_hi
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_hi
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: thresh_lo
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_lo
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: count
Accumulated count
**type**\: int
**range:** 0..4294967295
.. attribute:: intr_type
Type of error
**type**\: int
**range:** 0..4294967295
.. attribute:: leaf_id
Leaf ID defined in user data
**type**\: int
**range:** 0..4294967295
.. attribute:: last_cleared
Time cleared
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: csrs_info
List of csrs\_info
**type**\: list of :py:class:`CsrsInfo <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.BackPressureSoftErrors.Error.CsrsInfo>`
.. attribute:: last_err
Last Printable error information
**type**\: list of :py:class:`LastErr <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.BackPressureSoftErrors.Error.LastErr>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.BackPressureSoftErrors.Error, self).__init__()
self.yang_name = "error"
self.yang_parent_name = "back-pressure-soft-errors"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("csrs-info", ("csrs_info", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.BackPressureSoftErrors.Error.CsrsInfo)), ("last-err", ("last_err", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.BackPressureSoftErrors.Error.LastErr))])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('asic_info', YLeaf(YType.str, 'asic-info')),
('node_key', YLeaf(YType.uint32, 'node-key')),
('alarm_on', YLeaf(YType.boolean, 'alarm-on')),
('thresh_hi', YLeaf(YType.uint32, 'thresh-hi')),
('period_hi', YLeaf(YType.uint32, 'period-hi')),
('thresh_lo', YLeaf(YType.uint32, 'thresh-lo')),
('period_lo', YLeaf(YType.uint32, 'period-lo')),
('count', YLeaf(YType.uint32, 'count')),
('intr_type', YLeaf(YType.uint32, 'intr-type')),
('leaf_id', YLeaf(YType.uint32, 'leaf-id')),
('last_cleared', YLeaf(YType.uint64, 'last-cleared')),
])
self.name = None
self.asic_info = None
self.node_key = None
self.alarm_on = None
self.thresh_hi = None
self.period_hi = None
self.thresh_lo = None
self.period_lo = None
self.count = None
self.intr_type = None
self.leaf_id = None
self.last_cleared = None
self.csrs_info = YList(self)
self.last_err = YList(self)
self._segment_path = lambda: "error"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.BackPressureSoftErrors.Error, ['name', 'asic_info', 'node_key', 'alarm_on', 'thresh_hi', 'period_hi', 'thresh_lo', 'period_lo', 'count', 'intr_type', 'leaf_id', 'last_cleared'], name, value)
class CsrsInfo(Entity):
"""
List of csrs\_info
.. attribute:: name
name
**type**\: str
.. attribute:: address
address
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: width
width
**type**\: int
**range:** 0..4294967295
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.BackPressureSoftErrors.Error.CsrsInfo, self).__init__()
self.yang_name = "csrs-info"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('address', YLeaf(YType.uint64, 'address')),
('width', YLeaf(YType.uint32, 'width')),
])
self.name = None
self.address = None
self.width = None
self._segment_path = lambda: "csrs-info"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.BackPressureSoftErrors.Error.CsrsInfo, ['name', 'address', 'width'], name, value)
class LastErr(Entity):
"""
Last Printable error information
.. attribute:: at_time
at time
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: at_time_nsec
at time nsec
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: counter_val
counter val
**type**\: int
**range:** 0..4294967295
.. attribute:: error_desc
error desc
**type**\: str
.. attribute:: error_regval
error regval
**type**\: list of int
**range:** 0..255
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.BackPressureSoftErrors.Error.LastErr, self).__init__()
self.yang_name = "last-err"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('at_time', YLeaf(YType.uint64, 'at-time')),
('at_time_nsec', YLeaf(YType.uint64, 'at-time-nsec')),
('counter_val', YLeaf(YType.uint32, 'counter-val')),
('error_desc', YLeaf(YType.str, 'error-desc')),
('error_regval', YLeafList(YType.uint8, 'error-regval')),
])
self.at_time = None
self.at_time_nsec = None
self.counter_val = None
self.error_desc = None
self.error_regval = []
self._segment_path = lambda: "last-err"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.BackPressureSoftErrors.Error.LastErr, ['at_time', 'at_time_nsec', 'counter_val', 'error_desc', 'error_regval'], name, value)
class GenericSoftErrors(Entity):
"""
Generic soft error information
.. attribute:: error
Collection of errors
**type**\: list of :py:class:`Error <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.GenericSoftErrors.Error>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.GenericSoftErrors, self).__init__()
self.yang_name = "generic-soft-errors"
self.yang_parent_name = "error-path"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("error", ("error", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.GenericSoftErrors.Error))])
self._leafs = OrderedDict()
self.error = YList(self)
self._segment_path = lambda: "generic-soft-errors"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.GenericSoftErrors, [], name, value)
class Error(Entity):
"""
Collection of errors
.. attribute:: name
Name assigned to mem
**type**\: str
.. attribute:: asic_info
Name of rack/board/asic
**type**\: str
.. attribute:: node_key
32 bit key
**type**\: int
**range:** 0..4294967295
.. attribute:: alarm_on
High threshold crossed
**type**\: bool
.. attribute:: thresh_hi
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_hi
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: thresh_lo
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_lo
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: count
Accumulated count
**type**\: int
**range:** 0..4294967295
.. attribute:: intr_type
Type of error
**type**\: int
**range:** 0..4294967295
.. attribute:: leaf_id
Leaf ID defined in user data
**type**\: int
**range:** 0..4294967295
.. attribute:: last_cleared
Time cleared
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: csrs_info
List of csrs\_info
**type**\: list of :py:class:`CsrsInfo <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.GenericSoftErrors.Error.CsrsInfo>`
.. attribute:: last_err
Last Printable error information
**type**\: list of :py:class:`LastErr <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.GenericSoftErrors.Error.LastErr>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.GenericSoftErrors.Error, self).__init__()
self.yang_name = "error"
self.yang_parent_name = "generic-soft-errors"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("csrs-info", ("csrs_info", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.GenericSoftErrors.Error.CsrsInfo)), ("last-err", ("last_err", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.GenericSoftErrors.Error.LastErr))])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('asic_info', YLeaf(YType.str, 'asic-info')),
('node_key', YLeaf(YType.uint32, 'node-key')),
('alarm_on', YLeaf(YType.boolean, 'alarm-on')),
('thresh_hi', YLeaf(YType.uint32, 'thresh-hi')),
('period_hi', YLeaf(YType.uint32, 'period-hi')),
('thresh_lo', YLeaf(YType.uint32, 'thresh-lo')),
('period_lo', YLeaf(YType.uint32, 'period-lo')),
('count', YLeaf(YType.uint32, 'count')),
('intr_type', YLeaf(YType.uint32, 'intr-type')),
('leaf_id', YLeaf(YType.uint32, 'leaf-id')),
('last_cleared', YLeaf(YType.uint64, 'last-cleared')),
])
self.name = None
self.asic_info = None
self.node_key = None
self.alarm_on = None
self.thresh_hi = None
self.period_hi = None
self.thresh_lo = None
self.period_lo = None
self.count = None
self.intr_type = None
self.leaf_id = None
self.last_cleared = None
self.csrs_info = YList(self)
self.last_err = YList(self)
self._segment_path = lambda: "error"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.GenericSoftErrors.Error, ['name', 'asic_info', 'node_key', 'alarm_on', 'thresh_hi', 'period_hi', 'thresh_lo', 'period_lo', 'count', 'intr_type', 'leaf_id', 'last_cleared'], name, value)
class CsrsInfo(Entity):
"""
List of csrs\_info
.. attribute:: name
name
**type**\: str
.. attribute:: address
address
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: width
width
**type**\: int
**range:** 0..4294967295
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.GenericSoftErrors.Error.CsrsInfo, self).__init__()
self.yang_name = "csrs-info"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('address', YLeaf(YType.uint64, 'address')),
('width', YLeaf(YType.uint32, 'width')),
])
self.name = None
self.address = None
self.width = None
self._segment_path = lambda: "csrs-info"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.GenericSoftErrors.Error.CsrsInfo, ['name', 'address', 'width'], name, value)
class LastErr(Entity):
"""
Last Printable error information
.. attribute:: at_time
at time
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: at_time_nsec
at time nsec
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: counter_val
counter val
**type**\: int
**range:** 0..4294967295
.. attribute:: error_desc
error desc
**type**\: str
.. attribute:: error_regval
error regval
**type**\: list of int
**range:** 0..255
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.GenericSoftErrors.Error.LastErr, self).__init__()
self.yang_name = "last-err"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('at_time', YLeaf(YType.uint64, 'at-time')),
('at_time_nsec', YLeaf(YType.uint64, 'at-time-nsec')),
('counter_val', YLeaf(YType.uint32, 'counter-val')),
('error_desc', YLeaf(YType.str, 'error-desc')),
('error_regval', YLeafList(YType.uint8, 'error-regval')),
])
self.at_time = None
self.at_time_nsec = None
self.counter_val = None
self.error_desc = None
self.error_regval = []
self._segment_path = lambda: "last-err"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.GenericSoftErrors.Error.LastErr, ['at_time', 'at_time_nsec', 'counter_val', 'error_desc', 'error_regval'], name, value)
class LinkSoftErrors(Entity):
"""
Link soft error information
.. attribute:: error
Collection of errors
**type**\: list of :py:class:`Error <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.LinkSoftErrors.Error>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.LinkSoftErrors, self).__init__()
self.yang_name = "link-soft-errors"
self.yang_parent_name = "error-path"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("error", ("error", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.LinkSoftErrors.Error))])
self._leafs = OrderedDict()
self.error = YList(self)
self._segment_path = lambda: "link-soft-errors"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.LinkSoftErrors, [], name, value)
class Error(Entity):
"""
Collection of errors
.. attribute:: name
Name assigned to mem
**type**\: str
.. attribute:: asic_info
Name of rack/board/asic
**type**\: str
.. attribute:: node_key
32 bit key
**type**\: int
**range:** 0..4294967295
.. attribute:: alarm_on
High threshold crossed
**type**\: bool
.. attribute:: thresh_hi
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_hi
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: thresh_lo
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_lo
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: count
Accumulated count
**type**\: int
**range:** 0..4294967295
.. attribute:: intr_type
Type of error
**type**\: int
**range:** 0..4294967295
.. attribute:: leaf_id
Leaf ID defined in user data
**type**\: int
**range:** 0..4294967295
.. attribute:: last_cleared
Time cleared
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: csrs_info
List of csrs\_info
**type**\: list of :py:class:`CsrsInfo <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.LinkSoftErrors.Error.CsrsInfo>`
.. attribute:: last_err
Last Printable error information
**type**\: list of :py:class:`LastErr <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.LinkSoftErrors.Error.LastErr>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.LinkSoftErrors.Error, self).__init__()
self.yang_name = "error"
self.yang_parent_name = "link-soft-errors"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("csrs-info", ("csrs_info", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.LinkSoftErrors.Error.CsrsInfo)), ("last-err", ("last_err", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.LinkSoftErrors.Error.LastErr))])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('asic_info', YLeaf(YType.str, 'asic-info')),
('node_key', YLeaf(YType.uint32, 'node-key')),
('alarm_on', YLeaf(YType.boolean, 'alarm-on')),
('thresh_hi', YLeaf(YType.uint32, 'thresh-hi')),
('period_hi', YLeaf(YType.uint32, 'period-hi')),
('thresh_lo', YLeaf(YType.uint32, 'thresh-lo')),
('period_lo', YLeaf(YType.uint32, 'period-lo')),
('count', YLeaf(YType.uint32, 'count')),
('intr_type', YLeaf(YType.uint32, 'intr-type')),
('leaf_id', YLeaf(YType.uint32, 'leaf-id')),
('last_cleared', YLeaf(YType.uint64, 'last-cleared')),
])
self.name = None
self.asic_info = None
self.node_key = None
self.alarm_on = None
self.thresh_hi = None
self.period_hi = None
self.thresh_lo = None
self.period_lo = None
self.count = None
self.intr_type = None
self.leaf_id = None
self.last_cleared = None
self.csrs_info = YList(self)
self.last_err = YList(self)
self._segment_path = lambda: "error"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.LinkSoftErrors.Error, ['name', 'asic_info', 'node_key', 'alarm_on', 'thresh_hi', 'period_hi', 'thresh_lo', 'period_lo', 'count', 'intr_type', 'leaf_id', 'last_cleared'], name, value)
class CsrsInfo(Entity):
"""
List of csrs\_info
.. attribute:: name
name
**type**\: str
.. attribute:: address
address
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: width
width
**type**\: int
**range:** 0..4294967295
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.LinkSoftErrors.Error.CsrsInfo, self).__init__()
self.yang_name = "csrs-info"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('address', YLeaf(YType.uint64, 'address')),
('width', YLeaf(YType.uint32, 'width')),
])
self.name = None
self.address = None
self.width = None
self._segment_path = lambda: "csrs-info"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.LinkSoftErrors.Error.CsrsInfo, ['name', 'address', 'width'], name, value)
class LastErr(Entity):
"""
Last Printable error information
.. attribute:: at_time
at time
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: at_time_nsec
at time nsec
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: counter_val
counter val
**type**\: int
**range:** 0..4294967295
.. attribute:: error_desc
error desc
**type**\: str
.. attribute:: error_regval
error regval
**type**\: list of int
**range:** 0..255
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.LinkSoftErrors.Error.LastErr, self).__init__()
self.yang_name = "last-err"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('at_time', YLeaf(YType.uint64, 'at-time')),
('at_time_nsec', YLeaf(YType.uint64, 'at-time-nsec')),
('counter_val', YLeaf(YType.uint32, 'counter-val')),
('error_desc', YLeaf(YType.str, 'error-desc')),
('error_regval', YLeafList(YType.uint8, 'error-regval')),
])
self.at_time = None
self.at_time_nsec = None
self.counter_val = None
self.error_desc = None
self.error_regval = []
self._segment_path = lambda: "last-err"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.LinkSoftErrors.Error.LastErr, ['at_time', 'at_time_nsec', 'counter_val', 'error_desc', 'error_regval'], name, value)
class ConfigurationSoftErrors(Entity):
"""
Configuration soft error information
.. attribute:: error
Collection of errors
**type**\: list of :py:class:`Error <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ConfigurationSoftErrors.Error>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ConfigurationSoftErrors, self).__init__()
self.yang_name = "configuration-soft-errors"
self.yang_parent_name = "error-path"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("error", ("error", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ConfigurationSoftErrors.Error))])
self._leafs = OrderedDict()
self.error = YList(self)
self._segment_path = lambda: "configuration-soft-errors"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ConfigurationSoftErrors, [], name, value)
class Error(Entity):
"""
Collection of errors
.. attribute:: name
Name assigned to mem
**type**\: str
.. attribute:: asic_info
Name of rack/board/asic
**type**\: str
.. attribute:: node_key
32 bit key
**type**\: int
**range:** 0..4294967295
.. attribute:: alarm_on
High threshold crossed
**type**\: bool
.. attribute:: thresh_hi
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_hi
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: thresh_lo
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_lo
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: count
Accumulated count
**type**\: int
**range:** 0..4294967295
.. attribute:: intr_type
Type of error
**type**\: int
**range:** 0..4294967295
.. attribute:: leaf_id
Leaf ID defined in user data
**type**\: int
**range:** 0..4294967295
.. attribute:: last_cleared
Time cleared
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: csrs_info
List of csrs\_info
**type**\: list of :py:class:`CsrsInfo <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ConfigurationSoftErrors.Error.CsrsInfo>`
.. attribute:: last_err
Last Printable error information
**type**\: list of :py:class:`LastErr <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ConfigurationSoftErrors.Error.LastErr>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ConfigurationSoftErrors.Error, self).__init__()
self.yang_name = "error"
self.yang_parent_name = "configuration-soft-errors"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("csrs-info", ("csrs_info", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ConfigurationSoftErrors.Error.CsrsInfo)), ("last-err", ("last_err", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ConfigurationSoftErrors.Error.LastErr))])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('asic_info', YLeaf(YType.str, 'asic-info')),
('node_key', YLeaf(YType.uint32, 'node-key')),
('alarm_on', YLeaf(YType.boolean, 'alarm-on')),
('thresh_hi', YLeaf(YType.uint32, 'thresh-hi')),
('period_hi', YLeaf(YType.uint32, 'period-hi')),
('thresh_lo', YLeaf(YType.uint32, 'thresh-lo')),
('period_lo', YLeaf(YType.uint32, 'period-lo')),
('count', YLeaf(YType.uint32, 'count')),
('intr_type', YLeaf(YType.uint32, 'intr-type')),
('leaf_id', YLeaf(YType.uint32, 'leaf-id')),
('last_cleared', YLeaf(YType.uint64, 'last-cleared')),
])
self.name = None
self.asic_info = None
self.node_key = None
self.alarm_on = None
self.thresh_hi = None
self.period_hi = None
self.thresh_lo = None
self.period_lo = None
self.count = None
self.intr_type = None
self.leaf_id = None
self.last_cleared = None
self.csrs_info = YList(self)
self.last_err = YList(self)
self._segment_path = lambda: "error"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ConfigurationSoftErrors.Error, ['name', 'asic_info', 'node_key', 'alarm_on', 'thresh_hi', 'period_hi', 'thresh_lo', 'period_lo', 'count', 'intr_type', 'leaf_id', 'last_cleared'], name, value)
class CsrsInfo(Entity):
"""
List of csrs\_info
.. attribute:: name
name
**type**\: str
.. attribute:: address
address
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: width
width
**type**\: int
**range:** 0..4294967295
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ConfigurationSoftErrors.Error.CsrsInfo, self).__init__()
self.yang_name = "csrs-info"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('address', YLeaf(YType.uint64, 'address')),
('width', YLeaf(YType.uint32, 'width')),
])
self.name = None
self.address = None
self.width = None
self._segment_path = lambda: "csrs-info"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ConfigurationSoftErrors.Error.CsrsInfo, ['name', 'address', 'width'], name, value)
class LastErr(Entity):
"""
Last Printable error information
.. attribute:: at_time
at time
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: at_time_nsec
at time nsec
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: counter_val
counter val
**type**\: int
**range:** 0..4294967295
.. attribute:: error_desc
error desc
**type**\: str
.. attribute:: error_regval
error regval
**type**\: list of int
**range:** 0..255
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ConfigurationSoftErrors.Error.LastErr, self).__init__()
self.yang_name = "last-err"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('at_time', YLeaf(YType.uint64, 'at-time')),
('at_time_nsec', YLeaf(YType.uint64, 'at-time-nsec')),
('counter_val', YLeaf(YType.uint32, 'counter-val')),
('error_desc', YLeaf(YType.str, 'error-desc')),
('error_regval', YLeafList(YType.uint8, 'error-regval')),
])
self.at_time = None
self.at_time_nsec = None
self.counter_val = None
self.error_desc = None
self.error_regval = []
self._segment_path = lambda: "last-err"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ConfigurationSoftErrors.Error.LastErr, ['at_time', 'at_time_nsec', 'counter_val', 'error_desc', 'error_regval'], name, value)
class MultipleBitHardErrors(Entity):
"""
Multiple bit hard error information
.. attribute:: error
Collection of errors
**type**\: list of :py:class:`Error <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.MultipleBitHardErrors.Error>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.MultipleBitHardErrors, self).__init__()
self.yang_name = "multiple-bit-hard-errors"
self.yang_parent_name = "error-path"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("error", ("error", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.MultipleBitHardErrors.Error))])
self._leafs = OrderedDict()
self.error = YList(self)
self._segment_path = lambda: "multiple-bit-hard-errors"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.MultipleBitHardErrors, [], name, value)
class Error(Entity):
"""
Collection of errors
.. attribute:: name
Name assigned to mem
**type**\: str
.. attribute:: asic_info
Name of rack/board/asic
**type**\: str
.. attribute:: node_key
32 bit key
**type**\: int
**range:** 0..4294967295
.. attribute:: alarm_on
High threshold crossed
**type**\: bool
.. attribute:: thresh_hi
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_hi
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: thresh_lo
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_lo
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: count
Accumulated count
**type**\: int
**range:** 0..4294967295
.. attribute:: intr_type
Type of error
**type**\: int
**range:** 0..4294967295
.. attribute:: leaf_id
Leaf ID defined in user data
**type**\: int
**range:** 0..4294967295
.. attribute:: last_cleared
Time cleared
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: csrs_info
List of csrs\_info
**type**\: list of :py:class:`CsrsInfo <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.MultipleBitHardErrors.Error.CsrsInfo>`
.. attribute:: last_err
Last Printable error information
**type**\: list of :py:class:`LastErr <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.MultipleBitHardErrors.Error.LastErr>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.MultipleBitHardErrors.Error, self).__init__()
self.yang_name = "error"
self.yang_parent_name = "multiple-bit-hard-errors"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("csrs-info", ("csrs_info", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.MultipleBitHardErrors.Error.CsrsInfo)), ("last-err", ("last_err", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.MultipleBitHardErrors.Error.LastErr))])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('asic_info', YLeaf(YType.str, 'asic-info')),
('node_key', YLeaf(YType.uint32, 'node-key')),
('alarm_on', YLeaf(YType.boolean, 'alarm-on')),
('thresh_hi', YLeaf(YType.uint32, 'thresh-hi')),
('period_hi', YLeaf(YType.uint32, 'period-hi')),
('thresh_lo', YLeaf(YType.uint32, 'thresh-lo')),
('period_lo', YLeaf(YType.uint32, 'period-lo')),
('count', YLeaf(YType.uint32, 'count')),
('intr_type', YLeaf(YType.uint32, 'intr-type')),
('leaf_id', YLeaf(YType.uint32, 'leaf-id')),
('last_cleared', YLeaf(YType.uint64, 'last-cleared')),
])
self.name = None
self.asic_info = None
self.node_key = None
self.alarm_on = None
self.thresh_hi = None
self.period_hi = None
self.thresh_lo = None
self.period_lo = None
self.count = None
self.intr_type = None
self.leaf_id = None
self.last_cleared = None
self.csrs_info = YList(self)
self.last_err = YList(self)
self._segment_path = lambda: "error"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.MultipleBitHardErrors.Error, ['name', 'asic_info', 'node_key', 'alarm_on', 'thresh_hi', 'period_hi', 'thresh_lo', 'period_lo', 'count', 'intr_type', 'leaf_id', 'last_cleared'], name, value)
class CsrsInfo(Entity):
"""
List of csrs\_info
.. attribute:: name
name
**type**\: str
.. attribute:: address
address
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: width
width
**type**\: int
**range:** 0..4294967295
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.MultipleBitHardErrors.Error.CsrsInfo, self).__init__()
self.yang_name = "csrs-info"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('address', YLeaf(YType.uint64, 'address')),
('width', YLeaf(YType.uint32, 'width')),
])
self.name = None
self.address = None
self.width = None
self._segment_path = lambda: "csrs-info"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.MultipleBitHardErrors.Error.CsrsInfo, ['name', 'address', 'width'], name, value)
class LastErr(Entity):
"""
Last Printable error information
.. attribute:: at_time
at time
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: at_time_nsec
at time nsec
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: counter_val
counter val
**type**\: int
**range:** 0..4294967295
.. attribute:: error_desc
error desc
**type**\: str
.. attribute:: error_regval
error regval
**type**\: list of int
**range:** 0..255
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.MultipleBitHardErrors.Error.LastErr, self).__init__()
self.yang_name = "last-err"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('at_time', YLeaf(YType.uint64, 'at-time')),
('at_time_nsec', YLeaf(YType.uint64, 'at-time-nsec')),
('counter_val', YLeaf(YType.uint32, 'counter-val')),
('error_desc', YLeaf(YType.str, 'error-desc')),
('error_regval', YLeafList(YType.uint8, 'error-regval')),
])
self.at_time = None
self.at_time_nsec = None
self.counter_val = None
self.error_desc = None
self.error_regval = []
self._segment_path = lambda: "last-err"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.MultipleBitHardErrors.Error.LastErr, ['at_time', 'at_time_nsec', 'counter_val', 'error_desc', 'error_regval'], name, value)
class UnexpectedSoftErrors(Entity):
"""
Unexpected soft error information
.. attribute:: error
Collection of errors
**type**\: list of :py:class:`Error <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.UnexpectedSoftErrors.Error>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.UnexpectedSoftErrors, self).__init__()
self.yang_name = "unexpected-soft-errors"
self.yang_parent_name = "error-path"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("error", ("error", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.UnexpectedSoftErrors.Error))])
self._leafs = OrderedDict()
self.error = YList(self)
self._segment_path = lambda: "unexpected-soft-errors"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.UnexpectedSoftErrors, [], name, value)
class Error(Entity):
"""
Collection of errors
.. attribute:: name
Name assigned to mem
**type**\: str
.. attribute:: asic_info
Name of rack/board/asic
**type**\: str
.. attribute:: node_key
32 bit key
**type**\: int
**range:** 0..4294967295
.. attribute:: alarm_on
High threshold crossed
**type**\: bool
.. attribute:: thresh_hi
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_hi
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: thresh_lo
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_lo
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: count
Accumulated count
**type**\: int
**range:** 0..4294967295
.. attribute:: intr_type
Type of error
**type**\: int
**range:** 0..4294967295
.. attribute:: leaf_id
Leaf ID defined in user data
**type**\: int
**range:** 0..4294967295
.. attribute:: last_cleared
Time cleared
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: csrs_info
List of csrs\_info
**type**\: list of :py:class:`CsrsInfo <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.UnexpectedSoftErrors.Error.CsrsInfo>`
.. attribute:: last_err
Last Printable error information
**type**\: list of :py:class:`LastErr <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.UnexpectedSoftErrors.Error.LastErr>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.UnexpectedSoftErrors.Error, self).__init__()
self.yang_name = "error"
self.yang_parent_name = "unexpected-soft-errors"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("csrs-info", ("csrs_info", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.UnexpectedSoftErrors.Error.CsrsInfo)), ("last-err", ("last_err", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.UnexpectedSoftErrors.Error.LastErr))])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('asic_info', YLeaf(YType.str, 'asic-info')),
('node_key', YLeaf(YType.uint32, 'node-key')),
('alarm_on', YLeaf(YType.boolean, 'alarm-on')),
('thresh_hi', YLeaf(YType.uint32, 'thresh-hi')),
('period_hi', YLeaf(YType.uint32, 'period-hi')),
('thresh_lo', YLeaf(YType.uint32, 'thresh-lo')),
('period_lo', YLeaf(YType.uint32, 'period-lo')),
('count', YLeaf(YType.uint32, 'count')),
('intr_type', YLeaf(YType.uint32, 'intr-type')),
('leaf_id', YLeaf(YType.uint32, 'leaf-id')),
('last_cleared', YLeaf(YType.uint64, 'last-cleared')),
])
self.name = None
self.asic_info = None
self.node_key = None
self.alarm_on = None
self.thresh_hi = None
self.period_hi = None
self.thresh_lo = None
self.period_lo = None
self.count = None
self.intr_type = None
self.leaf_id = None
self.last_cleared = None
self.csrs_info = YList(self)
self.last_err = YList(self)
self._segment_path = lambda: "error"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.UnexpectedSoftErrors.Error, ['name', 'asic_info', 'node_key', 'alarm_on', 'thresh_hi', 'period_hi', 'thresh_lo', 'period_lo', 'count', 'intr_type', 'leaf_id', 'last_cleared'], name, value)
class CsrsInfo(Entity):
"""
List of csrs\_info
.. attribute:: name
name
**type**\: str
.. attribute:: address
address
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: width
width
**type**\: int
**range:** 0..4294967295
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.UnexpectedSoftErrors.Error.CsrsInfo, self).__init__()
self.yang_name = "csrs-info"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('address', YLeaf(YType.uint64, 'address')),
('width', YLeaf(YType.uint32, 'width')),
])
self.name = None
self.address = None
self.width = None
self._segment_path = lambda: "csrs-info"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.UnexpectedSoftErrors.Error.CsrsInfo, ['name', 'address', 'width'], name, value)
class LastErr(Entity):
"""
Last Printable error information
.. attribute:: at_time
at time
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: at_time_nsec
at time nsec
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: counter_val
counter val
**type**\: int
**range:** 0..4294967295
.. attribute:: error_desc
error desc
**type**\: str
.. attribute:: error_regval
error regval
**type**\: list of int
**range:** 0..255
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.UnexpectedSoftErrors.Error.LastErr, self).__init__()
self.yang_name = "last-err"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('at_time', YLeaf(YType.uint64, 'at-time')),
('at_time_nsec', YLeaf(YType.uint64, 'at-time-nsec')),
('counter_val', YLeaf(YType.uint32, 'counter-val')),
('error_desc', YLeaf(YType.str, 'error-desc')),
('error_regval', YLeafList(YType.uint8, 'error-regval')),
])
self.at_time = None
self.at_time_nsec = None
self.counter_val = None
self.error_desc = None
self.error_regval = []
self._segment_path = lambda: "last-err"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.UnexpectedSoftErrors.Error.LastErr, ['at_time', 'at_time_nsec', 'counter_val', 'error_desc', 'error_regval'], name, value)
class OutofResourceHard(Entity):
"""
OOR thresh information
.. attribute:: error
Collection of errors
**type**\: list of :py:class:`Error <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.OutofResourceHard.Error>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.OutofResourceHard, self).__init__()
self.yang_name = "outof-resource-hard"
self.yang_parent_name = "error-path"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("error", ("error", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.OutofResourceHard.Error))])
self._leafs = OrderedDict()
self.error = YList(self)
self._segment_path = lambda: "outof-resource-hard"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.OutofResourceHard, [], name, value)
class Error(Entity):
"""
Collection of errors
.. attribute:: name
Name assigned to mem
**type**\: str
.. attribute:: asic_info
Name of rack/board/asic
**type**\: str
.. attribute:: node_key
32 bit key
**type**\: int
**range:** 0..4294967295
.. attribute:: alarm_on
High threshold crossed
**type**\: bool
.. attribute:: thresh_hi
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_hi
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: thresh_lo
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_lo
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: count
Accumulated count
**type**\: int
**range:** 0..4294967295
.. attribute:: intr_type
Type of error
**type**\: int
**range:** 0..4294967295
.. attribute:: leaf_id
Leaf ID defined in user data
**type**\: int
**range:** 0..4294967295
.. attribute:: last_cleared
Time cleared
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: csrs_info
List of csrs\_info
**type**\: list of :py:class:`CsrsInfo <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.OutofResourceHard.Error.CsrsInfo>`
.. attribute:: last_err
Last Printable error information
**type**\: list of :py:class:`LastErr <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.OutofResourceHard.Error.LastErr>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.OutofResourceHard.Error, self).__init__()
self.yang_name = "error"
self.yang_parent_name = "outof-resource-hard"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("csrs-info", ("csrs_info", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.OutofResourceHard.Error.CsrsInfo)), ("last-err", ("last_err", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.OutofResourceHard.Error.LastErr))])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('asic_info', YLeaf(YType.str, 'asic-info')),
('node_key', YLeaf(YType.uint32, 'node-key')),
('alarm_on', YLeaf(YType.boolean, 'alarm-on')),
('thresh_hi', YLeaf(YType.uint32, 'thresh-hi')),
('period_hi', YLeaf(YType.uint32, 'period-hi')),
('thresh_lo', YLeaf(YType.uint32, 'thresh-lo')),
('period_lo', YLeaf(YType.uint32, 'period-lo')),
('count', YLeaf(YType.uint32, 'count')),
('intr_type', YLeaf(YType.uint32, 'intr-type')),
('leaf_id', YLeaf(YType.uint32, 'leaf-id')),
('last_cleared', YLeaf(YType.uint64, 'last-cleared')),
])
self.name = None
self.asic_info = None
self.node_key = None
self.alarm_on = None
self.thresh_hi = None
self.period_hi = None
self.thresh_lo = None
self.period_lo = None
self.count = None
self.intr_type = None
self.leaf_id = None
self.last_cleared = None
self.csrs_info = YList(self)
self.last_err = YList(self)
self._segment_path = lambda: "error"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.OutofResourceHard.Error, ['name', 'asic_info', 'node_key', 'alarm_on', 'thresh_hi', 'period_hi', 'thresh_lo', 'period_lo', 'count', 'intr_type', 'leaf_id', 'last_cleared'], name, value)
class CsrsInfo(Entity):
"""
List of csrs\_info
.. attribute:: name
name
**type**\: str
.. attribute:: address
address
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: width
width
**type**\: int
**range:** 0..4294967295
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.OutofResourceHard.Error.CsrsInfo, self).__init__()
self.yang_name = "csrs-info"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('address', YLeaf(YType.uint64, 'address')),
('width', YLeaf(YType.uint32, 'width')),
])
self.name = None
self.address = None
self.width = None
self._segment_path = lambda: "csrs-info"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.OutofResourceHard.Error.CsrsInfo, ['name', 'address', 'width'], name, value)
class LastErr(Entity):
"""
Last Printable error information
.. attribute:: at_time
at time
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: at_time_nsec
at time nsec
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: counter_val
counter val
**type**\: int
**range:** 0..4294967295
.. attribute:: error_desc
error desc
**type**\: str
.. attribute:: error_regval
error regval
**type**\: list of int
**range:** 0..255
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.OutofResourceHard.Error.LastErr, self).__init__()
self.yang_name = "last-err"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('at_time', YLeaf(YType.uint64, 'at-time')),
('at_time_nsec', YLeaf(YType.uint64, 'at-time-nsec')),
('counter_val', YLeaf(YType.uint32, 'counter-val')),
('error_desc', YLeaf(YType.str, 'error-desc')),
('error_regval', YLeafList(YType.uint8, 'error-regval')),
])
self.at_time = None
self.at_time_nsec = None
self.counter_val = None
self.error_desc = None
self.error_regval = []
self._segment_path = lambda: "last-err"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.OutofResourceHard.Error.LastErr, ['at_time', 'at_time_nsec', 'counter_val', 'error_desc', 'error_regval'], name, value)
class HardwareHardErrors(Entity):
"""
Hardware hard error information
.. attribute:: error
Collection of errors
**type**\: list of :py:class:`Error <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.HardwareHardErrors.Error>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.HardwareHardErrors, self).__init__()
self.yang_name = "hardware-hard-errors"
self.yang_parent_name = "error-path"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("error", ("error", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.HardwareHardErrors.Error))])
self._leafs = OrderedDict()
self.error = YList(self)
self._segment_path = lambda: "hardware-hard-errors"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.HardwareHardErrors, [], name, value)
class Error(Entity):
"""
Collection of errors
.. attribute:: name
Name assigned to mem
**type**\: str
.. attribute:: asic_info
Name of rack/board/asic
**type**\: str
.. attribute:: node_key
32 bit key
**type**\: int
**range:** 0..4294967295
.. attribute:: alarm_on
High threshold crossed
**type**\: bool
.. attribute:: thresh_hi
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_hi
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: thresh_lo
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_lo
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: count
Accumulated count
**type**\: int
**range:** 0..4294967295
.. attribute:: intr_type
Type of error
**type**\: int
**range:** 0..4294967295
.. attribute:: leaf_id
Leaf ID defined in user data
**type**\: int
**range:** 0..4294967295
.. attribute:: last_cleared
Time cleared
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: csrs_info
List of csrs\_info
**type**\: list of :py:class:`CsrsInfo <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.HardwareHardErrors.Error.CsrsInfo>`
.. attribute:: last_err
Last Printable error information
**type**\: list of :py:class:`LastErr <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.HardwareHardErrors.Error.LastErr>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.HardwareHardErrors.Error, self).__init__()
self.yang_name = "error"
self.yang_parent_name = "hardware-hard-errors"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("csrs-info", ("csrs_info", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.HardwareHardErrors.Error.CsrsInfo)), ("last-err", ("last_err", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.HardwareHardErrors.Error.LastErr))])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('asic_info', YLeaf(YType.str, 'asic-info')),
('node_key', YLeaf(YType.uint32, 'node-key')),
('alarm_on', YLeaf(YType.boolean, 'alarm-on')),
('thresh_hi', YLeaf(YType.uint32, 'thresh-hi')),
('period_hi', YLeaf(YType.uint32, 'period-hi')),
('thresh_lo', YLeaf(YType.uint32, 'thresh-lo')),
('period_lo', YLeaf(YType.uint32, 'period-lo')),
('count', YLeaf(YType.uint32, 'count')),
('intr_type', YLeaf(YType.uint32, 'intr-type')),
('leaf_id', YLeaf(YType.uint32, 'leaf-id')),
('last_cleared', YLeaf(YType.uint64, 'last-cleared')),
])
self.name = None
self.asic_info = None
self.node_key = None
self.alarm_on = None
self.thresh_hi = None
self.period_hi = None
self.thresh_lo = None
self.period_lo = None
self.count = None
self.intr_type = None
self.leaf_id = None
self.last_cleared = None
self.csrs_info = YList(self)
self.last_err = YList(self)
self._segment_path = lambda: "error"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.HardwareHardErrors.Error, ['name', 'asic_info', 'node_key', 'alarm_on', 'thresh_hi', 'period_hi', 'thresh_lo', 'period_lo', 'count', 'intr_type', 'leaf_id', 'last_cleared'], name, value)
class CsrsInfo(Entity):
"""
List of csrs\_info
.. attribute:: name
name
**type**\: str
.. attribute:: address
address
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: width
width
**type**\: int
**range:** 0..4294967295
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.HardwareHardErrors.Error.CsrsInfo, self).__init__()
self.yang_name = "csrs-info"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('address', YLeaf(YType.uint64, 'address')),
('width', YLeaf(YType.uint32, 'width')),
])
self.name = None
self.address = None
self.width = None
self._segment_path = lambda: "csrs-info"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.HardwareHardErrors.Error.CsrsInfo, ['name', 'address', 'width'], name, value)
class LastErr(Entity):
"""
Last Printable error information
.. attribute:: at_time
at time
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: at_time_nsec
at time nsec
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: counter_val
counter val
**type**\: int
**range:** 0..4294967295
.. attribute:: error_desc
error desc
**type**\: str
.. attribute:: error_regval
error regval
**type**\: list of int
**range:** 0..255
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.HardwareHardErrors.Error.LastErr, self).__init__()
self.yang_name = "last-err"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('at_time', YLeaf(YType.uint64, 'at-time')),
('at_time_nsec', YLeaf(YType.uint64, 'at-time-nsec')),
('counter_val', YLeaf(YType.uint32, 'counter-val')),
('error_desc', YLeaf(YType.str, 'error-desc')),
('error_regval', YLeafList(YType.uint8, 'error-regval')),
])
self.at_time = None
self.at_time_nsec = None
self.counter_val = None
self.error_desc = None
self.error_regval = []
self._segment_path = lambda: "last-err"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.HardwareHardErrors.Error.LastErr, ['at_time', 'at_time_nsec', 'counter_val', 'error_desc', 'error_regval'], name, value)
class ParitySoftErrors(Entity):
"""
Parity soft error information
.. attribute:: error
Collection of errors
**type**\: list of :py:class:`Error <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ParitySoftErrors.Error>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ParitySoftErrors, self).__init__()
self.yang_name = "parity-soft-errors"
self.yang_parent_name = "error-path"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("error", ("error", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ParitySoftErrors.Error))])
self._leafs = OrderedDict()
self.error = YList(self)
self._segment_path = lambda: "parity-soft-errors"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ParitySoftErrors, [], name, value)
class Error(Entity):
"""
Collection of errors
.. attribute:: name
Name assigned to mem
**type**\: str
.. attribute:: asic_info
Name of rack/board/asic
**type**\: str
.. attribute:: node_key
32 bit key
**type**\: int
**range:** 0..4294967295
.. attribute:: alarm_on
High threshold crossed
**type**\: bool
.. attribute:: thresh_hi
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_hi
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: thresh_lo
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_lo
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: count
Accumulated count
**type**\: int
**range:** 0..4294967295
.. attribute:: intr_type
Type of error
**type**\: int
**range:** 0..4294967295
.. attribute:: leaf_id
Leaf ID defined in user data
**type**\: int
**range:** 0..4294967295
.. attribute:: last_cleared
Time cleared
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: csrs_info
List of csrs\_info
**type**\: list of :py:class:`CsrsInfo <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ParitySoftErrors.Error.CsrsInfo>`
.. attribute:: last_err
Last Printable error information
**type**\: list of :py:class:`LastErr <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ParitySoftErrors.Error.LastErr>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ParitySoftErrors.Error, self).__init__()
self.yang_name = "error"
self.yang_parent_name = "parity-soft-errors"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("csrs-info", ("csrs_info", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ParitySoftErrors.Error.CsrsInfo)), ("last-err", ("last_err", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ParitySoftErrors.Error.LastErr))])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('asic_info', YLeaf(YType.str, 'asic-info')),
('node_key', YLeaf(YType.uint32, 'node-key')),
('alarm_on', YLeaf(YType.boolean, 'alarm-on')),
('thresh_hi', YLeaf(YType.uint32, 'thresh-hi')),
('period_hi', YLeaf(YType.uint32, 'period-hi')),
('thresh_lo', YLeaf(YType.uint32, 'thresh-lo')),
('period_lo', YLeaf(YType.uint32, 'period-lo')),
('count', YLeaf(YType.uint32, 'count')),
('intr_type', YLeaf(YType.uint32, 'intr-type')),
('leaf_id', YLeaf(YType.uint32, 'leaf-id')),
('last_cleared', YLeaf(YType.uint64, 'last-cleared')),
])
self.name = None
self.asic_info = None
self.node_key = None
self.alarm_on = None
self.thresh_hi = None
self.period_hi = None
self.thresh_lo = None
self.period_lo = None
self.count = None
self.intr_type = None
self.leaf_id = None
self.last_cleared = None
self.csrs_info = YList(self)
self.last_err = YList(self)
self._segment_path = lambda: "error"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ParitySoftErrors.Error, ['name', 'asic_info', 'node_key', 'alarm_on', 'thresh_hi', 'period_hi', 'thresh_lo', 'period_lo', 'count', 'intr_type', 'leaf_id', 'last_cleared'], name, value)
class CsrsInfo(Entity):
"""
List of csrs\_info
.. attribute:: name
name
**type**\: str
.. attribute:: address
address
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: width
width
**type**\: int
**range:** 0..4294967295
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ParitySoftErrors.Error.CsrsInfo, self).__init__()
self.yang_name = "csrs-info"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('address', YLeaf(YType.uint64, 'address')),
('width', YLeaf(YType.uint32, 'width')),
])
self.name = None
self.address = None
self.width = None
self._segment_path = lambda: "csrs-info"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ParitySoftErrors.Error.CsrsInfo, ['name', 'address', 'width'], name, value)
class LastErr(Entity):
"""
Last Printable error information
.. attribute:: at_time
at time
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: at_time_nsec
at time nsec
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: counter_val
counter val
**type**\: int
**range:** 0..4294967295
.. attribute:: error_desc
error desc
**type**\: str
.. attribute:: error_regval
error regval
**type**\: list of int
**range:** 0..255
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ParitySoftErrors.Error.LastErr, self).__init__()
self.yang_name = "last-err"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('at_time', YLeaf(YType.uint64, 'at-time')),
('at_time_nsec', YLeaf(YType.uint64, 'at-time-nsec')),
('counter_val', YLeaf(YType.uint32, 'counter-val')),
('error_desc', YLeaf(YType.str, 'error-desc')),
('error_regval', YLeafList(YType.uint8, 'error-regval')),
])
self.at_time = None
self.at_time_nsec = None
self.counter_val = None
self.error_desc = None
self.error_regval = []
self._segment_path = lambda: "last-err"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ParitySoftErrors.Error.LastErr, ['at_time', 'at_time_nsec', 'counter_val', 'error_desc', 'error_regval'], name, value)
class DescriptorSoftErrors(Entity):
"""
Descriptor soft error information
.. attribute:: error
Collection of errors
**type**\: list of :py:class:`Error <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.DescriptorSoftErrors.Error>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.DescriptorSoftErrors, self).__init__()
self.yang_name = "descriptor-soft-errors"
self.yang_parent_name = "error-path"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("error", ("error", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.DescriptorSoftErrors.Error))])
self._leafs = OrderedDict()
self.error = YList(self)
self._segment_path = lambda: "descriptor-soft-errors"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.DescriptorSoftErrors, [], name, value)
class Error(Entity):
"""
Collection of errors
.. attribute:: name
Name assigned to mem
**type**\: str
.. attribute:: asic_info
Name of rack/board/asic
**type**\: str
.. attribute:: node_key
32 bit key
**type**\: int
**range:** 0..4294967295
.. attribute:: alarm_on
High threshold crossed
**type**\: bool
.. attribute:: thresh_hi
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_hi
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: thresh_lo
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_lo
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: count
Accumulated count
**type**\: int
**range:** 0..4294967295
.. attribute:: intr_type
Type of error
**type**\: int
**range:** 0..4294967295
.. attribute:: leaf_id
Leaf ID defined in user data
**type**\: int
**range:** 0..4294967295
.. attribute:: last_cleared
Time cleared
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: csrs_info
List of csrs\_info
**type**\: list of :py:class:`CsrsInfo <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.DescriptorSoftErrors.Error.CsrsInfo>`
.. attribute:: last_err
Last Printable error information
**type**\: list of :py:class:`LastErr <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.DescriptorSoftErrors.Error.LastErr>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.DescriptorSoftErrors.Error, self).__init__()
self.yang_name = "error"
self.yang_parent_name = "descriptor-soft-errors"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("csrs-info", ("csrs_info", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.DescriptorSoftErrors.Error.CsrsInfo)), ("last-err", ("last_err", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.DescriptorSoftErrors.Error.LastErr))])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('asic_info', YLeaf(YType.str, 'asic-info')),
('node_key', YLeaf(YType.uint32, 'node-key')),
('alarm_on', YLeaf(YType.boolean, 'alarm-on')),
('thresh_hi', YLeaf(YType.uint32, 'thresh-hi')),
('period_hi', YLeaf(YType.uint32, 'period-hi')),
('thresh_lo', YLeaf(YType.uint32, 'thresh-lo')),
('period_lo', YLeaf(YType.uint32, 'period-lo')),
('count', YLeaf(YType.uint32, 'count')),
('intr_type', YLeaf(YType.uint32, 'intr-type')),
('leaf_id', YLeaf(YType.uint32, 'leaf-id')),
('last_cleared', YLeaf(YType.uint64, 'last-cleared')),
])
self.name = None
self.asic_info = None
self.node_key = None
self.alarm_on = None
self.thresh_hi = None
self.period_hi = None
self.thresh_lo = None
self.period_lo = None
self.count = None
self.intr_type = None
self.leaf_id = None
self.last_cleared = None
self.csrs_info = YList(self)
self.last_err = YList(self)
self._segment_path = lambda: "error"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.DescriptorSoftErrors.Error, ['name', 'asic_info', 'node_key', 'alarm_on', 'thresh_hi', 'period_hi', 'thresh_lo', 'period_lo', 'count', 'intr_type', 'leaf_id', 'last_cleared'], name, value)
class CsrsInfo(Entity):
"""
List of csrs\_info
.. attribute:: name
name
**type**\: str
.. attribute:: address
address
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: width
width
**type**\: int
**range:** 0..4294967295
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.DescriptorSoftErrors.Error.CsrsInfo, self).__init__()
self.yang_name = "csrs-info"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('address', YLeaf(YType.uint64, 'address')),
('width', YLeaf(YType.uint32, 'width')),
])
self.name = None
self.address = None
self.width = None
self._segment_path = lambda: "csrs-info"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.DescriptorSoftErrors.Error.CsrsInfo, ['name', 'address', 'width'], name, value)
class LastErr(Entity):
"""
Last Printable error information
.. attribute:: at_time
at time
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: at_time_nsec
at time nsec
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: counter_val
counter val
**type**\: int
**range:** 0..4294967295
.. attribute:: error_desc
error desc
**type**\: str
.. attribute:: error_regval
error regval
**type**\: list of int
**range:** 0..255
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.DescriptorSoftErrors.Error.LastErr, self).__init__()
self.yang_name = "last-err"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('at_time', YLeaf(YType.uint64, 'at-time')),
('at_time_nsec', YLeaf(YType.uint64, 'at-time-nsec')),
('counter_val', YLeaf(YType.uint32, 'counter-val')),
('error_desc', YLeaf(YType.str, 'error-desc')),
('error_regval', YLeafList(YType.uint8, 'error-regval')),
])
self.at_time = None
self.at_time_nsec = None
self.counter_val = None
self.error_desc = None
self.error_regval = []
self._segment_path = lambda: "last-err"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.DescriptorSoftErrors.Error.LastErr, ['at_time', 'at_time_nsec', 'counter_val', 'error_desc', 'error_regval'], name, value)
class InterfaceSoftErrors(Entity):
"""
Interface soft error information
.. attribute:: error
Collection of errors
**type**\: list of :py:class:`Error <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.InterfaceSoftErrors.Error>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.InterfaceSoftErrors, self).__init__()
self.yang_name = "interface-soft-errors"
self.yang_parent_name = "error-path"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("error", ("error", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.InterfaceSoftErrors.Error))])
self._leafs = OrderedDict()
self.error = YList(self)
self._segment_path = lambda: "interface-soft-errors"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.InterfaceSoftErrors, [], name, value)
class Error(Entity):
"""
Collection of errors
.. attribute:: name
Name assigned to mem
**type**\: str
.. attribute:: asic_info
Name of rack/board/asic
**type**\: str
.. attribute:: node_key
32 bit key
**type**\: int
**range:** 0..4294967295
.. attribute:: alarm_on
High threshold crossed
**type**\: bool
.. attribute:: thresh_hi
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_hi
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: thresh_lo
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_lo
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: count
Accumulated count
**type**\: int
**range:** 0..4294967295
.. attribute:: intr_type
Type of error
**type**\: int
**range:** 0..4294967295
.. attribute:: leaf_id
Leaf ID defined in user data
**type**\: int
**range:** 0..4294967295
.. attribute:: last_cleared
Time cleared
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: csrs_info
List of csrs\_info
**type**\: list of :py:class:`CsrsInfo <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.InterfaceSoftErrors.Error.CsrsInfo>`
.. attribute:: last_err
Last Printable error information
**type**\: list of :py:class:`LastErr <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.InterfaceSoftErrors.Error.LastErr>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.InterfaceSoftErrors.Error, self).__init__()
self.yang_name = "error"
self.yang_parent_name = "interface-soft-errors"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("csrs-info", ("csrs_info", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.InterfaceSoftErrors.Error.CsrsInfo)), ("last-err", ("last_err", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.InterfaceSoftErrors.Error.LastErr))])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('asic_info', YLeaf(YType.str, 'asic-info')),
('node_key', YLeaf(YType.uint32, 'node-key')),
('alarm_on', YLeaf(YType.boolean, 'alarm-on')),
('thresh_hi', YLeaf(YType.uint32, 'thresh-hi')),
('period_hi', YLeaf(YType.uint32, 'period-hi')),
('thresh_lo', YLeaf(YType.uint32, 'thresh-lo')),
('period_lo', YLeaf(YType.uint32, 'period-lo')),
('count', YLeaf(YType.uint32, 'count')),
('intr_type', YLeaf(YType.uint32, 'intr-type')),
('leaf_id', YLeaf(YType.uint32, 'leaf-id')),
('last_cleared', YLeaf(YType.uint64, 'last-cleared')),
])
self.name = None
self.asic_info = None
self.node_key = None
self.alarm_on = None
self.thresh_hi = None
self.period_hi = None
self.thresh_lo = None
self.period_lo = None
self.count = None
self.intr_type = None
self.leaf_id = None
self.last_cleared = None
self.csrs_info = YList(self)
self.last_err = YList(self)
self._segment_path = lambda: "error"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.InterfaceSoftErrors.Error, ['name', 'asic_info', 'node_key', 'alarm_on', 'thresh_hi', 'period_hi', 'thresh_lo', 'period_lo', 'count', 'intr_type', 'leaf_id', 'last_cleared'], name, value)
class CsrsInfo(Entity):
"""
List of csrs\_info
.. attribute:: name
name
**type**\: str
.. attribute:: address
address
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: width
width
**type**\: int
**range:** 0..4294967295
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.InterfaceSoftErrors.Error.CsrsInfo, self).__init__()
self.yang_name = "csrs-info"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('address', YLeaf(YType.uint64, 'address')),
('width', YLeaf(YType.uint32, 'width')),
])
self.name = None
self.address = None
self.width = None
self._segment_path = lambda: "csrs-info"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.InterfaceSoftErrors.Error.CsrsInfo, ['name', 'address', 'width'], name, value)
class LastErr(Entity):
"""
Last Printable error information
.. attribute:: at_time
at time
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: at_time_nsec
at time nsec
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: counter_val
counter val
**type**\: int
**range:** 0..4294967295
.. attribute:: error_desc
error desc
**type**\: str
.. attribute:: error_regval
error regval
**type**\: list of int
**range:** 0..255
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.InterfaceSoftErrors.Error.LastErr, self).__init__()
self.yang_name = "last-err"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('at_time', YLeaf(YType.uint64, 'at-time')),
('at_time_nsec', YLeaf(YType.uint64, 'at-time-nsec')),
('counter_val', YLeaf(YType.uint32, 'counter-val')),
('error_desc', YLeaf(YType.str, 'error-desc')),
('error_regval', YLeafList(YType.uint8, 'error-regval')),
])
self.at_time = None
self.at_time_nsec = None
self.counter_val = None
self.error_desc = None
self.error_regval = []
self._segment_path = lambda: "last-err"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.InterfaceSoftErrors.Error.LastErr, ['at_time', 'at_time_nsec', 'counter_val', 'error_desc', 'error_regval'], name, value)
class IoHardErrors(Entity):
"""
IO hard error information
.. attribute:: error
Collection of errors
**type**\: list of :py:class:`Error <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.IoHardErrors.Error>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.IoHardErrors, self).__init__()
self.yang_name = "io-hard-errors"
self.yang_parent_name = "error-path"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("error", ("error", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.IoHardErrors.Error))])
self._leafs = OrderedDict()
self.error = YList(self)
self._segment_path = lambda: "io-hard-errors"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.IoHardErrors, [], name, value)
class Error(Entity):
"""
Collection of errors
.. attribute:: name
Name assigned to mem
**type**\: str
.. attribute:: asic_info
Name of rack/board/asic
**type**\: str
.. attribute:: node_key
32 bit key
**type**\: int
**range:** 0..4294967295
.. attribute:: alarm_on
High threshold crossed
**type**\: bool
.. attribute:: thresh_hi
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_hi
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: thresh_lo
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_lo
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: count
Accumulated count
**type**\: int
**range:** 0..4294967295
.. attribute:: intr_type
Type of error
**type**\: int
**range:** 0..4294967295
.. attribute:: leaf_id
Leaf ID defined in user data
**type**\: int
**range:** 0..4294967295
.. attribute:: last_cleared
Time cleared
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: csrs_info
List of csrs\_info
**type**\: list of :py:class:`CsrsInfo <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.IoHardErrors.Error.CsrsInfo>`
.. attribute:: last_err
Last Printable error information
**type**\: list of :py:class:`LastErr <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.IoHardErrors.Error.LastErr>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.IoHardErrors.Error, self).__init__()
self.yang_name = "error"
self.yang_parent_name = "io-hard-errors"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("csrs-info", ("csrs_info", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.IoHardErrors.Error.CsrsInfo)), ("last-err", ("last_err", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.IoHardErrors.Error.LastErr))])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('asic_info', YLeaf(YType.str, 'asic-info')),
('node_key', YLeaf(YType.uint32, 'node-key')),
('alarm_on', YLeaf(YType.boolean, 'alarm-on')),
('thresh_hi', YLeaf(YType.uint32, 'thresh-hi')),
('period_hi', YLeaf(YType.uint32, 'period-hi')),
('thresh_lo', YLeaf(YType.uint32, 'thresh-lo')),
('period_lo', YLeaf(YType.uint32, 'period-lo')),
('count', YLeaf(YType.uint32, 'count')),
('intr_type', YLeaf(YType.uint32, 'intr-type')),
('leaf_id', YLeaf(YType.uint32, 'leaf-id')),
('last_cleared', YLeaf(YType.uint64, 'last-cleared')),
])
self.name = None
self.asic_info = None
self.node_key = None
self.alarm_on = None
self.thresh_hi = None
self.period_hi = None
self.thresh_lo = None
self.period_lo = None
self.count = None
self.intr_type = None
self.leaf_id = None
self.last_cleared = None
self.csrs_info = YList(self)
self.last_err = YList(self)
self._segment_path = lambda: "error"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.IoHardErrors.Error, ['name', 'asic_info', 'node_key', 'alarm_on', 'thresh_hi', 'period_hi', 'thresh_lo', 'period_lo', 'count', 'intr_type', 'leaf_id', 'last_cleared'], name, value)
class CsrsInfo(Entity):
"""
List of csrs\_info
.. attribute:: name
name
**type**\: str
.. attribute:: address
address
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: width
width
**type**\: int
**range:** 0..4294967295
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.IoHardErrors.Error.CsrsInfo, self).__init__()
self.yang_name = "csrs-info"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('address', YLeaf(YType.uint64, 'address')),
('width', YLeaf(YType.uint32, 'width')),
])
self.name = None
self.address = None
self.width = None
self._segment_path = lambda: "csrs-info"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.IoHardErrors.Error.CsrsInfo, ['name', 'address', 'width'], name, value)
class LastErr(Entity):
"""
Last Printable error information
.. attribute:: at_time
at time
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: at_time_nsec
at time nsec
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: counter_val
counter val
**type**\: int
**range:** 0..4294967295
.. attribute:: error_desc
error desc
**type**\: str
.. attribute:: error_regval
error regval
**type**\: list of int
**range:** 0..255
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.IoHardErrors.Error.LastErr, self).__init__()
self.yang_name = "last-err"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('at_time', YLeaf(YType.uint64, 'at-time')),
('at_time_nsec', YLeaf(YType.uint64, 'at-time-nsec')),
('counter_val', YLeaf(YType.uint32, 'counter-val')),
('error_desc', YLeaf(YType.str, 'error-desc')),
('error_regval', YLeafList(YType.uint8, 'error-regval')),
])
self.at_time = None
self.at_time_nsec = None
self.counter_val = None
self.error_desc = None
self.error_regval = []
self._segment_path = lambda: "last-err"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.IoHardErrors.Error.LastErr, ['at_time', 'at_time_nsec', 'counter_val', 'error_desc', 'error_regval'], name, value)
class ResetHardErrors(Entity):
"""
Reset hard error information
.. attribute:: error
Collection of errors
**type**\: list of :py:class:`Error <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ResetHardErrors.Error>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ResetHardErrors, self).__init__()
self.yang_name = "reset-hard-errors"
self.yang_parent_name = "error-path"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("error", ("error", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ResetHardErrors.Error))])
self._leafs = OrderedDict()
self.error = YList(self)
self._segment_path = lambda: "reset-hard-errors"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ResetHardErrors, [], name, value)
class Error(Entity):
"""
Collection of errors
.. attribute:: name
Name assigned to mem
**type**\: str
.. attribute:: asic_info
Name of rack/board/asic
**type**\: str
.. attribute:: node_key
32 bit key
**type**\: int
**range:** 0..4294967295
.. attribute:: alarm_on
High threshold crossed
**type**\: bool
.. attribute:: thresh_hi
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_hi
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: thresh_lo
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_lo
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: count
Accumulated count
**type**\: int
**range:** 0..4294967295
.. attribute:: intr_type
Type of error
**type**\: int
**range:** 0..4294967295
.. attribute:: leaf_id
Leaf ID defined in user data
**type**\: int
**range:** 0..4294967295
.. attribute:: last_cleared
Time cleared
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: csrs_info
List of csrs\_info
**type**\: list of :py:class:`CsrsInfo <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ResetHardErrors.Error.CsrsInfo>`
.. attribute:: last_err
Last Printable error information
**type**\: list of :py:class:`LastErr <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ResetHardErrors.Error.LastErr>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ResetHardErrors.Error, self).__init__()
self.yang_name = "error"
self.yang_parent_name = "reset-hard-errors"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("csrs-info", ("csrs_info", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ResetHardErrors.Error.CsrsInfo)), ("last-err", ("last_err", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ResetHardErrors.Error.LastErr))])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('asic_info', YLeaf(YType.str, 'asic-info')),
('node_key', YLeaf(YType.uint32, 'node-key')),
('alarm_on', YLeaf(YType.boolean, 'alarm-on')),
('thresh_hi', YLeaf(YType.uint32, 'thresh-hi')),
('period_hi', YLeaf(YType.uint32, 'period-hi')),
('thresh_lo', YLeaf(YType.uint32, 'thresh-lo')),
('period_lo', YLeaf(YType.uint32, 'period-lo')),
('count', YLeaf(YType.uint32, 'count')),
('intr_type', YLeaf(YType.uint32, 'intr-type')),
('leaf_id', YLeaf(YType.uint32, 'leaf-id')),
('last_cleared', YLeaf(YType.uint64, 'last-cleared')),
])
self.name = None
self.asic_info = None
self.node_key = None
self.alarm_on = None
self.thresh_hi = None
self.period_hi = None
self.thresh_lo = None
self.period_lo = None
self.count = None
self.intr_type = None
self.leaf_id = None
self.last_cleared = None
self.csrs_info = YList(self)
self.last_err = YList(self)
self._segment_path = lambda: "error"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ResetHardErrors.Error, ['name', 'asic_info', 'node_key', 'alarm_on', 'thresh_hi', 'period_hi', 'thresh_lo', 'period_lo', 'count', 'intr_type', 'leaf_id', 'last_cleared'], name, value)
class CsrsInfo(Entity):
"""
List of csrs\_info
.. attribute:: name
name
**type**\: str
.. attribute:: address
address
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: width
width
**type**\: int
**range:** 0..4294967295
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ResetHardErrors.Error.CsrsInfo, self).__init__()
self.yang_name = "csrs-info"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('address', YLeaf(YType.uint64, 'address')),
('width', YLeaf(YType.uint32, 'width')),
])
self.name = None
self.address = None
self.width = None
self._segment_path = lambda: "csrs-info"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ResetHardErrors.Error.CsrsInfo, ['name', 'address', 'width'], name, value)
class LastErr(Entity):
"""
Last Printable error information
.. attribute:: at_time
at time
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: at_time_nsec
at time nsec
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: counter_val
counter val
**type**\: int
**range:** 0..4294967295
.. attribute:: error_desc
error desc
**type**\: str
.. attribute:: error_regval
error regval
**type**\: list of int
**range:** 0..255
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ResetHardErrors.Error.LastErr, self).__init__()
self.yang_name = "last-err"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('at_time', YLeaf(YType.uint64, 'at-time')),
('at_time_nsec', YLeaf(YType.uint64, 'at-time-nsec')),
('counter_val', YLeaf(YType.uint32, 'counter-val')),
('error_desc', YLeaf(YType.str, 'error-desc')),
('error_regval', YLeafList(YType.uint8, 'error-regval')),
])
self.at_time = None
self.at_time_nsec = None
self.counter_val = None
self.error_desc = None
self.error_regval = []
self._segment_path = lambda: "last-err"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.ResetHardErrors.Error.LastErr, ['at_time', 'at_time_nsec', 'counter_val', 'error_desc', 'error_regval'], name, value)
class UcodeHardErrors(Entity):
"""
UCode hard error information
.. attribute:: error
Collection of errors
**type**\: list of :py:class:`Error <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.UcodeHardErrors.Error>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.UcodeHardErrors, self).__init__()
self.yang_name = "ucode-hard-errors"
self.yang_parent_name = "error-path"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("error", ("error", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.UcodeHardErrors.Error))])
self._leafs = OrderedDict()
self.error = YList(self)
self._segment_path = lambda: "ucode-hard-errors"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.UcodeHardErrors, [], name, value)
class Error(Entity):
"""
Collection of errors
.. attribute:: name
Name assigned to mem
**type**\: str
.. attribute:: asic_info
Name of rack/board/asic
**type**\: str
.. attribute:: node_key
32 bit key
**type**\: int
**range:** 0..4294967295
.. attribute:: alarm_on
High threshold crossed
**type**\: bool
.. attribute:: thresh_hi
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_hi
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: thresh_lo
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_lo
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: count
Accumulated count
**type**\: int
**range:** 0..4294967295
.. attribute:: intr_type
Type of error
**type**\: int
**range:** 0..4294967295
.. attribute:: leaf_id
Leaf ID defined in user data
**type**\: int
**range:** 0..4294967295
.. attribute:: last_cleared
Time cleared
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: csrs_info
List of csrs\_info
**type**\: list of :py:class:`CsrsInfo <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.UcodeHardErrors.Error.CsrsInfo>`
.. attribute:: last_err
Last Printable error information
**type**\: list of :py:class:`LastErr <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.UcodeHardErrors.Error.LastErr>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.UcodeHardErrors.Error, self).__init__()
self.yang_name = "error"
self.yang_parent_name = "ucode-hard-errors"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("csrs-info", ("csrs_info", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.UcodeHardErrors.Error.CsrsInfo)), ("last-err", ("last_err", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.UcodeHardErrors.Error.LastErr))])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('asic_info', YLeaf(YType.str, 'asic-info')),
('node_key', YLeaf(YType.uint32, 'node-key')),
('alarm_on', YLeaf(YType.boolean, 'alarm-on')),
('thresh_hi', YLeaf(YType.uint32, 'thresh-hi')),
('period_hi', YLeaf(YType.uint32, 'period-hi')),
('thresh_lo', YLeaf(YType.uint32, 'thresh-lo')),
('period_lo', YLeaf(YType.uint32, 'period-lo')),
('count', YLeaf(YType.uint32, 'count')),
('intr_type', YLeaf(YType.uint32, 'intr-type')),
('leaf_id', YLeaf(YType.uint32, 'leaf-id')),
('last_cleared', YLeaf(YType.uint64, 'last-cleared')),
])
self.name = None
self.asic_info = None
self.node_key = None
self.alarm_on = None
self.thresh_hi = None
self.period_hi = None
self.thresh_lo = None
self.period_lo = None
self.count = None
self.intr_type = None
self.leaf_id = None
self.last_cleared = None
self.csrs_info = YList(self)
self.last_err = YList(self)
self._segment_path = lambda: "error"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.UcodeHardErrors.Error, ['name', 'asic_info', 'node_key', 'alarm_on', 'thresh_hi', 'period_hi', 'thresh_lo', 'period_lo', 'count', 'intr_type', 'leaf_id', 'last_cleared'], name, value)
class CsrsInfo(Entity):
"""
List of csrs\_info
.. attribute:: name
name
**type**\: str
.. attribute:: address
address
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: width
width
**type**\: int
**range:** 0..4294967295
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.UcodeHardErrors.Error.CsrsInfo, self).__init__()
self.yang_name = "csrs-info"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('address', YLeaf(YType.uint64, 'address')),
('width', YLeaf(YType.uint32, 'width')),
])
self.name = None
self.address = None
self.width = None
self._segment_path = lambda: "csrs-info"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.UcodeHardErrors.Error.CsrsInfo, ['name', 'address', 'width'], name, value)
class LastErr(Entity):
"""
Last Printable error information
.. attribute:: at_time
at time
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: at_time_nsec
at time nsec
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: counter_val
counter val
**type**\: int
**range:** 0..4294967295
.. attribute:: error_desc
error desc
**type**\: str
.. attribute:: error_regval
error regval
**type**\: list of int
**range:** 0..255
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.UcodeHardErrors.Error.LastErr, self).__init__()
self.yang_name = "last-err"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('at_time', YLeaf(YType.uint64, 'at-time')),
('at_time_nsec', YLeaf(YType.uint64, 'at-time-nsec')),
('counter_val', YLeaf(YType.uint32, 'counter-val')),
('error_desc', YLeaf(YType.str, 'error-desc')),
('error_regval', YLeafList(YType.uint8, 'error-regval')),
])
self.at_time = None
self.at_time_nsec = None
self.counter_val = None
self.error_desc = None
self.error_regval = []
self._segment_path = lambda: "last-err"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.UcodeHardErrors.Error.LastErr, ['at_time', 'at_time_nsec', 'counter_val', 'error_desc', 'error_regval'], name, value)
class AsicErrorMbeHard(Entity):
"""
Indirect hard error information
.. attribute:: error
Collection of errors
**type**\: list of :py:class:`Error <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorMbeHard.Error>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorMbeHard, self).__init__()
self.yang_name = "asic-error-mbe-hard"
self.yang_parent_name = "error-path"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("error", ("error", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorMbeHard.Error))])
self._leafs = OrderedDict()
self.error = YList(self)
self._segment_path = lambda: "asic-error-mbe-hard"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorMbeHard, [], name, value)
class Error(Entity):
"""
Collection of errors
.. attribute:: name
Name assigned to mem
**type**\: str
.. attribute:: asic_info
Name of rack/board/asic
**type**\: str
.. attribute:: node_key
32 bit key
**type**\: int
**range:** 0..4294967295
.. attribute:: alarm_on
High threshold crossed
**type**\: bool
.. attribute:: thresh_hi
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_hi
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: thresh_lo
High threshold value
**type**\: int
**range:** 0..4294967295
.. attribute:: period_lo
High period value
**type**\: int
**range:** 0..4294967295
.. attribute:: count
Accumulated count
**type**\: int
**range:** 0..4294967295
.. attribute:: intr_type
Type of error
**type**\: int
**range:** 0..4294967295
.. attribute:: leaf_id
Leaf ID defined in user data
**type**\: int
**range:** 0..4294967295
.. attribute:: last_cleared
Time cleared
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: csrs_info
List of csrs\_info
**type**\: list of :py:class:`CsrsInfo <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorMbeHard.Error.CsrsInfo>`
.. attribute:: last_err
Last Printable error information
**type**\: list of :py:class:`LastErr <ydk.models.cisco_ios_xr.Cisco_IOS_XR_asic_errors_oper.AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorMbeHard.Error.LastErr>`
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorMbeHard.Error, self).__init__()
self.yang_name = "error"
self.yang_parent_name = "asic-error-mbe-hard"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([("csrs-info", ("csrs_info", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorMbeHard.Error.CsrsInfo)), ("last-err", ("last_err", AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorMbeHard.Error.LastErr))])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('asic_info', YLeaf(YType.str, 'asic-info')),
('node_key', YLeaf(YType.uint32, 'node-key')),
('alarm_on', YLeaf(YType.boolean, 'alarm-on')),
('thresh_hi', YLeaf(YType.uint32, 'thresh-hi')),
('period_hi', YLeaf(YType.uint32, 'period-hi')),
('thresh_lo', YLeaf(YType.uint32, 'thresh-lo')),
('period_lo', YLeaf(YType.uint32, 'period-lo')),
('count', YLeaf(YType.uint32, 'count')),
('intr_type', YLeaf(YType.uint32, 'intr-type')),
('leaf_id', YLeaf(YType.uint32, 'leaf-id')),
('last_cleared', YLeaf(YType.uint64, 'last-cleared')),
])
self.name = None
self.asic_info = None
self.node_key = None
self.alarm_on = None
self.thresh_hi = None
self.period_hi = None
self.thresh_lo = None
self.period_lo = None
self.count = None
self.intr_type = None
self.leaf_id = None
self.last_cleared = None
self.csrs_info = YList(self)
self.last_err = YList(self)
self._segment_path = lambda: "error"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorMbeHard.Error, ['name', 'asic_info', 'node_key', 'alarm_on', 'thresh_hi', 'period_hi', 'thresh_lo', 'period_lo', 'count', 'intr_type', 'leaf_id', 'last_cleared'], name, value)
class CsrsInfo(Entity):
"""
List of csrs\_info
.. attribute:: name
name
**type**\: str
.. attribute:: address
address
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: width
width
**type**\: int
**range:** 0..4294967295
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorMbeHard.Error.CsrsInfo, self).__init__()
self.yang_name = "csrs-info"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('name', YLeaf(YType.str, 'name')),
('address', YLeaf(YType.uint64, 'address')),
('width', YLeaf(YType.uint32, 'width')),
])
self.name = None
self.address = None
self.width = None
self._segment_path = lambda: "csrs-info"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorMbeHard.Error.CsrsInfo, ['name', 'address', 'width'], name, value)
class LastErr(Entity):
"""
Last Printable error information
.. attribute:: at_time
at time
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: at_time_nsec
at time nsec
**type**\: int
**range:** 0..18446744073709551615
.. attribute:: counter_val
counter val
**type**\: int
**range:** 0..4294967295
.. attribute:: error_desc
error desc
**type**\: str
.. attribute:: error_regval
error regval
**type**\: list of int
**range:** 0..255
"""
_prefix = 'asic-errors-oper'
_revision = '2015-11-09'
def __init__(self):
super(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorMbeHard.Error.LastErr, self).__init__()
self.yang_name = "last-err"
self.yang_parent_name = "error"
self.is_top_level_class = False
self.has_list_ancestor = True
self.ylist_key_names = []
self._child_container_classes = OrderedDict([])
self._child_list_classes = OrderedDict([])
self._leafs = OrderedDict([
('at_time', YLeaf(YType.uint64, 'at-time')),
('at_time_nsec', YLeaf(YType.uint64, 'at-time-nsec')),
('counter_val', YLeaf(YType.uint32, 'counter-val')),
('error_desc', YLeaf(YType.str, 'error-desc')),
('error_regval', YLeafList(YType.uint8, 'error-regval')),
])
self.at_time = None
self.at_time_nsec = None
self.counter_val = None
self.error_desc = None
self.error_regval = []
self._segment_path = lambda: "last-err"
def __setattr__(self, name, value):
self._perform_setattr(AsicErrors.Nodes.Node.AsicInformation.Instances.Instance.ErrorPath.AsicErrorMbeHard.Error.LastErr, ['at_time', 'at_time_nsec', 'counter_val', 'error_desc', 'error_regval'], name, value)
def clone_ptr(self):
self._top_entity = AsicErrors()
return self._top_entity
| 59.926387 | 7,123 | 0.319201 | 55,302 | 987,467 | 5.452172 | 0.0049 | 0.02579 | 0.056524 | 0.100698 | 0.981951 | 0.975719 | 0.963544 | 0.95583 | 0.942643 | 0.936524 | 0 | 0.035101 | 0.6119 | 987,467 | 16,477 | 7,124 | 59.930024 | 0.751662 | 0.159991 | 0 | 0.828809 | 0 | 0 | 0.086653 | 0.007181 | 0 | 0 | 0 | 0 | 0 | 1 | 0.070441 | false | 0 | 0.000833 | 0 | 0.107244 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
1cd6786da7098d7f8ba5c5a50595664a820df91a | 2,138 | py | Python | address_validator.py | ZackDowning/VLANInventory2 | 6740a605be6bd39dfa5ba80dc5694c63bbb365af | [
"MIT"
] | null | null | null | address_validator.py | ZackDowning/VLANInventory2 | 6740a605be6bd39dfa5ba80dc5694c63bbb365af | [
"MIT"
] | null | null | null | address_validator.py | ZackDowning/VLANInventory2 | 6740a605be6bd39dfa5ba80dc5694c63bbb365af | [
"MIT"
] | null | null | null | import re
def ipv4(address):
if re.fullmatch(
r'(([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])\.){3}'
r'([0-9]|[1-9][0-9]|1[0-9][0-9]|2[0-4][0-9]|25[0-5])'
r'', address):
return True
else:
return False
def ipv6(address):
if re.fullmatch(
r'(([0-9aA-fF]{1,4}:){7}[0-9aA-fF]{1,4}|'
r'([0-9aA-fF]{1,4}:){7}:|'
r'([0-9aA-fF]{1,4}:){1,6}:[0-9aA-fF]{1,4}|'
r'([0-9aA-fF]{1,4}:){1,5}(:[0-9aA-fF]{1,4}){1,2}|'
r'([0-9aA-fF]{1,4}:){1,4}(:[0-9aA-fF]{1,4}){1,3}|'
r'([0-9aA-fF]{1,4}:){1,3}(:[0-9aA-fF]{1,4}){1,4}|'
r'([0-9aA-fF]{1,4}:){1,2}(:[0-9aA-fF]{1,4}){1,5}|'
r'[0-9aA-fF]{1,4}:((:[0-9aA-fF]{1,4}){1,6})|'
r':((:[0-9aA-fF]{1,4}){1,7}|:)|'
r'fe80:(:[0-9aA-fF]{0,4}){0,4}%[0-9aA-zZ]+|::(ffff(:0{1,4})?:))'
r'', address):
return True
else:
return False
def macaddress(address):
if '.' in address:
if re.fullmatch(
r'(('
r'([0-9aA-fF]){4}|'
r'([0-9aA-fF]){3}([aA-fF0-9])|'
r'(([aA-fF0-9])([aA-fF0-9]){3})|'
r'((([0-9][aA-fF])|([aA-fF0-9])){2})|'
r'(([aA-fF0-9])([aA-fF0-9]){2}([aA-fF0-9])))\.){2}'
r'(([0-9aA-fF]){4})|'
r'(([0-9aA-fF]){3}([aA-fF0-9]))|'
r'(([aA-fF0-9])([aA-fF0-9]){3})|'
r'((([0-9][aA-fF])|([aA-fF][0-9])){2})|'
r'(([aA-fF0-9])([aA-fF0-9]){2}([aA-fF0-9]))'
r'', address):
return True
else:
return False
else:
if re.fullmatch(
r'(((([0-9aA-fF]){2}-){5}|'
r'(([0-9][aA-fF]|[aA-fF][0-9])-){5})'
r'(([0-9aA-fF]){2}|([0-9][aA-fF]|[aA-fF][0-9]){2}))|'
r'(((([0-9aA-fF]){2}:){5}|'
r'(([0-9][aA-fF]|[aA-Ff][0-9]):){5})'
r'(([0-9aA-fF]){2}|([0-9][aA-fF]|[aA-fF][0-9]){2}))'
r'', address):
return True
else:
return False
| 34.483871 | 76 | 0.33536 | 370 | 2,138 | 1.937838 | 0.086486 | 0.145049 | 0.209205 | 0.165969 | 0.896792 | 0.863319 | 0.814505 | 0.570432 | 0.439331 | 0.421199 | 0 | 0.158368 | 0.323667 | 2,138 | 61 | 77 | 35.04918 | 0.337483 | 0 | 0 | 0.418182 | 0 | 0.254545 | 0.495323 | 0.478017 | 0 | 0 | 0 | 0 | 0 | 1 | 0.054545 | false | 0 | 0.018182 | 0 | 0.218182 | 0 | 0 | 0 | 1 | null | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
1ce14f9173c35c99da45e508da0c265da1ea9c31 | 196 | py | Python | openwisp_utils/admin_theme/site.py | TPath123/openwisp-utils | 2cae02b42dfc3234e2ceaca918e68405218dfac0 | [
"BSD-3-Clause"
] | null | null | null | openwisp_utils/admin_theme/site.py | TPath123/openwisp-utils | 2cae02b42dfc3234e2ceaca918e68405218dfac0 | [
"BSD-3-Clause"
] | null | null | null | openwisp_utils/admin_theme/site.py | TPath123/openwisp-utils | 2cae02b42dfc3234e2ceaca918e68405218dfac0 | [
"BSD-3-Clause"
] | null | null | null | from django.utils.module_loading import import_string
from .settings import OPENWISP_ADMIN_SITE_CLASS
admin_site_class = import_string(OPENWISP_ADMIN_SITE_CLASS)
admin_site = admin_site_class()
| 28 | 59 | 0.872449 | 29 | 196 | 5.413793 | 0.413793 | 0.286624 | 0.356688 | 0.280255 | 0.394904 | 0.394904 | 0 | 0 | 0 | 0 | 0 | 0 | 0.081633 | 196 | 6 | 60 | 32.666667 | 0.872222 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.75 | 0 | 0.75 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
e82cff630a7bae2efc0e696c0570a5c0adc5d8d6 | 88,097 | py | Python | siren/siren.py | MrTornado24/FENeRF | 9d90acda243b7c7d7f2c688a3bb333da2e7f8894 | [
"MIT"
] | 22 | 2022-03-18T16:29:04.000Z | 2022-03-31T12:17:55.000Z | siren/siren.py | MrTornado24/FENeRF | 9d90acda243b7c7d7f2c688a3bb333da2e7f8894 | [
"MIT"
] | 2 | 2022-03-28T09:21:27.000Z | 2022-03-28T09:30:16.000Z | siren/siren.py | MrTornado24/FENeRF | 9d90acda243b7c7d7f2c688a3bb333da2e7f8894 | [
"MIT"
] | 1 | 2022-03-20T14:15:11.000Z | 2022-03-20T14:15:11.000Z | import sys
from numpy.lib.type_check import imag
from torch._C import device
from torch.functional import align_tensors
sys.path.append('/apdcephfs/share_1330077/starksun/projects/pi-GAN')
from fid_evaluation import output_images
import numpy as np
import torch.nn as nn
import torch
import math
import torch.nn.functional as F
from .latent_grid import StyleGenerator2D
from .layers import *
class Sine(nn.Module):
"""Sine Activation Function."""
def __init__(self):
super().__init__()
def forward(self, x):
return torch.sin(30. * x)
def sine_init(m):
with torch.no_grad():
if isinstance(m, nn.Linear):
num_input = m.weight.size(-1)
m.weight.uniform_(-np.sqrt(6 / num_input) / 30, np.sqrt(6 / num_input) / 30)
def first_layer_sine_init(m):
with torch.no_grad():
if isinstance(m, nn.Linear):
num_input = m.weight.size(-1)
m.weight.uniform_(-1 / num_input, 1 / num_input)
def film_sine_init(m):
with torch.no_grad():
if isinstance(m, nn.Linear):
num_input = m.weight.size(-1)
m.weight.uniform_(-np.sqrt(6 / num_input) / 30, np.sqrt(6 / num_input) / 30)
def first_layer_film_sine_init(m):
with torch.no_grad():
if isinstance(m, nn.Linear):
num_input = m.weight.size(-1)
m.weight.uniform_(-1 / num_input, 1 / num_input)
def kaiming_leaky_init(m):
classname = m.__class__.__name__
if classname.find('Linear') != -1:
torch.nn.init.kaiming_normal_(m.weight, a=0.2, mode='fan_in', nonlinearity='leaky_relu')
# class CustomMappingNetwork(nn.Module):
# def __init__(self, z_dim, map_hidden_dim, map_output_dim):
# super().__init__()
# self.network = nn.Sequential(nn.Linear(z_dim, map_hidden_dim),
# nn.LeakyReLU(0.2, inplace=True),
# nn.Linear(map_hidden_dim, map_hidden_dim),
# nn.LeakyReLU(0.2, inplace=True),
# nn.Linear(map_hidden_dim, map_hidden_dim),
# nn.LeakyReLU(0.2, inplace=True),
# nn.Linear(map_hidden_dim, map_output_dim))
# self.network.apply(kaiming_leaky_init)
# with torch.no_grad():
# self.network[-1].weight *= 0.25
# def forward(self, z):
# frequencies_offsets = self.network(z)
# frequencies = frequencies_offsets[..., :frequencies_offsets.shape[-1]//2]
# phase_shifts = frequencies_offsets[..., frequencies_offsets.shape[-1]//2:]
# return frequencies, phase_shifts
class CustomMappingNetwork(nn.Module):
def __init__(self, z_dim, map_hidden_dim, map_output_dim, n_blocks=3):
super().__init__()
self.network = [nn.Linear(z_dim, map_hidden_dim),
nn.LeakyReLU(0.2, inplace=True)]
for _ in range(n_blocks):
self.network.append(nn.Linear(map_hidden_dim, map_hidden_dim))
self.network.append(nn.LeakyReLU(0.2, inplace=True))
self.network.append(nn.Linear(map_hidden_dim, map_output_dim))
self.network = nn.Sequential(*self.network)
self.network.apply(kaiming_leaky_init)
with torch.no_grad():
self.network[-1].weight *= 0.25
def forward(self, z):
frequencies_offsets = self.network(z) # z: (n_batch * n_point, n_channel)
frequencies = frequencies_offsets[..., :frequencies_offsets.shape[-1]//2]
phase_shifts = frequencies_offsets[..., frequencies_offsets.shape[-1]//2:]
return frequencies, phase_shifts
def frequency_init(freq):
def init(m):
with torch.no_grad():
if isinstance(m, nn.Linear):
num_input = m.weight.size(-1)
m.weight.uniform_(-np.sqrt(6 / num_input) / freq, np.sqrt(6 / num_input) / freq)
return init
class FiLMLayer(nn.Module):
def __init__(self, input_dim, hidden_dim):
super().__init__()
self.layer = nn.Linear(input_dim, hidden_dim)
def forward(self, x, freq, phase_shift):
x = self.layer(x)
if x.shape[1] != freq.shape[1]:
freq = freq.unsqueeze(1).expand_as(x) #TODO: all x conditioned on a single freq and phase_shift --> every x conditioned on a specific freq and phase_shift
phase_shift = phase_shift.unsqueeze(1).expand_as(x)
return torch.sin(freq * x + phase_shift)
class TALLSIREN(nn.Module):
"""Primary SIREN architecture used in pi-GAN generators."""
def __init__(self, input_dim=2, z_dim=100, hidden_dim=256, output_dim=1, device=None):
super().__init__()
self.device = device
self.input_dim = input_dim
self.z_dim = z_dim
self.hidden_dim = hidden_dim
self.output_dim = output_dim
self.network = nn.ModuleList([
FiLMLayer(input_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
])
self.final_layer = nn.Linear(hidden_dim, 1)
self.color_layer_sine = FiLMLayer(hidden_dim + 3, hidden_dim)
self.color_layer_linear = nn.Sequential(nn.Linear(hidden_dim, 3), nn.Sigmoid())
self.mapping_network = CustomMappingNetwork(z_dim, 256, (len(self.network) + 1)*hidden_dim*2)
self.network.apply(frequency_init(25))
self.final_layer.apply(frequency_init(25))
self.color_layer_sine.apply(frequency_init(25))
self.color_layer_linear.apply(frequency_init(25))
self.network[0].apply(first_layer_film_sine_init)
def forward(self, input, z, ray_directions, **kwargs):
frequencies, phase_shifts = self.mapping_network(z)
return self.forward_with_frequencies_phase_shifts(input, frequencies, phase_shifts, ray_directions, **kwargs)
def forward_with_frequencies_phase_shifts(self, input, frequencies, phase_shifts, ray_directions, **kwargs):
frequencies = frequencies*15 + 30
x = input
for index, layer in enumerate(self.network):
start = index * self.hidden_dim
end = (index+1) * self.hidden_dim
x = layer(x, frequencies[..., start:end], phase_shifts[..., start:end])
sigma = self.final_layer(x)
rbg = self.color_layer_sine(torch.cat([ray_directions, x], dim=-1), frequencies[..., -self.hidden_dim:], phase_shifts[..., -self.hidden_dim:])
rbg = self.color_layer_linear(rbg)
return torch.cat([rbg, sigma], dim=-1)
class UniformBoxWarp(nn.Module):
def __init__(self, sidelength):
super().__init__()
self.scale_factor = 2/sidelength
def forward(self, coordinates):
return coordinates * self.scale_factor
class SPATIALSIRENBASELINE(nn.Module):
"""Same architecture as TALLSIREN but adds a UniformBoxWarp to map input points to -1, 1"""
def __init__(self, input_dim=2, z_dim=100, hidden_dim=256, output_dim=1, device=None):
super().__init__()
self.device = device
self.input_dim = input_dim
self.z_dim = z_dim
self.hidden_dim = hidden_dim
self.output_dim = output_dim
self.network = nn.ModuleList([
FiLMLayer(3, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
])
self.final_layer = nn.Linear(hidden_dim, 1)
self.color_layer_sine = FiLMLayer(hidden_dim + 3, hidden_dim)
self.color_layer_linear = nn.Sequential(nn.Linear(hidden_dim, 3))
self.mapping_network = CustomMappingNetwork(z_dim, 256, (len(self.network) + 1)*hidden_dim*2)
self.network.apply(frequency_init(25))
self.final_layer.apply(frequency_init(25))
self.color_layer_sine.apply(frequency_init(25))
self.color_layer_linear.apply(frequency_init(25))
self.network[0].apply(first_layer_film_sine_init)
self.gridwarper = UniformBoxWarp(0.24) # Don't worry about this, it was added to ensure compatibility with another model. Shouldn't affect performance.
def forward(self, input, z, ray_directions, **kwargs):
frequencies, phase_shifts = self.mapping_network(z)
return self.forward_with_frequencies_phase_shifts(input, frequencies, phase_shifts, ray_directions, **kwargs)
def forward_with_frequencies_phase_shifts(self, input, frequencies, phase_shifts, ray_directions, **kwargs):
frequencies = frequencies*15 + 30
input = self.gridwarper(input)
x = input
for index, layer in enumerate(self.network):
start = index * self.hidden_dim
end = (index+1) * self.hidden_dim
x = layer(x, frequencies[..., start:end], phase_shifts[..., start:end])
sigma = self.final_layer(x)
rbg = self.color_layer_sine(torch.cat([ray_directions, x], dim=-1), frequencies[..., -self.hidden_dim:], phase_shifts[..., -self.hidden_dim:])
rbg = torch.sigmoid(self.color_layer_linear(rbg))
return torch.cat([rbg, sigma], dim=-1)
class SPATIALSIRENBASELINEHD(nn.Module):
"""Same architecture as SPATIALSIRENBASELINE but use neural renderer"""
def __init__(self, input_dim=2, z_dim=100, hidden_dim=256, output_dim=1, device=None):
super().__init__()
self.device = device
self.input_dim = input_dim
self.z_dim = z_dim
self.hidden_dim = hidden_dim
self.output_dim = output_dim
self.network = nn.ModuleList([
FiLMLayer(3, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
])
self.final_layer = nn.Linear(hidden_dim, 1)
self.color_layer_sine = FiLMLayer(hidden_dim + 3, hidden_dim)
self.color_layer_linear = nn.Sequential(nn.Linear(hidden_dim, 64))
self.mapping_network = CustomMappingNetwork(z_dim, 256, (len(self.network) + 1)*hidden_dim*2)
self.network.apply(frequency_init(25))
self.final_layer.apply(frequency_init(25))
self.color_layer_sine.apply(frequency_init(25))
self.color_layer_linear.apply(frequency_init(25))
self.network[0].apply(first_layer_film_sine_init)
self.gridwarper = UniformBoxWarp(0.24) # Don't worry about this, it was added to ensure compatibility with another model. Shouldn't affect performance.
def forward(self, input, z, ray_directions, **kwargs):
frequencies, phase_shifts = self.mapping_network(z)
return self.forward_with_frequencies_phase_shifts(input, frequencies, phase_shifts, ray_directions, **kwargs)
def forward_with_frequencies_phase_shifts(self, input, frequencies, phase_shifts, ray_directions, **kwargs):
frequencies = frequencies*15 + 30
input = self.gridwarper(input)
x = input
for index, layer in enumerate(self.network):
start = index * self.hidden_dim
end = (index+1) * self.hidden_dim
x = layer(x, frequencies[..., start:end], phase_shifts[..., start:end])
sigma = self.final_layer(x)
rbg = self.color_layer_sine(torch.cat([ray_directions, x], dim=-1), frequencies[..., -self.hidden_dim:], phase_shifts[..., -self.hidden_dim:])
# rbg = torch.sigmoid(self.color_layer_linear(rbg))
rbg = self.color_layer_linear(rbg)
return torch.cat([rbg, sigma], dim=-1)
class UniformBoxWarp(nn.Module):
def __init__(self, sidelength):
super().__init__()
self.scale_factor = 2/sidelength
def forward(self, coordinates):
return coordinates * self.scale_factor
def sample_from_3dgrid(coordinates, grid):
"""
Expects coordinates in shape (batch_size, num_points_per_batch, 3)
Expects grid in shape (1, channels, H, W, D)
(Also works if grid has batch size)
Returns sampled features of shape (batch_size, num_points_per_batch, feature_channels)
"""
coordinates = coordinates.float()
grid = grid.float()
batch_size, n_coords, n_dims = coordinates.shape
sampled_features = torch.nn.functional.grid_sample(grid.expand(batch_size, -1, -1, -1, -1),
coordinates.reshape(batch_size, 1, 1, -1, n_dims),
mode='bilinear', padding_mode='zeros', align_corners=True)
N, C, H, W, D = sampled_features.shape
sampled_features = sampled_features.permute(0, 4, 3, 2, 1).reshape(N, H*W*D, C)
return sampled_features
def modified_first_sine_init(m):
with torch.no_grad():
# if hasattr(m, 'weight'):
if isinstance(m, nn.Linear):
num_input = 3
m.weight.uniform_(-1 / num_input, 1 / num_input)
class EmbeddingPiGAN128(nn.Module):
"""Smaller architecture that has an additional cube of embeddings. Often gives better fine details."""
def __init__(self, input_dim=2, z_dim=100, hidden_dim=128, output_dim=1, device=None):
super().__init__()
self.device = device
self.input_dim = input_dim
self.z_dim = z_dim
self.hidden_dim = hidden_dim
self.output_dim = output_dim
self.network = nn.ModuleList([
FiLMLayer(32 + 3, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
])
print(self.network)
self.final_layer = nn.Linear(hidden_dim, 1)
self.color_layer_sine = FiLMLayer(hidden_dim + 3, hidden_dim)
self.color_layer_linear = nn.Sequential(nn.Linear(hidden_dim, 3))
self.mapping_network = CustomMappingNetwork(z_dim, 256, (len(self.network) + 1)*hidden_dim*2)
self.network.apply(frequency_init(25))
self.final_layer.apply(frequency_init(25))
self.color_layer_sine.apply(frequency_init(25))
self.color_layer_linear.apply(frequency_init(25))
self.network[0].apply(modified_first_sine_init)
self.spatial_embeddings = nn.Parameter(torch.randn(1, 32, 96, 96, 96)*0.01)
# !! Important !! Set this value to the expected side-length of your scene. e.g. for for faces, heads usually fit in
# a box of side-length 0.24, since the camera has such a narrow FOV. For other scenes, with higher FOV, probably needs to be bigger.
self.gridwarper = UniformBoxWarp(0.24)
def forward(self, input, z, ray_directions, **kwargs):
frequencies, phase_shifts = self.mapping_network(z)
return self.forward_with_frequencies_phase_shifts(input, frequencies, phase_shifts, ray_directions, **kwargs)
def forward_with_frequencies_phase_shifts(self, input, frequencies, phase_shifts, ray_directions, **kwargs):
frequencies = frequencies*15 + 30
input = self.gridwarper(input)
shared_features = sample_from_3dgrid(input, self.spatial_embeddings)
x = torch.cat([shared_features, input], -1)
for index, layer in enumerate(self.network):
start = index * self.hidden_dim
end = (index+1) * self.hidden_dim
x = layer(x, frequencies[..., start:end], phase_shifts[..., start:end])
sigma = self.final_layer(x)
rbg = self.color_layer_sine(torch.cat([ray_directions, x], dim=-1), frequencies[..., -self.hidden_dim:], phase_shifts[..., -self.hidden_dim:])
rbg = torch.sigmoid(self.color_layer_linear(rbg))
return torch.cat([rbg, sigma], dim=-1)
class EmbeddingPiGAN256(EmbeddingPiGAN128):
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs, hidden_dim=256)
self.spatial_embeddings = nn.Parameter(torch.randn(1, 32, 64, 64, 64)*0.1)
class SPATIALSIRENGRID(nn.Module):
"""Same architecture as SPATIALSIRENBASELINE but use local latent sampled from grid"""
def __init__(self, input_dim=2, z_dim=100, hidden_dim=256, output_dim=1, device=None):
super().__init__()
self.device = device
self.input_dim = input_dim
self.z_dim = z_dim
self.hidden_dim = hidden_dim
self.output_dim = output_dim
self.local_coordinates = True
self.network = nn.ModuleList([
FiLMLayer(3, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
])
self.final_layer = nn.Linear(hidden_dim, 1)
self.color_layer_sine = FiLMLayer(hidden_dim + 3, hidden_dim)
self.color_layer_linear = nn.Sequential(nn.Linear(hidden_dim, 3))
self.mapping_network = CustomMappingNetwork(32, 256, (len(self.network) + 1)*hidden_dim*2, n_blocks=1)
self.grid_latent_network = StyleGenerator2D(out_res=32, out_ch=32, z_dim=z_dim, ch_mul=1, ch_max=256, skip_conn=False)
self.network.apply(frequency_init(25))
self.final_layer.apply(frequency_init(25))
self.color_layer_sine.apply(frequency_init(25))
self.color_layer_linear.apply(frequency_init(25))
self.network[0].apply(first_layer_film_sine_init)
self.gridwarper = UniformBoxWarp(0.24) # Don't worry about this, it was added to ensure compatibility with another model. Shouldn't affect performance.
def forward(self, input, z, ray_directions, **kwargs):
latent_grid = self.grid_latent_network(z)
input_grid = self.gridwarper(input) # range: (-1.4, 1.4)
sampled_latent = self.sample_local_latents(latent_grid, input_grid)
frequencies, phase_shifts = self.mapping_network(sampled_latent)
if self.local_coordinates:
# map global coordinate space into local coordinate space (i.e. each grid cell has a [-1, 1] range)
preserve_y = sampled_latent.ndim == 4 # if latents are 2D, then keep the y coordinate global
input = self.get_local_coordinates(
global_coords=input, local_grid_length=32, preserve_y=False
)
return self.forward_with_frequencies_phase_shifts(input, frequencies, phase_shifts, ray_directions, box_warp=False, **kwargs)
def forward_with_frequencies_phase_shifts(self, input, frequencies, phase_shifts, ray_directions, **kwargs):
frequencies = frequencies*15 + 30
x = self.gridwarper(input)
for index, layer in enumerate(self.network):
start = index * self.hidden_dim
end = (index+1) * self.hidden_dim
x = layer(x, frequencies[..., start:end], phase_shifts[..., start:end])
sigma = self.final_layer(x)
rbg = self.color_layer_sine(torch.cat([ray_directions, x], dim=-1), frequencies[..., -self.hidden_dim:], phase_shifts[..., -self.hidden_dim:])
rbg = torch.sigmoid(self.color_layer_linear(rbg))
return torch.cat([rbg, sigma], dim=-1)
def sample_local_latents(self, local_latents, xyz):
B, local_z_dim, H, W = local_latents.shape
# take only x and z coordinates, since our latent codes are in a 2D grid (no y dimension)
# for the purposes of grid_sample we treat H*W as the H dimension and samples_per_ray as the W dimension
xyz = xyz[:, :, [0, 2]].unsqueeze(1) # [B, H * W, samples_per_ray, 2]
# all samples get the most detailed latent codes
sampled_local_latents = nn.functional.grid_sample(
input=local_latents, # (b, c, h, w)
grid=xyz, # (b, 1, n_pixel, 2)
mode='bilinear', # bilinear mode will use trilinear interpolation if input is 5D
align_corners=False,
padding_mode="zeros",
)
# output is shape [B, local_z_dim, H * W, samples_per_ray]
# put channel dimension at end: [B, H * W, samples_per_ray, local_z_dim]
sampled_local_latents = sampled_local_latents.permute(0, 2, 3, 1)
# merge everything else into batch dim: [B * H * W * samples_per_ray, local_z_dim]
sampled_local_latents = sampled_local_latents.reshape(B, -1, local_z_dim)
return sampled_local_latents
def get_local_coordinates(self, global_coords, local_grid_length, preserve_y=True):
local_coords = global_coords.clone()
# it is assumed that the global coordinates are scaled to [-1, 1]
# convert to [0, 1] scale
local_coords = (local_coords + 1) / 2
# scale so that each grid cell in the local_latent grid is 1x1 in size
local_coords = local_coords * local_grid_length
# subtract integer from each coordinate so that they are all in range [0, 1]
local_coords = local_coords - (local_coords - 0.5).round()
# return to [-1, 1] scale
local_coords = (local_coords * 2) - 1
if preserve_y:
# preserve the y dimension in the global coordinate frame, since it doesn't have a local latent code
coords = torch.cat([local_coords[..., 0:1], global_coords[..., 1:2], local_coords[..., 2:3]], dim=-1)
else:
coords = torch.cat([local_coords[..., 0:1], local_coords[..., 1:2], local_coords[..., 2:3]], dim=-1)
return coords
class SPATIALSIRENVOLUME(nn.Module):
"""Same architecture as SPATIALSIRENBASELINE but use local latent sampled from volume"""
def __init__(self, input_dim=2, z_dim=100, hidden_dim=256, output_dim=1, device=None):
super().__init__()
self.device = device
self.input_dim = input_dim
self.z_dim = z_dim
self.hidden_dim = hidden_dim
self.output_dim = output_dim
self.network = nn.ModuleList([
FiLMLayer(3, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
])
self.final_layer = nn.Linear(hidden_dim, 1)
self.color_layer_sine = FiLMLayer(hidden_dim + 3, hidden_dim)
self.color_layer_linear = nn.Sequential(nn.Linear(hidden_dim, 3))
self.mapping_network = CustomMappingNetwork(32, 256, (len(self.network) + 1)*hidden_dim*2)
# self.volume_latent_network = VolumeStyleGenerator(
# mapping_fmaps=z_dim,
# style_mixing_prob=0.9, # Probability of mixing styles during training. None = disable.
# truncation_psi=0.7, # Style strength multiplier for the truncation trick. None = disable.
# truncation_cutoff=8, # Number of layers for which to apply the truncation trick. None = disable.
# resolution=32,
# fmap_base=512,
# fmap_max=256)
self.volume_latent_network = VolumeStyleGenerator(input_nc=z_dim, output_nc=32, n_samples=3, norm='batch', activation='ReLU', padding_type='zero')
self.network.apply(frequency_init(25))
self.final_layer.apply(frequency_init(25))
self.color_layer_sine.apply(frequency_init(25))
self.color_layer_linear.apply(frequency_init(25))
self.network[0].apply(first_layer_film_sine_init)
self.gridwarper = UniformBoxWarp(0.24) # Don't worry about this, it was added to ensure compatibility with another model. Shouldn't affect performance.
def forward(self, input, z, ray_directions, **kwargs):
latent_grid = self.volume_latent_network(z)
input_grid = self.gridwarper(input)
# interpolate latent
# samples = F.grid_sample(latent_grid,
# input[..., [0, 2]].unsqueeze(2),
# align_corners=True,
# mode='bilinear',
# padding_mode='zeros')
samples = sample_from_3dgrid(input_grid, latent_grid)
frequencies, phase_shifts = self.mapping_network(samples)
return self.forward_with_frequencies_phase_shifts(input, frequencies, phase_shifts, ray_directions, box_warp=False, **kwargs)
def forward_with_frequencies_phase_shifts(self, input, frequencies, phase_shifts, ray_directions, **kwargs):
frequencies = frequencies*15 + 30
input = self.gridwarper(input)
x = input
for index, layer in enumerate(self.network):
start = index * self.hidden_dim
end = (index+1) * self.hidden_dim
x = layer(x, frequencies[..., start:end], phase_shifts[..., start:end])
sigma = self.final_layer(x)
rbg = self.color_layer_sine(torch.cat([ray_directions, x], dim=-1), frequencies[..., -self.hidden_dim:], phase_shifts[..., -self.hidden_dim:])
rbg = torch.sigmoid(self.color_layer_linear(rbg))
return torch.cat([rbg, sigma], dim=-1)
class SPATIALSIRENSEMANTIC(nn.Module):
"""Same architecture as TALLSIREN but synthesis semantic map"""
def __init__(self, input_dim=2, z_dim=100, hidden_dim=256, output_dim=1, device=None):
super().__init__()
self.device = device
self.input_dim = input_dim
self.z_dim = z_dim
self.hidden_dim = hidden_dim
self.output_dim = output_dim
self.max_batch_size = 2500
self.network = nn.ModuleList([
FiLMLayer(3, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
])
self.final_layer = nn.Linear(hidden_dim, 1)
self.label_layer_sine = FiLMLayer(hidden_dim, hidden_dim)
self.label_layer_linear = nn.Sequential(nn.Linear(hidden_dim, 19)) # 19 semantic labels
self.color_layer_sine = FiLMLayer(hidden_dim + 3, hidden_dim)
self.color_layer_linear = nn.Sequential(nn.Linear(hidden_dim, 3))
self.mapping_network = CustomMappingNetwork(z_dim, 256, (len(self.network) + 2)*hidden_dim*2)
self.network.apply(frequency_init(25))
self.final_layer.apply(frequency_init(25))
self.label_layer_sine.apply(frequency_init(25))
self.label_layer_linear.apply(frequency_init(25))
self.color_layer_sine.apply(frequency_init(25))
self.color_layer_linear.apply(frequency_init(25))
self.network[0].apply(first_layer_film_sine_init)
self.activation = nn.Softmax(dim=-1)
self.gridwarper = UniformBoxWarp(0.24) # Don't worry about this, it was added to ensure compatibility with another model. Shouldn't affect performance.
def forward(self, input, z, ray_directions, **kwargs):
frequencies, phase_shifts = self.mapping_network(z)
n_batch, n_pixel = input.shape[:2]
# output = torch.zeros((n_batch, n_pixel, self.output_dim)).to(input)
# for b in range(n_batch):
# head = 0
# while head < n_pixel:
# tail = head + self.max_batch_size
# output[b:b+1, head:tail] = self.forward_with_frequencies_phase_shifts(input[b:b+1, head:tail], frequencies[b:b+1], phase_shifts[b:b+1], ray_directions[b:b+1, head:tail], **kwargs)
# head += self.max_batch_size
# return output
return self.forward_with_frequencies_phase_shifts(input, frequencies, phase_shifts, ray_directions, **kwargs)
def forward_with_frequencies_phase_shifts(self, input, frequencies, phase_shifts, ray_directions, **kwargs):
frequencies = frequencies*15 + 30
input = self.gridwarper(input)
x = input
for index, layer in enumerate(self.network):
start = index * self.hidden_dim
end = (index+1) * self.hidden_dim
x = layer(x, frequencies[..., start:end], phase_shifts[..., start:end])
start += self.hidden_dim
end += self.hidden_dim
sigma = self.final_layer(x)
labels = self.label_layer_sine(x, frequencies[..., start:end], phase_shifts[..., start:end])
# TODO: w. / w.o softmax activation on label
labels = self.label_layer_linear(labels)
start += self.hidden_dim
end += self.hidden_dim
rbg = self.color_layer_sine(torch.cat([ray_directions, x], dim=-1), frequencies[..., start:end], phase_shifts[..., start:end])
rbg = torch.sigmoid(self.color_layer_linear(rbg))
return torch.cat([labels, rbg, sigma], dim=-1)
class SPATIALSIRENBASELINESEMANTIC(nn.Module):
"""Same architecture as SPATIALSIRENSEMANTIC but doesn't condition on geometry code when regressing labels"""
def __init__(self, input_dim=2, z_dim=100, hidden_dim=256, output_dim=1, device=None):
super().__init__()
self.device = device
self.input_dim = input_dim
self.z_dim = z_dim
self.hidden_dim = hidden_dim
self.output_dim = output_dim
self.max_batch_size = 2500
self.network = nn.ModuleList([
FiLMLayer(3, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
])
self.final_layer = nn.Linear(hidden_dim, 1)
self.label_layer_linear = nn.Sequential(
nn.Linear(hidden_dim, hidden_dim),
nn.Linear(hidden_dim, 19)) # 19 semantic labels
self.color_layer_sine = FiLMLayer(hidden_dim + 3, hidden_dim)
self.color_layer_linear = nn.Sequential(nn.Linear(hidden_dim, 3))
self.mapping_network = CustomMappingNetwork(z_dim, 256, (len(self.network) + 1)*hidden_dim*2)
self.network.apply(frequency_init(25))
self.final_layer.apply(frequency_init(25))
self.label_layer_linear.apply(frequency_init(25))
self.color_layer_sine.apply(frequency_init(25))
self.color_layer_linear.apply(frequency_init(25))
self.network[0].apply(first_layer_film_sine_init)
self.activation = nn.Softmax(dim=-1)
self.gridwarper = UniformBoxWarp(0.24) # Don't worry about this, it was added to ensure compatibility with another model. Shouldn't affect performance.
def forward(self, input, z, ray_directions, **kwargs):
frequencies, phase_shifts = self.mapping_network(z)
n_batch, n_pixel = input.shape[:2]
# output = torch.zeros((n_batch, n_pixel, self.output_dim)).to(input)
# for b in range(n_batch):
# head = 0
# while head < n_pixel:
# tail = head + self.max_batch_size
# output[b:b+1, head:tail] = self.forward_with_frequencies_phase_shifts(input[b:b+1, head:tail], frequencies[b:b+1], phase_shifts[b:b+1], ray_directions[b:b+1, head:tail], **kwargs)
# head += self.max_batch_size
# return output
return self.forward_with_frequencies_phase_shifts(input, frequencies, phase_shifts, ray_directions, **kwargs)
def forward_with_frequencies_phase_shifts(self, input, frequencies, phase_shifts, ray_directions, **kwargs):
frequencies = frequencies*15 + 30
input = self.gridwarper(input)
x = input
for index, layer in enumerate(self.network):
start = index * self.hidden_dim
end = (index+1) * self.hidden_dim
x = layer(x, frequencies[..., start:end], phase_shifts[..., start:end])
sigma = self.final_layer(x)
# labels = torch.sigmoid(self.label_layer_linear(x))
labels = self.label_layer_linear(x)
start += self.hidden_dim
end += self.hidden_dim
rbg = self.color_layer_sine(torch.cat([ray_directions, x], dim=-1), frequencies[..., start:end], phase_shifts[..., start:end])
rbg = torch.sigmoid(self.color_layer_linear(rbg))
return torch.cat([labels, rbg, sigma], dim=-1)
class SPATIALSIRENDISENTANGLE(nn.Module):
"""Same architecture as TALLSIREN but use double latent codes"""
def __init__(self, input_dim=2, z_geo_dim=100, z_app_dim=100, hidden_dim=256, output_dim=1, device=None):
super().__init__()
self.device = device
self.input_dim = input_dim
self.z_geo_dim = z_geo_dim
self.z_app_dim = z_app_dim
self.hidden_dim = hidden_dim
self.output_dim = output_dim
self.max_batch_size = 2500
self.network = nn.ModuleList([
FiLMLayer(3, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
])
self.final_layer = nn.Linear(hidden_dim, 1)
self.color_layer_sine = nn.ModuleList([
FiLMLayer(hidden_dim+3, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
])
self.color_layer_linear = nn.Sequential(nn.Linear(hidden_dim, 3))
self.geo_mapping_network = CustomMappingNetwork(z_geo_dim, 256, len(self.network)*hidden_dim*2)
self.app_mapping_network = CustomMappingNetwork(z_app_dim, 256, len(self.color_layer_sine)*hidden_dim*2)
self.network.apply(frequency_init(25))
self.final_layer.apply(frequency_init(25))
self.color_layer_sine.apply(frequency_init(25))
self.color_layer_linear.apply(frequency_init(25))
self.network[0].apply(first_layer_film_sine_init)
self.activation = nn.Softmax(dim=-1)
self.gridwarper = UniformBoxWarp(0.24) # Don't worry about this, it was added to ensure compatibility with another model. Shouldn't affect performance.
def forward(self, input, z_geo, z_app, ray_directions, **kwargs):
frequencies_geo, phase_shifts_geo = self.geo_mapping_network(z_geo)
frequencies_app, phase_shifts_app = self.app_mapping_network(z_app)
return self.forward_with_frequencies_phase_shifts(input, frequencies_geo, frequencies_app, phase_shifts_geo, phase_shifts_app, ray_directions, **kwargs)
def forward_with_frequencies_phase_shifts(self, input, frequencies_geo, frequencies_app, phase_shifts_geo, phase_shifts_app, ray_directions, **kwargs):
frequencies_geo = frequencies_geo*15 + 30
frequencies_app = frequencies_app*15 + 30 # TODO: 为什么做变换
input = self.gridwarper(input)
x = input
for index, layer in enumerate(self.network):
start = index * self.hidden_dim
end = (index+1) * self.hidden_dim
x = layer(x, frequencies_geo[..., start:end], phase_shifts_geo[..., start:end])
rbg = torch.cat([ray_directions, x], dim=-1)
sigma = self.final_layer(x)
for index, layer in enumerate(self.color_layer_sine):
start, end = index * self.hidden_dim, (index+1) * self.hidden_dim
rbg = layer(rbg, frequencies_app[..., start:end], phase_shifts_app[..., start: end])
rbg = torch.sigmoid(self.color_layer_linear(rbg))
return torch.cat([rbg, sigma], dim=-1)
class SPATIALSIRENDISENTANGLE_debug(nn.Module):
"""Same architecture as TALLSIREN but use double latent codes"""
def __init__(self, input_dim=2, z_geo_dim=100, z_app_dim=100, hidden_dim=256, output_dim=1, device=None):
super().__init__()
self.device = device
self.input_dim = input_dim
self.z_geo_dim = z_geo_dim
self.z_app_dim = z_app_dim
self.hidden_dim = hidden_dim
self.output_dim = output_dim
self.max_batch_size = 2500
self.network = nn.ModuleList([
FiLMLayer(3, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
])
self.final_layer = nn.Linear(hidden_dim, 1)
self.color_layer_pre = nn.Sequential(nn.Linear(hidden_dim, hidden_dim))
# self.color_layer_sine = FiLMLayer(hidden_dim + 32, hidden_dim) # ray_drection dim: 3 --> 32
self.color_layer_sine = nn.ModuleList([
FiLMLayer(hidden_dim+3, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
])
self.color_layer_linear = nn.Sequential(nn.Linear(hidden_dim, 3))
self.geo_mapping_network = CustomMappingNetwork(z_geo_dim, 256, len(self.network)*hidden_dim*2)
self.app_mapping_network = CustomMappingNetwork(z_app_dim, 256, len(self.color_layer_sine)*hidden_dim*2)
self.dir_mapping_network = nn.Sequential(
nn.Linear(3, 256),
nn.Linear(256, 32)
)
self.network.apply(frequency_init(25))
self.final_layer.apply(frequency_init(25))
self.color_layer_sine.apply(frequency_init(25))
self.color_layer_linear.apply(frequency_init(25))
self.network[0].apply(first_layer_film_sine_init)
self.activation = nn.Softmax(dim=-1)
self.gridwarper = UniformBoxWarp(0.24) # Don't worry about this, it was added to ensure compatibility with another model. Shouldn't affect performance.
def forward(self, input, z_geo, z_app, ray_directions, **kwargs):
frequencies_geo, phase_shifts_geo = self.geo_mapping_network(z_geo)
frequencies_app, phase_shifts_app = self.app_mapping_network(z_app)
# n_batch, n_pixel = input.shape[:2]
# output = torch.zeros((n_batch, n_pixel, self.output_dim)).to(input)
# for b in range(n_batch):
# head = 0
# while head < n_pixel:
# tail = head + self.max_batch_size
# output[b:b+1, head:tail] = self.forward_with_frequencies_phase_shifts(input[b:b+1, head:tail], frequencies[b:b+1], phase_shifts[b:b+1], ray_directions[b:b+1, head:tail], **kwargs)
# head += self.max_batch_size
# return output
return self.forward_with_frequencies_phase_shifts(input, frequencies_geo, frequencies_app, phase_shifts_geo, phase_shifts_app, ray_directions, **kwargs)
def forward_with_frequencies_phase_shifts(self, input, frequencies_geo, frequencies_app, phase_shifts_geo, phase_shifts_app, ray_directions, **kwargs):
frequencies_geo = frequencies_geo*15 + 30
frequencies_app = frequencies_app*15 + 30 # TODO: 为什么做变换
input = self.gridwarper(input)
x = input
for index, layer in enumerate(self.network):
start = index * self.hidden_dim
end = (index+1) * self.hidden_dim
x = layer(x, frequencies_geo[..., start:end], phase_shifts_geo[..., start:end])
sigma = self.final_layer(x)
# ray_directions = self.dir_mapping_network(ray_directions)
x = self.color_layer_pre(x)
rbg = torch.cat([ray_directions, x], dim=-1)
for index, layer in enumerate(self.color_layer_sine):
start, end = index * self.hidden_dim, (index+1) * self.hidden_dim
rbg = layer(rbg, frequencies_app[..., start:end], phase_shifts_app[..., start: end])
rbg = torch.sigmoid(self.color_layer_linear(rbg))
return torch.cat([rbg, sigma], dim=-1)
class SPATIALSIRENAUGDISENTANGLE(nn.Module):
"""Same architecture as SPATIALSIRENDISENTANGLE but has augmented color branch and narrower density feature branch"""
def __init__(self, input_dim=2, z_geo_dim=100, z_app_dim=100, hidden_dim=256, output_dim=1, device=None):
super().__init__()
self.device = device
self.input_dim = input_dim
self.z_geo_dim = z_geo_dim
self.z_app_dim = z_app_dim
self.hidden_dim = hidden_dim
self.output_dim = output_dim
self.max_batch_size = 2500
self.network = nn.ModuleList([
FiLMLayer(3, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
])
self.final_layer = nn.Linear(hidden_dim, 1)
self.color_layer_pre = nn.Sequential(
nn.Linear(hidden_dim, 3),
)
# self.color_layer_sine = FiLMLayer(hidden_dim + 32, hidden_dim) # ray_drection dim: 3 --> 32
self.color_layer_sine = nn.ModuleList([
FiLMLayer(3 + 3, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
])
self.color_layer_linear = nn.Sequential(nn.Linear(hidden_dim, 3))
self.geo_mapping_network = CustomMappingNetwork(z_geo_dim, 256, len(self.network)*hidden_dim*2)
self.app_mapping_network = CustomMappingNetwork(z_app_dim, 256, len(self.color_layer_sine)*hidden_dim*2)
self.network.apply(frequency_init(25))
self.final_layer.apply(frequency_init(25))
self.color_layer_sine.apply(frequency_init(25))
self.color_layer_linear.apply(frequency_init(25))
self.network[0].apply(first_layer_film_sine_init)
self.gridwarper = UniformBoxWarp(0.24) # Don't worry about this, it was added to ensure compatibility with another model. Shouldn't affect performance.
def forward(self, input, z_geo, z_app, ray_directions, **kwargs):
frequencies_geo, phase_shifts_geo = self.geo_mapping_network(z_geo)
frequencies_app, phase_shifts_app = self.app_mapping_network(z_app)
return self.forward_with_frequencies_phase_shifts(input, frequencies_geo, frequencies_app, phase_shifts_geo, phase_shifts_app, ray_directions, **kwargs)
def forward_with_frequencies_phase_shifts(self, input, frequencies_geo, frequencies_app, phase_shifts_geo, phase_shifts_app, ray_directions, **kwargs):
frequencies_geo = frequencies_geo*15 + 30
frequencies_app = frequencies_app*15 + 30 # TODO: 为什么做变换
input = self.gridwarper(input)
x = input
for index, layer in enumerate(self.network):
start = index * self.hidden_dim
end = (index+1) * self.hidden_dim
x = layer(x, frequencies_geo[..., start:end], phase_shifts_geo[..., start:end])
sigma = self.final_layer(x)
x = self.color_layer_pre(x)
rbg = torch.cat([ray_directions, x], dim=-1)
for index, layer in enumerate(self.color_layer_sine):
start, end = index * self.hidden_dim, (index+1) * self.hidden_dim
rbg = layer(rbg, frequencies_app[..., start:end], phase_shifts_app[..., start: end])
rbg = torch.sigmoid(self.color_layer_linear(rbg))
return torch.cat([rbg, sigma], dim=-1)
class RESSIRENDISENTANGLE(nn.Module):
"""
Same architecture as SIRENDISENTANGLE but use residual architecure
code accroding to http://gvv.mpi-inf.mpg.de/projects/i3DMM/
"""
def __init__(self, input_dim=2, z_geo_dim=100, z_app_dim=100, hidden_dim=256, output_dim=1, device=None):
super().__init__()
self.device = device
self.input_dim = input_dim
self.z_geo_dim = z_geo_dim
self.z_app_dim = z_app_dim
self.hidden_dim = hidden_dim
self.output_dim = output_dim
self.max_batch_size = 2500
self.network = nn.ModuleList([
FiLMLayer(3, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
])
self.res_coord_layer = nn.Linear(hidden_dim, 3)
self.density_layer_linear = nn.Sequential(
nn.Linear(3, hidden_dim),
nn.Linear(hidden_dim, hidden_dim),
nn.Linear(hidden_dim, hidden_dim),
nn.Linear(hidden_dim, 1)
)
self.color_layer_pre = nn.Sequential(nn.Linear(3, hidden_dim))
self.color_layer_sine = nn.ModuleList([
FiLMLayer(hidden_dim+3, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
])
self.color_layer_linear = nn.Sequential(nn.Linear(hidden_dim, 3))
self.geo_mapping_network = CustomMappingNetwork(z_geo_dim, 256, len(self.network)*hidden_dim*2)
self.app_mapping_network = CustomMappingNetwork(z_app_dim, 256, len(self.color_layer_sine)*hidden_dim*2)
# self.dir_mapping_network = nn.Sequential(
# nn.Linear(3, 256),
# nn.Linear(256, 32)
# )
self.network.apply(frequency_init(25))
self.density_layer_linear.apply(frequency_init(25))
self.color_layer_sine.apply(frequency_init(25))
self.color_layer_linear.apply(frequency_init(25))
self.network[0].apply(first_layer_film_sine_init)
self.activation = nn.Softmax(dim=-1)
self.gridwarper = UniformBoxWarp(0.24) # Don't worry about this, it was added to ensure compatibility with another model. Shouldn't affect performance.
def forward(self, input, z_geo, z_app, ray_directions, **kwargs):
frequencies_geo, phase_shifts_geo = self.geo_mapping_network(z_geo)
frequencies_app, phase_shifts_app = self.app_mapping_network(z_app)
# n_batch, n_pixel = input.shape[:2]
# output = torch.zeros((n_batch, n_pixel, self.output_dim)).to(input)
# for b in range(n_batch):
# head = 0
# while head < n_pixel:
# tail = head + self.max_batch_size
# output[b:b+1, head:tail] = self.forward_with_frequencies_phase_shifts(input[b:b+1, head:tail], frequencies[b:b+1], phase_shifts[b:b+1], ray_directions[b:b+1, head:tail], **kwargs)
# head += self.max_batch_size
# return output
return self.forward_with_frequencies_phase_shifts(input, frequencies_geo, frequencies_app, phase_shifts_geo, phase_shifts_app, ray_directions, **kwargs)
def forward_with_frequencies_phase_shifts(self, input, frequencies_geo, frequencies_app, phase_shifts_geo, phase_shifts_app, ray_directions, **kwargs):
frequencies_geo = frequencies_geo*15 + 30
frequencies_app = frequencies_app*15 + 30 # TODO: 为什么做变换
input = self.gridwarper(input)
x = input
for index, layer in enumerate(self.network):
start = index * self.hidden_dim
end = (index+1) * self.hidden_dim
x = layer(x, frequencies_geo[..., start:end], phase_shifts_geo[..., start:end])
coords_res = self.res_coord_layer(x)
input = input + coords_res
sigma = self.density_layer_linear(input)
# ray_directions = self.dir_mapping_network(ray_directions)
rbg = self.color_layer_pre(input)
rbg = torch.cat([ray_directions, rbg], dim=-1)
for index, layer in enumerate(self.color_layer_sine):
start, end = index * self.hidden_dim, (index+1) * self.hidden_dim
rbg = layer(rbg, frequencies_app[..., start:end], phase_shifts_app[..., start: end])
rbg = torch.sigmoid(self.color_layer_linear(rbg))
return torch.cat([rbg, sigma], dim=-1)
class SPATIALSIRENSEMANTICDISENTANGLE(nn.Module):
"""Same architecture as TALLSIREN but use double latent codes and render semantic maps"""
def __init__(self, input_dim=2, z_geo_dim=100, z_app_dim=100, hidden_dim=256, output_dim=1, device=None):
super().__init__()
self.device = device
self.input_dim = input_dim
self.z_geo_dim = z_geo_dim
self.z_app_dim = z_app_dim
self.hidden_dim = hidden_dim
self.output_dim = output_dim
self.network = nn.ModuleList([
FiLMLayer(3, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
])
self.final_layer = nn.Linear(hidden_dim, 1)
self.color_layer_sine = nn.ModuleList([
FiLMLayer(hidden_dim+3, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
])
self.color_layer_linear = nn.Sequential(nn.Linear(hidden_dim, 3))
self.geo_mapping_network = CustomMappingNetwork(z_geo_dim, 256, len(self.network)*hidden_dim*2)
self.app_mapping_network = CustomMappingNetwork(z_app_dim, 256, len(self.color_layer_sine)*hidden_dim*2)
self.label_layer_linear = nn.Sequential(
nn.Linear(hidden_dim, hidden_dim),
nn.Linear(hidden_dim, self.output_dim - 4)) # output_dim = seg_channel + rgb_channel + density_channel
self.network.apply(frequency_init(25))
self.final_layer.apply(frequency_init(25))
self.color_layer_sine.apply(frequency_init(25))
self.color_layer_linear.apply(frequency_init(25))
self.label_layer_linear.apply(frequency_init(25))
self.network[0].apply(first_layer_film_sine_init)
self.color_layer_sine[0].apply(first_layer_film_sine_init)
self.gridwarper = UniformBoxWarp(0.24) # Don't worry about this, it was added to ensure compatibility with another model. Shouldn't affect performance.
def forward(self, input, z_geo, z_app, ray_directions, **kwargs):
frequencies_geo, phase_shifts_geo = self.geo_mapping_network(z_geo)
frequencies_app, phase_shifts_app = self.app_mapping_network(z_app)
return self.forward_with_frequencies_phase_shifts(input, frequencies_geo, frequencies_app, phase_shifts_geo, phase_shifts_app, ray_directions, **kwargs)
def forward_with_frequencies_phase_shifts(self, input, frequencies_geo, frequencies_app, phase_shifts_geo, phase_shifts_app, ray_directions, **kwargs):
frequencies_geo = frequencies_geo*15 + 30
frequencies_app = frequencies_app*15 + 30 # TODO: 为什么做变换
input = self.gridwarper(input)
x = input
for index, layer in enumerate(self.network):
start = index * self.hidden_dim
end = (index+1) * self.hidden_dim
x = layer(x, frequencies_geo[..., start:end], phase_shifts_geo[..., start:end])
sigma = self.final_layer(x)
start += self.hidden_dim
end += self.hidden_dim
labels = self.label_layer_linear(x)
# rbg = torch.cat([ray_directions, input, labels], dim=-1)
rbg = torch.cat([ray_directions, x], dim=-1)
for index, layer in enumerate(self.color_layer_sine):
start, end = index * self.hidden_dim, (index+1) * self.hidden_dim
rbg = layer(rbg, frequencies_app[..., start:end], phase_shifts_app[..., start: end])
rbg = torch.sigmoid(self.color_layer_linear(rbg))
return torch.cat([labels, rbg, sigma], dim=-1)
class SIRENBASELINESEMANTICDISENTANGLE(nn.Module):
"""Same architecture as TALLSIREN baseline but use double latent codes and render semantic maps"""
def __init__(self, input_dim=2, z_geo_dim=100, z_app_dim=100, hidden_dim=256, output_dim=1, device=None):
super().__init__()
self.device = device
self.input_dim = input_dim
self.z_geo_dim = z_geo_dim
self.z_app_dim = z_app_dim
self.hidden_dim = hidden_dim
self.output_dim = output_dim
self.network = nn.ModuleList([
FiLMLayer(3, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
])
self.final_layer = nn.Linear(hidden_dim, 1)
self.color_layer_sine = nn.ModuleList([
FiLMLayer(hidden_dim+3, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
])
self.color_layer_linear = nn.Sequential(nn.Linear(hidden_dim, 3))
self.geo_mapping_network = CustomMappingNetwork(z_geo_dim, 256, len(self.network)*hidden_dim*2)
self.app_mapping_network = CustomMappingNetwork(z_app_dim, 256, len(self.color_layer_sine)*hidden_dim*2)
self.label_layer_linear = nn.Sequential(
nn.Linear(hidden_dim, hidden_dim),
nn.Linear(hidden_dim, self.output_dim - 4))
self.network.apply(frequency_init(25))
self.final_layer.apply(frequency_init(25))
self.color_layer_sine.apply(frequency_init(25))
self.color_layer_linear.apply(frequency_init(25))
self.label_layer_linear.apply(frequency_init(25))
self.network[0].apply(first_layer_film_sine_init)
self.gridwarper = UniformBoxWarp(0.24) # Don't worry about this, it was added to ensure compatibility with another model. Shouldn't affect performance.
def forward(self, input, z_geo, z_app, ray_directions, **kwargs):
frequencies_geo, phase_shifts_geo = self.geo_mapping_network(z_geo)
frequencies_app, phase_shifts_app = self.app_mapping_network(z_app)
return self.forward_with_frequencies_phase_shifts(input, frequencies_geo, frequencies_app, phase_shifts_geo, phase_shifts_app, ray_directions, **kwargs)
def forward_with_frequencies_phase_shifts(self, input, frequencies_geo, frequencies_app, phase_shifts_geo, phase_shifts_app, ray_directions, **kwargs):
frequencies_geo = frequencies_geo*15 + 30
frequencies_app = frequencies_app*15 + 30
input = self.gridwarper(input)
x = input
for index, layer in enumerate(self.network):
start = index * self.hidden_dim
end = (index+1) * self.hidden_dim
x = layer(x, frequencies_geo[..., start:end], phase_shifts_geo[..., start:end])
rbg = torch.cat([ray_directions, x], dim=-1)
sigma = self.final_layer(x)
labels = self.label_layer_linear(x)
for index, layer in enumerate(self.color_layer_sine):
start, end = index * self.hidden_dim, (index+1) * self.hidden_dim
rbg = layer(rbg, frequencies_app[..., start:end], phase_shifts_app[..., start: end])
rbg = torch.sigmoid(self.color_layer_linear(rbg))
return torch.cat([labels, rbg, sigma], dim=-1)
class SIRENBASELINESEMANTICDISENTANGLE_debug(nn.Module):
"""Same architecture as SIRENBASELINESEMANTICDISENTANGLE_debug except adding sigmoid to label"""
def __init__(self, input_dim=2, z_geo_dim=100, z_app_dim=100, hidden_dim=256, output_dim=1, device=None):
super().__init__()
self.device = device
self.input_dim = input_dim
self.z_geo_dim = z_geo_dim
self.z_app_dim = z_app_dim
self.hidden_dim = hidden_dim
self.output_dim = output_dim
self.network = nn.ModuleList([
FiLMLayer(3, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
])
self.final_layer = nn.Linear(hidden_dim, 1)
self.color_layer_sine = nn.ModuleList([
FiLMLayer(hidden_dim+3, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
])
self.color_layer_linear = nn.Sequential(nn.Linear(hidden_dim, 3))
self.geo_mapping_network = CustomMappingNetwork(z_geo_dim, 256, len(self.network)*hidden_dim*2)
self.app_mapping_network = CustomMappingNetwork(z_app_dim, 256, len(self.color_layer_sine)*hidden_dim*2)
self.label_layer_linear = nn.Sequential(
nn.Linear(hidden_dim, hidden_dim),
nn.Linear(hidden_dim, 19)) # 19 semantic labels
self.network.apply(frequency_init(25))
self.final_layer.apply(frequency_init(25))
self.color_layer_sine.apply(frequency_init(25))
self.color_layer_linear.apply(frequency_init(25))
self.label_layer_linear.apply(frequency_init(25))
self.network[0].apply(first_layer_film_sine_init)
self.gridwarper = UniformBoxWarp(0.24) # Don't worry about this, it was added to ensure compatibility with another model. Shouldn't affect performance.
def forward(self, input, z_geo, z_app, ray_directions, **kwargs):
frequencies_geo, phase_shifts_geo = self.geo_mapping_network(z_geo)
frequencies_app, phase_shifts_app = self.app_mapping_network(z_app)
return self.forward_with_frequencies_phase_shifts(input, frequencies_geo, frequencies_app, phase_shifts_geo, phase_shifts_app, ray_directions, **kwargs)
def forward_with_frequencies_phase_shifts(self, input, frequencies_geo, frequencies_app, phase_shifts_geo, phase_shifts_app, ray_directions, **kwargs):
frequencies_geo = frequencies_geo*15 + 30
frequencies_app = frequencies_app*15 + 30
input = self.gridwarper(input)
x = input
for index, layer in enumerate(self.network):
start = index * self.hidden_dim
end = (index+1) * self.hidden_dim
x = layer(x, frequencies_geo[..., start:end], phase_shifts_geo[..., start:end])
rbg = torch.cat([ray_directions, x], dim=-1)
sigma = self.final_layer(x)
labels = torch.sigmoid(self.label_layer_linear(x))
for index, layer in enumerate(self.color_layer_sine):
start, end = index * self.hidden_dim, (index+1) * self.hidden_dim
rbg = layer(rbg, frequencies_app[..., start:end], phase_shifts_app[..., start: end])
rbg = torch.sigmoid(self.color_layer_linear(rbg))
return torch.cat([labels, rbg, sigma], dim=-1)
class SPATIALSIRENSEMANTICHD(nn.Module):
"""Same architecture as SPATIALSIRENSEMANTIC but on a high resolution"""
def __init__(self, input_dim=2, z_dim=100, hidden_dim=256, output_dim=1, device=None):
super().__init__()
self.device = device
self.input_dim = input_dim
self.z_dim = z_dim
self.hidden_dim = hidden_dim
self.output_dim = output_dim
self.max_batch_size = 2500
self.network = nn.ModuleList([
FiLMLayer(3, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
])
self.final_layer = nn.Linear(hidden_dim, 1)
self.label_layer_sine = FiLMLayer(hidden_dim, hidden_dim)
self.label_layer_linear = nn.Sequential(nn.Linear(hidden_dim, 64)) # 19 semantic labels
self.color_layer_sine = FiLMLayer(hidden_dim + 3, hidden_dim)
self.color_layer_linear = nn.Sequential(nn.Linear(hidden_dim, 64))
self.mapping_network = CustomMappingNetwork(z_dim, 256, (len(self.network) + 2)*hidden_dim*2)
self.network.apply(frequency_init(25))
self.final_layer.apply(frequency_init(25))
self.label_layer_sine.apply(frequency_init(25))
self.label_layer_linear.apply(frequency_init(25))
self.color_layer_sine.apply(frequency_init(25))
self.color_layer_linear.apply(frequency_init(25))
self.network[0].apply(first_layer_film_sine_init)
self.activation = nn.Softmax(dim=-1)
self.gridwarper = UniformBoxWarp(0.24) # Don't worry about this, it was added to ensure compatibility with another model. Shouldn't affect performance.
def forward(self, input, z, ray_directions, **kwargs):
frequencies, phase_shifts = self.mapping_network(z)
n_batch, n_pixel = input.shape[:2]
return self.forward_with_frequencies_phase_shifts(input, frequencies, phase_shifts, ray_directions, **kwargs)
def forward_with_frequencies_phase_shifts(self, input, frequencies, phase_shifts, ray_directions, **kwargs):
frequencies = frequencies*15 + 30
input = self.gridwarper(input)
x = input
for index, layer in enumerate(self.network):
start = index * self.hidden_dim
end = (index+1) * self.hidden_dim
x = layer(x, frequencies[..., start:end], phase_shifts[..., start:end])
start += self.hidden_dim
end += self.hidden_dim
sigma = self.final_layer(x)
labels = self.label_layer_sine(x, frequencies[..., start:end], phase_shifts[..., start:end])
# TODO: w. / w.o softmax activation on label
labels = self.label_layer_linear(labels)
start += self.hidden_dim
end += self.hidden_dim
rbg = self.color_layer_sine(torch.cat([ray_directions, x], dim=-1), frequencies[..., start:end], phase_shifts[..., start:end])
rbg = self.color_layer_linear(rbg)
# rbg = torch.sigmoid(self.color_layer_linear(rbg))
return torch.cat([labels, rbg, sigma], dim=-1)
class EmbeddingPiGAN128SEMANTICDISENTANGLE(nn.Module):
"""Smaller architecture that has an additional cube of embeddings. Often gives better fine details."""
def __init__(self, input_dim=2, z_geo_dim=100, z_app_dim=100, hidden_dim=128, output_dim=1, device=None):
super().__init__()
self.device = device
self.input_dim = input_dim
self.z_geo_dim = z_geo_dim
self.z_app_dim = z_app_dim
self.hidden_dim = hidden_dim
self.output_dim = output_dim
self.network = nn.ModuleList([
FiLMLayer(32 + 3, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
])
self.final_layer = nn.Linear(hidden_dim, 1)
# self.color_layer_sine = FiLMLayer(hidden_dim + 3, hidden_dim)
self.color_layer_sine = nn.ModuleList([
FiLMLayer(hidden_dim+3, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
])
self.color_layer_linear = nn.Sequential(nn.Linear(hidden_dim, 3))
self.geo_mapping_network = CustomMappingNetwork(z_geo_dim, 256, len(self.network)*hidden_dim*2)
self.app_mapping_network = CustomMappingNetwork(z_app_dim, 256, len(self.color_layer_sine)*hidden_dim*2)
self.label_layer_linear = nn.Sequential(
nn.Linear(hidden_dim, hidden_dim),
nn.Linear(hidden_dim, hidden_dim),
nn.Linear(hidden_dim, self.output_dim-4)
)
self.network.apply(frequency_init(25))
self.final_layer.apply(frequency_init(25))
self.color_layer_sine.apply(frequency_init(25))
self.color_layer_linear.apply(frequency_init(25))
self.label_layer_linear.apply(frequency_init(25))
self.network[0].apply(modified_first_sine_init)
self.spatial_embeddings = nn.Parameter(torch.randn(1, 32, 96, 96, 96)*0.01)
# !! Important !! Set this value to the expected side-length of your scene. e.g. for for faces, heads usually fit in
# a box of side-length 0.24, since the camera has such a narrow FOV. For other scenes, with higher FOV, probably needs to be bigger.
self.gridwarper = UniformBoxWarp(0.24)
def forward(self, input, z_geo, z_app, ray_directions, **kwargs):
frequencies_geo, phase_shifts_geo = self.geo_mapping_network(z_geo)
frequencies_app, phase_shifts_app = self.app_mapping_network(z_app)
return self.forward_with_frequencies_phase_shifts(input, frequencies_geo, frequencies_app, phase_shifts_geo, phase_shifts_app, ray_directions, **kwargs)
def forward_with_frequencies_phase_shifts(self, input, frequencies_geo, frequencies_app, phase_shifts_geo, phase_shifts_app, ray_directions, **kwargs):
frequencies_geo = frequencies_geo*15 + 30
frequencies_app = frequencies_app*15 + 30
input = self.gridwarper(input)
shared_features = sample_from_3dgrid(input, self.spatial_embeddings)
x = torch.cat([shared_features, input], -1)
for index, layer in enumerate(self.network):
start = index * self.hidden_dim
end = (index+1) * self.hidden_dim
x = layer(x, frequencies_geo[..., start:end], phase_shifts_geo[..., start:end])
rbg = torch.cat([ray_directions, x], dim=-1)
sigma = self.final_layer(x)
labels = self.label_layer_linear(x)
for index, layer in enumerate(self.color_layer_sine):
start, end = index * self.hidden_dim, (index+1) * self.hidden_dim
rbg = layer(rbg, frequencies_app[..., start:end], phase_shifts_app[..., start: end])
rbg = torch.sigmoid(self.color_layer_linear(rbg))
return torch.cat([labels, rbg, sigma], dim=-1)
class TextureEmbeddingPiGAN128SEMANTICDISENTANGLE(nn.Module):
"""Smaller architecture that has an additional cube of embeddings. Often gives better fine details.
Embeddings are in color prediction branch instead of density network"""
def __init__(self, input_dim=2, z_geo_dim=100, z_app_dim=100, hidden_dim=128, output_dim=1, device=None):
super().__init__()
self.device = device
self.input_dim = input_dim
self.z_geo_dim = z_geo_dim
self.z_app_dim = z_app_dim
self.hidden_dim = hidden_dim
self.output_dim = output_dim
self.network = nn.ModuleList([
FiLMLayer(3, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
])
self.final_layer = nn.Linear(hidden_dim, 1)
# self.color_layer_sine = FiLMLayer(hidden_dim + 3, hidden_dim)
self.color_layer_sine = nn.ModuleList([
FiLMLayer(hidden_dim+32+3, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
])
self.color_layer_linear = nn.Sequential(nn.Linear(hidden_dim, 3))
self.geo_mapping_network = CustomMappingNetwork(z_geo_dim, 256, len(self.network)*hidden_dim*2)
self.app_mapping_network = CustomMappingNetwork(z_app_dim, 256, len(self.color_layer_sine)*hidden_dim*2)
self.label_layer_linear = nn.Sequential(
nn.Linear(hidden_dim, hidden_dim),
nn.Linear(hidden_dim, hidden_dim),
nn.Linear(hidden_dim, self.output_dim-4)
)
self.network.apply(frequency_init(25))
self.final_layer.apply(frequency_init(25))
self.color_layer_sine.apply(frequency_init(25))
self.color_layer_linear.apply(frequency_init(25))
self.label_layer_linear.apply(frequency_init(25))
self.network[0].apply(modified_first_sine_init)
self.spatial_embeddings = nn.Parameter(torch.randn(1, 32, 96, 96, 96)*0.01)
# !! Important !! Set this value to the expected side-length of your scene. e.g. for for faces, heads usually fit in
# a box of side-length 0.24, since the camera has such a narrow FOV. For other scenes, with higher FOV, probably needs to be bigger.
self.gridwarper = UniformBoxWarp(0.24)
def forward(self, input, z_geo, z_app, ray_directions, **kwargs):
frequencies_geo, phase_shifts_geo = self.geo_mapping_network(z_geo)
frequencies_app, phase_shifts_app = self.app_mapping_network(z_app)
return self.forward_with_frequencies_phase_shifts(input, frequencies_geo, frequencies_app, phase_shifts_geo, phase_shifts_app, ray_directions, **kwargs)
def forward_with_frequencies_phase_shifts(self, input, frequencies_geo, frequencies_app, phase_shifts_geo, phase_shifts_app, ray_directions, **kwargs):
frequencies_geo = frequencies_geo*15 + 30
frequencies_app = frequencies_app*15 + 30
input = self.gridwarper(input)
shared_features = sample_from_3dgrid(input, self.spatial_embeddings)
# x = torch.cat([shared_features, input], -1)
x = input
for index, layer in enumerate(self.network):
start = index * self.hidden_dim
end = (index+1) * self.hidden_dim
x = layer(x, frequencies_geo[..., start:end], phase_shifts_geo[..., start:end])
rbg = torch.cat([ray_directions, shared_features, x], dim=-1)
sigma = self.final_layer(x)
labels = self.label_layer_linear(x)
for index, layer in enumerate(self.color_layer_sine):
start, end = index * self.hidden_dim, (index+1) * self.hidden_dim
rbg = layer(rbg, frequencies_app[..., start:end], phase_shifts_app[..., start: end])
rbg = torch.sigmoid(self.color_layer_linear(rbg))
return torch.cat([labels, rbg, sigma], dim=-1)
class TextureEmbeddingPiGAN256SEMANTICDISENTANGLE(TextureEmbeddingPiGAN128SEMANTICDISENTANGLE):
"""Smaller architecture that has an additional cube of embeddings. Often gives better fine details.
Embeddings are in color prediction branch instead of density network"""
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs, hidden_dim=256)
self.spatial_embeddings = nn.Parameter(torch.randn(1,32,64,64,64)*0.1)
class TextureEmbeddingPiGAN256SEMANTICDISENTANGLE_DIM_96(TextureEmbeddingPiGAN128SEMANTICDISENTANGLE):
"""Smaller architecture that has an additional cube of embeddings. Often gives better fine details.
Embeddings are in color prediction branch instead of density network"""
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs, hidden_dim=256)
self.spatial_embeddings = nn.Parameter(torch.randn(1,32,96,96,96)*0.1)
class TextureEmbeddingPiGAN128SEMANTICDISENTANGLE_WO_DIR(nn.Module):
"""
1. Smaller architecture that has an additional cube of embeddings. Often gives better fine details.
Embeddings are in color prediction branch instead of density network;
2. remove view direction
3. add more color layers
"""
def __init__(self, input_dim=2, z_geo_dim=100, z_app_dim=100, hidden_dim=128, output_dim=1, device=None):
super().__init__()
self.device = device
self.input_dim = input_dim
self.z_geo_dim = z_geo_dim
self.z_app_dim = z_app_dim
self.hidden_dim = hidden_dim
self.output_dim = output_dim
self.network = nn.ModuleList([
FiLMLayer(3, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
])
self.final_layer = nn.Linear(hidden_dim, 1)
# self.color_layer_sine = FiLMLayer(hidden_dim + 3, hidden_dim)
self.color_layer_sine = nn.ModuleList([
FiLMLayer(hidden_dim+32, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
])
self.color_layer_linear = nn.Sequential(nn.Linear(hidden_dim, 3))
self.geo_mapping_network = CustomMappingNetwork(z_geo_dim, 256, len(self.network)*hidden_dim*2)
self.app_mapping_network = CustomMappingNetwork(z_app_dim, 256, len(self.color_layer_sine)*hidden_dim*2)
self.label_layer_linear = nn.Sequential(
nn.Linear(hidden_dim, hidden_dim),
nn.Linear(hidden_dim, hidden_dim),
nn.Linear(hidden_dim, self.output_dim-4)
)
self.network.apply(frequency_init(25))
self.final_layer.apply(frequency_init(25))
self.color_layer_sine.apply(frequency_init(25))
self.color_layer_linear.apply(frequency_init(25))
self.label_layer_linear.apply(frequency_init(25))
self.network[0].apply(modified_first_sine_init)
self.color_layer_sine[0].apply(modified_first_sine_init)
self.spatial_embeddings = nn.Parameter(torch.randn(1, 32, 96, 96, 96)*0.01)
# !! Important !! Set this value to the expected side-length of your scene. e.g. for for faces, heads usually fit in
# a box of side-length 0.24, since the camera has such a narrow FOV. For other scenes, with higher FOV, probably needs to be bigger.
self.gridwarper = UniformBoxWarp(0.24)
def forward(self, input, z_geo, z_app, ray_directions, **kwargs):
frequencies_geo, phase_shifts_geo = self.geo_mapping_network(z_geo)
frequencies_app, phase_shifts_app = self.app_mapping_network(z_app)
return self.forward_with_frequencies_phase_shifts(input, frequencies_geo, frequencies_app, phase_shifts_geo, phase_shifts_app, ray_directions, **kwargs)
def forward_with_frequencies_phase_shifts(self, input, frequencies_geo, frequencies_app, phase_shifts_geo, phase_shifts_app, ray_directions, **kwargs):
frequencies_geo = frequencies_geo*15 + 30
frequencies_app = frequencies_app*15 + 30
input = self.gridwarper(input)
shared_features = sample_from_3dgrid(input, self.spatial_embeddings)
# x = torch.cat([shared_features, input], -1)
x = input
for index, layer in enumerate(self.network):
start = index * self.hidden_dim
end = (index+1) * self.hidden_dim
x = layer(x, frequencies_geo[..., start:end], phase_shifts_geo[..., start:end])
rbg = torch.cat([shared_features, x], dim=-1)
sigma = self.final_layer(x)
labels = self.label_layer_linear(x)
for index, layer in enumerate(self.color_layer_sine):
start, end = index * self.hidden_dim, (index+1) * self.hidden_dim
rbg = layer(rbg, frequencies_app[..., start:end], phase_shifts_app[..., start: end])
rbg = torch.sigmoid(self.color_layer_linear(rbg))
return torch.cat([labels, rbg, sigma], dim=-1)
class TextureEmbeddingPiGAN128SEMANTICDISENTANGLE_WO_DIR_debug(nn.Module):
"""
1. Smaller architecture that has an additional cube of embeddings. Often gives better fine details.
Embeddings are in color prediction branch instead of density network;
2. remove view direction
3. add more color layers
"""
def __init__(self, input_dim=2, z_geo_dim=100, z_app_dim=100, hidden_dim=128, output_dim=1, device=None):
super().__init__()
self.device = device
self.input_dim = input_dim
self.z_geo_dim = z_geo_dim
self.z_app_dim = z_app_dim
self.hidden_dim = hidden_dim
self.output_dim = output_dim
self.network = nn.ModuleList([
FiLMLayer(3, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
])
self.final_layer = nn.Linear(hidden_dim, 1)
# self.color_layer_sine = FiLMLayer(hidden_dim + 3, hidden_dim)
self.color_layer_sine = nn.ModuleList([
FiLMLayer(hidden_dim+32, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
])
self.color_layer_linear = nn.Sequential(nn.Linear(hidden_dim, 3))
self.geo_mapping_network = CustomMappingNetwork(z_geo_dim, 256, len(self.network)*hidden_dim*2)
self.app_mapping_network = CustomMappingNetwork(z_app_dim, 256, len(self.color_layer_sine)*hidden_dim*2)
self.label_layer_linear = nn.Sequential(
nn.Linear(hidden_dim, hidden_dim),
nn.Linear(hidden_dim, hidden_dim),
nn.Linear(hidden_dim, self.output_dim-4)
)
self.network.apply(frequency_init(25))
self.final_layer.apply(frequency_init(25))
self.color_layer_sine.apply(frequency_init(25))
self.color_layer_linear.apply(frequency_init(25))
self.label_layer_linear.apply(frequency_init(25))
self.network[0].apply(modified_first_sine_init)
self.color_layer_sine[0].apply(modified_first_sine_init)
self.spatial_embeddings = nn.Parameter(torch.randn(1, 32, 96, 96, 96)*0.01)
# !! Important !! Set this value to the expected side-length of your scene. e.g. for for faces, heads usually fit in
# a box of side-length 0.24, since the camera has such a narrow FOV. For other scenes, with higher FOV, probably needs to be bigger.
self.gridwarper = UniformBoxWarp(0.24)
def forward(self, input, z_geo, z_app, ray_directions, **kwargs):
frequencies_geo, phase_shifts_geo = self.geo_mapping_network(z_geo)
frequencies_app, phase_shifts_app = self.app_mapping_network(z_app)
return self.forward_with_frequencies_phase_shifts(input, frequencies_geo, frequencies_app, phase_shifts_geo, phase_shifts_app, ray_directions, **kwargs)
def forward_with_frequencies_phase_shifts(self, input, frequencies_geo, frequencies_app, phase_shifts_geo, phase_shifts_app, ray_directions, **kwargs):
frequencies_geo = frequencies_geo*15 + 30
frequencies_app = frequencies_app*15 + 30
input = self.gridwarper(input)
shared_features = sample_from_3dgrid(input, self.spatial_embeddings)
# x = torch.cat([shared_features, input], -1)
x = input
for index, layer in enumerate(self.network):
start = index * self.hidden_dim
end = (index+1) * self.hidden_dim
x = layer(x, frequencies_geo[..., start:end], phase_shifts_geo[..., start:end])
rbg = torch.cat([shared_features, x], dim=-1)
sigma = self.final_layer(x)
labels = self.label_layer_linear(x)
for index, layer in enumerate(self.color_layer_sine):
start, end = index * self.hidden_dim, (index+1) * self.hidden_dim
rbg = layer(rbg, frequencies_app[..., start:end], phase_shifts_app[..., start: end])
rbg = torch.sigmoid(self.color_layer_linear(rbg))
return torch.cat([labels, rbg, sigma], dim=-1)
class TextureEmbeddingPiGAN128SEMANTICDISENTANGLE_WO_DIR_debug2(nn.Module):
"""
1. Smaller architecture that has an additional cube of embeddings. Often gives better fine details.
Embeddings are in color prediction branch instead of density network;
2. remove view direction
3. add more color layers
"""
def __init__(self, input_dim=2, z_geo_dim=100, z_app_dim=100, hidden_dim=128, output_dim=1, device=None):
super().__init__()
self.device = device
self.input_dim = input_dim
self.z_geo_dim = z_geo_dim
self.z_app_dim = z_app_dim
self.hidden_dim = hidden_dim
self.output_dim = output_dim
self.network = nn.ModuleList([
FiLMLayer(3, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
])
self.final_layer = nn.Linear(hidden_dim, 1)
# self.color_layer_sine = FiLMLayer(hidden_dim + 3, hidden_dim)
self.color_layer_sine = nn.ModuleList([
FiLMLayer(hidden_dim+32, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
FiLMLayer(hidden_dim, hidden_dim),
])
self.color_layer_linear = nn.Sequential(nn.Linear(hidden_dim, 3))
self.geo_mapping_network = CustomMappingNetwork(z_geo_dim, 256, len(self.network)*hidden_dim*2)
self.app_mapping_network = CustomMappingNetwork(z_app_dim, 256, len(self.color_layer_sine)*hidden_dim*2)
self.label_layer_linear = nn.Sequential(
nn.Linear(hidden_dim, hidden_dim),
nn.Linear(hidden_dim, hidden_dim),
nn.Linear(hidden_dim, self.output_dim-4)
)
self.network.apply(frequency_init(25))
self.final_layer.apply(frequency_init(25))
self.color_layer_sine.apply(frequency_init(25))
self.color_layer_linear.apply(frequency_init(25))
self.label_layer_linear.apply(frequency_init(25))
self.network[0].apply(modified_first_sine_init)
# self.color_layer_sine[0].apply(modified_first_sine_init)
self.spatial_embeddings = nn.Parameter(torch.randn(1, 32, 96, 96, 96)*0.01)
# !! Important !! Set this value to the expected side-length of your scene. e.g. for for faces, heads usually fit in
# a box of side-length 0.24, since the camera has such a narrow FOV. For other scenes, with higher FOV, probably needs to be bigger.
self.gridwarper = UniformBoxWarp(0.24)
def forward(self, input, z_geo, z_app, ray_directions, **kwargs):
frequencies_geo, phase_shifts_geo = self.geo_mapping_network(z_geo)
frequencies_app, phase_shifts_app = self.app_mapping_network(z_app)
return self.forward_with_frequencies_phase_shifts(input, frequencies_geo, frequencies_app, phase_shifts_geo, phase_shifts_app, ray_directions, **kwargs)
def forward_with_frequencies_phase_shifts(self, input, frequencies_geo, frequencies_app, phase_shifts_geo, phase_shifts_app, ray_directions, **kwargs):
frequencies_geo = frequencies_geo*15 + 30
frequencies_app = frequencies_app*15 + 30
input = self.gridwarper(input)
shared_features = sample_from_3dgrid(input, self.spatial_embeddings)
# x = torch.cat([shared_features, input], -1)
x = input
for index, layer in enumerate(self.network):
start = index * self.hidden_dim
end = (index+1) * self.hidden_dim
x = layer(x, frequencies_geo[..., start:end], phase_shifts_geo[..., start:end])
rbg = torch.cat([shared_features, x], dim=-1)
sigma = self.final_layer(x)
labels = self.label_layer_linear(x)
for index, layer in enumerate(self.color_layer_sine):
start, end = index * self.hidden_dim, (index+1) * self.hidden_dim
rbg = layer(rbg, frequencies_app[..., start:end], phase_shifts_app[..., start: end])
rbg = torch.sigmoid(self.color_layer_linear(rbg))
return torch.cat([labels, rbg, sigma], dim=-1)
class TextureEmbeddingPiGAN256SEMANTICDISENTANGLE_WO_DIR_DIM_96(TextureEmbeddingPiGAN128SEMANTICDISENTANGLE_WO_DIR):
"""Smaller architecture that has an additional cube of embeddings. Often gives better fine details.
Embeddings are in color prediction branch instead of density network"""
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs, hidden_dim=256)
self.spatial_embeddings = nn.Parameter(torch.randn(1,32,96,96,96)*0.1)
def main():
# model = SPATIALSIRENVOLUME(input_dim=3, z_dim=256, hidden_dim=256, output_dim=4, device=None)
model = SPATIALSIRENSEMANTIC(input_dim=3, z_dim=256, hidden_dim=256, output_dim=4, device=None)
input, z, ray_directions = torch.randn(2, 4000, 3), torch.rand(2, 256), torch.rand(2, 4000, 3)
output = model(input, z, ray_directions)
print(output.shape)
if __name__ == "__main__":
main()
| 47.956995 | 197 | 0.654574 | 11,495 | 88,097 | 4.731535 | 0.03706 | 0.124933 | 0.051187 | 0.075787 | 0.917465 | 0.907536 | 0.900789 | 0.892441 | 0.887128 | 0.883818 | 0 | 0.022032 | 0.241109 | 88,097 | 1,836 | 198 | 47.983115 | 0.791492 | 0.139607 | 0 | 0.869985 | 0 | 0 | 0.001565 | 0.00065 | 0 | 0 | 0 | 0.001634 | 0 | 1 | 0.066122 | false | 0 | 0.008915 | 0.002229 | 0.135215 | 0.001486 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
1c37dc2b9a245831e5c847624b92e7308a27130d | 151 | py | Python | rootfs/usr/lib/python3/dist-packages/numpy/distutils/compat.py | kappaIO-Dev/kappaIO-sdk-armhf-crosscompile | 66fc5fc21e6235f7a3be72a7ccac68e2224b7fb2 | [
"MIT"
] | 343 | 2015-01-07T05:58:44.000Z | 2022-03-15T14:55:21.000Z | rootfs/usr/lib/python3/dist-packages/numpy/distutils/compat.py | kappaIO-Dev/kappaIO-sdk-armhf-crosscompile | 66fc5fc21e6235f7a3be72a7ccac68e2224b7fb2 | [
"MIT"
] | 61 | 2015-03-19T18:20:21.000Z | 2019-10-23T12:58:23.000Z | rootfs/usr/lib/python3/dist-packages/numpy/distutils/compat.py | kappaIO-Dev/kappaIO-sdk-armhf-crosscompile | 66fc5fc21e6235f7a3be72a7ccac68e2224b7fb2 | [
"MIT"
] | 66 | 2015-01-20T15:35:05.000Z | 2021-11-25T16:49:41.000Z | """Small modules to cope with python 2 vs 3 incompatibilities inside
numpy.distutils
"""
import sys
def get_exception():
return sys.exc_info()[1]
| 18.875 | 68 | 0.741722 | 23 | 151 | 4.782609 | 0.956522 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.023622 | 0.15894 | 151 | 7 | 69 | 21.571429 | 0.84252 | 0.536424 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | true | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 7 |
1c56f3da5203656cfb5c5f767865ff64a9b22e0a | 2,198 | py | Python | final_submission/parser/json_dump.py | Groverkss/Breeding-Horses | 8a8e3c5114ec7c26c87d7517bac7a0bb3f2b19a7 | [
"MIT"
] | null | null | null | final_submission/parser/json_dump.py | Groverkss/Breeding-Horses | 8a8e3c5114ec7c26c87d7517bac7a0bb3f2b19a7 | [
"MIT"
] | null | null | null | final_submission/parser/json_dump.py | Groverkss/Breeding-Horses | 8a8e3c5114ec7c26c87d7517bac7a0bb3f2b19a7 | [
"MIT"
] | null | null | null | import json
data = [
[
0.0,
0.0,
0.0,
0.0,
0.0,
-5.74726809e-16,
0.0,
1.52226779e-05,
-1.04623995e-06,
-5.47552982e-09,
3.8344835e-10,
],
[
0.0,
0.0,
0.0,
0.0,
0.0,
-5.77e-16,
0.0,
1.52294786e-05,
-1.03985242e-06,
-5.4574558e-09,
3.78988721e-10,
],
[
0.0,
0.0,
0.0,
0.0,
0.0,
-5.74726809e-16,
0.0,
1.52226779e-05,
-1.04623995e-06,
-5.47552982e-09,
3.8344835e-10,
],
[
0.0,
0.0,
0.0,
0.0,
0.0,
-5.74726809e-16,
0.0,
1.51827267e-05,
-1.04623995e-06,
-5.47549689e-09,
3.83653816e-10,
],
[
0.0,
0.0,
0.0,
0.0,
0.0,
-5.77e-16,
0.0,
1.52294786e-05,
-1.03985242e-06,
-5.4574558e-09,
3.78988721e-10,
],
[
0.0,
0.0,
0.0,
0.0,
0.0,
-5.74726809e-16,
0.0,
1.52226779e-05,
-1.04623995e-06,
-5.47552982e-09,
3.8344835e-10,
],
[
0.0,
0.0,
0.0,
0.0,
0.0,
-5.77e-16,
0.0,
1.52294786e-05,
-1.03985242e-06,
-5.4574558e-09,
3.78988721e-10,
],
[
0.0,
0.0,
0.0,
0.0,
0.0,
-5.77e-16,
0.0,
1.52294786e-05,
-1.03985242e-06,
-5.38474558e-09,
3.78988721e-10,
],
[
0.0,
0.0,
0.0,
0.0,
0.0,
-5.74726809e-16,
0.0,
1.52226779e-05,
-1.04623995e-06,
-5.47552982e-09,
3.8344835e-10,
],
[
0.0,
0.0,
0.0,
0.0,
0.0,
-5.77e-16,
0.0,
1.52294786e-05,
-1.03985242e-06,
-5.4574558e-09,
3.78988721e-10,
],
]
with open("output.json", "w") as outfile:
json.dump(data, outfile)
| 15.927536 | 41 | 0.338035 | 284 | 2,198 | 2.616197 | 0.116197 | 0.269179 | 0.323015 | 0.376851 | 0.865411 | 0.845222 | 0.845222 | 0.845222 | 0.845222 | 0.845222 | 0 | 0.580349 | 0.50455 | 2,198 | 137 | 42 | 16.043796 | 0.101928 | 0 | 0 | 0.859259 | 0 | 0 | 0.00546 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.007407 | 0 | 0.007407 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 13 |
c743da3bb3ab3a0334c54507a96760529660fa1a | 259 | py | Python | nginx_access_tailer/__init__.py | swfrench/nginx-access-tailer | 5e060396ca749935c622e8e9c50b659b39e3675b | [
"BSD-3-Clause"
] | null | null | null | nginx_access_tailer/__init__.py | swfrench/nginx-access-tailer | 5e060396ca749935c622e8e9c50b659b39e3675b | [
"BSD-3-Clause"
] | null | null | null | nginx_access_tailer/__init__.py | swfrench/nginx-access-tailer | 5e060396ca749935c622e8e9c50b659b39e3675b | [
"BSD-3-Clause"
] | null | null | null | """Minimal top-level emports."""
from nginx_access_tailer.instance_metadata import InstanceMetadata
from nginx_access_tailer.nginx_access_log_consumer import NginxAccessLogConsumer
from nginx_access_tailer.nginx_access_log_tailer import NginxAccessLogTailer
| 43.166667 | 80 | 0.895753 | 32 | 259 | 6.84375 | 0.5 | 0.251142 | 0.205479 | 0.287671 | 0.319635 | 0.319635 | 0.319635 | 0 | 0 | 0 | 0 | 0 | 0.061776 | 259 | 5 | 81 | 51.8 | 0.901235 | 0.100386 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
c74ae5f7d31d33d1e61caadd4f6839a8091cc959 | 5,933 | py | Python | tests/parser/16-Incremental-Scheduling.asp.test.py | veltri/DLV2 | 944aaef803aa75e7ec51d7e0c2b0d964687fdd0e | [
"Apache-2.0"
] | null | null | null | tests/parser/16-Incremental-Scheduling.asp.test.py | veltri/DLV2 | 944aaef803aa75e7ec51d7e0c2b0d964687fdd0e | [
"Apache-2.0"
] | null | null | null | tests/parser/16-Incremental-Scheduling.asp.test.py | veltri/DLV2 | 944aaef803aa75e7ec51d7e0c2b0d964687fdd0e | [
"Apache-2.0"
] | null | null | null | input = """
%
% ******GRINGO 3.x REQUIRED******
%
%
time(0).
time(T+1) :- time(T), T < MT, max_value(MT).
%time(0..MT) :- max_value(MT).
pen_value(T) :- time(T).
td_value(T) :- time(T).
instance_of(D,1) :- device(D).
instance_of(D,I+1) :- device(D), instance_of(D,I), instances(D,N), I < N.
% Pick a unique start time and instance for each job
1 <= { start(J,S) : time(S) } <= 1 :- job(J).
1 <= { on_instance(J,I) : instance_of(D,I) } <= 1 :- job(J), job_device(J,D).
%----------------------
% - overlap
%----------------------
:- on_instance(J1,I), on_instance(J2,I), J1 != J2,
job_device(J1,D), job_device(J2,D),
start(J1,S1), job_len(J1,L1),
start(J2,S2),
S1 <= S2, S2 < S1 + L1.
%----------------------
% - order
%----------------------
:- precedes(J1,J2),
start(J1,S1), job_len(J1,L1),
start(J2,S2),
S2 < S1 + L1.
%-------------------------------------
% - completion -- total-tardiness
%-------------------------------------
td(J,S + L - D) :-
job(J),
start(J,S), job_len(J,L),
deadline(J,D),
S + L > D.
td(J,0) :-
job(J),
start(J,S), job_len(J,L),
deadline(J,D),
S + L <= D.
%-------------------------------------
% - completion -- penalty
%-------------------------------------
penalty(J,TD * I) :-
job(J),
td(J,TD),
importance(J,I).
:- penalty(J,P),
max_value(MV),
P > MV.
tot_penalty(TP) :-
pen_value(TP),
TP = #sum{ P,J : penalty(J,P) }.
%
% If the value of the total penalty would be greater than the
% maximum allowed value of pen_value(_), the above rule
% does not define tot_penalty(_).
% In that case, the solution is not acceptable.
%
has_tot_penalty :-
tot_penalty(TP).
-has_tot_penalty :-
not has_tot_penalty.
:- -has_tot_penalty.
:- pen_value(TP), tot_penalty(TP), max_total_penalty(K),
TP > K.
%----------------------
% - instance assignment
%----------------------
:- on_instance(J1,I), on_instance(J2,I),
job_device(J1,D), job_device(J2,D),
instances(D,N), N > 1,
J1 != J2,
start(J1,S1), start(J2,S2),
job_len(J1,L1),
S1 <= S2, S2 < S1 + L1.
:- on_instance(J,I),
device(D),
job(J), job_device(J,D),
offline_instance(D,I),
must_schedule(J).
%----------------------
% - current schedule
%----------------------
already_started(J) :-
curr_job_start(J,S),
curr_time(CT),
CT > S.
already_finished(J) :-
curr_job_start(J,S),
job_len(J,L),
curr_time(CT),
CT >= S + L.
must_schedule(J) :-
job(J),
not must_not_schedule(J).
must_not_schedule(J) :-
already_started(J),
not rescheduled(J).
rescheduled(J) :-
already_started(J),
not already_finished(J),
job_device(J,D),
curr_on_instance(J,I),
offline_instance(D,I).
:- start(J,S),
curr_time(CT),
S < CT,
device(D),
job_device(J,D),
time(S),
must_schedule(J).
:- start(J,S),
curr_job_start(J,CS),
S != CS,
job_device(J,D),
must_not_schedule(J).
:- on_instance(J,I),
curr_on_instance(J,CI),
I != CI,
must_not_schedule(J).
"""
output = """
%
% ******GRINGO 3.x REQUIRED******
%
%
time(0).
time(T+1) :- time(T), T < MT, max_value(MT).
%time(0..MT) :- max_value(MT).
pen_value(T) :- time(T).
td_value(T) :- time(T).
instance_of(D,1) :- device(D).
instance_of(D,I+1) :- device(D), instance_of(D,I), instances(D,N), I < N.
% Pick a unique start time and instance for each job
1 <= { start(J,S) : time(S) } <= 1 :- job(J).
1 <= { on_instance(J,I) : instance_of(D,I) } <= 1 :- job(J), job_device(J,D).
%----------------------
% - overlap
%----------------------
:- on_instance(J1,I), on_instance(J2,I), J1 != J2,
job_device(J1,D), job_device(J2,D),
start(J1,S1), job_len(J1,L1),
start(J2,S2),
S1 <= S2, S2 < S1 + L1.
%----------------------
% - order
%----------------------
:- precedes(J1,J2),
start(J1,S1), job_len(J1,L1),
start(J2,S2),
S2 < S1 + L1.
%-------------------------------------
% - completion -- total-tardiness
%-------------------------------------
td(J,S + L - D) :-
job(J),
start(J,S), job_len(J,L),
deadline(J,D),
S + L > D.
td(J,0) :-
job(J),
start(J,S), job_len(J,L),
deadline(J,D),
S + L <= D.
%-------------------------------------
% - completion -- penalty
%-------------------------------------
penalty(J,TD * I) :-
job(J),
td(J,TD),
importance(J,I).
:- penalty(J,P),
max_value(MV),
P > MV.
tot_penalty(TP) :-
pen_value(TP),
TP = #sum{ P,J : penalty(J,P) }.
%
% If the value of the total penalty would be greater than the
% maximum allowed value of pen_value(_), the above rule
% does not define tot_penalty(_).
% In that case, the solution is not acceptable.
%
has_tot_penalty :-
tot_penalty(TP).
-has_tot_penalty :-
not has_tot_penalty.
:- -has_tot_penalty.
:- pen_value(TP), tot_penalty(TP), max_total_penalty(K),
TP > K.
%----------------------
% - instance assignment
%----------------------
:- on_instance(J1,I), on_instance(J2,I),
job_device(J1,D), job_device(J2,D),
instances(D,N), N > 1,
J1 != J2,
start(J1,S1), start(J2,S2),
job_len(J1,L1),
S1 <= S2, S2 < S1 + L1.
:- on_instance(J,I),
device(D),
job(J), job_device(J,D),
offline_instance(D,I),
must_schedule(J).
%----------------------
% - current schedule
%----------------------
already_started(J) :-
curr_job_start(J,S),
curr_time(CT),
CT > S.
already_finished(J) :-
curr_job_start(J,S),
job_len(J,L),
curr_time(CT),
CT >= S + L.
must_schedule(J) :-
job(J),
not must_not_schedule(J).
must_not_schedule(J) :-
already_started(J),
not rescheduled(J).
rescheduled(J) :-
already_started(J),
not already_finished(J),
job_device(J,D),
curr_on_instance(J,I),
offline_instance(D,I).
:- start(J,S),
curr_time(CT),
S < CT,
device(D),
job_device(J,D),
time(S),
must_schedule(J).
:- start(J,S),
curr_job_start(J,CS),
S != CS,
job_device(J,D),
must_not_schedule(J).
:- on_instance(J,I),
curr_on_instance(J,CI),
I != CI,
must_not_schedule(J).
"""
| 18.834921 | 77 | 0.531434 | 930 | 5,933 | 3.216129 | 0.094624 | 0.060181 | 0.032765 | 0.036777 | 0.996322 | 0.996322 | 0.996322 | 0.996322 | 0.996322 | 0.996322 | 0 | 0.023204 | 0.17192 | 5,933 | 314 | 78 | 18.894904 | 0.585589 | 0 | 0 | 0.952 | 0 | 0.024 | 0.994775 | 0.178662 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.008 | 0 | 0.008 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
c7c40a8980b0463e80cdc4ebc4a48759659fa0a3 | 115,654 | py | Python | proyecto.py | pydae/pscig_doc | 34964d1754acdae0e0a8f32da396ad1ce73e1280 | [
"MIT"
] | null | null | null | proyecto.py | pydae/pscig_doc | 34964d1754acdae0e0a8f32da396ad1ce73e1280 | [
"MIT"
] | null | null | null | proyecto.py | pydae/pscig_doc | 34964d1754acdae0e0a8f32da396ad1ce73e1280 | [
"MIT"
] | null | null | null | import numpy as np
import numba
import scipy.optimize as sopt
import json
sin = np.sin
cos = np.cos
atan2 = np.arctan2
sqrt = np.sqrt
class proyecto_class:
def __init__(self):
self.t_end = 10.000000
self.Dt = 0.0010000
self.decimation = 10.000000
self.itol = 1e-6
self.Dt_max = 0.001000
self.Dt_min = 0.001000
self.solvern = 5
self.imax = 100
self.N_x = 7
self.N_y = 20
self.N_z = 7
self.N_store = 10000
self.params_list = ['S_base', 'g_GRI_POI', 'b_GRI_POI', 'g_POI_PMV', 'b_POI_PMV', 'g_PMV_GR1', 'b_PMV_GR1', 'g_GR1_GR2', 'b_GR1_GR2', 'g_PMV_GR3', 'b_PMV_GR3', 'g_GR3_GR4', 'b_GR3_GR4', 'U_GRI_n', 'U_POI_n', 'U_PMV_n', 'U_GR1_n', 'U_GR2_n', 'U_GR3_n', 'U_GR4_n', 'S_n_GRI', 'X_d_GRI', 'X1d_GRI', 'T1d0_GRI', 'X_q_GRI', 'X1q_GRI', 'T1q0_GRI', 'R_a_GRI', 'X_l_GRI', 'H_GRI', 'D_GRI', 'Omega_b_GRI', 'omega_s_GRI', 'K_a_GRI', 'T_r_GRI', 'v_pss_GRI', 'Droop_GRI', 'T_m_GRI', 'K_sec_GRI', 'K_delta_GRI', 'v_ref_GRI']
self.params_values_list = [100000000.0, 1.4986238532110094, -4.995412844036698, 2.941176470588235, -11.76470588235294, 24.742268041237114, -10.996563573883162, 24.742268041237114, -10.996563573883162, 24.742268041237114, -10.996563573883162, 24.742268041237114, -10.996563573883162, 66000.0, 66000.0, 20000.0, 20000.0, 20000.0, 20000.0, 20000.0, 100000000.0, 1.81, 0.3, 8.0, 1.76, 0.65, 1.0, 0.003, 0.05, 6.0, 1.0, 314.1592653589793, 1.0, 100, 0.1, 0.0, 0.05, 5.0, 0.001, 0.01, 1.0]
self.inputs_ini_list = ['P_GRI', 'Q_GRI', 'P_POI', 'Q_POI', 'P_PMV', 'Q_PMV', 'P_GR1', 'Q_GR1', 'P_GR2', 'Q_GR2', 'P_GR3', 'Q_GR3', 'P_GR4', 'Q_GR4']
self.inputs_ini_values_list = [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1000000.0, 0.0, 1000000.0, 0.0, 1000000.0, 0.0, 1000000.0, 0.0]
self.inputs_run_list = ['P_GRI', 'Q_GRI', 'P_POI', 'Q_POI', 'P_PMV', 'Q_PMV', 'P_GR1', 'Q_GR1', 'P_GR2', 'Q_GR2', 'P_GR3', 'Q_GR3', 'P_GR4', 'Q_GR4']
self.inputs_run_values_list = [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1000000.0, 0.0, 1000000.0, 0.0, 1000000.0, 0.0, 1000000.0, 0.0]
self.outputs_list = ['V_GRI', 'V_POI', 'V_PMV', 'V_GR1', 'V_GR2', 'V_GR3', 'V_GR4']
self.x_list = ['delta_GRI', 'omega_GRI', 'e1q_GRI', 'e1d_GRI', 'v_c_GRI', 'p_m_GRI', 'xi_m_GRI']
self.y_run_list = ['V_GRI', 'theta_GRI', 'V_POI', 'theta_POI', 'V_PMV', 'theta_PMV', 'V_GR1', 'theta_GR1', 'V_GR2', 'theta_GR2', 'V_GR3', 'theta_GR3', 'V_GR4', 'theta_GR4', 'i_d_GRI', 'i_q_GRI', 'P_GRI_1', 'Q_GRI_1', 'v_f_GRI', 'p_m_ref_GRI']
self.xy_list = self.x_list + self.y_run_list
self.y_ini_list = ['V_GRI', 'theta_GRI', 'V_POI', 'theta_POI', 'V_PMV', 'theta_PMV', 'V_GR1', 'theta_GR1', 'V_GR2', 'theta_GR2', 'V_GR3', 'theta_GR3', 'V_GR4', 'theta_GR4', 'i_d_GRI', 'i_q_GRI', 'P_GRI_1', 'Q_GRI_1', 'v_f_GRI', 'p_m_ref_GRI']
self.xy_ini_list = self.x_list + self.y_ini_list
self.t = 0.0
self.it = 0
self.it_store = 0
self.xy_prev = np.zeros((self.N_x+self.N_y,1))
self.initialization_tol = 1e-6
self.N_u = len(self.inputs_run_list)
self.sopt_root_method='hybr'
self.sopt_root_jac=True
self.u_ini_list = self.inputs_ini_list
self.u_ini_values_list = self.inputs_ini_values_list
self.u_run_list = self.inputs_run_list
self.u_run_values_list = self.inputs_run_values_list
self.N_u = len(self.u_run_list)
self.update()
def update(self):
self.N_steps = int(np.ceil(self.t_end/self.Dt))
dt = [
('t_end', np.float64),
('Dt', np.float64),
('decimation', np.float64),
('itol', np.float64),
('Dt_max', np.float64),
('Dt_min', np.float64),
('solvern', np.int64),
('imax', np.int64),
('N_steps', np.int64),
('N_store', np.int64),
('N_x', np.int64),
('N_y', np.int64),
('N_z', np.int64),
('t', np.float64),
('it', np.int64),
('it_store', np.int64),
('idx', np.int64),
('idy', np.int64),
('f', np.float64, (self.N_x,1)),
('x', np.float64, (self.N_x,1)),
('x_0', np.float64, (self.N_x,1)),
('g', np.float64, (self.N_y,1)),
('y_run', np.float64, (self.N_y,1)),
('y_ini', np.float64, (self.N_y,1)),
('u_run', np.float64, (self.N_u,1)),
('y_0', np.float64, (self.N_y,1)),
('h', np.float64, (self.N_z,1)),
('Fx', np.float64, (self.N_x,self.N_x)),
('Fy', np.float64, (self.N_x,self.N_y)),
('Gx', np.float64, (self.N_y,self.N_x)),
('Gy', np.float64, (self.N_y,self.N_y)),
('Fu', np.float64, (self.N_x,self.N_u)),
('Gu', np.float64, (self.N_y,self.N_u)),
('Hx', np.float64, (self.N_z,self.N_x)),
('Hy', np.float64, (self.N_z,self.N_y)),
('Hu', np.float64, (self.N_z,self.N_u)),
('Fx_ini', np.float64, (self.N_x,self.N_x)),
('Fy_ini', np.float64, (self.N_x,self.N_y)),
('Gx_ini', np.float64, (self.N_y,self.N_x)),
('Gy_ini', np.float64, (self.N_y,self.N_y)),
('T', np.float64, (self.N_store+1,1)),
('X', np.float64, (self.N_store+1,self.N_x)),
('Y', np.float64, (self.N_store+1,self.N_y)),
('Z', np.float64, (self.N_store+1,self.N_z)),
('iters', np.float64, (self.N_store+1,1)),
('store', np.int64),
]
values = [
self.t_end,
self.Dt,
self.decimation,
self.itol,
self.Dt_max,
self.Dt_min,
self.solvern,
self.imax,
self.N_steps,
self.N_store,
self.N_x,
self.N_y,
self.N_z,
self.t,
self.it,
self.it_store,
0, # idx
0, # idy
np.zeros((self.N_x,1)), # f
np.zeros((self.N_x,1)), # x
np.zeros((self.N_x,1)), # x_0
np.zeros((self.N_y,1)), # g
np.zeros((self.N_y,1)), # y_run
np.zeros((self.N_y,1)), # y_ini
np.zeros((self.N_u,1)), # u_run
np.zeros((self.N_y,1)), # y_0
np.zeros((self.N_z,1)), # h
np.zeros((self.N_x,self.N_x)), # Fx
np.zeros((self.N_x,self.N_y)), # Fy
np.zeros((self.N_y,self.N_x)), # Gx
np.zeros((self.N_y,self.N_y)), # Fy
np.zeros((self.N_x,self.N_u)), # Fu
np.zeros((self.N_y,self.N_u)), # Gu
np.zeros((self.N_z,self.N_x)), # Hx
np.zeros((self.N_z,self.N_y)), # Hy
np.zeros((self.N_z,self.N_u)), # Hu
np.zeros((self.N_x,self.N_x)), # Fx_ini
np.zeros((self.N_x,self.N_y)), # Fy_ini
np.zeros((self.N_y,self.N_x)), # Gx_ini
np.zeros((self.N_y,self.N_y)), # Fy_ini
np.zeros((self.N_store+1,1)), # T
np.zeros((self.N_store+1,self.N_x)), # X
np.zeros((self.N_store+1,self.N_y)), # Y
np.zeros((self.N_store+1,self.N_z)), # Z
np.zeros((self.N_store+1,1)), # iters
1,
]
dt += [(item,np.float64) for item in self.params_list]
values += [item for item in self.params_values_list]
for item_id,item_val in zip(self.inputs_ini_list,self.inputs_ini_values_list):
if item_id in self.inputs_run_list: continue
dt += [(item_id,np.float64)]
values += [item_val]
dt += [(item,np.float64) for item in self.inputs_run_list]
values += [item for item in self.inputs_run_values_list]
self.struct = np.rec.array([tuple(values)], dtype=np.dtype(dt))
xy0 = np.zeros((self.N_x+self.N_y,))
self.ini_dae_jacobian_nn(xy0)
self.run_dae_jacobian_nn(xy0)
def load_params(self,data_input):
if type(data_input) == str:
json_file = data_input
self.json_file = json_file
self.json_data = open(json_file).read().replace("'",'"')
data = json.loads(self.json_data)
elif type(data_input) == dict:
data = data_input
self.data = data
for item in self.data:
self.struct[0][item] = self.data[item]
self.params_values_list[self.params_list.index(item)] = self.data[item]
def ini_problem(self,x):
self.struct[0].x[:,0] = x[0:self.N_x]
self.struct[0].y_ini[:,0] = x[self.N_x:(self.N_x+self.N_y)]
ini(self.struct,2)
ini(self.struct,3)
fg = np.vstack((self.struct[0].f,self.struct[0].g))[:,0]
return fg
def run_problem(self,x):
t = self.struct[0].t
self.struct[0].x[:,0] = x[0:self.N_x]
self.struct[0].y_run[:,0] = x[self.N_x:(self.N_x+self.N_y)]
run(t,self.struct,2)
run(t,self.struct,3)
run(t,self.struct,10)
run(t,self.struct,11)
run(t,self.struct,12)
run(t,self.struct,13)
fg = np.vstack((self.struct[0].f,self.struct[0].g))[:,0]
return fg
def run_dae_jacobian(self,x):
self.struct[0].x[:,0] = x[0:self.N_x]
self.struct[0].y_run[:,0] = x[self.N_x:(self.N_x+self.N_y)]
run(0.0,self.struct,10)
run(0.0,self.struct,11)
run(0.0,self.struct,12)
run(0.0,self.struct,13)
A_c = np.block([[self.struct[0].Fx,self.struct[0].Fy],
[self.struct[0].Gx,self.struct[0].Gy]])
return A_c
def run_dae_jacobian_nn(self,x):
self.struct[0].x[:,0] = x[0:self.N_x]
self.struct[0].y_run[:,0] = x[self.N_x:(self.N_x+self.N_y)]
run_nn(0.0,self.struct,10)
run_nn(0.0,self.struct,11)
run_nn(0.0,self.struct,12)
run_nn(0.0,self.struct,13)
def eval_jacobians(self):
run(0.0,self.struct,10)
run(0.0,self.struct,11)
run(0.0,self.struct,12)
return 1
def ini_dae_jacobian(self,x):
self.struct[0].x[:,0] = x[0:self.N_x]
self.struct[0].y_ini[:,0] = x[self.N_x:(self.N_x+self.N_y)]
ini(self.struct,10)
ini(self.struct,11)
A_c = np.block([[self.struct[0].Fx_ini,self.struct[0].Fy_ini],
[self.struct[0].Gx_ini,self.struct[0].Gy_ini]])
return A_c
def ini_dae_jacobian_nn(self,x):
self.struct[0].x[:,0] = x[0:self.N_x]
self.struct[0].y_ini[:,0] = x[self.N_x:(self.N_x+self.N_y)]
ini_nn(self.struct,10)
ini_nn(self.struct,11)
def f_ode(self,x):
self.struct[0].x[:,0] = x
run(self.struct,1)
return self.struct[0].f[:,0]
def f_odeint(self,x,t):
self.struct[0].x[:,0] = x
run(self.struct,1)
return self.struct[0].f[:,0]
def f_ivp(self,t,x):
self.struct[0].x[:,0] = x
run(self.struct,1)
return self.struct[0].f[:,0]
def Fx_ode(self,x):
self.struct[0].x[:,0] = x
run(self.struct,10)
return self.struct[0].Fx
def eval_A(self):
Fx = self.struct[0].Fx
Fy = self.struct[0].Fy
Gx = self.struct[0].Gx
Gy = self.struct[0].Gy
A = Fx - Fy @ np.linalg.solve(Gy,Gx)
self.A = A
return A
def eval_A_ini(self):
Fx = self.struct[0].Fx_ini
Fy = self.struct[0].Fy_ini
Gx = self.struct[0].Gx_ini
Gy = self.struct[0].Gy_ini
A = Fx - Fy @ np.linalg.solve(Gy,Gx)
return A
def reset(self):
for param,param_value in zip(self.params_list,self.params_values_list):
self.struct[0][param] = param_value
for input_name,input_value in zip(self.inputs_ini_list,self.inputs_ini_values_list):
self.struct[0][input_name] = input_value
for input_name,input_value in zip(self.inputs_run_list,self.inputs_run_values_list):
self.struct[0][input_name] = input_value
def simulate(self,events,xy0=0):
# initialize both the ini and the run system
self.initialize(events,xy0=xy0)
# simulation run
for event in events:
# make all the desired changes
self.run([event])
# post process
T,X,Y,Z = self.post()
return T,X,Y,Z
def run(self,events):
# simulation run
for event in events:
# make all the desired changes
for item in event:
self.struct[0][item] = event[item]
daesolver(self.struct) # run until next event
return 1
def rtrun(self,events):
# simulation run
for event in events:
# make all the desired changes
for item in event:
self.struct[0][item] = event[item]
self.struct[0].it_store = self.struct[0].N_store-1
daesolver(self.struct) # run until next event
return 1
def post(self):
# post process result
T = self.struct[0]['T'][:self.struct[0].it_store]
X = self.struct[0]['X'][:self.struct[0].it_store,:]
Y = self.struct[0]['Y'][:self.struct[0].it_store,:]
Z = self.struct[0]['Z'][:self.struct[0].it_store,:]
iters = self.struct[0]['iters'][:self.struct[0].it_store,:]
self.T = T
self.X = X
self.Y = Y
self.Z = Z
self.iters = iters
return T,X,Y,Z
def save_0(self,file_name = 'xy_0.json'):
xy_0_dict = {}
for item in self.x_list:
xy_0_dict.update({item:self.get_value(item)})
for item in self.y_ini_list:
xy_0_dict.update({item:self.get_value(item)})
xy_0_str = json.dumps(xy_0_dict, indent=4)
with open(file_name,'w') as fobj:
fobj.write(xy_0_str)
def load_0(self,file_name = 'xy_0.json'):
with open(file_name) as fobj:
xy_0_str = fobj.read()
xy_0_dict = json.loads(xy_0_str)
for item in xy_0_dict:
if item in self.x_list:
self.xy_prev[self.x_list.index(item)] = xy_0_dict[item]
if item in self.y_ini_list:
self.xy_prev[self.y_ini_list.index(item)+self.N_x] = xy_0_dict[item]
def initialize(self,events=[{}],xy0=0):
'''
Parameters
----------
events : dictionary
Dictionary with at least 't_end' and all inputs and parameters
that need to be changed.
xy0 : float or string, optional
0 means all states should be zero as initial guess.
If not zero all the states initial guess are the given input.
If 'prev' it uses the last known initialization result as initial guess.
Returns
-------
T : TYPE
DESCRIPTION.
X : TYPE
DESCRIPTION.
Y : TYPE
DESCRIPTION.
Z : TYPE
DESCRIPTION.
'''
# simulation parameters
self.struct[0].it = 0 # set time step to zero
self.struct[0].it_store = 0 # set storage to zero
self.struct[0].t = 0.0 # set time to zero
# initialization
it_event = 0
event = events[it_event]
for item in event:
self.struct[0][item] = event[item]
## compute initial conditions using x and y_ini
if type(xy0) == str:
if xy0 == 'prev':
xy0 = self.xy_prev
else:
self.load_0(xy0)
xy0 = self.xy_prev
elif type(xy0) == dict:
with open('xy_0.json','w') as fobj:
fobj.write(json.dumps(xy0))
self.load_0('xy_0.json')
xy0 = self.xy_prev
else:
if xy0 == 0:
xy0 = np.zeros(self.N_x+self.N_y)
elif xy0 == 1:
xy0 = np.ones(self.N_x+self.N_y)
else:
xy0 = xy0*np.ones(self.N_x+self.N_y)
#xy = sopt.fsolve(self.ini_problem,xy0, jac=self.ini_dae_jacobian )
if self.sopt_root_jac:
sol = sopt.root(self.ini_problem, xy0,
jac=self.ini_dae_jacobian,
method=self.sopt_root_method, tol=self.initialization_tol)
else:
sol = sopt.root(self.ini_problem, xy0, method=self.sopt_root_method)
self.initialization_ok = True
if sol.success == False:
print('initialization not found!')
self.initialization_ok = False
T = self.struct[0]['T'][:self.struct[0].it_store]
X = self.struct[0]['X'][:self.struct[0].it_store,:]
Y = self.struct[0]['Y'][:self.struct[0].it_store,:]
Z = self.struct[0]['Z'][:self.struct[0].it_store,:]
iters = self.struct[0]['iters'][:self.struct[0].it_store,:]
if self.initialization_ok:
xy = sol.x
self.xy_prev = xy
self.struct[0].x[:,0] = xy[0:self.N_x]
self.struct[0].y_run[:,0] = xy[self.N_x:]
## y_ini to u_run
for item in self.inputs_run_list:
if item in self.y_ini_list:
self.struct[0][item] = self.struct[0].y_ini[self.y_ini_list.index(item)]
## u_ini to y_run
for item in self.inputs_ini_list:
if item in self.y_run_list:
self.struct[0].y_run[self.y_run_list.index(item)] = self.struct[0][item]
#xy = sopt.fsolve(self.ini_problem,xy0, jac=self.ini_dae_jacobian )
if self.sopt_root_jac:
sol = sopt.root(self.run_problem, xy0,
jac=self.run_dae_jacobian,
method=self.sopt_root_method, tol=self.initialization_tol)
else:
sol = sopt.root(self.run_problem, xy0, method=self.sopt_root_method)
# evaluate f and g
run(0.0,self.struct,2)
run(0.0,self.struct,3)
# evaluate run jacobians
run(0.0,self.struct,10)
run(0.0,self.struct,11)
run(0.0,self.struct,12)
run(0.0,self.struct,14)
# post process result
T = self.struct[0]['T'][:self.struct[0].it_store]
X = self.struct[0]['X'][:self.struct[0].it_store,:]
Y = self.struct[0]['Y'][:self.struct[0].it_store,:]
Z = self.struct[0]['Z'][:self.struct[0].it_store,:]
iters = self.struct[0]['iters'][:self.struct[0].it_store,:]
self.T = T
self.X = X
self.Y = Y
self.Z = Z
self.iters = iters
return self.initialization_ok
def get_value(self,name):
if name in self.inputs_run_list:
value = self.struct[0][name]
if name in self.x_list:
idx = self.x_list.index(name)
value = self.struct[0].x[idx,0]
if name in self.y_run_list:
idy = self.y_run_list.index(name)
value = self.struct[0].y_run[idy,0]
if name in self.params_list:
value = self.struct[0][name]
if name in self.outputs_list:
value = self.struct[0].h[self.outputs_list.index(name),0]
return value
def get_values(self,name):
if name in self.x_list:
values = self.X[:,self.x_list.index(name)]
if name in self.y_run_list:
values = self.Y[:,self.y_run_list.index(name)]
if name in self.outputs_list:
values = self.Z[:,self.outputs_list.index(name)]
return values
def get_mvalue(self,names):
'''
Parameters
----------
names : list
list of variables names to return each value.
Returns
-------
mvalue : TYPE
list of value of each variable.
'''
mvalue = []
for name in names:
mvalue += [self.get_value(name)]
return mvalue
def set_value(self,name,value):
if name in self.inputs_run_list:
self.struct[0][name] = value
if name in self.params_list:
self.struct[0][name] = value
def report_x(self,value_format='5.2f'):
for item in self.x_list:
print(f'{item:5s} = {self.get_value(item):5.2f}')
def report_y(self,value_format='5.2f'):
for item in self.y_run_list:
print(f'{item:5s} = {self.get_value(item):5.2f}')
def report_u(self,value_format='5.2f'):
for item in self.inputs_run_list:
print(f'{item:5s} = {self.get_value(item):5.2f}')
def report_z(self,value_format='5.2f'):
for item in self.outputs_list:
print(f'{item:5s} = {self.get_value(item):5.2f}')
def report_params(self,value_format='5.2f'):
for item in self.params_list:
print(f'{item:5s} = {self.get_value(item):5.2f}')
def get_x(self):
return self.struct[0].x
@numba.njit(cache=True)
def ini(struct,mode):
# Parameters:
S_base = struct[0].S_base
g_GRI_POI = struct[0].g_GRI_POI
b_GRI_POI = struct[0].b_GRI_POI
g_POI_PMV = struct[0].g_POI_PMV
b_POI_PMV = struct[0].b_POI_PMV
g_PMV_GR1 = struct[0].g_PMV_GR1
b_PMV_GR1 = struct[0].b_PMV_GR1
g_GR1_GR2 = struct[0].g_GR1_GR2
b_GR1_GR2 = struct[0].b_GR1_GR2
g_PMV_GR3 = struct[0].g_PMV_GR3
b_PMV_GR3 = struct[0].b_PMV_GR3
g_GR3_GR4 = struct[0].g_GR3_GR4
b_GR3_GR4 = struct[0].b_GR3_GR4
U_GRI_n = struct[0].U_GRI_n
U_POI_n = struct[0].U_POI_n
U_PMV_n = struct[0].U_PMV_n
U_GR1_n = struct[0].U_GR1_n
U_GR2_n = struct[0].U_GR2_n
U_GR3_n = struct[0].U_GR3_n
U_GR4_n = struct[0].U_GR4_n
S_n_GRI = struct[0].S_n_GRI
X_d_GRI = struct[0].X_d_GRI
X1d_GRI = struct[0].X1d_GRI
T1d0_GRI = struct[0].T1d0_GRI
X_q_GRI = struct[0].X_q_GRI
X1q_GRI = struct[0].X1q_GRI
T1q0_GRI = struct[0].T1q0_GRI
R_a_GRI = struct[0].R_a_GRI
X_l_GRI = struct[0].X_l_GRI
H_GRI = struct[0].H_GRI
D_GRI = struct[0].D_GRI
Omega_b_GRI = struct[0].Omega_b_GRI
omega_s_GRI = struct[0].omega_s_GRI
K_a_GRI = struct[0].K_a_GRI
T_r_GRI = struct[0].T_r_GRI
v_pss_GRI = struct[0].v_pss_GRI
Droop_GRI = struct[0].Droop_GRI
T_m_GRI = struct[0].T_m_GRI
K_sec_GRI = struct[0].K_sec_GRI
K_delta_GRI = struct[0].K_delta_GRI
v_ref_GRI = struct[0].v_ref_GRI
# Inputs:
P_GRI = struct[0].P_GRI
Q_GRI = struct[0].Q_GRI
P_POI = struct[0].P_POI
Q_POI = struct[0].Q_POI
P_PMV = struct[0].P_PMV
Q_PMV = struct[0].Q_PMV
P_GR1 = struct[0].P_GR1
Q_GR1 = struct[0].Q_GR1
P_GR2 = struct[0].P_GR2
Q_GR2 = struct[0].Q_GR2
P_GR3 = struct[0].P_GR3
Q_GR3 = struct[0].Q_GR3
P_GR4 = struct[0].P_GR4
Q_GR4 = struct[0].Q_GR4
# Dynamical states:
delta_GRI = struct[0].x[0,0]
omega_GRI = struct[0].x[1,0]
e1q_GRI = struct[0].x[2,0]
e1d_GRI = struct[0].x[3,0]
v_c_GRI = struct[0].x[4,0]
p_m_GRI = struct[0].x[5,0]
xi_m_GRI = struct[0].x[6,0]
# Algebraic states:
V_GRI = struct[0].y_ini[0,0]
theta_GRI = struct[0].y_ini[1,0]
V_POI = struct[0].y_ini[2,0]
theta_POI = struct[0].y_ini[3,0]
V_PMV = struct[0].y_ini[4,0]
theta_PMV = struct[0].y_ini[5,0]
V_GR1 = struct[0].y_ini[6,0]
theta_GR1 = struct[0].y_ini[7,0]
V_GR2 = struct[0].y_ini[8,0]
theta_GR2 = struct[0].y_ini[9,0]
V_GR3 = struct[0].y_ini[10,0]
theta_GR3 = struct[0].y_ini[11,0]
V_GR4 = struct[0].y_ini[12,0]
theta_GR4 = struct[0].y_ini[13,0]
i_d_GRI = struct[0].y_ini[14,0]
i_q_GRI = struct[0].y_ini[15,0]
P_GRI_1 = struct[0].y_ini[16,0]
Q_GRI_1 = struct[0].y_ini[17,0]
v_f_GRI = struct[0].y_ini[18,0]
p_m_ref_GRI = struct[0].y_ini[19,0]
# Differential equations:
if mode == 2:
struct[0].f[0,0] = -K_delta_GRI*delta_GRI + Omega_b_GRI*(omega_GRI - omega_s_GRI)
struct[0].f[1,0] = (-D_GRI*(omega_GRI - omega_s_GRI) - i_d_GRI*(R_a_GRI*i_d_GRI + V_GRI*sin(delta_GRI - theta_GRI)) - i_q_GRI*(R_a_GRI*i_q_GRI + V_GRI*cos(delta_GRI - theta_GRI)) + p_m_GRI)/(2*H_GRI)
struct[0].f[2,0] = (-e1q_GRI - i_d_GRI*(-X1d_GRI + X_d_GRI) + v_f_GRI)/T1d0_GRI
struct[0].f[3,0] = (-e1d_GRI + i_q_GRI*(-X1q_GRI + X_q_GRI))/T1q0_GRI
struct[0].f[4,0] = (V_GRI - v_c_GRI)/T_r_GRI
struct[0].f[5,0] = (-p_m_GRI + p_m_ref_GRI)/T_m_GRI
struct[0].f[6,0] = omega_GRI - 1
# Algebraic equations:
if mode == 3:
g_n = np.ascontiguousarray(struct[0].Gy_ini) @ np.ascontiguousarray(struct[0].y_ini)
struct[0].g[0,0] = -P_GRI/S_base - P_GRI_1/S_base + V_GRI**2*g_GRI_POI + V_GRI*V_POI*(-b_GRI_POI*sin(theta_GRI - theta_POI) - g_GRI_POI*cos(theta_GRI - theta_POI))
struct[0].g[1,0] = -Q_GRI/S_base - Q_GRI_1/S_base - V_GRI**2*b_GRI_POI + V_GRI*V_POI*(b_GRI_POI*cos(theta_GRI - theta_POI) - g_GRI_POI*sin(theta_GRI - theta_POI))
struct[0].g[2,0] = -P_POI/S_base + V_GRI*V_POI*(b_GRI_POI*sin(theta_GRI - theta_POI) - g_GRI_POI*cos(theta_GRI - theta_POI)) + V_PMV*V_POI*(b_POI_PMV*sin(theta_PMV - theta_POI) - g_POI_PMV*cos(theta_PMV - theta_POI)) + V_POI**2*(g_GRI_POI + g_POI_PMV)
struct[0].g[3,0] = -Q_POI/S_base + V_GRI*V_POI*(b_GRI_POI*cos(theta_GRI - theta_POI) + g_GRI_POI*sin(theta_GRI - theta_POI)) + V_PMV*V_POI*(b_POI_PMV*cos(theta_PMV - theta_POI) + g_POI_PMV*sin(theta_PMV - theta_POI)) + V_POI**2*(-b_GRI_POI - b_POI_PMV)
struct[0].g[4,0] = -P_PMV/S_base + V_GR1*V_PMV*(b_PMV_GR1*sin(theta_GR1 - theta_PMV) - g_PMV_GR1*cos(theta_GR1 - theta_PMV)) + V_GR3*V_PMV*(b_PMV_GR3*sin(theta_GR3 - theta_PMV) - g_PMV_GR3*cos(theta_GR3 - theta_PMV)) + V_PMV**2*(g_PMV_GR1 + g_PMV_GR3 + g_POI_PMV) + V_PMV*V_POI*(-b_POI_PMV*sin(theta_PMV - theta_POI) - g_POI_PMV*cos(theta_PMV - theta_POI))
struct[0].g[5,0] = -Q_PMV/S_base + V_GR1*V_PMV*(b_PMV_GR1*cos(theta_GR1 - theta_PMV) + g_PMV_GR1*sin(theta_GR1 - theta_PMV)) + V_GR3*V_PMV*(b_PMV_GR3*cos(theta_GR3 - theta_PMV) + g_PMV_GR3*sin(theta_GR3 - theta_PMV)) + V_PMV**2*(-b_PMV_GR1 - b_PMV_GR3 - b_POI_PMV) + V_PMV*V_POI*(b_POI_PMV*cos(theta_PMV - theta_POI) - g_POI_PMV*sin(theta_PMV - theta_POI))
struct[0].g[6,0] = -P_GR1/S_base + V_GR1**2*(g_GR1_GR2 + g_PMV_GR1) + V_GR1*V_GR2*(-b_GR1_GR2*sin(theta_GR1 - theta_GR2) - g_GR1_GR2*cos(theta_GR1 - theta_GR2)) + V_GR1*V_PMV*(-b_PMV_GR1*sin(theta_GR1 - theta_PMV) - g_PMV_GR1*cos(theta_GR1 - theta_PMV))
struct[0].g[7,0] = -Q_GR1/S_base + V_GR1**2*(-b_GR1_GR2 - b_PMV_GR1) + V_GR1*V_GR2*(b_GR1_GR2*cos(theta_GR1 - theta_GR2) - g_GR1_GR2*sin(theta_GR1 - theta_GR2)) + V_GR1*V_PMV*(b_PMV_GR1*cos(theta_GR1 - theta_PMV) - g_PMV_GR1*sin(theta_GR1 - theta_PMV))
struct[0].g[8,0] = -P_GR2/S_base + V_GR1*V_GR2*(b_GR1_GR2*sin(theta_GR1 - theta_GR2) - g_GR1_GR2*cos(theta_GR1 - theta_GR2)) + V_GR2**2*g_GR1_GR2
struct[0].g[9,0] = -Q_GR2/S_base + V_GR1*V_GR2*(b_GR1_GR2*cos(theta_GR1 - theta_GR2) + g_GR1_GR2*sin(theta_GR1 - theta_GR2)) - V_GR2**2*b_GR1_GR2
struct[0].g[10,0] = -P_GR3/S_base + V_GR3**2*(g_GR3_GR4 + g_PMV_GR3) + V_GR3*V_GR4*(-b_GR3_GR4*sin(theta_GR3 - theta_GR4) - g_GR3_GR4*cos(theta_GR3 - theta_GR4)) + V_GR3*V_PMV*(-b_PMV_GR3*sin(theta_GR3 - theta_PMV) - g_PMV_GR3*cos(theta_GR3 - theta_PMV))
struct[0].g[11,0] = -Q_GR3/S_base + V_GR3**2*(-b_GR3_GR4 - b_PMV_GR3) + V_GR3*V_GR4*(b_GR3_GR4*cos(theta_GR3 - theta_GR4) - g_GR3_GR4*sin(theta_GR3 - theta_GR4)) + V_GR3*V_PMV*(b_PMV_GR3*cos(theta_GR3 - theta_PMV) - g_PMV_GR3*sin(theta_GR3 - theta_PMV))
struct[0].g[12,0] = -P_GR4/S_base + V_GR3*V_GR4*(b_GR3_GR4*sin(theta_GR3 - theta_GR4) - g_GR3_GR4*cos(theta_GR3 - theta_GR4)) + V_GR4**2*g_GR3_GR4
struct[0].g[13,0] = -Q_GR4/S_base + V_GR3*V_GR4*(b_GR3_GR4*cos(theta_GR3 - theta_GR4) + g_GR3_GR4*sin(theta_GR3 - theta_GR4)) - V_GR4**2*b_GR3_GR4
struct[0].g[14,0] = R_a_GRI*i_q_GRI + V_GRI*cos(delta_GRI - theta_GRI) + X1d_GRI*i_d_GRI - e1q_GRI
struct[0].g[15,0] = R_a_GRI*i_d_GRI + V_GRI*sin(delta_GRI - theta_GRI) - X1q_GRI*i_q_GRI - e1d_GRI
struct[0].g[16,0] = -P_GRI_1/S_n_GRI + V_GRI*i_d_GRI*sin(delta_GRI - theta_GRI) + V_GRI*i_q_GRI*cos(delta_GRI - theta_GRI)
struct[0].g[17,0] = -Q_GRI_1/S_n_GRI + V_GRI*i_d_GRI*cos(delta_GRI - theta_GRI) - V_GRI*i_q_GRI*sin(delta_GRI - theta_GRI)
struct[0].g[18,0] = K_a_GRI*(-v_c_GRI + v_pss_GRI + v_ref_GRI) - v_f_GRI
struct[0].g[19,0] = -K_sec_GRI*xi_m_GRI - p_m_ref_GRI - (omega_GRI - 1)/Droop_GRI
# Outputs:
if mode == 3:
struct[0].h[0,0] = V_GRI
struct[0].h[1,0] = V_POI
struct[0].h[2,0] = V_PMV
struct[0].h[3,0] = V_GR1
struct[0].h[4,0] = V_GR2
struct[0].h[5,0] = V_GR3
struct[0].h[6,0] = V_GR4
if mode == 10:
struct[0].Fx_ini[0,0] = -K_delta_GRI
struct[0].Fx_ini[0,1] = Omega_b_GRI
struct[0].Fx_ini[1,0] = (-V_GRI*i_d_GRI*cos(delta_GRI - theta_GRI) + V_GRI*i_q_GRI*sin(delta_GRI - theta_GRI))/(2*H_GRI)
struct[0].Fx_ini[1,1] = -D_GRI/(2*H_GRI)
struct[0].Fx_ini[1,5] = 1/(2*H_GRI)
struct[0].Fx_ini[2,2] = -1/T1d0_GRI
struct[0].Fx_ini[3,3] = -1/T1q0_GRI
struct[0].Fx_ini[4,4] = -1/T_r_GRI
struct[0].Fx_ini[5,5] = -1/T_m_GRI
if mode == 11:
struct[0].Fy_ini[1,0] = (-i_d_GRI*sin(delta_GRI - theta_GRI) - i_q_GRI*cos(delta_GRI - theta_GRI))/(2*H_GRI)
struct[0].Fy_ini[1,1] = (V_GRI*i_d_GRI*cos(delta_GRI - theta_GRI) - V_GRI*i_q_GRI*sin(delta_GRI - theta_GRI))/(2*H_GRI)
struct[0].Fy_ini[1,14] = (-2*R_a_GRI*i_d_GRI - V_GRI*sin(delta_GRI - theta_GRI))/(2*H_GRI)
struct[0].Fy_ini[1,15] = (-2*R_a_GRI*i_q_GRI - V_GRI*cos(delta_GRI - theta_GRI))/(2*H_GRI)
struct[0].Fy_ini[2,14] = (X1d_GRI - X_d_GRI)/T1d0_GRI
struct[0].Fy_ini[2,18] = 1/T1d0_GRI
struct[0].Fy_ini[3,15] = (-X1q_GRI + X_q_GRI)/T1q0_GRI
struct[0].Fy_ini[4,0] = 1/T_r_GRI
struct[0].Fy_ini[5,19] = 1/T_m_GRI
struct[0].Gx_ini[14,0] = -V_GRI*sin(delta_GRI - theta_GRI)
struct[0].Gx_ini[14,2] = -1
struct[0].Gx_ini[15,0] = V_GRI*cos(delta_GRI - theta_GRI)
struct[0].Gx_ini[15,3] = -1
struct[0].Gx_ini[16,0] = V_GRI*i_d_GRI*cos(delta_GRI - theta_GRI) - V_GRI*i_q_GRI*sin(delta_GRI - theta_GRI)
struct[0].Gx_ini[17,0] = -V_GRI*i_d_GRI*sin(delta_GRI - theta_GRI) - V_GRI*i_q_GRI*cos(delta_GRI - theta_GRI)
struct[0].Gx_ini[18,4] = -K_a_GRI
struct[0].Gx_ini[19,1] = -1/Droop_GRI
struct[0].Gx_ini[19,6] = -K_sec_GRI
struct[0].Gy_ini[0,0] = 2*V_GRI*g_GRI_POI + V_POI*(-b_GRI_POI*sin(theta_GRI - theta_POI) - g_GRI_POI*cos(theta_GRI - theta_POI))
struct[0].Gy_ini[0,1] = V_GRI*V_POI*(-b_GRI_POI*cos(theta_GRI - theta_POI) + g_GRI_POI*sin(theta_GRI - theta_POI))
struct[0].Gy_ini[0,2] = V_GRI*(-b_GRI_POI*sin(theta_GRI - theta_POI) - g_GRI_POI*cos(theta_GRI - theta_POI))
struct[0].Gy_ini[0,3] = V_GRI*V_POI*(b_GRI_POI*cos(theta_GRI - theta_POI) - g_GRI_POI*sin(theta_GRI - theta_POI))
struct[0].Gy_ini[0,16] = -1/S_base
struct[0].Gy_ini[1,0] = -2*V_GRI*b_GRI_POI + V_POI*(b_GRI_POI*cos(theta_GRI - theta_POI) - g_GRI_POI*sin(theta_GRI - theta_POI))
struct[0].Gy_ini[1,1] = V_GRI*V_POI*(-b_GRI_POI*sin(theta_GRI - theta_POI) - g_GRI_POI*cos(theta_GRI - theta_POI))
struct[0].Gy_ini[1,2] = V_GRI*(b_GRI_POI*cos(theta_GRI - theta_POI) - g_GRI_POI*sin(theta_GRI - theta_POI))
struct[0].Gy_ini[1,3] = V_GRI*V_POI*(b_GRI_POI*sin(theta_GRI - theta_POI) + g_GRI_POI*cos(theta_GRI - theta_POI))
struct[0].Gy_ini[1,17] = -1/S_base
struct[0].Gy_ini[2,0] = V_POI*(b_GRI_POI*sin(theta_GRI - theta_POI) - g_GRI_POI*cos(theta_GRI - theta_POI))
struct[0].Gy_ini[2,1] = V_GRI*V_POI*(b_GRI_POI*cos(theta_GRI - theta_POI) + g_GRI_POI*sin(theta_GRI - theta_POI))
struct[0].Gy_ini[2,2] = V_GRI*(b_GRI_POI*sin(theta_GRI - theta_POI) - g_GRI_POI*cos(theta_GRI - theta_POI)) + V_PMV*(b_POI_PMV*sin(theta_PMV - theta_POI) - g_POI_PMV*cos(theta_PMV - theta_POI)) + 2*V_POI*(g_GRI_POI + g_POI_PMV)
struct[0].Gy_ini[2,3] = V_GRI*V_POI*(-b_GRI_POI*cos(theta_GRI - theta_POI) - g_GRI_POI*sin(theta_GRI - theta_POI)) + V_PMV*V_POI*(-b_POI_PMV*cos(theta_PMV - theta_POI) - g_POI_PMV*sin(theta_PMV - theta_POI))
struct[0].Gy_ini[2,4] = V_POI*(b_POI_PMV*sin(theta_PMV - theta_POI) - g_POI_PMV*cos(theta_PMV - theta_POI))
struct[0].Gy_ini[2,5] = V_PMV*V_POI*(b_POI_PMV*cos(theta_PMV - theta_POI) + g_POI_PMV*sin(theta_PMV - theta_POI))
struct[0].Gy_ini[3,0] = V_POI*(b_GRI_POI*cos(theta_GRI - theta_POI) + g_GRI_POI*sin(theta_GRI - theta_POI))
struct[0].Gy_ini[3,1] = V_GRI*V_POI*(-b_GRI_POI*sin(theta_GRI - theta_POI) + g_GRI_POI*cos(theta_GRI - theta_POI))
struct[0].Gy_ini[3,2] = V_GRI*(b_GRI_POI*cos(theta_GRI - theta_POI) + g_GRI_POI*sin(theta_GRI - theta_POI)) + V_PMV*(b_POI_PMV*cos(theta_PMV - theta_POI) + g_POI_PMV*sin(theta_PMV - theta_POI)) + 2*V_POI*(-b_GRI_POI - b_POI_PMV)
struct[0].Gy_ini[3,3] = V_GRI*V_POI*(b_GRI_POI*sin(theta_GRI - theta_POI) - g_GRI_POI*cos(theta_GRI - theta_POI)) + V_PMV*V_POI*(b_POI_PMV*sin(theta_PMV - theta_POI) - g_POI_PMV*cos(theta_PMV - theta_POI))
struct[0].Gy_ini[3,4] = V_POI*(b_POI_PMV*cos(theta_PMV - theta_POI) + g_POI_PMV*sin(theta_PMV - theta_POI))
struct[0].Gy_ini[3,5] = V_PMV*V_POI*(-b_POI_PMV*sin(theta_PMV - theta_POI) + g_POI_PMV*cos(theta_PMV - theta_POI))
struct[0].Gy_ini[4,2] = V_PMV*(-b_POI_PMV*sin(theta_PMV - theta_POI) - g_POI_PMV*cos(theta_PMV - theta_POI))
struct[0].Gy_ini[4,3] = V_PMV*V_POI*(b_POI_PMV*cos(theta_PMV - theta_POI) - g_POI_PMV*sin(theta_PMV - theta_POI))
struct[0].Gy_ini[4,4] = V_GR1*(b_PMV_GR1*sin(theta_GR1 - theta_PMV) - g_PMV_GR1*cos(theta_GR1 - theta_PMV)) + V_GR3*(b_PMV_GR3*sin(theta_GR3 - theta_PMV) - g_PMV_GR3*cos(theta_GR3 - theta_PMV)) + 2*V_PMV*(g_PMV_GR1 + g_PMV_GR3 + g_POI_PMV) + V_POI*(-b_POI_PMV*sin(theta_PMV - theta_POI) - g_POI_PMV*cos(theta_PMV - theta_POI))
struct[0].Gy_ini[4,5] = V_GR1*V_PMV*(-b_PMV_GR1*cos(theta_GR1 - theta_PMV) - g_PMV_GR1*sin(theta_GR1 - theta_PMV)) + V_GR3*V_PMV*(-b_PMV_GR3*cos(theta_GR3 - theta_PMV) - g_PMV_GR3*sin(theta_GR3 - theta_PMV)) + V_PMV*V_POI*(-b_POI_PMV*cos(theta_PMV - theta_POI) + g_POI_PMV*sin(theta_PMV - theta_POI))
struct[0].Gy_ini[4,6] = V_PMV*(b_PMV_GR1*sin(theta_GR1 - theta_PMV) - g_PMV_GR1*cos(theta_GR1 - theta_PMV))
struct[0].Gy_ini[4,7] = V_GR1*V_PMV*(b_PMV_GR1*cos(theta_GR1 - theta_PMV) + g_PMV_GR1*sin(theta_GR1 - theta_PMV))
struct[0].Gy_ini[4,10] = V_PMV*(b_PMV_GR3*sin(theta_GR3 - theta_PMV) - g_PMV_GR3*cos(theta_GR3 - theta_PMV))
struct[0].Gy_ini[4,11] = V_GR3*V_PMV*(b_PMV_GR3*cos(theta_GR3 - theta_PMV) + g_PMV_GR3*sin(theta_GR3 - theta_PMV))
struct[0].Gy_ini[5,2] = V_PMV*(b_POI_PMV*cos(theta_PMV - theta_POI) - g_POI_PMV*sin(theta_PMV - theta_POI))
struct[0].Gy_ini[5,3] = V_PMV*V_POI*(b_POI_PMV*sin(theta_PMV - theta_POI) + g_POI_PMV*cos(theta_PMV - theta_POI))
struct[0].Gy_ini[5,4] = V_GR1*(b_PMV_GR1*cos(theta_GR1 - theta_PMV) + g_PMV_GR1*sin(theta_GR1 - theta_PMV)) + V_GR3*(b_PMV_GR3*cos(theta_GR3 - theta_PMV) + g_PMV_GR3*sin(theta_GR3 - theta_PMV)) + 2*V_PMV*(-b_PMV_GR1 - b_PMV_GR3 - b_POI_PMV) + V_POI*(b_POI_PMV*cos(theta_PMV - theta_POI) - g_POI_PMV*sin(theta_PMV - theta_POI))
struct[0].Gy_ini[5,5] = V_GR1*V_PMV*(b_PMV_GR1*sin(theta_GR1 - theta_PMV) - g_PMV_GR1*cos(theta_GR1 - theta_PMV)) + V_GR3*V_PMV*(b_PMV_GR3*sin(theta_GR3 - theta_PMV) - g_PMV_GR3*cos(theta_GR3 - theta_PMV)) + V_PMV*V_POI*(-b_POI_PMV*sin(theta_PMV - theta_POI) - g_POI_PMV*cos(theta_PMV - theta_POI))
struct[0].Gy_ini[5,6] = V_PMV*(b_PMV_GR1*cos(theta_GR1 - theta_PMV) + g_PMV_GR1*sin(theta_GR1 - theta_PMV))
struct[0].Gy_ini[5,7] = V_GR1*V_PMV*(-b_PMV_GR1*sin(theta_GR1 - theta_PMV) + g_PMV_GR1*cos(theta_GR1 - theta_PMV))
struct[0].Gy_ini[5,10] = V_PMV*(b_PMV_GR3*cos(theta_GR3 - theta_PMV) + g_PMV_GR3*sin(theta_GR3 - theta_PMV))
struct[0].Gy_ini[5,11] = V_GR3*V_PMV*(-b_PMV_GR3*sin(theta_GR3 - theta_PMV) + g_PMV_GR3*cos(theta_GR3 - theta_PMV))
struct[0].Gy_ini[6,4] = V_GR1*(-b_PMV_GR1*sin(theta_GR1 - theta_PMV) - g_PMV_GR1*cos(theta_GR1 - theta_PMV))
struct[0].Gy_ini[6,5] = V_GR1*V_PMV*(b_PMV_GR1*cos(theta_GR1 - theta_PMV) - g_PMV_GR1*sin(theta_GR1 - theta_PMV))
struct[0].Gy_ini[6,6] = 2*V_GR1*(g_GR1_GR2 + g_PMV_GR1) + V_GR2*(-b_GR1_GR2*sin(theta_GR1 - theta_GR2) - g_GR1_GR2*cos(theta_GR1 - theta_GR2)) + V_PMV*(-b_PMV_GR1*sin(theta_GR1 - theta_PMV) - g_PMV_GR1*cos(theta_GR1 - theta_PMV))
struct[0].Gy_ini[6,7] = V_GR1*V_GR2*(-b_GR1_GR2*cos(theta_GR1 - theta_GR2) + g_GR1_GR2*sin(theta_GR1 - theta_GR2)) + V_GR1*V_PMV*(-b_PMV_GR1*cos(theta_GR1 - theta_PMV) + g_PMV_GR1*sin(theta_GR1 - theta_PMV))
struct[0].Gy_ini[6,8] = V_GR1*(-b_GR1_GR2*sin(theta_GR1 - theta_GR2) - g_GR1_GR2*cos(theta_GR1 - theta_GR2))
struct[0].Gy_ini[6,9] = V_GR1*V_GR2*(b_GR1_GR2*cos(theta_GR1 - theta_GR2) - g_GR1_GR2*sin(theta_GR1 - theta_GR2))
struct[0].Gy_ini[7,4] = V_GR1*(b_PMV_GR1*cos(theta_GR1 - theta_PMV) - g_PMV_GR1*sin(theta_GR1 - theta_PMV))
struct[0].Gy_ini[7,5] = V_GR1*V_PMV*(b_PMV_GR1*sin(theta_GR1 - theta_PMV) + g_PMV_GR1*cos(theta_GR1 - theta_PMV))
struct[0].Gy_ini[7,6] = 2*V_GR1*(-b_GR1_GR2 - b_PMV_GR1) + V_GR2*(b_GR1_GR2*cos(theta_GR1 - theta_GR2) - g_GR1_GR2*sin(theta_GR1 - theta_GR2)) + V_PMV*(b_PMV_GR1*cos(theta_GR1 - theta_PMV) - g_PMV_GR1*sin(theta_GR1 - theta_PMV))
struct[0].Gy_ini[7,7] = V_GR1*V_GR2*(-b_GR1_GR2*sin(theta_GR1 - theta_GR2) - g_GR1_GR2*cos(theta_GR1 - theta_GR2)) + V_GR1*V_PMV*(-b_PMV_GR1*sin(theta_GR1 - theta_PMV) - g_PMV_GR1*cos(theta_GR1 - theta_PMV))
struct[0].Gy_ini[7,8] = V_GR1*(b_GR1_GR2*cos(theta_GR1 - theta_GR2) - g_GR1_GR2*sin(theta_GR1 - theta_GR2))
struct[0].Gy_ini[7,9] = V_GR1*V_GR2*(b_GR1_GR2*sin(theta_GR1 - theta_GR2) + g_GR1_GR2*cos(theta_GR1 - theta_GR2))
struct[0].Gy_ini[8,6] = V_GR2*(b_GR1_GR2*sin(theta_GR1 - theta_GR2) - g_GR1_GR2*cos(theta_GR1 - theta_GR2))
struct[0].Gy_ini[8,7] = V_GR1*V_GR2*(b_GR1_GR2*cos(theta_GR1 - theta_GR2) + g_GR1_GR2*sin(theta_GR1 - theta_GR2))
struct[0].Gy_ini[8,8] = V_GR1*(b_GR1_GR2*sin(theta_GR1 - theta_GR2) - g_GR1_GR2*cos(theta_GR1 - theta_GR2)) + 2*V_GR2*g_GR1_GR2
struct[0].Gy_ini[8,9] = V_GR1*V_GR2*(-b_GR1_GR2*cos(theta_GR1 - theta_GR2) - g_GR1_GR2*sin(theta_GR1 - theta_GR2))
struct[0].Gy_ini[9,6] = V_GR2*(b_GR1_GR2*cos(theta_GR1 - theta_GR2) + g_GR1_GR2*sin(theta_GR1 - theta_GR2))
struct[0].Gy_ini[9,7] = V_GR1*V_GR2*(-b_GR1_GR2*sin(theta_GR1 - theta_GR2) + g_GR1_GR2*cos(theta_GR1 - theta_GR2))
struct[0].Gy_ini[9,8] = V_GR1*(b_GR1_GR2*cos(theta_GR1 - theta_GR2) + g_GR1_GR2*sin(theta_GR1 - theta_GR2)) - 2*V_GR2*b_GR1_GR2
struct[0].Gy_ini[9,9] = V_GR1*V_GR2*(b_GR1_GR2*sin(theta_GR1 - theta_GR2) - g_GR1_GR2*cos(theta_GR1 - theta_GR2))
struct[0].Gy_ini[10,4] = V_GR3*(-b_PMV_GR3*sin(theta_GR3 - theta_PMV) - g_PMV_GR3*cos(theta_GR3 - theta_PMV))
struct[0].Gy_ini[10,5] = V_GR3*V_PMV*(b_PMV_GR3*cos(theta_GR3 - theta_PMV) - g_PMV_GR3*sin(theta_GR3 - theta_PMV))
struct[0].Gy_ini[10,10] = 2*V_GR3*(g_GR3_GR4 + g_PMV_GR3) + V_GR4*(-b_GR3_GR4*sin(theta_GR3 - theta_GR4) - g_GR3_GR4*cos(theta_GR3 - theta_GR4)) + V_PMV*(-b_PMV_GR3*sin(theta_GR3 - theta_PMV) - g_PMV_GR3*cos(theta_GR3 - theta_PMV))
struct[0].Gy_ini[10,11] = V_GR3*V_GR4*(-b_GR3_GR4*cos(theta_GR3 - theta_GR4) + g_GR3_GR4*sin(theta_GR3 - theta_GR4)) + V_GR3*V_PMV*(-b_PMV_GR3*cos(theta_GR3 - theta_PMV) + g_PMV_GR3*sin(theta_GR3 - theta_PMV))
struct[0].Gy_ini[10,12] = V_GR3*(-b_GR3_GR4*sin(theta_GR3 - theta_GR4) - g_GR3_GR4*cos(theta_GR3 - theta_GR4))
struct[0].Gy_ini[10,13] = V_GR3*V_GR4*(b_GR3_GR4*cos(theta_GR3 - theta_GR4) - g_GR3_GR4*sin(theta_GR3 - theta_GR4))
struct[0].Gy_ini[11,4] = V_GR3*(b_PMV_GR3*cos(theta_GR3 - theta_PMV) - g_PMV_GR3*sin(theta_GR3 - theta_PMV))
struct[0].Gy_ini[11,5] = V_GR3*V_PMV*(b_PMV_GR3*sin(theta_GR3 - theta_PMV) + g_PMV_GR3*cos(theta_GR3 - theta_PMV))
struct[0].Gy_ini[11,10] = 2*V_GR3*(-b_GR3_GR4 - b_PMV_GR3) + V_GR4*(b_GR3_GR4*cos(theta_GR3 - theta_GR4) - g_GR3_GR4*sin(theta_GR3 - theta_GR4)) + V_PMV*(b_PMV_GR3*cos(theta_GR3 - theta_PMV) - g_PMV_GR3*sin(theta_GR3 - theta_PMV))
struct[0].Gy_ini[11,11] = V_GR3*V_GR4*(-b_GR3_GR4*sin(theta_GR3 - theta_GR4) - g_GR3_GR4*cos(theta_GR3 - theta_GR4)) + V_GR3*V_PMV*(-b_PMV_GR3*sin(theta_GR3 - theta_PMV) - g_PMV_GR3*cos(theta_GR3 - theta_PMV))
struct[0].Gy_ini[11,12] = V_GR3*(b_GR3_GR4*cos(theta_GR3 - theta_GR4) - g_GR3_GR4*sin(theta_GR3 - theta_GR4))
struct[0].Gy_ini[11,13] = V_GR3*V_GR4*(b_GR3_GR4*sin(theta_GR3 - theta_GR4) + g_GR3_GR4*cos(theta_GR3 - theta_GR4))
struct[0].Gy_ini[12,10] = V_GR4*(b_GR3_GR4*sin(theta_GR3 - theta_GR4) - g_GR3_GR4*cos(theta_GR3 - theta_GR4))
struct[0].Gy_ini[12,11] = V_GR3*V_GR4*(b_GR3_GR4*cos(theta_GR3 - theta_GR4) + g_GR3_GR4*sin(theta_GR3 - theta_GR4))
struct[0].Gy_ini[12,12] = V_GR3*(b_GR3_GR4*sin(theta_GR3 - theta_GR4) - g_GR3_GR4*cos(theta_GR3 - theta_GR4)) + 2*V_GR4*g_GR3_GR4
struct[0].Gy_ini[12,13] = V_GR3*V_GR4*(-b_GR3_GR4*cos(theta_GR3 - theta_GR4) - g_GR3_GR4*sin(theta_GR3 - theta_GR4))
struct[0].Gy_ini[13,10] = V_GR4*(b_GR3_GR4*cos(theta_GR3 - theta_GR4) + g_GR3_GR4*sin(theta_GR3 - theta_GR4))
struct[0].Gy_ini[13,11] = V_GR3*V_GR4*(-b_GR3_GR4*sin(theta_GR3 - theta_GR4) + g_GR3_GR4*cos(theta_GR3 - theta_GR4))
struct[0].Gy_ini[13,12] = V_GR3*(b_GR3_GR4*cos(theta_GR3 - theta_GR4) + g_GR3_GR4*sin(theta_GR3 - theta_GR4)) - 2*V_GR4*b_GR3_GR4
struct[0].Gy_ini[13,13] = V_GR3*V_GR4*(b_GR3_GR4*sin(theta_GR3 - theta_GR4) - g_GR3_GR4*cos(theta_GR3 - theta_GR4))
struct[0].Gy_ini[14,0] = cos(delta_GRI - theta_GRI)
struct[0].Gy_ini[14,1] = V_GRI*sin(delta_GRI - theta_GRI)
struct[0].Gy_ini[14,14] = X1d_GRI
struct[0].Gy_ini[14,15] = R_a_GRI
struct[0].Gy_ini[15,0] = sin(delta_GRI - theta_GRI)
struct[0].Gy_ini[15,1] = -V_GRI*cos(delta_GRI - theta_GRI)
struct[0].Gy_ini[15,14] = R_a_GRI
struct[0].Gy_ini[15,15] = -X1q_GRI
struct[0].Gy_ini[16,0] = i_d_GRI*sin(delta_GRI - theta_GRI) + i_q_GRI*cos(delta_GRI - theta_GRI)
struct[0].Gy_ini[16,1] = -V_GRI*i_d_GRI*cos(delta_GRI - theta_GRI) + V_GRI*i_q_GRI*sin(delta_GRI - theta_GRI)
struct[0].Gy_ini[16,14] = V_GRI*sin(delta_GRI - theta_GRI)
struct[0].Gy_ini[16,15] = V_GRI*cos(delta_GRI - theta_GRI)
struct[0].Gy_ini[16,16] = -1/S_n_GRI
struct[0].Gy_ini[17,0] = i_d_GRI*cos(delta_GRI - theta_GRI) - i_q_GRI*sin(delta_GRI - theta_GRI)
struct[0].Gy_ini[17,1] = V_GRI*i_d_GRI*sin(delta_GRI - theta_GRI) + V_GRI*i_q_GRI*cos(delta_GRI - theta_GRI)
struct[0].Gy_ini[17,14] = V_GRI*cos(delta_GRI - theta_GRI)
struct[0].Gy_ini[17,15] = -V_GRI*sin(delta_GRI - theta_GRI)
struct[0].Gy_ini[17,17] = -1/S_n_GRI
@numba.njit(cache=True)
def run(t,struct,mode):
# Parameters:
S_base = struct[0].S_base
g_GRI_POI = struct[0].g_GRI_POI
b_GRI_POI = struct[0].b_GRI_POI
g_POI_PMV = struct[0].g_POI_PMV
b_POI_PMV = struct[0].b_POI_PMV
g_PMV_GR1 = struct[0].g_PMV_GR1
b_PMV_GR1 = struct[0].b_PMV_GR1
g_GR1_GR2 = struct[0].g_GR1_GR2
b_GR1_GR2 = struct[0].b_GR1_GR2
g_PMV_GR3 = struct[0].g_PMV_GR3
b_PMV_GR3 = struct[0].b_PMV_GR3
g_GR3_GR4 = struct[0].g_GR3_GR4
b_GR3_GR4 = struct[0].b_GR3_GR4
U_GRI_n = struct[0].U_GRI_n
U_POI_n = struct[0].U_POI_n
U_PMV_n = struct[0].U_PMV_n
U_GR1_n = struct[0].U_GR1_n
U_GR2_n = struct[0].U_GR2_n
U_GR3_n = struct[0].U_GR3_n
U_GR4_n = struct[0].U_GR4_n
S_n_GRI = struct[0].S_n_GRI
X_d_GRI = struct[0].X_d_GRI
X1d_GRI = struct[0].X1d_GRI
T1d0_GRI = struct[0].T1d0_GRI
X_q_GRI = struct[0].X_q_GRI
X1q_GRI = struct[0].X1q_GRI
T1q0_GRI = struct[0].T1q0_GRI
R_a_GRI = struct[0].R_a_GRI
X_l_GRI = struct[0].X_l_GRI
H_GRI = struct[0].H_GRI
D_GRI = struct[0].D_GRI
Omega_b_GRI = struct[0].Omega_b_GRI
omega_s_GRI = struct[0].omega_s_GRI
K_a_GRI = struct[0].K_a_GRI
T_r_GRI = struct[0].T_r_GRI
v_pss_GRI = struct[0].v_pss_GRI
Droop_GRI = struct[0].Droop_GRI
T_m_GRI = struct[0].T_m_GRI
K_sec_GRI = struct[0].K_sec_GRI
K_delta_GRI = struct[0].K_delta_GRI
v_ref_GRI = struct[0].v_ref_GRI
# Inputs:
P_GRI = struct[0].P_GRI
Q_GRI = struct[0].Q_GRI
P_POI = struct[0].P_POI
Q_POI = struct[0].Q_POI
P_PMV = struct[0].P_PMV
Q_PMV = struct[0].Q_PMV
P_GR1 = struct[0].P_GR1
Q_GR1 = struct[0].Q_GR1
P_GR2 = struct[0].P_GR2
Q_GR2 = struct[0].Q_GR2
P_GR3 = struct[0].P_GR3
Q_GR3 = struct[0].Q_GR3
P_GR4 = struct[0].P_GR4
Q_GR4 = struct[0].Q_GR4
# Dynamical states:
delta_GRI = struct[0].x[0,0]
omega_GRI = struct[0].x[1,0]
e1q_GRI = struct[0].x[2,0]
e1d_GRI = struct[0].x[3,0]
v_c_GRI = struct[0].x[4,0]
p_m_GRI = struct[0].x[5,0]
xi_m_GRI = struct[0].x[6,0]
# Algebraic states:
V_GRI = struct[0].y_run[0,0]
theta_GRI = struct[0].y_run[1,0]
V_POI = struct[0].y_run[2,0]
theta_POI = struct[0].y_run[3,0]
V_PMV = struct[0].y_run[4,0]
theta_PMV = struct[0].y_run[5,0]
V_GR1 = struct[0].y_run[6,0]
theta_GR1 = struct[0].y_run[7,0]
V_GR2 = struct[0].y_run[8,0]
theta_GR2 = struct[0].y_run[9,0]
V_GR3 = struct[0].y_run[10,0]
theta_GR3 = struct[0].y_run[11,0]
V_GR4 = struct[0].y_run[12,0]
theta_GR4 = struct[0].y_run[13,0]
i_d_GRI = struct[0].y_run[14,0]
i_q_GRI = struct[0].y_run[15,0]
P_GRI_1 = struct[0].y_run[16,0]
Q_GRI_1 = struct[0].y_run[17,0]
v_f_GRI = struct[0].y_run[18,0]
p_m_ref_GRI = struct[0].y_run[19,0]
struct[0].u_run[0,0] = P_GRI
struct[0].u_run[1,0] = Q_GRI
struct[0].u_run[2,0] = P_POI
struct[0].u_run[3,0] = Q_POI
struct[0].u_run[4,0] = P_PMV
struct[0].u_run[5,0] = Q_PMV
struct[0].u_run[6,0] = P_GR1
struct[0].u_run[7,0] = Q_GR1
struct[0].u_run[8,0] = P_GR2
struct[0].u_run[9,0] = Q_GR2
struct[0].u_run[10,0] = P_GR3
struct[0].u_run[11,0] = Q_GR3
struct[0].u_run[12,0] = P_GR4
struct[0].u_run[13,0] = Q_GR4
# Differential equations:
if mode == 2:
struct[0].f[0,0] = -K_delta_GRI*delta_GRI + Omega_b_GRI*(omega_GRI - omega_s_GRI)
struct[0].f[1,0] = (-D_GRI*(omega_GRI - omega_s_GRI) - i_d_GRI*(R_a_GRI*i_d_GRI + V_GRI*sin(delta_GRI - theta_GRI)) - i_q_GRI*(R_a_GRI*i_q_GRI + V_GRI*cos(delta_GRI - theta_GRI)) + p_m_GRI)/(2*H_GRI)
struct[0].f[2,0] = (-e1q_GRI - i_d_GRI*(-X1d_GRI + X_d_GRI) + v_f_GRI)/T1d0_GRI
struct[0].f[3,0] = (-e1d_GRI + i_q_GRI*(-X1q_GRI + X_q_GRI))/T1q0_GRI
struct[0].f[4,0] = (V_GRI - v_c_GRI)/T_r_GRI
struct[0].f[5,0] = (-p_m_GRI + p_m_ref_GRI)/T_m_GRI
struct[0].f[6,0] = omega_GRI - 1
# Algebraic equations:
if mode == 3:
g_n = np.ascontiguousarray(struct[0].Gy) @ np.ascontiguousarray(struct[0].y_run) + np.ascontiguousarray(struct[0].Gu) @ np.ascontiguousarray(struct[0].u_run)
struct[0].g[0,0] = -P_GRI/S_base - P_GRI_1/S_base + V_GRI**2*g_GRI_POI + V_GRI*V_POI*(-b_GRI_POI*sin(theta_GRI - theta_POI) - g_GRI_POI*cos(theta_GRI - theta_POI))
struct[0].g[1,0] = -Q_GRI/S_base - Q_GRI_1/S_base - V_GRI**2*b_GRI_POI + V_GRI*V_POI*(b_GRI_POI*cos(theta_GRI - theta_POI) - g_GRI_POI*sin(theta_GRI - theta_POI))
struct[0].g[2,0] = -P_POI/S_base + V_GRI*V_POI*(b_GRI_POI*sin(theta_GRI - theta_POI) - g_GRI_POI*cos(theta_GRI - theta_POI)) + V_PMV*V_POI*(b_POI_PMV*sin(theta_PMV - theta_POI) - g_POI_PMV*cos(theta_PMV - theta_POI)) + V_POI**2*(g_GRI_POI + g_POI_PMV)
struct[0].g[3,0] = -Q_POI/S_base + V_GRI*V_POI*(b_GRI_POI*cos(theta_GRI - theta_POI) + g_GRI_POI*sin(theta_GRI - theta_POI)) + V_PMV*V_POI*(b_POI_PMV*cos(theta_PMV - theta_POI) + g_POI_PMV*sin(theta_PMV - theta_POI)) + V_POI**2*(-b_GRI_POI - b_POI_PMV)
struct[0].g[4,0] = -P_PMV/S_base + V_GR1*V_PMV*(b_PMV_GR1*sin(theta_GR1 - theta_PMV) - g_PMV_GR1*cos(theta_GR1 - theta_PMV)) + V_GR3*V_PMV*(b_PMV_GR3*sin(theta_GR3 - theta_PMV) - g_PMV_GR3*cos(theta_GR3 - theta_PMV)) + V_PMV**2*(g_PMV_GR1 + g_PMV_GR3 + g_POI_PMV) + V_PMV*V_POI*(-b_POI_PMV*sin(theta_PMV - theta_POI) - g_POI_PMV*cos(theta_PMV - theta_POI))
struct[0].g[5,0] = -Q_PMV/S_base + V_GR1*V_PMV*(b_PMV_GR1*cos(theta_GR1 - theta_PMV) + g_PMV_GR1*sin(theta_GR1 - theta_PMV)) + V_GR3*V_PMV*(b_PMV_GR3*cos(theta_GR3 - theta_PMV) + g_PMV_GR3*sin(theta_GR3 - theta_PMV)) + V_PMV**2*(-b_PMV_GR1 - b_PMV_GR3 - b_POI_PMV) + V_PMV*V_POI*(b_POI_PMV*cos(theta_PMV - theta_POI) - g_POI_PMV*sin(theta_PMV - theta_POI))
struct[0].g[6,0] = -P_GR1/S_base + V_GR1**2*(g_GR1_GR2 + g_PMV_GR1) + V_GR1*V_GR2*(-b_GR1_GR2*sin(theta_GR1 - theta_GR2) - g_GR1_GR2*cos(theta_GR1 - theta_GR2)) + V_GR1*V_PMV*(-b_PMV_GR1*sin(theta_GR1 - theta_PMV) - g_PMV_GR1*cos(theta_GR1 - theta_PMV))
struct[0].g[7,0] = -Q_GR1/S_base + V_GR1**2*(-b_GR1_GR2 - b_PMV_GR1) + V_GR1*V_GR2*(b_GR1_GR2*cos(theta_GR1 - theta_GR2) - g_GR1_GR2*sin(theta_GR1 - theta_GR2)) + V_GR1*V_PMV*(b_PMV_GR1*cos(theta_GR1 - theta_PMV) - g_PMV_GR1*sin(theta_GR1 - theta_PMV))
struct[0].g[8,0] = -P_GR2/S_base + V_GR1*V_GR2*(b_GR1_GR2*sin(theta_GR1 - theta_GR2) - g_GR1_GR2*cos(theta_GR1 - theta_GR2)) + V_GR2**2*g_GR1_GR2
struct[0].g[9,0] = -Q_GR2/S_base + V_GR1*V_GR2*(b_GR1_GR2*cos(theta_GR1 - theta_GR2) + g_GR1_GR2*sin(theta_GR1 - theta_GR2)) - V_GR2**2*b_GR1_GR2
struct[0].g[10,0] = -P_GR3/S_base + V_GR3**2*(g_GR3_GR4 + g_PMV_GR3) + V_GR3*V_GR4*(-b_GR3_GR4*sin(theta_GR3 - theta_GR4) - g_GR3_GR4*cos(theta_GR3 - theta_GR4)) + V_GR3*V_PMV*(-b_PMV_GR3*sin(theta_GR3 - theta_PMV) - g_PMV_GR3*cos(theta_GR3 - theta_PMV))
struct[0].g[11,0] = -Q_GR3/S_base + V_GR3**2*(-b_GR3_GR4 - b_PMV_GR3) + V_GR3*V_GR4*(b_GR3_GR4*cos(theta_GR3 - theta_GR4) - g_GR3_GR4*sin(theta_GR3 - theta_GR4)) + V_GR3*V_PMV*(b_PMV_GR3*cos(theta_GR3 - theta_PMV) - g_PMV_GR3*sin(theta_GR3 - theta_PMV))
struct[0].g[12,0] = -P_GR4/S_base + V_GR3*V_GR4*(b_GR3_GR4*sin(theta_GR3 - theta_GR4) - g_GR3_GR4*cos(theta_GR3 - theta_GR4)) + V_GR4**2*g_GR3_GR4
struct[0].g[13,0] = -Q_GR4/S_base + V_GR3*V_GR4*(b_GR3_GR4*cos(theta_GR3 - theta_GR4) + g_GR3_GR4*sin(theta_GR3 - theta_GR4)) - V_GR4**2*b_GR3_GR4
struct[0].g[14,0] = R_a_GRI*i_q_GRI + V_GRI*cos(delta_GRI - theta_GRI) + X1d_GRI*i_d_GRI - e1q_GRI
struct[0].g[15,0] = R_a_GRI*i_d_GRI + V_GRI*sin(delta_GRI - theta_GRI) - X1q_GRI*i_q_GRI - e1d_GRI
struct[0].g[16,0] = -P_GRI_1/S_n_GRI + V_GRI*i_d_GRI*sin(delta_GRI - theta_GRI) + V_GRI*i_q_GRI*cos(delta_GRI - theta_GRI)
struct[0].g[17,0] = -Q_GRI_1/S_n_GRI + V_GRI*i_d_GRI*cos(delta_GRI - theta_GRI) - V_GRI*i_q_GRI*sin(delta_GRI - theta_GRI)
struct[0].g[18,0] = K_a_GRI*(-v_c_GRI + v_pss_GRI + v_ref_GRI) - v_f_GRI
struct[0].g[19,0] = -K_sec_GRI*xi_m_GRI - p_m_ref_GRI - (omega_GRI - 1)/Droop_GRI
# Outputs:
if mode == 3:
struct[0].h[0,0] = V_GRI
struct[0].h[1,0] = V_POI
struct[0].h[2,0] = V_PMV
struct[0].h[3,0] = V_GR1
struct[0].h[4,0] = V_GR2
struct[0].h[5,0] = V_GR3
struct[0].h[6,0] = V_GR4
if mode == 10:
struct[0].Fx[0,0] = -K_delta_GRI
struct[0].Fx[0,1] = Omega_b_GRI
struct[0].Fx[1,0] = (-V_GRI*i_d_GRI*cos(delta_GRI - theta_GRI) + V_GRI*i_q_GRI*sin(delta_GRI - theta_GRI))/(2*H_GRI)
struct[0].Fx[1,1] = -D_GRI/(2*H_GRI)
struct[0].Fx[1,5] = 1/(2*H_GRI)
struct[0].Fx[2,2] = -1/T1d0_GRI
struct[0].Fx[3,3] = -1/T1q0_GRI
struct[0].Fx[4,4] = -1/T_r_GRI
struct[0].Fx[5,5] = -1/T_m_GRI
if mode == 11:
struct[0].Fy[1,0] = (-i_d_GRI*sin(delta_GRI - theta_GRI) - i_q_GRI*cos(delta_GRI - theta_GRI))/(2*H_GRI)
struct[0].Fy[1,1] = (V_GRI*i_d_GRI*cos(delta_GRI - theta_GRI) - V_GRI*i_q_GRI*sin(delta_GRI - theta_GRI))/(2*H_GRI)
struct[0].Fy[1,14] = (-2*R_a_GRI*i_d_GRI - V_GRI*sin(delta_GRI - theta_GRI))/(2*H_GRI)
struct[0].Fy[1,15] = (-2*R_a_GRI*i_q_GRI - V_GRI*cos(delta_GRI - theta_GRI))/(2*H_GRI)
struct[0].Fy[2,14] = (X1d_GRI - X_d_GRI)/T1d0_GRI
struct[0].Fy[2,18] = 1/T1d0_GRI
struct[0].Fy[3,15] = (-X1q_GRI + X_q_GRI)/T1q0_GRI
struct[0].Fy[4,0] = 1/T_r_GRI
struct[0].Fy[5,19] = 1/T_m_GRI
struct[0].Gx[14,0] = -V_GRI*sin(delta_GRI - theta_GRI)
struct[0].Gx[14,2] = -1
struct[0].Gx[15,0] = V_GRI*cos(delta_GRI - theta_GRI)
struct[0].Gx[15,3] = -1
struct[0].Gx[16,0] = V_GRI*i_d_GRI*cos(delta_GRI - theta_GRI) - V_GRI*i_q_GRI*sin(delta_GRI - theta_GRI)
struct[0].Gx[17,0] = -V_GRI*i_d_GRI*sin(delta_GRI - theta_GRI) - V_GRI*i_q_GRI*cos(delta_GRI - theta_GRI)
struct[0].Gx[18,4] = -K_a_GRI
struct[0].Gx[19,1] = -1/Droop_GRI
struct[0].Gx[19,6] = -K_sec_GRI
struct[0].Gy[0,0] = 2*V_GRI*g_GRI_POI + V_POI*(-b_GRI_POI*sin(theta_GRI - theta_POI) - g_GRI_POI*cos(theta_GRI - theta_POI))
struct[0].Gy[0,1] = V_GRI*V_POI*(-b_GRI_POI*cos(theta_GRI - theta_POI) + g_GRI_POI*sin(theta_GRI - theta_POI))
struct[0].Gy[0,2] = V_GRI*(-b_GRI_POI*sin(theta_GRI - theta_POI) - g_GRI_POI*cos(theta_GRI - theta_POI))
struct[0].Gy[0,3] = V_GRI*V_POI*(b_GRI_POI*cos(theta_GRI - theta_POI) - g_GRI_POI*sin(theta_GRI - theta_POI))
struct[0].Gy[0,16] = -1/S_base
struct[0].Gy[1,0] = -2*V_GRI*b_GRI_POI + V_POI*(b_GRI_POI*cos(theta_GRI - theta_POI) - g_GRI_POI*sin(theta_GRI - theta_POI))
struct[0].Gy[1,1] = V_GRI*V_POI*(-b_GRI_POI*sin(theta_GRI - theta_POI) - g_GRI_POI*cos(theta_GRI - theta_POI))
struct[0].Gy[1,2] = V_GRI*(b_GRI_POI*cos(theta_GRI - theta_POI) - g_GRI_POI*sin(theta_GRI - theta_POI))
struct[0].Gy[1,3] = V_GRI*V_POI*(b_GRI_POI*sin(theta_GRI - theta_POI) + g_GRI_POI*cos(theta_GRI - theta_POI))
struct[0].Gy[1,17] = -1/S_base
struct[0].Gy[2,0] = V_POI*(b_GRI_POI*sin(theta_GRI - theta_POI) - g_GRI_POI*cos(theta_GRI - theta_POI))
struct[0].Gy[2,1] = V_GRI*V_POI*(b_GRI_POI*cos(theta_GRI - theta_POI) + g_GRI_POI*sin(theta_GRI - theta_POI))
struct[0].Gy[2,2] = V_GRI*(b_GRI_POI*sin(theta_GRI - theta_POI) - g_GRI_POI*cos(theta_GRI - theta_POI)) + V_PMV*(b_POI_PMV*sin(theta_PMV - theta_POI) - g_POI_PMV*cos(theta_PMV - theta_POI)) + 2*V_POI*(g_GRI_POI + g_POI_PMV)
struct[0].Gy[2,3] = V_GRI*V_POI*(-b_GRI_POI*cos(theta_GRI - theta_POI) - g_GRI_POI*sin(theta_GRI - theta_POI)) + V_PMV*V_POI*(-b_POI_PMV*cos(theta_PMV - theta_POI) - g_POI_PMV*sin(theta_PMV - theta_POI))
struct[0].Gy[2,4] = V_POI*(b_POI_PMV*sin(theta_PMV - theta_POI) - g_POI_PMV*cos(theta_PMV - theta_POI))
struct[0].Gy[2,5] = V_PMV*V_POI*(b_POI_PMV*cos(theta_PMV - theta_POI) + g_POI_PMV*sin(theta_PMV - theta_POI))
struct[0].Gy[3,0] = V_POI*(b_GRI_POI*cos(theta_GRI - theta_POI) + g_GRI_POI*sin(theta_GRI - theta_POI))
struct[0].Gy[3,1] = V_GRI*V_POI*(-b_GRI_POI*sin(theta_GRI - theta_POI) + g_GRI_POI*cos(theta_GRI - theta_POI))
struct[0].Gy[3,2] = V_GRI*(b_GRI_POI*cos(theta_GRI - theta_POI) + g_GRI_POI*sin(theta_GRI - theta_POI)) + V_PMV*(b_POI_PMV*cos(theta_PMV - theta_POI) + g_POI_PMV*sin(theta_PMV - theta_POI)) + 2*V_POI*(-b_GRI_POI - b_POI_PMV)
struct[0].Gy[3,3] = V_GRI*V_POI*(b_GRI_POI*sin(theta_GRI - theta_POI) - g_GRI_POI*cos(theta_GRI - theta_POI)) + V_PMV*V_POI*(b_POI_PMV*sin(theta_PMV - theta_POI) - g_POI_PMV*cos(theta_PMV - theta_POI))
struct[0].Gy[3,4] = V_POI*(b_POI_PMV*cos(theta_PMV - theta_POI) + g_POI_PMV*sin(theta_PMV - theta_POI))
struct[0].Gy[3,5] = V_PMV*V_POI*(-b_POI_PMV*sin(theta_PMV - theta_POI) + g_POI_PMV*cos(theta_PMV - theta_POI))
struct[0].Gy[4,2] = V_PMV*(-b_POI_PMV*sin(theta_PMV - theta_POI) - g_POI_PMV*cos(theta_PMV - theta_POI))
struct[0].Gy[4,3] = V_PMV*V_POI*(b_POI_PMV*cos(theta_PMV - theta_POI) - g_POI_PMV*sin(theta_PMV - theta_POI))
struct[0].Gy[4,4] = V_GR1*(b_PMV_GR1*sin(theta_GR1 - theta_PMV) - g_PMV_GR1*cos(theta_GR1 - theta_PMV)) + V_GR3*(b_PMV_GR3*sin(theta_GR3 - theta_PMV) - g_PMV_GR3*cos(theta_GR3 - theta_PMV)) + 2*V_PMV*(g_PMV_GR1 + g_PMV_GR3 + g_POI_PMV) + V_POI*(-b_POI_PMV*sin(theta_PMV - theta_POI) - g_POI_PMV*cos(theta_PMV - theta_POI))
struct[0].Gy[4,5] = V_GR1*V_PMV*(-b_PMV_GR1*cos(theta_GR1 - theta_PMV) - g_PMV_GR1*sin(theta_GR1 - theta_PMV)) + V_GR3*V_PMV*(-b_PMV_GR3*cos(theta_GR3 - theta_PMV) - g_PMV_GR3*sin(theta_GR3 - theta_PMV)) + V_PMV*V_POI*(-b_POI_PMV*cos(theta_PMV - theta_POI) + g_POI_PMV*sin(theta_PMV - theta_POI))
struct[0].Gy[4,6] = V_PMV*(b_PMV_GR1*sin(theta_GR1 - theta_PMV) - g_PMV_GR1*cos(theta_GR1 - theta_PMV))
struct[0].Gy[4,7] = V_GR1*V_PMV*(b_PMV_GR1*cos(theta_GR1 - theta_PMV) + g_PMV_GR1*sin(theta_GR1 - theta_PMV))
struct[0].Gy[4,10] = V_PMV*(b_PMV_GR3*sin(theta_GR3 - theta_PMV) - g_PMV_GR3*cos(theta_GR3 - theta_PMV))
struct[0].Gy[4,11] = V_GR3*V_PMV*(b_PMV_GR3*cos(theta_GR3 - theta_PMV) + g_PMV_GR3*sin(theta_GR3 - theta_PMV))
struct[0].Gy[5,2] = V_PMV*(b_POI_PMV*cos(theta_PMV - theta_POI) - g_POI_PMV*sin(theta_PMV - theta_POI))
struct[0].Gy[5,3] = V_PMV*V_POI*(b_POI_PMV*sin(theta_PMV - theta_POI) + g_POI_PMV*cos(theta_PMV - theta_POI))
struct[0].Gy[5,4] = V_GR1*(b_PMV_GR1*cos(theta_GR1 - theta_PMV) + g_PMV_GR1*sin(theta_GR1 - theta_PMV)) + V_GR3*(b_PMV_GR3*cos(theta_GR3 - theta_PMV) + g_PMV_GR3*sin(theta_GR3 - theta_PMV)) + 2*V_PMV*(-b_PMV_GR1 - b_PMV_GR3 - b_POI_PMV) + V_POI*(b_POI_PMV*cos(theta_PMV - theta_POI) - g_POI_PMV*sin(theta_PMV - theta_POI))
struct[0].Gy[5,5] = V_GR1*V_PMV*(b_PMV_GR1*sin(theta_GR1 - theta_PMV) - g_PMV_GR1*cos(theta_GR1 - theta_PMV)) + V_GR3*V_PMV*(b_PMV_GR3*sin(theta_GR3 - theta_PMV) - g_PMV_GR3*cos(theta_GR3 - theta_PMV)) + V_PMV*V_POI*(-b_POI_PMV*sin(theta_PMV - theta_POI) - g_POI_PMV*cos(theta_PMV - theta_POI))
struct[0].Gy[5,6] = V_PMV*(b_PMV_GR1*cos(theta_GR1 - theta_PMV) + g_PMV_GR1*sin(theta_GR1 - theta_PMV))
struct[0].Gy[5,7] = V_GR1*V_PMV*(-b_PMV_GR1*sin(theta_GR1 - theta_PMV) + g_PMV_GR1*cos(theta_GR1 - theta_PMV))
struct[0].Gy[5,10] = V_PMV*(b_PMV_GR3*cos(theta_GR3 - theta_PMV) + g_PMV_GR3*sin(theta_GR3 - theta_PMV))
struct[0].Gy[5,11] = V_GR3*V_PMV*(-b_PMV_GR3*sin(theta_GR3 - theta_PMV) + g_PMV_GR3*cos(theta_GR3 - theta_PMV))
struct[0].Gy[6,4] = V_GR1*(-b_PMV_GR1*sin(theta_GR1 - theta_PMV) - g_PMV_GR1*cos(theta_GR1 - theta_PMV))
struct[0].Gy[6,5] = V_GR1*V_PMV*(b_PMV_GR1*cos(theta_GR1 - theta_PMV) - g_PMV_GR1*sin(theta_GR1 - theta_PMV))
struct[0].Gy[6,6] = 2*V_GR1*(g_GR1_GR2 + g_PMV_GR1) + V_GR2*(-b_GR1_GR2*sin(theta_GR1 - theta_GR2) - g_GR1_GR2*cos(theta_GR1 - theta_GR2)) + V_PMV*(-b_PMV_GR1*sin(theta_GR1 - theta_PMV) - g_PMV_GR1*cos(theta_GR1 - theta_PMV))
struct[0].Gy[6,7] = V_GR1*V_GR2*(-b_GR1_GR2*cos(theta_GR1 - theta_GR2) + g_GR1_GR2*sin(theta_GR1 - theta_GR2)) + V_GR1*V_PMV*(-b_PMV_GR1*cos(theta_GR1 - theta_PMV) + g_PMV_GR1*sin(theta_GR1 - theta_PMV))
struct[0].Gy[6,8] = V_GR1*(-b_GR1_GR2*sin(theta_GR1 - theta_GR2) - g_GR1_GR2*cos(theta_GR1 - theta_GR2))
struct[0].Gy[6,9] = V_GR1*V_GR2*(b_GR1_GR2*cos(theta_GR1 - theta_GR2) - g_GR1_GR2*sin(theta_GR1 - theta_GR2))
struct[0].Gy[7,4] = V_GR1*(b_PMV_GR1*cos(theta_GR1 - theta_PMV) - g_PMV_GR1*sin(theta_GR1 - theta_PMV))
struct[0].Gy[7,5] = V_GR1*V_PMV*(b_PMV_GR1*sin(theta_GR1 - theta_PMV) + g_PMV_GR1*cos(theta_GR1 - theta_PMV))
struct[0].Gy[7,6] = 2*V_GR1*(-b_GR1_GR2 - b_PMV_GR1) + V_GR2*(b_GR1_GR2*cos(theta_GR1 - theta_GR2) - g_GR1_GR2*sin(theta_GR1 - theta_GR2)) + V_PMV*(b_PMV_GR1*cos(theta_GR1 - theta_PMV) - g_PMV_GR1*sin(theta_GR1 - theta_PMV))
struct[0].Gy[7,7] = V_GR1*V_GR2*(-b_GR1_GR2*sin(theta_GR1 - theta_GR2) - g_GR1_GR2*cos(theta_GR1 - theta_GR2)) + V_GR1*V_PMV*(-b_PMV_GR1*sin(theta_GR1 - theta_PMV) - g_PMV_GR1*cos(theta_GR1 - theta_PMV))
struct[0].Gy[7,8] = V_GR1*(b_GR1_GR2*cos(theta_GR1 - theta_GR2) - g_GR1_GR2*sin(theta_GR1 - theta_GR2))
struct[0].Gy[7,9] = V_GR1*V_GR2*(b_GR1_GR2*sin(theta_GR1 - theta_GR2) + g_GR1_GR2*cos(theta_GR1 - theta_GR2))
struct[0].Gy[8,6] = V_GR2*(b_GR1_GR2*sin(theta_GR1 - theta_GR2) - g_GR1_GR2*cos(theta_GR1 - theta_GR2))
struct[0].Gy[8,7] = V_GR1*V_GR2*(b_GR1_GR2*cos(theta_GR1 - theta_GR2) + g_GR1_GR2*sin(theta_GR1 - theta_GR2))
struct[0].Gy[8,8] = V_GR1*(b_GR1_GR2*sin(theta_GR1 - theta_GR2) - g_GR1_GR2*cos(theta_GR1 - theta_GR2)) + 2*V_GR2*g_GR1_GR2
struct[0].Gy[8,9] = V_GR1*V_GR2*(-b_GR1_GR2*cos(theta_GR1 - theta_GR2) - g_GR1_GR2*sin(theta_GR1 - theta_GR2))
struct[0].Gy[9,6] = V_GR2*(b_GR1_GR2*cos(theta_GR1 - theta_GR2) + g_GR1_GR2*sin(theta_GR1 - theta_GR2))
struct[0].Gy[9,7] = V_GR1*V_GR2*(-b_GR1_GR2*sin(theta_GR1 - theta_GR2) + g_GR1_GR2*cos(theta_GR1 - theta_GR2))
struct[0].Gy[9,8] = V_GR1*(b_GR1_GR2*cos(theta_GR1 - theta_GR2) + g_GR1_GR2*sin(theta_GR1 - theta_GR2)) - 2*V_GR2*b_GR1_GR2
struct[0].Gy[9,9] = V_GR1*V_GR2*(b_GR1_GR2*sin(theta_GR1 - theta_GR2) - g_GR1_GR2*cos(theta_GR1 - theta_GR2))
struct[0].Gy[10,4] = V_GR3*(-b_PMV_GR3*sin(theta_GR3 - theta_PMV) - g_PMV_GR3*cos(theta_GR3 - theta_PMV))
struct[0].Gy[10,5] = V_GR3*V_PMV*(b_PMV_GR3*cos(theta_GR3 - theta_PMV) - g_PMV_GR3*sin(theta_GR3 - theta_PMV))
struct[0].Gy[10,10] = 2*V_GR3*(g_GR3_GR4 + g_PMV_GR3) + V_GR4*(-b_GR3_GR4*sin(theta_GR3 - theta_GR4) - g_GR3_GR4*cos(theta_GR3 - theta_GR4)) + V_PMV*(-b_PMV_GR3*sin(theta_GR3 - theta_PMV) - g_PMV_GR3*cos(theta_GR3 - theta_PMV))
struct[0].Gy[10,11] = V_GR3*V_GR4*(-b_GR3_GR4*cos(theta_GR3 - theta_GR4) + g_GR3_GR4*sin(theta_GR3 - theta_GR4)) + V_GR3*V_PMV*(-b_PMV_GR3*cos(theta_GR3 - theta_PMV) + g_PMV_GR3*sin(theta_GR3 - theta_PMV))
struct[0].Gy[10,12] = V_GR3*(-b_GR3_GR4*sin(theta_GR3 - theta_GR4) - g_GR3_GR4*cos(theta_GR3 - theta_GR4))
struct[0].Gy[10,13] = V_GR3*V_GR4*(b_GR3_GR4*cos(theta_GR3 - theta_GR4) - g_GR3_GR4*sin(theta_GR3 - theta_GR4))
struct[0].Gy[11,4] = V_GR3*(b_PMV_GR3*cos(theta_GR3 - theta_PMV) - g_PMV_GR3*sin(theta_GR3 - theta_PMV))
struct[0].Gy[11,5] = V_GR3*V_PMV*(b_PMV_GR3*sin(theta_GR3 - theta_PMV) + g_PMV_GR3*cos(theta_GR3 - theta_PMV))
struct[0].Gy[11,10] = 2*V_GR3*(-b_GR3_GR4 - b_PMV_GR3) + V_GR4*(b_GR3_GR4*cos(theta_GR3 - theta_GR4) - g_GR3_GR4*sin(theta_GR3 - theta_GR4)) + V_PMV*(b_PMV_GR3*cos(theta_GR3 - theta_PMV) - g_PMV_GR3*sin(theta_GR3 - theta_PMV))
struct[0].Gy[11,11] = V_GR3*V_GR4*(-b_GR3_GR4*sin(theta_GR3 - theta_GR4) - g_GR3_GR4*cos(theta_GR3 - theta_GR4)) + V_GR3*V_PMV*(-b_PMV_GR3*sin(theta_GR3 - theta_PMV) - g_PMV_GR3*cos(theta_GR3 - theta_PMV))
struct[0].Gy[11,12] = V_GR3*(b_GR3_GR4*cos(theta_GR3 - theta_GR4) - g_GR3_GR4*sin(theta_GR3 - theta_GR4))
struct[0].Gy[11,13] = V_GR3*V_GR4*(b_GR3_GR4*sin(theta_GR3 - theta_GR4) + g_GR3_GR4*cos(theta_GR3 - theta_GR4))
struct[0].Gy[12,10] = V_GR4*(b_GR3_GR4*sin(theta_GR3 - theta_GR4) - g_GR3_GR4*cos(theta_GR3 - theta_GR4))
struct[0].Gy[12,11] = V_GR3*V_GR4*(b_GR3_GR4*cos(theta_GR3 - theta_GR4) + g_GR3_GR4*sin(theta_GR3 - theta_GR4))
struct[0].Gy[12,12] = V_GR3*(b_GR3_GR4*sin(theta_GR3 - theta_GR4) - g_GR3_GR4*cos(theta_GR3 - theta_GR4)) + 2*V_GR4*g_GR3_GR4
struct[0].Gy[12,13] = V_GR3*V_GR4*(-b_GR3_GR4*cos(theta_GR3 - theta_GR4) - g_GR3_GR4*sin(theta_GR3 - theta_GR4))
struct[0].Gy[13,10] = V_GR4*(b_GR3_GR4*cos(theta_GR3 - theta_GR4) + g_GR3_GR4*sin(theta_GR3 - theta_GR4))
struct[0].Gy[13,11] = V_GR3*V_GR4*(-b_GR3_GR4*sin(theta_GR3 - theta_GR4) + g_GR3_GR4*cos(theta_GR3 - theta_GR4))
struct[0].Gy[13,12] = V_GR3*(b_GR3_GR4*cos(theta_GR3 - theta_GR4) + g_GR3_GR4*sin(theta_GR3 - theta_GR4)) - 2*V_GR4*b_GR3_GR4
struct[0].Gy[13,13] = V_GR3*V_GR4*(b_GR3_GR4*sin(theta_GR3 - theta_GR4) - g_GR3_GR4*cos(theta_GR3 - theta_GR4))
struct[0].Gy[14,0] = cos(delta_GRI - theta_GRI)
struct[0].Gy[14,1] = V_GRI*sin(delta_GRI - theta_GRI)
struct[0].Gy[14,14] = X1d_GRI
struct[0].Gy[14,15] = R_a_GRI
struct[0].Gy[15,0] = sin(delta_GRI - theta_GRI)
struct[0].Gy[15,1] = -V_GRI*cos(delta_GRI - theta_GRI)
struct[0].Gy[15,14] = R_a_GRI
struct[0].Gy[15,15] = -X1q_GRI
struct[0].Gy[16,0] = i_d_GRI*sin(delta_GRI - theta_GRI) + i_q_GRI*cos(delta_GRI - theta_GRI)
struct[0].Gy[16,1] = -V_GRI*i_d_GRI*cos(delta_GRI - theta_GRI) + V_GRI*i_q_GRI*sin(delta_GRI - theta_GRI)
struct[0].Gy[16,14] = V_GRI*sin(delta_GRI - theta_GRI)
struct[0].Gy[16,15] = V_GRI*cos(delta_GRI - theta_GRI)
struct[0].Gy[16,16] = -1/S_n_GRI
struct[0].Gy[17,0] = i_d_GRI*cos(delta_GRI - theta_GRI) - i_q_GRI*sin(delta_GRI - theta_GRI)
struct[0].Gy[17,1] = V_GRI*i_d_GRI*sin(delta_GRI - theta_GRI) + V_GRI*i_q_GRI*cos(delta_GRI - theta_GRI)
struct[0].Gy[17,14] = V_GRI*cos(delta_GRI - theta_GRI)
struct[0].Gy[17,15] = -V_GRI*sin(delta_GRI - theta_GRI)
struct[0].Gy[17,17] = -1/S_n_GRI
if mode > 12:
struct[0].Gu[0,0] = -1/S_base
struct[0].Gu[1,1] = -1/S_base
struct[0].Gu[2,2] = -1/S_base
struct[0].Gu[3,3] = -1/S_base
struct[0].Gu[4,4] = -1/S_base
struct[0].Gu[5,5] = -1/S_base
struct[0].Gu[6,6] = -1/S_base
struct[0].Gu[7,7] = -1/S_base
struct[0].Gu[8,8] = -1/S_base
struct[0].Gu[9,9] = -1/S_base
struct[0].Gu[10,10] = -1/S_base
struct[0].Gu[11,11] = -1/S_base
struct[0].Gu[12,12] = -1/S_base
struct[0].Gu[13,13] = -1/S_base
struct[0].Hy[0,0] = 1
struct[0].Hy[1,2] = 1
struct[0].Hy[2,4] = 1
struct[0].Hy[3,6] = 1
struct[0].Hy[4,8] = 1
struct[0].Hy[5,10] = 1
struct[0].Hy[6,12] = 1
def ini_nn(struct,mode):
# Parameters:
S_base = struct[0].S_base
g_GRI_POI = struct[0].g_GRI_POI
b_GRI_POI = struct[0].b_GRI_POI
g_POI_PMV = struct[0].g_POI_PMV
b_POI_PMV = struct[0].b_POI_PMV
g_PMV_GR1 = struct[0].g_PMV_GR1
b_PMV_GR1 = struct[0].b_PMV_GR1
g_GR1_GR2 = struct[0].g_GR1_GR2
b_GR1_GR2 = struct[0].b_GR1_GR2
g_PMV_GR3 = struct[0].g_PMV_GR3
b_PMV_GR3 = struct[0].b_PMV_GR3
g_GR3_GR4 = struct[0].g_GR3_GR4
b_GR3_GR4 = struct[0].b_GR3_GR4
U_GRI_n = struct[0].U_GRI_n
U_POI_n = struct[0].U_POI_n
U_PMV_n = struct[0].U_PMV_n
U_GR1_n = struct[0].U_GR1_n
U_GR2_n = struct[0].U_GR2_n
U_GR3_n = struct[0].U_GR3_n
U_GR4_n = struct[0].U_GR4_n
S_n_GRI = struct[0].S_n_GRI
X_d_GRI = struct[0].X_d_GRI
X1d_GRI = struct[0].X1d_GRI
T1d0_GRI = struct[0].T1d0_GRI
X_q_GRI = struct[0].X_q_GRI
X1q_GRI = struct[0].X1q_GRI
T1q0_GRI = struct[0].T1q0_GRI
R_a_GRI = struct[0].R_a_GRI
X_l_GRI = struct[0].X_l_GRI
H_GRI = struct[0].H_GRI
D_GRI = struct[0].D_GRI
Omega_b_GRI = struct[0].Omega_b_GRI
omega_s_GRI = struct[0].omega_s_GRI
K_a_GRI = struct[0].K_a_GRI
T_r_GRI = struct[0].T_r_GRI
v_pss_GRI = struct[0].v_pss_GRI
Droop_GRI = struct[0].Droop_GRI
T_m_GRI = struct[0].T_m_GRI
K_sec_GRI = struct[0].K_sec_GRI
K_delta_GRI = struct[0].K_delta_GRI
v_ref_GRI = struct[0].v_ref_GRI
# Inputs:
P_GRI = struct[0].P_GRI
Q_GRI = struct[0].Q_GRI
P_POI = struct[0].P_POI
Q_POI = struct[0].Q_POI
P_PMV = struct[0].P_PMV
Q_PMV = struct[0].Q_PMV
P_GR1 = struct[0].P_GR1
Q_GR1 = struct[0].Q_GR1
P_GR2 = struct[0].P_GR2
Q_GR2 = struct[0].Q_GR2
P_GR3 = struct[0].P_GR3
Q_GR3 = struct[0].Q_GR3
P_GR4 = struct[0].P_GR4
Q_GR4 = struct[0].Q_GR4
# Dynamical states:
delta_GRI = struct[0].x[0,0]
omega_GRI = struct[0].x[1,0]
e1q_GRI = struct[0].x[2,0]
e1d_GRI = struct[0].x[3,0]
v_c_GRI = struct[0].x[4,0]
p_m_GRI = struct[0].x[5,0]
xi_m_GRI = struct[0].x[6,0]
# Algebraic states:
V_GRI = struct[0].y_ini[0,0]
theta_GRI = struct[0].y_ini[1,0]
V_POI = struct[0].y_ini[2,0]
theta_POI = struct[0].y_ini[3,0]
V_PMV = struct[0].y_ini[4,0]
theta_PMV = struct[0].y_ini[5,0]
V_GR1 = struct[0].y_ini[6,0]
theta_GR1 = struct[0].y_ini[7,0]
V_GR2 = struct[0].y_ini[8,0]
theta_GR2 = struct[0].y_ini[9,0]
V_GR3 = struct[0].y_ini[10,0]
theta_GR3 = struct[0].y_ini[11,0]
V_GR4 = struct[0].y_ini[12,0]
theta_GR4 = struct[0].y_ini[13,0]
i_d_GRI = struct[0].y_ini[14,0]
i_q_GRI = struct[0].y_ini[15,0]
P_GRI_1 = struct[0].y_ini[16,0]
Q_GRI_1 = struct[0].y_ini[17,0]
v_f_GRI = struct[0].y_ini[18,0]
p_m_ref_GRI = struct[0].y_ini[19,0]
# Differential equations:
if mode == 2:
struct[0].f[0,0] = -K_delta_GRI*delta_GRI + Omega_b_GRI*(omega_GRI - omega_s_GRI)
struct[0].f[1,0] = (-D_GRI*(omega_GRI - omega_s_GRI) - i_d_GRI*(R_a_GRI*i_d_GRI + V_GRI*sin(delta_GRI - theta_GRI)) - i_q_GRI*(R_a_GRI*i_q_GRI + V_GRI*cos(delta_GRI - theta_GRI)) + p_m_GRI)/(2*H_GRI)
struct[0].f[2,0] = (-e1q_GRI - i_d_GRI*(-X1d_GRI + X_d_GRI) + v_f_GRI)/T1d0_GRI
struct[0].f[3,0] = (-e1d_GRI + i_q_GRI*(-X1q_GRI + X_q_GRI))/T1q0_GRI
struct[0].f[4,0] = (V_GRI - v_c_GRI)/T_r_GRI
struct[0].f[5,0] = (-p_m_GRI + p_m_ref_GRI)/T_m_GRI
struct[0].f[6,0] = omega_GRI - 1
# Algebraic equations:
if mode == 3:
struct[0].g[0,0] = -P_GRI/S_base - P_GRI_1/S_base + V_GRI**2*g_GRI_POI + V_GRI*V_POI*(-b_GRI_POI*sin(theta_GRI - theta_POI) - g_GRI_POI*cos(theta_GRI - theta_POI))
struct[0].g[1,0] = -Q_GRI/S_base - Q_GRI_1/S_base - V_GRI**2*b_GRI_POI + V_GRI*V_POI*(b_GRI_POI*cos(theta_GRI - theta_POI) - g_GRI_POI*sin(theta_GRI - theta_POI))
struct[0].g[2,0] = -P_POI/S_base + V_GRI*V_POI*(b_GRI_POI*sin(theta_GRI - theta_POI) - g_GRI_POI*cos(theta_GRI - theta_POI)) + V_PMV*V_POI*(b_POI_PMV*sin(theta_PMV - theta_POI) - g_POI_PMV*cos(theta_PMV - theta_POI)) + V_POI**2*(g_GRI_POI + g_POI_PMV)
struct[0].g[3,0] = -Q_POI/S_base + V_GRI*V_POI*(b_GRI_POI*cos(theta_GRI - theta_POI) + g_GRI_POI*sin(theta_GRI - theta_POI)) + V_PMV*V_POI*(b_POI_PMV*cos(theta_PMV - theta_POI) + g_POI_PMV*sin(theta_PMV - theta_POI)) + V_POI**2*(-b_GRI_POI - b_POI_PMV)
struct[0].g[4,0] = -P_PMV/S_base + V_GR1*V_PMV*(b_PMV_GR1*sin(theta_GR1 - theta_PMV) - g_PMV_GR1*cos(theta_GR1 - theta_PMV)) + V_GR3*V_PMV*(b_PMV_GR3*sin(theta_GR3 - theta_PMV) - g_PMV_GR3*cos(theta_GR3 - theta_PMV)) + V_PMV**2*(g_PMV_GR1 + g_PMV_GR3 + g_POI_PMV) + V_PMV*V_POI*(-b_POI_PMV*sin(theta_PMV - theta_POI) - g_POI_PMV*cos(theta_PMV - theta_POI))
struct[0].g[5,0] = -Q_PMV/S_base + V_GR1*V_PMV*(b_PMV_GR1*cos(theta_GR1 - theta_PMV) + g_PMV_GR1*sin(theta_GR1 - theta_PMV)) + V_GR3*V_PMV*(b_PMV_GR3*cos(theta_GR3 - theta_PMV) + g_PMV_GR3*sin(theta_GR3 - theta_PMV)) + V_PMV**2*(-b_PMV_GR1 - b_PMV_GR3 - b_POI_PMV) + V_PMV*V_POI*(b_POI_PMV*cos(theta_PMV - theta_POI) - g_POI_PMV*sin(theta_PMV - theta_POI))
struct[0].g[6,0] = -P_GR1/S_base + V_GR1**2*(g_GR1_GR2 + g_PMV_GR1) + V_GR1*V_GR2*(-b_GR1_GR2*sin(theta_GR1 - theta_GR2) - g_GR1_GR2*cos(theta_GR1 - theta_GR2)) + V_GR1*V_PMV*(-b_PMV_GR1*sin(theta_GR1 - theta_PMV) - g_PMV_GR1*cos(theta_GR1 - theta_PMV))
struct[0].g[7,0] = -Q_GR1/S_base + V_GR1**2*(-b_GR1_GR2 - b_PMV_GR1) + V_GR1*V_GR2*(b_GR1_GR2*cos(theta_GR1 - theta_GR2) - g_GR1_GR2*sin(theta_GR1 - theta_GR2)) + V_GR1*V_PMV*(b_PMV_GR1*cos(theta_GR1 - theta_PMV) - g_PMV_GR1*sin(theta_GR1 - theta_PMV))
struct[0].g[8,0] = -P_GR2/S_base + V_GR1*V_GR2*(b_GR1_GR2*sin(theta_GR1 - theta_GR2) - g_GR1_GR2*cos(theta_GR1 - theta_GR2)) + V_GR2**2*g_GR1_GR2
struct[0].g[9,0] = -Q_GR2/S_base + V_GR1*V_GR2*(b_GR1_GR2*cos(theta_GR1 - theta_GR2) + g_GR1_GR2*sin(theta_GR1 - theta_GR2)) - V_GR2**2*b_GR1_GR2
struct[0].g[10,0] = -P_GR3/S_base + V_GR3**2*(g_GR3_GR4 + g_PMV_GR3) + V_GR3*V_GR4*(-b_GR3_GR4*sin(theta_GR3 - theta_GR4) - g_GR3_GR4*cos(theta_GR3 - theta_GR4)) + V_GR3*V_PMV*(-b_PMV_GR3*sin(theta_GR3 - theta_PMV) - g_PMV_GR3*cos(theta_GR3 - theta_PMV))
struct[0].g[11,0] = -Q_GR3/S_base + V_GR3**2*(-b_GR3_GR4 - b_PMV_GR3) + V_GR3*V_GR4*(b_GR3_GR4*cos(theta_GR3 - theta_GR4) - g_GR3_GR4*sin(theta_GR3 - theta_GR4)) + V_GR3*V_PMV*(b_PMV_GR3*cos(theta_GR3 - theta_PMV) - g_PMV_GR3*sin(theta_GR3 - theta_PMV))
struct[0].g[12,0] = -P_GR4/S_base + V_GR3*V_GR4*(b_GR3_GR4*sin(theta_GR3 - theta_GR4) - g_GR3_GR4*cos(theta_GR3 - theta_GR4)) + V_GR4**2*g_GR3_GR4
struct[0].g[13,0] = -Q_GR4/S_base + V_GR3*V_GR4*(b_GR3_GR4*cos(theta_GR3 - theta_GR4) + g_GR3_GR4*sin(theta_GR3 - theta_GR4)) - V_GR4**2*b_GR3_GR4
struct[0].g[14,0] = R_a_GRI*i_q_GRI + V_GRI*cos(delta_GRI - theta_GRI) + X1d_GRI*i_d_GRI - e1q_GRI
struct[0].g[15,0] = R_a_GRI*i_d_GRI + V_GRI*sin(delta_GRI - theta_GRI) - X1q_GRI*i_q_GRI - e1d_GRI
struct[0].g[16,0] = -P_GRI_1/S_n_GRI + V_GRI*i_d_GRI*sin(delta_GRI - theta_GRI) + V_GRI*i_q_GRI*cos(delta_GRI - theta_GRI)
struct[0].g[17,0] = -Q_GRI_1/S_n_GRI + V_GRI*i_d_GRI*cos(delta_GRI - theta_GRI) - V_GRI*i_q_GRI*sin(delta_GRI - theta_GRI)
struct[0].g[18,0] = K_a_GRI*(-v_c_GRI + v_pss_GRI + v_ref_GRI) - v_f_GRI
struct[0].g[19,0] = -K_sec_GRI*xi_m_GRI - p_m_ref_GRI - (omega_GRI - 1)/Droop_GRI
# Outputs:
if mode == 3:
struct[0].h[0,0] = V_GRI
struct[0].h[1,0] = V_POI
struct[0].h[2,0] = V_PMV
struct[0].h[3,0] = V_GR1
struct[0].h[4,0] = V_GR2
struct[0].h[5,0] = V_GR3
struct[0].h[6,0] = V_GR4
if mode == 10:
struct[0].Fx_ini[0,0] = -K_delta_GRI
struct[0].Fx_ini[0,1] = Omega_b_GRI
struct[0].Fx_ini[1,0] = (-V_GRI*i_d_GRI*cos(delta_GRI - theta_GRI) + V_GRI*i_q_GRI*sin(delta_GRI - theta_GRI))/(2*H_GRI)
struct[0].Fx_ini[1,1] = -D_GRI/(2*H_GRI)
struct[0].Fx_ini[1,5] = 1/(2*H_GRI)
struct[0].Fx_ini[2,2] = -1/T1d0_GRI
struct[0].Fx_ini[3,3] = -1/T1q0_GRI
struct[0].Fx_ini[4,4] = -1/T_r_GRI
struct[0].Fx_ini[5,5] = -1/T_m_GRI
struct[0].Fx_ini[6,1] = 1
if mode == 11:
struct[0].Fy_ini[1,0] = (-i_d_GRI*sin(delta_GRI - theta_GRI) - i_q_GRI*cos(delta_GRI - theta_GRI))/(2*H_GRI)
struct[0].Fy_ini[1,1] = (V_GRI*i_d_GRI*cos(delta_GRI - theta_GRI) - V_GRI*i_q_GRI*sin(delta_GRI - theta_GRI))/(2*H_GRI)
struct[0].Fy_ini[1,14] = (-2*R_a_GRI*i_d_GRI - V_GRI*sin(delta_GRI - theta_GRI))/(2*H_GRI)
struct[0].Fy_ini[1,15] = (-2*R_a_GRI*i_q_GRI - V_GRI*cos(delta_GRI - theta_GRI))/(2*H_GRI)
struct[0].Fy_ini[2,14] = (X1d_GRI - X_d_GRI)/T1d0_GRI
struct[0].Fy_ini[2,18] = 1/T1d0_GRI
struct[0].Fy_ini[3,15] = (-X1q_GRI + X_q_GRI)/T1q0_GRI
struct[0].Fy_ini[4,0] = 1/T_r_GRI
struct[0].Fy_ini[5,19] = 1/T_m_GRI
struct[0].Gy_ini[0,0] = 2*V_GRI*g_GRI_POI + V_POI*(-b_GRI_POI*sin(theta_GRI - theta_POI) - g_GRI_POI*cos(theta_GRI - theta_POI))
struct[0].Gy_ini[0,1] = V_GRI*V_POI*(-b_GRI_POI*cos(theta_GRI - theta_POI) + g_GRI_POI*sin(theta_GRI - theta_POI))
struct[0].Gy_ini[0,2] = V_GRI*(-b_GRI_POI*sin(theta_GRI - theta_POI) - g_GRI_POI*cos(theta_GRI - theta_POI))
struct[0].Gy_ini[0,3] = V_GRI*V_POI*(b_GRI_POI*cos(theta_GRI - theta_POI) - g_GRI_POI*sin(theta_GRI - theta_POI))
struct[0].Gy_ini[0,16] = -1/S_base
struct[0].Gy_ini[1,0] = -2*V_GRI*b_GRI_POI + V_POI*(b_GRI_POI*cos(theta_GRI - theta_POI) - g_GRI_POI*sin(theta_GRI - theta_POI))
struct[0].Gy_ini[1,1] = V_GRI*V_POI*(-b_GRI_POI*sin(theta_GRI - theta_POI) - g_GRI_POI*cos(theta_GRI - theta_POI))
struct[0].Gy_ini[1,2] = V_GRI*(b_GRI_POI*cos(theta_GRI - theta_POI) - g_GRI_POI*sin(theta_GRI - theta_POI))
struct[0].Gy_ini[1,3] = V_GRI*V_POI*(b_GRI_POI*sin(theta_GRI - theta_POI) + g_GRI_POI*cos(theta_GRI - theta_POI))
struct[0].Gy_ini[1,17] = -1/S_base
struct[0].Gy_ini[2,0] = V_POI*(b_GRI_POI*sin(theta_GRI - theta_POI) - g_GRI_POI*cos(theta_GRI - theta_POI))
struct[0].Gy_ini[2,1] = V_GRI*V_POI*(b_GRI_POI*cos(theta_GRI - theta_POI) + g_GRI_POI*sin(theta_GRI - theta_POI))
struct[0].Gy_ini[2,2] = V_GRI*(b_GRI_POI*sin(theta_GRI - theta_POI) - g_GRI_POI*cos(theta_GRI - theta_POI)) + V_PMV*(b_POI_PMV*sin(theta_PMV - theta_POI) - g_POI_PMV*cos(theta_PMV - theta_POI)) + 2*V_POI*(g_GRI_POI + g_POI_PMV)
struct[0].Gy_ini[2,3] = V_GRI*V_POI*(-b_GRI_POI*cos(theta_GRI - theta_POI) - g_GRI_POI*sin(theta_GRI - theta_POI)) + V_PMV*V_POI*(-b_POI_PMV*cos(theta_PMV - theta_POI) - g_POI_PMV*sin(theta_PMV - theta_POI))
struct[0].Gy_ini[2,4] = V_POI*(b_POI_PMV*sin(theta_PMV - theta_POI) - g_POI_PMV*cos(theta_PMV - theta_POI))
struct[0].Gy_ini[2,5] = V_PMV*V_POI*(b_POI_PMV*cos(theta_PMV - theta_POI) + g_POI_PMV*sin(theta_PMV - theta_POI))
struct[0].Gy_ini[3,0] = V_POI*(b_GRI_POI*cos(theta_GRI - theta_POI) + g_GRI_POI*sin(theta_GRI - theta_POI))
struct[0].Gy_ini[3,1] = V_GRI*V_POI*(-b_GRI_POI*sin(theta_GRI - theta_POI) + g_GRI_POI*cos(theta_GRI - theta_POI))
struct[0].Gy_ini[3,2] = V_GRI*(b_GRI_POI*cos(theta_GRI - theta_POI) + g_GRI_POI*sin(theta_GRI - theta_POI)) + V_PMV*(b_POI_PMV*cos(theta_PMV - theta_POI) + g_POI_PMV*sin(theta_PMV - theta_POI)) + 2*V_POI*(-b_GRI_POI - b_POI_PMV)
struct[0].Gy_ini[3,3] = V_GRI*V_POI*(b_GRI_POI*sin(theta_GRI - theta_POI) - g_GRI_POI*cos(theta_GRI - theta_POI)) + V_PMV*V_POI*(b_POI_PMV*sin(theta_PMV - theta_POI) - g_POI_PMV*cos(theta_PMV - theta_POI))
struct[0].Gy_ini[3,4] = V_POI*(b_POI_PMV*cos(theta_PMV - theta_POI) + g_POI_PMV*sin(theta_PMV - theta_POI))
struct[0].Gy_ini[3,5] = V_PMV*V_POI*(-b_POI_PMV*sin(theta_PMV - theta_POI) + g_POI_PMV*cos(theta_PMV - theta_POI))
struct[0].Gy_ini[4,2] = V_PMV*(-b_POI_PMV*sin(theta_PMV - theta_POI) - g_POI_PMV*cos(theta_PMV - theta_POI))
struct[0].Gy_ini[4,3] = V_PMV*V_POI*(b_POI_PMV*cos(theta_PMV - theta_POI) - g_POI_PMV*sin(theta_PMV - theta_POI))
struct[0].Gy_ini[4,4] = V_GR1*(b_PMV_GR1*sin(theta_GR1 - theta_PMV) - g_PMV_GR1*cos(theta_GR1 - theta_PMV)) + V_GR3*(b_PMV_GR3*sin(theta_GR3 - theta_PMV) - g_PMV_GR3*cos(theta_GR3 - theta_PMV)) + 2*V_PMV*(g_PMV_GR1 + g_PMV_GR3 + g_POI_PMV) + V_POI*(-b_POI_PMV*sin(theta_PMV - theta_POI) - g_POI_PMV*cos(theta_PMV - theta_POI))
struct[0].Gy_ini[4,5] = V_GR1*V_PMV*(-b_PMV_GR1*cos(theta_GR1 - theta_PMV) - g_PMV_GR1*sin(theta_GR1 - theta_PMV)) + V_GR3*V_PMV*(-b_PMV_GR3*cos(theta_GR3 - theta_PMV) - g_PMV_GR3*sin(theta_GR3 - theta_PMV)) + V_PMV*V_POI*(-b_POI_PMV*cos(theta_PMV - theta_POI) + g_POI_PMV*sin(theta_PMV - theta_POI))
struct[0].Gy_ini[4,6] = V_PMV*(b_PMV_GR1*sin(theta_GR1 - theta_PMV) - g_PMV_GR1*cos(theta_GR1 - theta_PMV))
struct[0].Gy_ini[4,7] = V_GR1*V_PMV*(b_PMV_GR1*cos(theta_GR1 - theta_PMV) + g_PMV_GR1*sin(theta_GR1 - theta_PMV))
struct[0].Gy_ini[4,10] = V_PMV*(b_PMV_GR3*sin(theta_GR3 - theta_PMV) - g_PMV_GR3*cos(theta_GR3 - theta_PMV))
struct[0].Gy_ini[4,11] = V_GR3*V_PMV*(b_PMV_GR3*cos(theta_GR3 - theta_PMV) + g_PMV_GR3*sin(theta_GR3 - theta_PMV))
struct[0].Gy_ini[5,2] = V_PMV*(b_POI_PMV*cos(theta_PMV - theta_POI) - g_POI_PMV*sin(theta_PMV - theta_POI))
struct[0].Gy_ini[5,3] = V_PMV*V_POI*(b_POI_PMV*sin(theta_PMV - theta_POI) + g_POI_PMV*cos(theta_PMV - theta_POI))
struct[0].Gy_ini[5,4] = V_GR1*(b_PMV_GR1*cos(theta_GR1 - theta_PMV) + g_PMV_GR1*sin(theta_GR1 - theta_PMV)) + V_GR3*(b_PMV_GR3*cos(theta_GR3 - theta_PMV) + g_PMV_GR3*sin(theta_GR3 - theta_PMV)) + 2*V_PMV*(-b_PMV_GR1 - b_PMV_GR3 - b_POI_PMV) + V_POI*(b_POI_PMV*cos(theta_PMV - theta_POI) - g_POI_PMV*sin(theta_PMV - theta_POI))
struct[0].Gy_ini[5,5] = V_GR1*V_PMV*(b_PMV_GR1*sin(theta_GR1 - theta_PMV) - g_PMV_GR1*cos(theta_GR1 - theta_PMV)) + V_GR3*V_PMV*(b_PMV_GR3*sin(theta_GR3 - theta_PMV) - g_PMV_GR3*cos(theta_GR3 - theta_PMV)) + V_PMV*V_POI*(-b_POI_PMV*sin(theta_PMV - theta_POI) - g_POI_PMV*cos(theta_PMV - theta_POI))
struct[0].Gy_ini[5,6] = V_PMV*(b_PMV_GR1*cos(theta_GR1 - theta_PMV) + g_PMV_GR1*sin(theta_GR1 - theta_PMV))
struct[0].Gy_ini[5,7] = V_GR1*V_PMV*(-b_PMV_GR1*sin(theta_GR1 - theta_PMV) + g_PMV_GR1*cos(theta_GR1 - theta_PMV))
struct[0].Gy_ini[5,10] = V_PMV*(b_PMV_GR3*cos(theta_GR3 - theta_PMV) + g_PMV_GR3*sin(theta_GR3 - theta_PMV))
struct[0].Gy_ini[5,11] = V_GR3*V_PMV*(-b_PMV_GR3*sin(theta_GR3 - theta_PMV) + g_PMV_GR3*cos(theta_GR3 - theta_PMV))
struct[0].Gy_ini[6,4] = V_GR1*(-b_PMV_GR1*sin(theta_GR1 - theta_PMV) - g_PMV_GR1*cos(theta_GR1 - theta_PMV))
struct[0].Gy_ini[6,5] = V_GR1*V_PMV*(b_PMV_GR1*cos(theta_GR1 - theta_PMV) - g_PMV_GR1*sin(theta_GR1 - theta_PMV))
struct[0].Gy_ini[6,6] = 2*V_GR1*(g_GR1_GR2 + g_PMV_GR1) + V_GR2*(-b_GR1_GR2*sin(theta_GR1 - theta_GR2) - g_GR1_GR2*cos(theta_GR1 - theta_GR2)) + V_PMV*(-b_PMV_GR1*sin(theta_GR1 - theta_PMV) - g_PMV_GR1*cos(theta_GR1 - theta_PMV))
struct[0].Gy_ini[6,7] = V_GR1*V_GR2*(-b_GR1_GR2*cos(theta_GR1 - theta_GR2) + g_GR1_GR2*sin(theta_GR1 - theta_GR2)) + V_GR1*V_PMV*(-b_PMV_GR1*cos(theta_GR1 - theta_PMV) + g_PMV_GR1*sin(theta_GR1 - theta_PMV))
struct[0].Gy_ini[6,8] = V_GR1*(-b_GR1_GR2*sin(theta_GR1 - theta_GR2) - g_GR1_GR2*cos(theta_GR1 - theta_GR2))
struct[0].Gy_ini[6,9] = V_GR1*V_GR2*(b_GR1_GR2*cos(theta_GR1 - theta_GR2) - g_GR1_GR2*sin(theta_GR1 - theta_GR2))
struct[0].Gy_ini[7,4] = V_GR1*(b_PMV_GR1*cos(theta_GR1 - theta_PMV) - g_PMV_GR1*sin(theta_GR1 - theta_PMV))
struct[0].Gy_ini[7,5] = V_GR1*V_PMV*(b_PMV_GR1*sin(theta_GR1 - theta_PMV) + g_PMV_GR1*cos(theta_GR1 - theta_PMV))
struct[0].Gy_ini[7,6] = 2*V_GR1*(-b_GR1_GR2 - b_PMV_GR1) + V_GR2*(b_GR1_GR2*cos(theta_GR1 - theta_GR2) - g_GR1_GR2*sin(theta_GR1 - theta_GR2)) + V_PMV*(b_PMV_GR1*cos(theta_GR1 - theta_PMV) - g_PMV_GR1*sin(theta_GR1 - theta_PMV))
struct[0].Gy_ini[7,7] = V_GR1*V_GR2*(-b_GR1_GR2*sin(theta_GR1 - theta_GR2) - g_GR1_GR2*cos(theta_GR1 - theta_GR2)) + V_GR1*V_PMV*(-b_PMV_GR1*sin(theta_GR1 - theta_PMV) - g_PMV_GR1*cos(theta_GR1 - theta_PMV))
struct[0].Gy_ini[7,8] = V_GR1*(b_GR1_GR2*cos(theta_GR1 - theta_GR2) - g_GR1_GR2*sin(theta_GR1 - theta_GR2))
struct[0].Gy_ini[7,9] = V_GR1*V_GR2*(b_GR1_GR2*sin(theta_GR1 - theta_GR2) + g_GR1_GR2*cos(theta_GR1 - theta_GR2))
struct[0].Gy_ini[8,6] = V_GR2*(b_GR1_GR2*sin(theta_GR1 - theta_GR2) - g_GR1_GR2*cos(theta_GR1 - theta_GR2))
struct[0].Gy_ini[8,7] = V_GR1*V_GR2*(b_GR1_GR2*cos(theta_GR1 - theta_GR2) + g_GR1_GR2*sin(theta_GR1 - theta_GR2))
struct[0].Gy_ini[8,8] = V_GR1*(b_GR1_GR2*sin(theta_GR1 - theta_GR2) - g_GR1_GR2*cos(theta_GR1 - theta_GR2)) + 2*V_GR2*g_GR1_GR2
struct[0].Gy_ini[8,9] = V_GR1*V_GR2*(-b_GR1_GR2*cos(theta_GR1 - theta_GR2) - g_GR1_GR2*sin(theta_GR1 - theta_GR2))
struct[0].Gy_ini[9,6] = V_GR2*(b_GR1_GR2*cos(theta_GR1 - theta_GR2) + g_GR1_GR2*sin(theta_GR1 - theta_GR2))
struct[0].Gy_ini[9,7] = V_GR1*V_GR2*(-b_GR1_GR2*sin(theta_GR1 - theta_GR2) + g_GR1_GR2*cos(theta_GR1 - theta_GR2))
struct[0].Gy_ini[9,8] = V_GR1*(b_GR1_GR2*cos(theta_GR1 - theta_GR2) + g_GR1_GR2*sin(theta_GR1 - theta_GR2)) - 2*V_GR2*b_GR1_GR2
struct[0].Gy_ini[9,9] = V_GR1*V_GR2*(b_GR1_GR2*sin(theta_GR1 - theta_GR2) - g_GR1_GR2*cos(theta_GR1 - theta_GR2))
struct[0].Gy_ini[10,4] = V_GR3*(-b_PMV_GR3*sin(theta_GR3 - theta_PMV) - g_PMV_GR3*cos(theta_GR3 - theta_PMV))
struct[0].Gy_ini[10,5] = V_GR3*V_PMV*(b_PMV_GR3*cos(theta_GR3 - theta_PMV) - g_PMV_GR3*sin(theta_GR3 - theta_PMV))
struct[0].Gy_ini[10,10] = 2*V_GR3*(g_GR3_GR4 + g_PMV_GR3) + V_GR4*(-b_GR3_GR4*sin(theta_GR3 - theta_GR4) - g_GR3_GR4*cos(theta_GR3 - theta_GR4)) + V_PMV*(-b_PMV_GR3*sin(theta_GR3 - theta_PMV) - g_PMV_GR3*cos(theta_GR3 - theta_PMV))
struct[0].Gy_ini[10,11] = V_GR3*V_GR4*(-b_GR3_GR4*cos(theta_GR3 - theta_GR4) + g_GR3_GR4*sin(theta_GR3 - theta_GR4)) + V_GR3*V_PMV*(-b_PMV_GR3*cos(theta_GR3 - theta_PMV) + g_PMV_GR3*sin(theta_GR3 - theta_PMV))
struct[0].Gy_ini[10,12] = V_GR3*(-b_GR3_GR4*sin(theta_GR3 - theta_GR4) - g_GR3_GR4*cos(theta_GR3 - theta_GR4))
struct[0].Gy_ini[10,13] = V_GR3*V_GR4*(b_GR3_GR4*cos(theta_GR3 - theta_GR4) - g_GR3_GR4*sin(theta_GR3 - theta_GR4))
struct[0].Gy_ini[11,4] = V_GR3*(b_PMV_GR3*cos(theta_GR3 - theta_PMV) - g_PMV_GR3*sin(theta_GR3 - theta_PMV))
struct[0].Gy_ini[11,5] = V_GR3*V_PMV*(b_PMV_GR3*sin(theta_GR3 - theta_PMV) + g_PMV_GR3*cos(theta_GR3 - theta_PMV))
struct[0].Gy_ini[11,10] = 2*V_GR3*(-b_GR3_GR4 - b_PMV_GR3) + V_GR4*(b_GR3_GR4*cos(theta_GR3 - theta_GR4) - g_GR3_GR4*sin(theta_GR3 - theta_GR4)) + V_PMV*(b_PMV_GR3*cos(theta_GR3 - theta_PMV) - g_PMV_GR3*sin(theta_GR3 - theta_PMV))
struct[0].Gy_ini[11,11] = V_GR3*V_GR4*(-b_GR3_GR4*sin(theta_GR3 - theta_GR4) - g_GR3_GR4*cos(theta_GR3 - theta_GR4)) + V_GR3*V_PMV*(-b_PMV_GR3*sin(theta_GR3 - theta_PMV) - g_PMV_GR3*cos(theta_GR3 - theta_PMV))
struct[0].Gy_ini[11,12] = V_GR3*(b_GR3_GR4*cos(theta_GR3 - theta_GR4) - g_GR3_GR4*sin(theta_GR3 - theta_GR4))
struct[0].Gy_ini[11,13] = V_GR3*V_GR4*(b_GR3_GR4*sin(theta_GR3 - theta_GR4) + g_GR3_GR4*cos(theta_GR3 - theta_GR4))
struct[0].Gy_ini[12,10] = V_GR4*(b_GR3_GR4*sin(theta_GR3 - theta_GR4) - g_GR3_GR4*cos(theta_GR3 - theta_GR4))
struct[0].Gy_ini[12,11] = V_GR3*V_GR4*(b_GR3_GR4*cos(theta_GR3 - theta_GR4) + g_GR3_GR4*sin(theta_GR3 - theta_GR4))
struct[0].Gy_ini[12,12] = V_GR3*(b_GR3_GR4*sin(theta_GR3 - theta_GR4) - g_GR3_GR4*cos(theta_GR3 - theta_GR4)) + 2*V_GR4*g_GR3_GR4
struct[0].Gy_ini[12,13] = V_GR3*V_GR4*(-b_GR3_GR4*cos(theta_GR3 - theta_GR4) - g_GR3_GR4*sin(theta_GR3 - theta_GR4))
struct[0].Gy_ini[13,10] = V_GR4*(b_GR3_GR4*cos(theta_GR3 - theta_GR4) + g_GR3_GR4*sin(theta_GR3 - theta_GR4))
struct[0].Gy_ini[13,11] = V_GR3*V_GR4*(-b_GR3_GR4*sin(theta_GR3 - theta_GR4) + g_GR3_GR4*cos(theta_GR3 - theta_GR4))
struct[0].Gy_ini[13,12] = V_GR3*(b_GR3_GR4*cos(theta_GR3 - theta_GR4) + g_GR3_GR4*sin(theta_GR3 - theta_GR4)) - 2*V_GR4*b_GR3_GR4
struct[0].Gy_ini[13,13] = V_GR3*V_GR4*(b_GR3_GR4*sin(theta_GR3 - theta_GR4) - g_GR3_GR4*cos(theta_GR3 - theta_GR4))
struct[0].Gy_ini[14,0] = cos(delta_GRI - theta_GRI)
struct[0].Gy_ini[14,1] = V_GRI*sin(delta_GRI - theta_GRI)
struct[0].Gy_ini[14,14] = X1d_GRI
struct[0].Gy_ini[14,15] = R_a_GRI
struct[0].Gy_ini[15,0] = sin(delta_GRI - theta_GRI)
struct[0].Gy_ini[15,1] = -V_GRI*cos(delta_GRI - theta_GRI)
struct[0].Gy_ini[15,14] = R_a_GRI
struct[0].Gy_ini[15,15] = -X1q_GRI
struct[0].Gy_ini[16,0] = i_d_GRI*sin(delta_GRI - theta_GRI) + i_q_GRI*cos(delta_GRI - theta_GRI)
struct[0].Gy_ini[16,1] = -V_GRI*i_d_GRI*cos(delta_GRI - theta_GRI) + V_GRI*i_q_GRI*sin(delta_GRI - theta_GRI)
struct[0].Gy_ini[16,14] = V_GRI*sin(delta_GRI - theta_GRI)
struct[0].Gy_ini[16,15] = V_GRI*cos(delta_GRI - theta_GRI)
struct[0].Gy_ini[16,16] = -1/S_n_GRI
struct[0].Gy_ini[17,0] = i_d_GRI*cos(delta_GRI - theta_GRI) - i_q_GRI*sin(delta_GRI - theta_GRI)
struct[0].Gy_ini[17,1] = V_GRI*i_d_GRI*sin(delta_GRI - theta_GRI) + V_GRI*i_q_GRI*cos(delta_GRI - theta_GRI)
struct[0].Gy_ini[17,14] = V_GRI*cos(delta_GRI - theta_GRI)
struct[0].Gy_ini[17,15] = -V_GRI*sin(delta_GRI - theta_GRI)
struct[0].Gy_ini[17,17] = -1/S_n_GRI
struct[0].Gy_ini[18,18] = -1
struct[0].Gy_ini[19,19] = -1
def run_nn(t,struct,mode):
# Parameters:
S_base = struct[0].S_base
g_GRI_POI = struct[0].g_GRI_POI
b_GRI_POI = struct[0].b_GRI_POI
g_POI_PMV = struct[0].g_POI_PMV
b_POI_PMV = struct[0].b_POI_PMV
g_PMV_GR1 = struct[0].g_PMV_GR1
b_PMV_GR1 = struct[0].b_PMV_GR1
g_GR1_GR2 = struct[0].g_GR1_GR2
b_GR1_GR2 = struct[0].b_GR1_GR2
g_PMV_GR3 = struct[0].g_PMV_GR3
b_PMV_GR3 = struct[0].b_PMV_GR3
g_GR3_GR4 = struct[0].g_GR3_GR4
b_GR3_GR4 = struct[0].b_GR3_GR4
U_GRI_n = struct[0].U_GRI_n
U_POI_n = struct[0].U_POI_n
U_PMV_n = struct[0].U_PMV_n
U_GR1_n = struct[0].U_GR1_n
U_GR2_n = struct[0].U_GR2_n
U_GR3_n = struct[0].U_GR3_n
U_GR4_n = struct[0].U_GR4_n
S_n_GRI = struct[0].S_n_GRI
X_d_GRI = struct[0].X_d_GRI
X1d_GRI = struct[0].X1d_GRI
T1d0_GRI = struct[0].T1d0_GRI
X_q_GRI = struct[0].X_q_GRI
X1q_GRI = struct[0].X1q_GRI
T1q0_GRI = struct[0].T1q0_GRI
R_a_GRI = struct[0].R_a_GRI
X_l_GRI = struct[0].X_l_GRI
H_GRI = struct[0].H_GRI
D_GRI = struct[0].D_GRI
Omega_b_GRI = struct[0].Omega_b_GRI
omega_s_GRI = struct[0].omega_s_GRI
K_a_GRI = struct[0].K_a_GRI
T_r_GRI = struct[0].T_r_GRI
v_pss_GRI = struct[0].v_pss_GRI
Droop_GRI = struct[0].Droop_GRI
T_m_GRI = struct[0].T_m_GRI
K_sec_GRI = struct[0].K_sec_GRI
K_delta_GRI = struct[0].K_delta_GRI
v_ref_GRI = struct[0].v_ref_GRI
# Inputs:
P_GRI = struct[0].P_GRI
Q_GRI = struct[0].Q_GRI
P_POI = struct[0].P_POI
Q_POI = struct[0].Q_POI
P_PMV = struct[0].P_PMV
Q_PMV = struct[0].Q_PMV
P_GR1 = struct[0].P_GR1
Q_GR1 = struct[0].Q_GR1
P_GR2 = struct[0].P_GR2
Q_GR2 = struct[0].Q_GR2
P_GR3 = struct[0].P_GR3
Q_GR3 = struct[0].Q_GR3
P_GR4 = struct[0].P_GR4
Q_GR4 = struct[0].Q_GR4
# Dynamical states:
delta_GRI = struct[0].x[0,0]
omega_GRI = struct[0].x[1,0]
e1q_GRI = struct[0].x[2,0]
e1d_GRI = struct[0].x[3,0]
v_c_GRI = struct[0].x[4,0]
p_m_GRI = struct[0].x[5,0]
xi_m_GRI = struct[0].x[6,0]
# Algebraic states:
V_GRI = struct[0].y_run[0,0]
theta_GRI = struct[0].y_run[1,0]
V_POI = struct[0].y_run[2,0]
theta_POI = struct[0].y_run[3,0]
V_PMV = struct[0].y_run[4,0]
theta_PMV = struct[0].y_run[5,0]
V_GR1 = struct[0].y_run[6,0]
theta_GR1 = struct[0].y_run[7,0]
V_GR2 = struct[0].y_run[8,0]
theta_GR2 = struct[0].y_run[9,0]
V_GR3 = struct[0].y_run[10,0]
theta_GR3 = struct[0].y_run[11,0]
V_GR4 = struct[0].y_run[12,0]
theta_GR4 = struct[0].y_run[13,0]
i_d_GRI = struct[0].y_run[14,0]
i_q_GRI = struct[0].y_run[15,0]
P_GRI_1 = struct[0].y_run[16,0]
Q_GRI_1 = struct[0].y_run[17,0]
v_f_GRI = struct[0].y_run[18,0]
p_m_ref_GRI = struct[0].y_run[19,0]
# Differential equations:
if mode == 2:
struct[0].f[0,0] = -K_delta_GRI*delta_GRI + Omega_b_GRI*(omega_GRI - omega_s_GRI)
struct[0].f[1,0] = (-D_GRI*(omega_GRI - omega_s_GRI) - i_d_GRI*(R_a_GRI*i_d_GRI + V_GRI*sin(delta_GRI - theta_GRI)) - i_q_GRI*(R_a_GRI*i_q_GRI + V_GRI*cos(delta_GRI - theta_GRI)) + p_m_GRI)/(2*H_GRI)
struct[0].f[2,0] = (-e1q_GRI - i_d_GRI*(-X1d_GRI + X_d_GRI) + v_f_GRI)/T1d0_GRI
struct[0].f[3,0] = (-e1d_GRI + i_q_GRI*(-X1q_GRI + X_q_GRI))/T1q0_GRI
struct[0].f[4,0] = (V_GRI - v_c_GRI)/T_r_GRI
struct[0].f[5,0] = (-p_m_GRI + p_m_ref_GRI)/T_m_GRI
struct[0].f[6,0] = omega_GRI - 1
# Algebraic equations:
if mode == 3:
struct[0].g[0,0] = -P_GRI/S_base - P_GRI_1/S_base + V_GRI**2*g_GRI_POI + V_GRI*V_POI*(-b_GRI_POI*sin(theta_GRI - theta_POI) - g_GRI_POI*cos(theta_GRI - theta_POI))
struct[0].g[1,0] = -Q_GRI/S_base - Q_GRI_1/S_base - V_GRI**2*b_GRI_POI + V_GRI*V_POI*(b_GRI_POI*cos(theta_GRI - theta_POI) - g_GRI_POI*sin(theta_GRI - theta_POI))
struct[0].g[2,0] = -P_POI/S_base + V_GRI*V_POI*(b_GRI_POI*sin(theta_GRI - theta_POI) - g_GRI_POI*cos(theta_GRI - theta_POI)) + V_PMV*V_POI*(b_POI_PMV*sin(theta_PMV - theta_POI) - g_POI_PMV*cos(theta_PMV - theta_POI)) + V_POI**2*(g_GRI_POI + g_POI_PMV)
struct[0].g[3,0] = -Q_POI/S_base + V_GRI*V_POI*(b_GRI_POI*cos(theta_GRI - theta_POI) + g_GRI_POI*sin(theta_GRI - theta_POI)) + V_PMV*V_POI*(b_POI_PMV*cos(theta_PMV - theta_POI) + g_POI_PMV*sin(theta_PMV - theta_POI)) + V_POI**2*(-b_GRI_POI - b_POI_PMV)
struct[0].g[4,0] = -P_PMV/S_base + V_GR1*V_PMV*(b_PMV_GR1*sin(theta_GR1 - theta_PMV) - g_PMV_GR1*cos(theta_GR1 - theta_PMV)) + V_GR3*V_PMV*(b_PMV_GR3*sin(theta_GR3 - theta_PMV) - g_PMV_GR3*cos(theta_GR3 - theta_PMV)) + V_PMV**2*(g_PMV_GR1 + g_PMV_GR3 + g_POI_PMV) + V_PMV*V_POI*(-b_POI_PMV*sin(theta_PMV - theta_POI) - g_POI_PMV*cos(theta_PMV - theta_POI))
struct[0].g[5,0] = -Q_PMV/S_base + V_GR1*V_PMV*(b_PMV_GR1*cos(theta_GR1 - theta_PMV) + g_PMV_GR1*sin(theta_GR1 - theta_PMV)) + V_GR3*V_PMV*(b_PMV_GR3*cos(theta_GR3 - theta_PMV) + g_PMV_GR3*sin(theta_GR3 - theta_PMV)) + V_PMV**2*(-b_PMV_GR1 - b_PMV_GR3 - b_POI_PMV) + V_PMV*V_POI*(b_POI_PMV*cos(theta_PMV - theta_POI) - g_POI_PMV*sin(theta_PMV - theta_POI))
struct[0].g[6,0] = -P_GR1/S_base + V_GR1**2*(g_GR1_GR2 + g_PMV_GR1) + V_GR1*V_GR2*(-b_GR1_GR2*sin(theta_GR1 - theta_GR2) - g_GR1_GR2*cos(theta_GR1 - theta_GR2)) + V_GR1*V_PMV*(-b_PMV_GR1*sin(theta_GR1 - theta_PMV) - g_PMV_GR1*cos(theta_GR1 - theta_PMV))
struct[0].g[7,0] = -Q_GR1/S_base + V_GR1**2*(-b_GR1_GR2 - b_PMV_GR1) + V_GR1*V_GR2*(b_GR1_GR2*cos(theta_GR1 - theta_GR2) - g_GR1_GR2*sin(theta_GR1 - theta_GR2)) + V_GR1*V_PMV*(b_PMV_GR1*cos(theta_GR1 - theta_PMV) - g_PMV_GR1*sin(theta_GR1 - theta_PMV))
struct[0].g[8,0] = -P_GR2/S_base + V_GR1*V_GR2*(b_GR1_GR2*sin(theta_GR1 - theta_GR2) - g_GR1_GR2*cos(theta_GR1 - theta_GR2)) + V_GR2**2*g_GR1_GR2
struct[0].g[9,0] = -Q_GR2/S_base + V_GR1*V_GR2*(b_GR1_GR2*cos(theta_GR1 - theta_GR2) + g_GR1_GR2*sin(theta_GR1 - theta_GR2)) - V_GR2**2*b_GR1_GR2
struct[0].g[10,0] = -P_GR3/S_base + V_GR3**2*(g_GR3_GR4 + g_PMV_GR3) + V_GR3*V_GR4*(-b_GR3_GR4*sin(theta_GR3 - theta_GR4) - g_GR3_GR4*cos(theta_GR3 - theta_GR4)) + V_GR3*V_PMV*(-b_PMV_GR3*sin(theta_GR3 - theta_PMV) - g_PMV_GR3*cos(theta_GR3 - theta_PMV))
struct[0].g[11,0] = -Q_GR3/S_base + V_GR3**2*(-b_GR3_GR4 - b_PMV_GR3) + V_GR3*V_GR4*(b_GR3_GR4*cos(theta_GR3 - theta_GR4) - g_GR3_GR4*sin(theta_GR3 - theta_GR4)) + V_GR3*V_PMV*(b_PMV_GR3*cos(theta_GR3 - theta_PMV) - g_PMV_GR3*sin(theta_GR3 - theta_PMV))
struct[0].g[12,0] = -P_GR4/S_base + V_GR3*V_GR4*(b_GR3_GR4*sin(theta_GR3 - theta_GR4) - g_GR3_GR4*cos(theta_GR3 - theta_GR4)) + V_GR4**2*g_GR3_GR4
struct[0].g[13,0] = -Q_GR4/S_base + V_GR3*V_GR4*(b_GR3_GR4*cos(theta_GR3 - theta_GR4) + g_GR3_GR4*sin(theta_GR3 - theta_GR4)) - V_GR4**2*b_GR3_GR4
struct[0].g[14,0] = R_a_GRI*i_q_GRI + V_GRI*cos(delta_GRI - theta_GRI) + X1d_GRI*i_d_GRI - e1q_GRI
struct[0].g[15,0] = R_a_GRI*i_d_GRI + V_GRI*sin(delta_GRI - theta_GRI) - X1q_GRI*i_q_GRI - e1d_GRI
struct[0].g[16,0] = -P_GRI_1/S_n_GRI + V_GRI*i_d_GRI*sin(delta_GRI - theta_GRI) + V_GRI*i_q_GRI*cos(delta_GRI - theta_GRI)
struct[0].g[17,0] = -Q_GRI_1/S_n_GRI + V_GRI*i_d_GRI*cos(delta_GRI - theta_GRI) - V_GRI*i_q_GRI*sin(delta_GRI - theta_GRI)
struct[0].g[18,0] = K_a_GRI*(-v_c_GRI + v_pss_GRI + v_ref_GRI) - v_f_GRI
struct[0].g[19,0] = -K_sec_GRI*xi_m_GRI - p_m_ref_GRI - (omega_GRI - 1)/Droop_GRI
# Outputs:
if mode == 3:
struct[0].h[0,0] = V_GRI
struct[0].h[1,0] = V_POI
struct[0].h[2,0] = V_PMV
struct[0].h[3,0] = V_GR1
struct[0].h[4,0] = V_GR2
struct[0].h[5,0] = V_GR3
struct[0].h[6,0] = V_GR4
if mode == 10:
struct[0].Fx[0,0] = -K_delta_GRI
struct[0].Fx[0,1] = Omega_b_GRI
struct[0].Fx[1,0] = (-V_GRI*i_d_GRI*cos(delta_GRI - theta_GRI) + V_GRI*i_q_GRI*sin(delta_GRI - theta_GRI))/(2*H_GRI)
struct[0].Fx[1,1] = -D_GRI/(2*H_GRI)
struct[0].Fx[1,5] = 1/(2*H_GRI)
struct[0].Fx[2,2] = -1/T1d0_GRI
struct[0].Fx[3,3] = -1/T1q0_GRI
struct[0].Fx[4,4] = -1/T_r_GRI
struct[0].Fx[5,5] = -1/T_m_GRI
struct[0].Fx[6,1] = 1
if mode == 11:
struct[0].Fy[1,0] = (-i_d_GRI*sin(delta_GRI - theta_GRI) - i_q_GRI*cos(delta_GRI - theta_GRI))/(2*H_GRI)
struct[0].Fy[1,1] = (V_GRI*i_d_GRI*cos(delta_GRI - theta_GRI) - V_GRI*i_q_GRI*sin(delta_GRI - theta_GRI))/(2*H_GRI)
struct[0].Fy[1,14] = (-2*R_a_GRI*i_d_GRI - V_GRI*sin(delta_GRI - theta_GRI))/(2*H_GRI)
struct[0].Fy[1,15] = (-2*R_a_GRI*i_q_GRI - V_GRI*cos(delta_GRI - theta_GRI))/(2*H_GRI)
struct[0].Fy[2,14] = (X1d_GRI - X_d_GRI)/T1d0_GRI
struct[0].Fy[2,18] = 1/T1d0_GRI
struct[0].Fy[3,15] = (-X1q_GRI + X_q_GRI)/T1q0_GRI
struct[0].Fy[4,0] = 1/T_r_GRI
struct[0].Fy[5,19] = 1/T_m_GRI
struct[0].Gy[0,0] = 2*V_GRI*g_GRI_POI + V_POI*(-b_GRI_POI*sin(theta_GRI - theta_POI) - g_GRI_POI*cos(theta_GRI - theta_POI))
struct[0].Gy[0,1] = V_GRI*V_POI*(-b_GRI_POI*cos(theta_GRI - theta_POI) + g_GRI_POI*sin(theta_GRI - theta_POI))
struct[0].Gy[0,2] = V_GRI*(-b_GRI_POI*sin(theta_GRI - theta_POI) - g_GRI_POI*cos(theta_GRI - theta_POI))
struct[0].Gy[0,3] = V_GRI*V_POI*(b_GRI_POI*cos(theta_GRI - theta_POI) - g_GRI_POI*sin(theta_GRI - theta_POI))
struct[0].Gy[0,16] = -1/S_base
struct[0].Gy[1,0] = -2*V_GRI*b_GRI_POI + V_POI*(b_GRI_POI*cos(theta_GRI - theta_POI) - g_GRI_POI*sin(theta_GRI - theta_POI))
struct[0].Gy[1,1] = V_GRI*V_POI*(-b_GRI_POI*sin(theta_GRI - theta_POI) - g_GRI_POI*cos(theta_GRI - theta_POI))
struct[0].Gy[1,2] = V_GRI*(b_GRI_POI*cos(theta_GRI - theta_POI) - g_GRI_POI*sin(theta_GRI - theta_POI))
struct[0].Gy[1,3] = V_GRI*V_POI*(b_GRI_POI*sin(theta_GRI - theta_POI) + g_GRI_POI*cos(theta_GRI - theta_POI))
struct[0].Gy[1,17] = -1/S_base
struct[0].Gy[2,0] = V_POI*(b_GRI_POI*sin(theta_GRI - theta_POI) - g_GRI_POI*cos(theta_GRI - theta_POI))
struct[0].Gy[2,1] = V_GRI*V_POI*(b_GRI_POI*cos(theta_GRI - theta_POI) + g_GRI_POI*sin(theta_GRI - theta_POI))
struct[0].Gy[2,2] = V_GRI*(b_GRI_POI*sin(theta_GRI - theta_POI) - g_GRI_POI*cos(theta_GRI - theta_POI)) + V_PMV*(b_POI_PMV*sin(theta_PMV - theta_POI) - g_POI_PMV*cos(theta_PMV - theta_POI)) + 2*V_POI*(g_GRI_POI + g_POI_PMV)
struct[0].Gy[2,3] = V_GRI*V_POI*(-b_GRI_POI*cos(theta_GRI - theta_POI) - g_GRI_POI*sin(theta_GRI - theta_POI)) + V_PMV*V_POI*(-b_POI_PMV*cos(theta_PMV - theta_POI) - g_POI_PMV*sin(theta_PMV - theta_POI))
struct[0].Gy[2,4] = V_POI*(b_POI_PMV*sin(theta_PMV - theta_POI) - g_POI_PMV*cos(theta_PMV - theta_POI))
struct[0].Gy[2,5] = V_PMV*V_POI*(b_POI_PMV*cos(theta_PMV - theta_POI) + g_POI_PMV*sin(theta_PMV - theta_POI))
struct[0].Gy[3,0] = V_POI*(b_GRI_POI*cos(theta_GRI - theta_POI) + g_GRI_POI*sin(theta_GRI - theta_POI))
struct[0].Gy[3,1] = V_GRI*V_POI*(-b_GRI_POI*sin(theta_GRI - theta_POI) + g_GRI_POI*cos(theta_GRI - theta_POI))
struct[0].Gy[3,2] = V_GRI*(b_GRI_POI*cos(theta_GRI - theta_POI) + g_GRI_POI*sin(theta_GRI - theta_POI)) + V_PMV*(b_POI_PMV*cos(theta_PMV - theta_POI) + g_POI_PMV*sin(theta_PMV - theta_POI)) + 2*V_POI*(-b_GRI_POI - b_POI_PMV)
struct[0].Gy[3,3] = V_GRI*V_POI*(b_GRI_POI*sin(theta_GRI - theta_POI) - g_GRI_POI*cos(theta_GRI - theta_POI)) + V_PMV*V_POI*(b_POI_PMV*sin(theta_PMV - theta_POI) - g_POI_PMV*cos(theta_PMV - theta_POI))
struct[0].Gy[3,4] = V_POI*(b_POI_PMV*cos(theta_PMV - theta_POI) + g_POI_PMV*sin(theta_PMV - theta_POI))
struct[0].Gy[3,5] = V_PMV*V_POI*(-b_POI_PMV*sin(theta_PMV - theta_POI) + g_POI_PMV*cos(theta_PMV - theta_POI))
struct[0].Gy[4,2] = V_PMV*(-b_POI_PMV*sin(theta_PMV - theta_POI) - g_POI_PMV*cos(theta_PMV - theta_POI))
struct[0].Gy[4,3] = V_PMV*V_POI*(b_POI_PMV*cos(theta_PMV - theta_POI) - g_POI_PMV*sin(theta_PMV - theta_POI))
struct[0].Gy[4,4] = V_GR1*(b_PMV_GR1*sin(theta_GR1 - theta_PMV) - g_PMV_GR1*cos(theta_GR1 - theta_PMV)) + V_GR3*(b_PMV_GR3*sin(theta_GR3 - theta_PMV) - g_PMV_GR3*cos(theta_GR3 - theta_PMV)) + 2*V_PMV*(g_PMV_GR1 + g_PMV_GR3 + g_POI_PMV) + V_POI*(-b_POI_PMV*sin(theta_PMV - theta_POI) - g_POI_PMV*cos(theta_PMV - theta_POI))
struct[0].Gy[4,5] = V_GR1*V_PMV*(-b_PMV_GR1*cos(theta_GR1 - theta_PMV) - g_PMV_GR1*sin(theta_GR1 - theta_PMV)) + V_GR3*V_PMV*(-b_PMV_GR3*cos(theta_GR3 - theta_PMV) - g_PMV_GR3*sin(theta_GR3 - theta_PMV)) + V_PMV*V_POI*(-b_POI_PMV*cos(theta_PMV - theta_POI) + g_POI_PMV*sin(theta_PMV - theta_POI))
struct[0].Gy[4,6] = V_PMV*(b_PMV_GR1*sin(theta_GR1 - theta_PMV) - g_PMV_GR1*cos(theta_GR1 - theta_PMV))
struct[0].Gy[4,7] = V_GR1*V_PMV*(b_PMV_GR1*cos(theta_GR1 - theta_PMV) + g_PMV_GR1*sin(theta_GR1 - theta_PMV))
struct[0].Gy[4,10] = V_PMV*(b_PMV_GR3*sin(theta_GR3 - theta_PMV) - g_PMV_GR3*cos(theta_GR3 - theta_PMV))
struct[0].Gy[4,11] = V_GR3*V_PMV*(b_PMV_GR3*cos(theta_GR3 - theta_PMV) + g_PMV_GR3*sin(theta_GR3 - theta_PMV))
struct[0].Gy[5,2] = V_PMV*(b_POI_PMV*cos(theta_PMV - theta_POI) - g_POI_PMV*sin(theta_PMV - theta_POI))
struct[0].Gy[5,3] = V_PMV*V_POI*(b_POI_PMV*sin(theta_PMV - theta_POI) + g_POI_PMV*cos(theta_PMV - theta_POI))
struct[0].Gy[5,4] = V_GR1*(b_PMV_GR1*cos(theta_GR1 - theta_PMV) + g_PMV_GR1*sin(theta_GR1 - theta_PMV)) + V_GR3*(b_PMV_GR3*cos(theta_GR3 - theta_PMV) + g_PMV_GR3*sin(theta_GR3 - theta_PMV)) + 2*V_PMV*(-b_PMV_GR1 - b_PMV_GR3 - b_POI_PMV) + V_POI*(b_POI_PMV*cos(theta_PMV - theta_POI) - g_POI_PMV*sin(theta_PMV - theta_POI))
struct[0].Gy[5,5] = V_GR1*V_PMV*(b_PMV_GR1*sin(theta_GR1 - theta_PMV) - g_PMV_GR1*cos(theta_GR1 - theta_PMV)) + V_GR3*V_PMV*(b_PMV_GR3*sin(theta_GR3 - theta_PMV) - g_PMV_GR3*cos(theta_GR3 - theta_PMV)) + V_PMV*V_POI*(-b_POI_PMV*sin(theta_PMV - theta_POI) - g_POI_PMV*cos(theta_PMV - theta_POI))
struct[0].Gy[5,6] = V_PMV*(b_PMV_GR1*cos(theta_GR1 - theta_PMV) + g_PMV_GR1*sin(theta_GR1 - theta_PMV))
struct[0].Gy[5,7] = V_GR1*V_PMV*(-b_PMV_GR1*sin(theta_GR1 - theta_PMV) + g_PMV_GR1*cos(theta_GR1 - theta_PMV))
struct[0].Gy[5,10] = V_PMV*(b_PMV_GR3*cos(theta_GR3 - theta_PMV) + g_PMV_GR3*sin(theta_GR3 - theta_PMV))
struct[0].Gy[5,11] = V_GR3*V_PMV*(-b_PMV_GR3*sin(theta_GR3 - theta_PMV) + g_PMV_GR3*cos(theta_GR3 - theta_PMV))
struct[0].Gy[6,4] = V_GR1*(-b_PMV_GR1*sin(theta_GR1 - theta_PMV) - g_PMV_GR1*cos(theta_GR1 - theta_PMV))
struct[0].Gy[6,5] = V_GR1*V_PMV*(b_PMV_GR1*cos(theta_GR1 - theta_PMV) - g_PMV_GR1*sin(theta_GR1 - theta_PMV))
struct[0].Gy[6,6] = 2*V_GR1*(g_GR1_GR2 + g_PMV_GR1) + V_GR2*(-b_GR1_GR2*sin(theta_GR1 - theta_GR2) - g_GR1_GR2*cos(theta_GR1 - theta_GR2)) + V_PMV*(-b_PMV_GR1*sin(theta_GR1 - theta_PMV) - g_PMV_GR1*cos(theta_GR1 - theta_PMV))
struct[0].Gy[6,7] = V_GR1*V_GR2*(-b_GR1_GR2*cos(theta_GR1 - theta_GR2) + g_GR1_GR2*sin(theta_GR1 - theta_GR2)) + V_GR1*V_PMV*(-b_PMV_GR1*cos(theta_GR1 - theta_PMV) + g_PMV_GR1*sin(theta_GR1 - theta_PMV))
struct[0].Gy[6,8] = V_GR1*(-b_GR1_GR2*sin(theta_GR1 - theta_GR2) - g_GR1_GR2*cos(theta_GR1 - theta_GR2))
struct[0].Gy[6,9] = V_GR1*V_GR2*(b_GR1_GR2*cos(theta_GR1 - theta_GR2) - g_GR1_GR2*sin(theta_GR1 - theta_GR2))
struct[0].Gy[7,4] = V_GR1*(b_PMV_GR1*cos(theta_GR1 - theta_PMV) - g_PMV_GR1*sin(theta_GR1 - theta_PMV))
struct[0].Gy[7,5] = V_GR1*V_PMV*(b_PMV_GR1*sin(theta_GR1 - theta_PMV) + g_PMV_GR1*cos(theta_GR1 - theta_PMV))
struct[0].Gy[7,6] = 2*V_GR1*(-b_GR1_GR2 - b_PMV_GR1) + V_GR2*(b_GR1_GR2*cos(theta_GR1 - theta_GR2) - g_GR1_GR2*sin(theta_GR1 - theta_GR2)) + V_PMV*(b_PMV_GR1*cos(theta_GR1 - theta_PMV) - g_PMV_GR1*sin(theta_GR1 - theta_PMV))
struct[0].Gy[7,7] = V_GR1*V_GR2*(-b_GR1_GR2*sin(theta_GR1 - theta_GR2) - g_GR1_GR2*cos(theta_GR1 - theta_GR2)) + V_GR1*V_PMV*(-b_PMV_GR1*sin(theta_GR1 - theta_PMV) - g_PMV_GR1*cos(theta_GR1 - theta_PMV))
struct[0].Gy[7,8] = V_GR1*(b_GR1_GR2*cos(theta_GR1 - theta_GR2) - g_GR1_GR2*sin(theta_GR1 - theta_GR2))
struct[0].Gy[7,9] = V_GR1*V_GR2*(b_GR1_GR2*sin(theta_GR1 - theta_GR2) + g_GR1_GR2*cos(theta_GR1 - theta_GR2))
struct[0].Gy[8,6] = V_GR2*(b_GR1_GR2*sin(theta_GR1 - theta_GR2) - g_GR1_GR2*cos(theta_GR1 - theta_GR2))
struct[0].Gy[8,7] = V_GR1*V_GR2*(b_GR1_GR2*cos(theta_GR1 - theta_GR2) + g_GR1_GR2*sin(theta_GR1 - theta_GR2))
struct[0].Gy[8,8] = V_GR1*(b_GR1_GR2*sin(theta_GR1 - theta_GR2) - g_GR1_GR2*cos(theta_GR1 - theta_GR2)) + 2*V_GR2*g_GR1_GR2
struct[0].Gy[8,9] = V_GR1*V_GR2*(-b_GR1_GR2*cos(theta_GR1 - theta_GR2) - g_GR1_GR2*sin(theta_GR1 - theta_GR2))
struct[0].Gy[9,6] = V_GR2*(b_GR1_GR2*cos(theta_GR1 - theta_GR2) + g_GR1_GR2*sin(theta_GR1 - theta_GR2))
struct[0].Gy[9,7] = V_GR1*V_GR2*(-b_GR1_GR2*sin(theta_GR1 - theta_GR2) + g_GR1_GR2*cos(theta_GR1 - theta_GR2))
struct[0].Gy[9,8] = V_GR1*(b_GR1_GR2*cos(theta_GR1 - theta_GR2) + g_GR1_GR2*sin(theta_GR1 - theta_GR2)) - 2*V_GR2*b_GR1_GR2
struct[0].Gy[9,9] = V_GR1*V_GR2*(b_GR1_GR2*sin(theta_GR1 - theta_GR2) - g_GR1_GR2*cos(theta_GR1 - theta_GR2))
struct[0].Gy[10,4] = V_GR3*(-b_PMV_GR3*sin(theta_GR3 - theta_PMV) - g_PMV_GR3*cos(theta_GR3 - theta_PMV))
struct[0].Gy[10,5] = V_GR3*V_PMV*(b_PMV_GR3*cos(theta_GR3 - theta_PMV) - g_PMV_GR3*sin(theta_GR3 - theta_PMV))
struct[0].Gy[10,10] = 2*V_GR3*(g_GR3_GR4 + g_PMV_GR3) + V_GR4*(-b_GR3_GR4*sin(theta_GR3 - theta_GR4) - g_GR3_GR4*cos(theta_GR3 - theta_GR4)) + V_PMV*(-b_PMV_GR3*sin(theta_GR3 - theta_PMV) - g_PMV_GR3*cos(theta_GR3 - theta_PMV))
struct[0].Gy[10,11] = V_GR3*V_GR4*(-b_GR3_GR4*cos(theta_GR3 - theta_GR4) + g_GR3_GR4*sin(theta_GR3 - theta_GR4)) + V_GR3*V_PMV*(-b_PMV_GR3*cos(theta_GR3 - theta_PMV) + g_PMV_GR3*sin(theta_GR3 - theta_PMV))
struct[0].Gy[10,12] = V_GR3*(-b_GR3_GR4*sin(theta_GR3 - theta_GR4) - g_GR3_GR4*cos(theta_GR3 - theta_GR4))
struct[0].Gy[10,13] = V_GR3*V_GR4*(b_GR3_GR4*cos(theta_GR3 - theta_GR4) - g_GR3_GR4*sin(theta_GR3 - theta_GR4))
struct[0].Gy[11,4] = V_GR3*(b_PMV_GR3*cos(theta_GR3 - theta_PMV) - g_PMV_GR3*sin(theta_GR3 - theta_PMV))
struct[0].Gy[11,5] = V_GR3*V_PMV*(b_PMV_GR3*sin(theta_GR3 - theta_PMV) + g_PMV_GR3*cos(theta_GR3 - theta_PMV))
struct[0].Gy[11,10] = 2*V_GR3*(-b_GR3_GR4 - b_PMV_GR3) + V_GR4*(b_GR3_GR4*cos(theta_GR3 - theta_GR4) - g_GR3_GR4*sin(theta_GR3 - theta_GR4)) + V_PMV*(b_PMV_GR3*cos(theta_GR3 - theta_PMV) - g_PMV_GR3*sin(theta_GR3 - theta_PMV))
struct[0].Gy[11,11] = V_GR3*V_GR4*(-b_GR3_GR4*sin(theta_GR3 - theta_GR4) - g_GR3_GR4*cos(theta_GR3 - theta_GR4)) + V_GR3*V_PMV*(-b_PMV_GR3*sin(theta_GR3 - theta_PMV) - g_PMV_GR3*cos(theta_GR3 - theta_PMV))
struct[0].Gy[11,12] = V_GR3*(b_GR3_GR4*cos(theta_GR3 - theta_GR4) - g_GR3_GR4*sin(theta_GR3 - theta_GR4))
struct[0].Gy[11,13] = V_GR3*V_GR4*(b_GR3_GR4*sin(theta_GR3 - theta_GR4) + g_GR3_GR4*cos(theta_GR3 - theta_GR4))
struct[0].Gy[12,10] = V_GR4*(b_GR3_GR4*sin(theta_GR3 - theta_GR4) - g_GR3_GR4*cos(theta_GR3 - theta_GR4))
struct[0].Gy[12,11] = V_GR3*V_GR4*(b_GR3_GR4*cos(theta_GR3 - theta_GR4) + g_GR3_GR4*sin(theta_GR3 - theta_GR4))
struct[0].Gy[12,12] = V_GR3*(b_GR3_GR4*sin(theta_GR3 - theta_GR4) - g_GR3_GR4*cos(theta_GR3 - theta_GR4)) + 2*V_GR4*g_GR3_GR4
struct[0].Gy[12,13] = V_GR3*V_GR4*(-b_GR3_GR4*cos(theta_GR3 - theta_GR4) - g_GR3_GR4*sin(theta_GR3 - theta_GR4))
struct[0].Gy[13,10] = V_GR4*(b_GR3_GR4*cos(theta_GR3 - theta_GR4) + g_GR3_GR4*sin(theta_GR3 - theta_GR4))
struct[0].Gy[13,11] = V_GR3*V_GR4*(-b_GR3_GR4*sin(theta_GR3 - theta_GR4) + g_GR3_GR4*cos(theta_GR3 - theta_GR4))
struct[0].Gy[13,12] = V_GR3*(b_GR3_GR4*cos(theta_GR3 - theta_GR4) + g_GR3_GR4*sin(theta_GR3 - theta_GR4)) - 2*V_GR4*b_GR3_GR4
struct[0].Gy[13,13] = V_GR3*V_GR4*(b_GR3_GR4*sin(theta_GR3 - theta_GR4) - g_GR3_GR4*cos(theta_GR3 - theta_GR4))
struct[0].Gy[14,0] = cos(delta_GRI - theta_GRI)
struct[0].Gy[14,1] = V_GRI*sin(delta_GRI - theta_GRI)
struct[0].Gy[14,14] = X1d_GRI
struct[0].Gy[14,15] = R_a_GRI
struct[0].Gy[15,0] = sin(delta_GRI - theta_GRI)
struct[0].Gy[15,1] = -V_GRI*cos(delta_GRI - theta_GRI)
struct[0].Gy[15,14] = R_a_GRI
struct[0].Gy[15,15] = -X1q_GRI
struct[0].Gy[16,0] = i_d_GRI*sin(delta_GRI - theta_GRI) + i_q_GRI*cos(delta_GRI - theta_GRI)
struct[0].Gy[16,1] = -V_GRI*i_d_GRI*cos(delta_GRI - theta_GRI) + V_GRI*i_q_GRI*sin(delta_GRI - theta_GRI)
struct[0].Gy[16,14] = V_GRI*sin(delta_GRI - theta_GRI)
struct[0].Gy[16,15] = V_GRI*cos(delta_GRI - theta_GRI)
struct[0].Gy[16,16] = -1/S_n_GRI
struct[0].Gy[17,0] = i_d_GRI*cos(delta_GRI - theta_GRI) - i_q_GRI*sin(delta_GRI - theta_GRI)
struct[0].Gy[17,1] = V_GRI*i_d_GRI*sin(delta_GRI - theta_GRI) + V_GRI*i_q_GRI*cos(delta_GRI - theta_GRI)
struct[0].Gy[17,14] = V_GRI*cos(delta_GRI - theta_GRI)
struct[0].Gy[17,15] = -V_GRI*sin(delta_GRI - theta_GRI)
struct[0].Gy[17,17] = -1/S_n_GRI
struct[0].Gy[18,18] = -1
struct[0].Gy[19,19] = -1
struct[0].Gu[0,0] = -1/S_base
struct[0].Gu[1,1] = -1/S_base
struct[0].Gu[2,2] = -1/S_base
struct[0].Gu[3,3] = -1/S_base
struct[0].Gu[4,4] = -1/S_base
struct[0].Gu[5,5] = -1/S_base
struct[0].Gu[6,6] = -1/S_base
struct[0].Gu[7,7] = -1/S_base
struct[0].Gu[8,8] = -1/S_base
struct[0].Gu[9,9] = -1/S_base
struct[0].Gu[10,10] = -1/S_base
struct[0].Gu[11,11] = -1/S_base
struct[0].Gu[12,12] = -1/S_base
struct[0].Gu[13,13] = -1/S_base
@numba.njit(cache=True)
def Piecewise(arg):
out = arg[0][1]
N = len(arg)
for it in range(N-1,-1,-1):
if arg[it][1]: out = arg[it][0]
return out
@numba.njit(cache=True)
def ITE(arg):
out = arg[0][1]
N = len(arg)
for it in range(N-1,-1,-1):
if arg[it][1]: out = arg[it][0]
return out
@numba.njit(cache=True)
def Abs(x):
return np.abs(x)
@numba.njit(cache=True)
def daesolver(struct):
sin = np.sin
cos = np.cos
sqrt = np.sqrt
i = 0
Dt = struct[i].Dt
N_x = struct[i].N_x
N_y = struct[i].N_y
N_z = struct[i].N_z
decimation = struct[i].decimation
eye = np.eye(N_x)
t = struct[i].t
t_end = struct[i].t_end
if struct[i].it == 0:
run(t,struct, 1)
struct[i].it_store = 0
struct[i]['T'][0] = t
struct[i].X[0,:] = struct[i].x[:,0]
struct[i].Y[0,:] = struct[i].y_run[:,0]
struct[i].Z[0,:] = struct[i].h[:,0]
solver = struct[i].solvern
while t<t_end:
struct[i].it += 1
struct[i].t += Dt
t = struct[i].t
if solver == 5: # Teapezoidal DAE as in Milano's book
run(t,struct, 2)
run(t,struct, 3)
x = np.copy(struct[i].x[:])
y = np.copy(struct[i].y_run[:])
f = np.copy(struct[i].f[:])
g = np.copy(struct[i].g[:])
for iter in range(struct[i].imax):
run(t,struct, 2)
run(t,struct, 3)
run(t,struct,10)
run(t,struct,11)
x_i = struct[i].x[:]
y_i = struct[i].y_run[:]
f_i = struct[i].f[:]
g_i = struct[i].g[:]
F_x_i = struct[i].Fx[:,:]
F_y_i = struct[i].Fy[:,:]
G_x_i = struct[i].Gx[:,:]
G_y_i = struct[i].Gy[:,:]
A_c_i = np.vstack((np.hstack((eye-0.5*Dt*F_x_i, -0.5*Dt*F_y_i)),
np.hstack((G_x_i, G_y_i))))
f_n_i = x_i - x - 0.5*Dt*(f_i+f)
# print(t,iter,g_i)
Dxy_i = np.linalg.solve(-A_c_i,np.vstack((f_n_i,g_i)))
x_i = x_i + Dxy_i[0:N_x]
y_i = y_i + Dxy_i[N_x:(N_x+N_y)]
struct[i].x[:] = x_i
struct[i].y_run[:] = y_i
# [f_i,g_i,F_x_i,F_y_i,G_x_i,G_y_i] = smib_transient(x_i,y_i,u);
# A_c_i = [[eye(N_x)-0.5*Dt*F_x_i, -0.5*Dt*F_y_i],
# [ G_x_i, G_y_i]];
# f_n_i = x_i - x - 0.5*Dt*(f_i+f);
# Dxy_i = -A_c_i\[f_n_i.',g_i.'].';
# x_i = x_i + Dxy_i(1:N_x);
# y_i = y_i + Dxy_i(N_x+1:N_x+N_y);
xy = np.vstack((x_i,y_i))
max_relative = 0.0
for it_var in range(N_x+N_y):
abs_value = np.abs(xy[it_var,0])
if abs_value < 0.001:
abs_value = 0.001
relative_error = np.abs(Dxy_i[it_var,0])/abs_value
if relative_error > max_relative: max_relative = relative_error
if max_relative<struct[i].itol:
break
# if iter>struct[i].imax-2:
# print('Convergence problem')
struct[i].x[:] = x_i
struct[i].y_run[:] = y_i
# channels
if struct[i].store == 1:
it_store = struct[i].it_store
if struct[i].it >= it_store*decimation:
struct[i]['T'][it_store+1] = t
struct[i].X[it_store+1,:] = struct[i].x[:,0]
struct[i].Y[it_store+1,:] = struct[i].y_run[:,0]
struct[i].Z[it_store+1,:] = struct[i].h[:,0]
struct[i].iters[it_store+1,0] = iter
struct[i].it_store += 1
struct[i].t = t
return t
| 61.095615 | 520 | 0.627484 | 23,033 | 115,654 | 2.782139 | 0.013719 | 0.119723 | 0.055336 | 0.036891 | 0.92653 | 0.906727 | 0.892386 | 0.876797 | 0.866653 | 0.855121 | 0 | 0.080352 | 0.207576 | 115,654 | 1,892 | 521 | 61.127907 | 0.618864 | 0.019429 | 0 | 0.736299 | 0 | 0 | 0.011738 | 0.001194 | 0 | 0 | 0 | 0 | 0 | 1 | 0.027079 | false | 0 | 0.002579 | 0.001289 | 0.045777 | 0.003868 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
400ab2f34a127823bbb296bbc2616dfd10727217 | 134 | py | Python | allennlp_models/classification/__init__.py | matt-peters/allennlp-models | cdd505ed539fdc2b82e4cc0a23eae4bfd3368e7e | [
"Apache-2.0"
] | 1 | 2022-02-10T22:19:55.000Z | 2022-02-10T22:19:55.000Z | allennlp_models/classification/__init__.py | matt-peters/allennlp-models | cdd505ed539fdc2b82e4cc0a23eae4bfd3368e7e | [
"Apache-2.0"
] | 29 | 2020-10-29T20:28:47.000Z | 2022-03-28T13:05:18.000Z | allennlp_models/classification/__init__.py | matt-peters/allennlp-models | cdd505ed539fdc2b82e4cc0a23eae4bfd3368e7e | [
"Apache-2.0"
] | null | null | null | # flake8: noqa: F403
from allennlp_models.classification.models import *
from allennlp_models.classification.dataset_readers import *
| 33.5 | 60 | 0.843284 | 16 | 134 | 6.875 | 0.625 | 0.218182 | 0.327273 | 0.581818 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.032787 | 0.089552 | 134 | 3 | 61 | 44.666667 | 0.868852 | 0.134328 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
401c2e30c15e17596ce735d3d4aa0da8362ceda3 | 81,931 | py | Python | anuga/shallow_water/tests/test_forcing.py | samcom12/anuga_core | f4378114dbf02d666fe6423de45798add5c42806 | [
"Python-2.0",
"OLDAP-2.7"
] | null | null | null | anuga/shallow_water/tests/test_forcing.py | samcom12/anuga_core | f4378114dbf02d666fe6423de45798add5c42806 | [
"Python-2.0",
"OLDAP-2.7"
] | null | null | null | anuga/shallow_water/tests/test_forcing.py | samcom12/anuga_core | f4378114dbf02d666fe6423de45798add5c42806 | [
"Python-2.0",
"OLDAP-2.7"
] | null | null | null | """ Test environmental forcing - rain, wind, etc.
"""
from __future__ import division
from builtins import str
from builtins import range
from past.utils import old_div
from future.utils import raise_
import unittest, os
import anuga
from anuga.shallow_water.shallow_water_domain import Domain
from anuga.shallow_water.boundaries import Reflective_boundary
from anuga.coordinate_transforms.geo_reference import Geo_reference
from anuga.file_conversion.file_conversion import timefile2netcdf
from anuga.abstract_2d_finite_volumes.mesh_factory import rectangular
from anuga.abstract_2d_finite_volumes.util import file_function
from anuga.config import netcdf_mode_r, netcdf_mode_w, netcdf_mode_a
from anuga.shallow_water.forcing import *
import numpy as num
import warnings
def scalar_func_list(t, x, y):
"""Function that returns a scalar.
Used to test error message when numeric array is expected
"""
return [17.7]
def speed(t, x, y):
"""
Variable windfield implemented using functions
Large speeds halfway between center and edges
Low speeds at center and edges
"""
from math import exp, cos, pi
x = num.array(x)
y = num.array(y)
N = len(x)
s = 0*x #New array
for k in range(N):
r = num.sqrt(x[k]**2 + y[k]**2)
factor = exp(-(r-0.15)**2)
s[k] = 4000 * factor * (cos(old_div(t*2*pi,150)) + 2)
return s
def angle(t, x, y):
"""Rotating field
"""
from math import atan, pi
x = num.array(x)
y = num.array(y)
N = len(x)
a = 0 * x # New array
for k in range(N):
r = num.sqrt(x[k]**2 + y[k]**2)
angle = atan(old_div(y[k],x[k]))
if x[k] < 0:
angle += pi
# Take normal direction
angle -= old_div(pi,2)
# Ensure positive radians
if angle < 0:
angle += 2*pi
a[k] = old_div(angle,pi)*180
return a
def time_varying_speed(t, x, y):
"""
Variable speed windfield
"""
from math import exp, cos, pi
x = num.array(x,float)
y = num.array(y,float)
N = len(x)
s = 0*x #New array
#dx=x[-1]-x[0]; dy = y[-1]-y[0]
S=100.
for k in range(N):
s[k]=S*(1.+t/100.)
return s
def time_varying_angle(t, x, y):
"""Rotating field
"""
from math import atan, pi
x = num.array(x,float)
y = num.array(y,float)
N = len(x)
a = 0 * x # New array
phi=135.
for k in range(N):
a[k]=phi*(1.+t/100.)
return a
def time_varying_pressure(t, x, y):
"""Rotating field
"""
from math import atan, pi
x = num.array(x,float)
y = num.array(y,float)
N = len(x)
p = 0 * x # New array
p0=1000.
for k in range(N):
p[k]=p0*(1.-t/100.)
return p
def spatial_linear_varying_speed(t, x, y):
"""
Variable speed windfield
"""
from math import exp, cos, pi
x = num.array(x)
y = num.array(y)
N = len(x)
s = 0*x #New array
#dx=x[-1]-x[0]; dy = y[-1]-y[0]
s0=250.
ymin=num.min(y)
xmin=num.min(x)
a=0.000025; b=0.0000125;
for k in range(N):
s[k]=s0*(1+t/100.)+a*x[k]+b*y[k]
return s
def spatial_linear_varying_angle(t, x, y):
"""Rotating field
"""
from math import atan, pi
x = num.array(x)
y = num.array(y)
N = len(x)
a = 0 * x # New array
phi=135.
b1=0.000025; b2=0.00001125;
for k in range(N):
a[k]=phi*(1+t/100.)+b1*x[k]+b2*y[k]
return a
def spatial_linear_varying_pressure(t, x, y):
p0=1000;
a=0.000025; b=0.0000125;
x = num.array(x)
y = num.array(y)
N = len(x)
p = 0 * x # New array
for k in range(N):
p[k]=p0*(1.-t/100.)+a*x[k]+b*y[k]
return p
def grid_1d(x0,dx,nx):
x = num.empty(nx,dtype=float)
for i in range(nx):
x[i]=x0+float(i)*dx
return x
def ndgrid(x,y):
nx = len(x)
ny = len(y)
X = num.empty(nx*ny,dtype=float)
Y = num.empty(nx*ny,dtype=float)
k=0
for i in range(nx):
for j in range(ny):
X[k]=x[i]
Y[k]=y[j]
k+=1
return X,Y
class Test_Forcing(unittest.TestCase):
def setUp(self):
pass
def tearDown(self):
for file in ['domain.sww']:
try:
os.remove(file)
except:
pass
def write_wind_pressure_field_sts(self,
field_sts_filename,
nrows=10,
ncols=10,
cellsize=25,
origin=(0.0,0.0),
refzone=50,
timestep=1,
number_of_timesteps=10,
angle=135.0,
speed=100.0,
pressure=1000.0):
xllcorner=origin[0]
yllcorner=origin[1]
starttime = 0; endtime = number_of_timesteps*timestep;
no_data = -9999
time = num.arange(starttime, endtime, timestep, dtype='i')
x = grid_1d(xllcorner,cellsize,ncols)
y = grid_1d(yllcorner,cellsize,nrows)
[X,Y] = ndgrid(x,y)
number_of_points = nrows*ncols
wind_speed = num.empty((number_of_timesteps,nrows*ncols),dtype=float)
wind_angle = num.empty((number_of_timesteps,nrows*ncols),dtype=float)
barometric_pressure = num.empty((number_of_timesteps,nrows*ncols),
dtype=float)
if ( callable(speed) and callable(angle) and callable(pressure) ):
x = num.ones(3, float)
y = num.ones(3, float)
try:
s = speed(1.0, x=x, y=y)
a = angle(1.0, x=x, y=y)
p = pressure(1.0, x=x, y=y)
use_function=True
except Exception as e:
msg = 'Function could not be executed.\n'
raise_(Exception, msg)
else:
try :
speed=float(speed)
angle=float(angle)
pressure=float(pressure)
use_function=False
except:
msg = ('Force fields must be a scalar value coercible to float.')
raise_(Exception, msg)
for i,t in enumerate(time):
if ( use_function ):
wind_speed[i,:] = speed(t,X,Y)
wind_angle[i,:] = angle(t,X,Y)
barometric_pressure[i,:] = pressure(t,X,Y)
else:
wind_speed[i,:] = speed
wind_angle[i,:] = angle
barometric_pressure[i,:] = pressure
# "Creating the field STS NetCDF file"
fid = NetCDFFile(field_sts_filename+'.sts', 'w')
fid.institution = 'Geoscience Australia'
fid.description = "description"
fid.starttime = 0.0
fid.ncols = ncols
fid.nrows = nrows
fid.cellsize = cellsize
fid.no_data = no_data
fid.createDimension('number_of_points', number_of_points)
fid.createDimension('number_of_timesteps', number_of_timesteps)
fid.createDimension('numbers_in_range', 2)
fid.createVariable('x', 'd', ('number_of_points',))
fid.createVariable('y', 'd', ('number_of_points',))
fid.createVariable('time', 'i', ('number_of_timesteps',))
fid.createVariable('wind_speed', 'd', ('number_of_timesteps',
'number_of_points'))
fid.createVariable('wind_speed_range', 'd', ('numbers_in_range', ))
fid.createVariable('wind_angle', 'd', ('number_of_timesteps',
'number_of_points'))
fid.createVariable('wind_angle_range', 'd', ('numbers_in_range',))
fid.createVariable('barometric_pressure', 'd', ('number_of_timesteps',
'number_of_points'))
fid.createVariable('barometric_pressure_range', 'd', ('numbers_in_range',))
fid.variables['wind_speed_range'][:] = num.array([1e+036, -1e+036])
fid.variables['wind_angle_range'][:] = num.array([1e+036, -1e+036])
fid.variables['barometric_pressure_range'][:] = num.array([1e+036, -1e+036])
fid.variables['time'][:] = time
ws = fid.variables['wind_speed']
wa = fid.variables['wind_angle']
pr = fid.variables['barometric_pressure']
for i in range(number_of_timesteps):
ws[i] = wind_speed[i,:]
wa[i] = wind_angle[i,:]
pr[i] = barometric_pressure[i,:]
origin = anuga.coordinate_transforms.geo_reference.Geo_reference(refzone,
xllcorner,
yllcorner)
geo_ref = anuga.coordinate_transforms.geo_reference.write_NetCDF_georeference(origin, fid)
fid.variables['x'][:]=X-geo_ref.get_xllcorner()
fid.variables['y'][:]=Y-geo_ref.get_yllcorner()
fid.close()
def test_constant_wind_stress(self):
from anuga.config import rho_a, rho_w, eta_w
from math import pi, cos, sin
a = [0.0, 0.0]
b = [0.0, 2.0]
c = [2.0, 0.0]
d = [0.0, 4.0]
e = [2.0, 2.0]
f = [4.0, 0.0]
points = [a, b, c, d, e, f]
# bac, bce, ecf, dbe
vertices = [[1,0,2], [1,2,4], [4,2,5], [3,1,4]]
domain = Domain(points, vertices)
#Flat surface with 1m of water
domain.set_quantity('elevation', 0)
domain.set_quantity('stage', 1.0)
domain.set_quantity('friction', 0)
Br = Reflective_boundary(domain)
domain.set_boundary({'exterior': Br})
#Setup only one forcing term, constant wind stress
s = 100
phi = 135
domain.forcing_terms = []
domain.forcing_terms.append(Wind_stress(s, phi))
domain.compute_forcing_terms()
const = old_div(eta_w*rho_a, rho_w)
#Convert to radians
phi = old_div(phi*pi, 180)
#Compute velocity vector (u, v)
u = s*cos(phi)
v = s*sin(phi)
#Compute wind stress
S = const * num.sqrt(u**2 + v**2)
assert num.allclose(domain.quantities['stage'].explicit_update, 0)
assert num.allclose(domain.quantities['xmomentum'].explicit_update, S*u)
assert num.allclose(domain.quantities['ymomentum'].explicit_update, S*v)
def test_variable_wind_stress(self):
from anuga.config import rho_a, rho_w, eta_w
from math import pi, cos, sin
a = [0.0, 0.0]
b = [0.0, 2.0]
c = [2.0, 0.0]
d = [0.0, 4.0]
e = [2.0, 2.0]
f = [4.0, 0.0]
points = [a, b, c, d, e, f]
# bac, bce, ecf, dbe
vertices = [[1,0,2], [1,2,4], [4,2,5], [3,1,4]]
domain = Domain(points, vertices)
#Flat surface with 1m of water
domain.set_quantity('elevation', 0)
domain.set_quantity('stage', 1.0)
domain.set_quantity('friction', 0)
Br = Reflective_boundary(domain)
domain.set_boundary({'exterior': Br})
domain.set_time(5.54) # Take a random time (not zero)
#Setup only one forcing term, constant wind stress
s = 100
phi = 135
domain.forcing_terms = []
domain.forcing_terms.append(Wind_stress(s=speed, phi=angle))
domain.compute_forcing_terms()
#Compute reference solution
const = eta_w*rho_a/rho_w
N = len(domain) # number_of_triangles
xc = domain.get_centroid_coordinates()
t = domain.get_time()
x = xc[:,0]
y = xc[:,1]
s_vec = speed(t,x,y)
phi_vec = angle(t,x,y)
for k in range(N):
# Convert to radians
phi = old_div(phi_vec[k]*pi, 180)
s = s_vec[k]
# Compute velocity vector (u, v)
u = s*cos(phi)
v = s*sin(phi)
# Compute wind stress
S = const * num.sqrt(u**2 + v**2)
assert num.allclose(domain.quantities['stage'].explicit_update[k],
0)
assert num.allclose(domain.quantities['xmomentum'].\
explicit_update[k],
S*u)
assert num.allclose(domain.quantities['ymomentum'].\
explicit_update[k],
S*v)
def test_windfield_from_file(self):
import time
from anuga.config import rho_a, rho_w, eta_w
from math import pi, cos, sin
from anuga.config import time_format
from anuga.abstract_2d_finite_volumes.util import file_function
a = [0.0, 0.0]
b = [0.0, 2.0]
c = [2.0, 0.0]
d = [0.0, 4.0]
e = [2.0, 2.0]
f = [4.0, 0.0]
points = [a, b, c, d, e, f]
# bac, bce, ecf, dbe
vertices = [[1,0,2], [1,2,4], [4,2,5], [3,1,4]]
domain = Domain(points, vertices)
# Flat surface with 1m of water
domain.set_quantity('elevation', 0)
domain.set_quantity('stage', 1.0)
domain.set_quantity('friction', 0)
Br = Reflective_boundary(domain)
domain.set_boundary({'exterior': Br})
domain.set_time(7) # Take a time that is represented in file (not zero)
# Write wind stress file (ensure that domaim time is covered)
# Take x=1 and y=0
filename = 'test_windstress_from_file'
start = time.mktime(time.strptime('2000', '%Y'))
fid = open(filename + '.txt', 'w')
dt = 1 # One second interval
t = 0.0
while t <= 10.0:
t_string = time.strftime(time_format, time.gmtime(t+start))
fid.write('%s, %f %f\n' %
(t_string, speed(t,[1],[0])[0], angle(t,[1],[0])[0]))
t += dt
fid.close()
timefile2netcdf(filename + '.txt')
os.remove(filename + '.txt')
# Setup wind stress
F = file_function(filename + '.tms',
quantities=['Attribute0', 'Attribute1'])
os.remove(filename + '.tms')
W = Wind_stress(F)
domain.forcing_terms = []
domain.forcing_terms.append(W)
domain.compute_forcing_terms()
# Compute reference solution
const = old_div(eta_w*rho_a, rho_w)
N = len(domain) # number_of_triangles
t = domain.get_time()
s = speed(t, [1], [0])[0]
phi = angle(t, [1], [0])[0]
# Convert to radians
phi = old_div(phi*pi, 180)
# Compute velocity vector (u, v)
u = s*cos(phi)
v = s*sin(phi)
# Compute wind stress
S = const * num.sqrt(u**2 + v**2)
for k in range(N):
assert num.allclose(domain.quantities['stage'].explicit_update[k],
0)
assert num.allclose(domain.quantities['xmomentum'].\
explicit_update[k],
S*u)
assert num.allclose(domain.quantities['ymomentum'].\
explicit_update[k],
S*v)
def test_windfield_from_file_seconds(self):
import time
from anuga.config import rho_a, rho_w, eta_w
from math import pi, cos, sin
from anuga.config import time_format
from anuga.abstract_2d_finite_volumes.util import file_function
a = [0.0, 0.0]
b = [0.0, 2.0]
c = [2.0, 0.0]
d = [0.0, 4.0]
e = [2.0, 2.0]
f = [4.0, 0.0]
points = [a, b, c, d, e, f]
# bac, bce, ecf, dbe
vertices = [[1,0,2], [1,2,4], [4,2,5], [3,1,4]]
domain = Domain(points, vertices)
# Flat surface with 1m of water
domain.set_quantity('elevation', 0)
domain.set_quantity('stage', 1.0)
domain.set_quantity('friction', 0)
Br = Reflective_boundary(domain)
domain.set_boundary({'exterior': Br})
domain.set_time(7) # Take a time that is represented in file (not zero)
# Write wind stress file (ensure that domain time is covered)
# Take x=1 and y=0
filename = 'test_windstress_from_file'
start = time.mktime(time.strptime('2000', '%Y'))
fid = open(filename + '.txt', 'w')
dt = 0.5 # Half second interval
t = 0.0
while t <= 10.0:
fid.write('%s, %f %f\n'
% (str(t), speed(t, [1], [0])[0], angle(t, [1], [0])[0]))
t += dt
fid.close()
timefile2netcdf(filename + '.txt', time_as_seconds=True)
os.remove(filename + '.txt')
# Setup wind stress
F = file_function(filename + '.tms',
quantities=['Attribute0', 'Attribute1'])
os.remove(filename + '.tms')
W = Wind_stress(F)
domain.forcing_terms = []
domain.forcing_terms.append(W)
domain.compute_forcing_terms()
# Compute reference solution
const = old_div(eta_w*rho_a, rho_w)
N = len(domain) # number_of_triangles
t = domain.get_time()
s = speed(t, [1], [0])[0]
phi = angle(t, [1], [0])[0]
# Convert to radians
phi = old_div(phi*pi, 180)
# Compute velocity vector (u, v)
u = s*cos(phi)
v = s*sin(phi)
# Compute wind stress
S = const * num.sqrt(u**2 + v**2)
for k in range(N):
assert num.allclose(domain.quantities['stage'].explicit_update[k],
0)
assert num.allclose(domain.quantities['xmomentum'].\
explicit_update[k],
S*u)
assert num.allclose(domain.quantities['ymomentum'].\
explicit_update[k],
S*v)
def test_wind_stress_error_condition(self):
"""Test that windstress reacts properly when forcing functions
are wrong - e.g. returns a scalar
"""
from math import pi, cos, sin
from anuga.config import rho_a, rho_w, eta_w
a = [0.0, 0.0]
b = [0.0, 2.0]
c = [2.0, 0.0]
d = [0.0, 4.0]
e = [2.0, 2.0]
f = [4.0, 0.0]
points = [a, b, c, d, e, f]
# bac, bce, ecf, dbe
vertices = [[1,0,2], [1,2,4], [4,2,5], [3,1,4]]
domain = Domain(points, vertices)
# Flat surface with 1m of water
domain.set_quantity('elevation', 0)
domain.set_quantity('stage', 1.0)
domain.set_quantity('friction', 0)
Br = Reflective_boundary(domain)
domain.set_boundary({'exterior': Br})
domain.set_time(5.54) # Take a random time (not zero)
# Setup only one forcing term, bad func
domain.forcing_terms = []
try:
domain.forcing_terms.append(Wind_stress(s=scalar_func_list,
phi=angle))
except AssertionError:
pass
else:
msg = 'Should have raised exception'
raise_(Exception, msg)
try:
domain.forcing_terms.append(Wind_stress(s=speed, phi=scalar_func))
except Exception:
pass
else:
msg = 'Should have raised exception'
raise_(Exception, msg)
try:
domain.forcing_terms.append(Wind_stress(s=speed, phi='xx'))
except:
pass
else:
msg = 'Should have raised exception'
raise_(Exception, msg)
def test_rainfall(self):
from math import pi, cos, sin
a = [0.0, 0.0]
b = [0.0, 2.0]
c = [2.0, 0.0]
d = [0.0, 4.0]
e = [2.0, 2.0]
f = [4.0, 0.0]
points = [a, b, c, d, e, f]
# bac, bce, ecf, dbe
vertices = [[1,0,2], [1,2,4], [4,2,5], [3,1,4]]
domain = Domain(points, vertices)
# Flat surface with 1m of water
domain.set_quantity('elevation', 0)
domain.set_quantity('stage', 1.0)
domain.set_quantity('friction', 0)
Br = Reflective_boundary(domain)
domain.set_boundary({'exterior': Br})
# Setup only one forcing term, constant rainfall
domain.forcing_terms = []
domain.forcing_terms.append(Rainfall(domain, rate=2.0))
domain.compute_forcing_terms()
assert num.allclose(domain.quantities['stage'].explicit_update,
2.0/1000)
def test_rainfall_restricted_by_polygon(self):
from math import pi, cos, sin
a = [0.0, 0.0]
b = [0.0, 2.0]
c = [2.0, 0.0]
d = [0.0, 4.0]
e = [2.0, 2.0]
f = [4.0, 0.0]
points = [a, b, c, d, e, f]
# bac, bce, ecf, dbe
vertices = [[1,0,2], [1,2,4], [4,2,5], [3,1,4]]
domain = Domain(points, vertices)
# Flat surface with 1m of water
domain.set_quantity('elevation', 0)
domain.set_quantity('stage', 1.0)
domain.set_quantity('friction', 0)
Br = Reflective_boundary(domain)
domain.set_boundary({'exterior': Br})
# Setup only one forcing term, constant rainfall
# restricted to a polygon enclosing triangle #1 (bce)
domain.forcing_terms = []
R = Rainfall(domain, rate=2.0, polygon=[[1,1], [2,1], [2,2], [1,2]])
assert num.allclose(R.exchange_area, 2)
domain.forcing_terms.append(R)
domain.compute_forcing_terms()
assert num.allclose(domain.quantities['stage'].explicit_update[1],
2.0/1000)
assert num.allclose(domain.quantities['stage'].explicit_update[0], 0)
assert num.allclose(domain.quantities['stage'].explicit_update[2:], 0)
def test_time_dependent_rainfall_restricted_by_polygon(self):
a = [0.0, 0.0]
b = [0.0, 2.0]
c = [2.0, 0.0]
d = [0.0, 4.0]
e = [2.0, 2.0]
f = [4.0, 0.0]
points = [a, b, c, d, e, f]
# bac, bce, ecf, dbe
vertices = [[1,0,2], [1,2,4], [4,2,5], [3,1,4]]
domain = Domain(points, vertices)
# Flat surface with 1m of water
domain.set_quantity('elevation', 0)
domain.set_quantity('stage', 1.0)
domain.set_quantity('friction', 0)
Br = Reflective_boundary(domain)
domain.set_boundary({'exterior': Br})
# Setup only one forcing term, time dependent rainfall
# restricted to a polygon enclosing triangle #1 (bce)
domain.forcing_terms = []
R = Rainfall(domain,
rate=lambda t: 3*t + 7,
polygon = [[1,1], [2,1], [2,2], [1,2]])
assert num.allclose(R.exchange_area, 2)
domain.forcing_terms.append(R)
domain.set_time(10.0)
domain.compute_forcing_terms()
assert num.allclose(domain.quantities['stage'].explicit_update[1],
(3*domain.get_time() + 7)/1000)
assert num.allclose(domain.quantities['stage'].explicit_update[0], 0)
assert num.allclose(domain.quantities['stage'].explicit_update[2:], 0)
def test_time_dependent_rainfall_using_starttime(self):
rainfall_poly = ensure_numeric([[1,1], [2,1], [2,2], [1,2]], float)
a = [0.0, 0.0]
b = [0.0, 2.0]
c = [2.0, 0.0]
d = [0.0, 4.0]
e = [2.0, 2.0]
f = [4.0, 0.0]
points = [a, b, c, d, e, f]
# bac, bce, ecf, dbe
vertices = [[1,0,2], [1,2,4], [4,2,5], [3,1,4]]
domain = Domain(points, vertices)
# Flat surface with 1m of water
domain.set_quantity('elevation', 0)
domain.set_quantity('stage', 1.0)
domain.set_quantity('friction', 0)
Br = Reflective_boundary(domain)
domain.set_boundary({'exterior': Br})
# Setup only one forcing term, time dependent rainfall
# restricted to a polygon enclosing triangle #1 (bce)
domain.forcing_terms = []
R = Rainfall(domain,
rate=lambda t: 3*t + 7,
polygon=rainfall_poly)
assert num.allclose(R.exchange_area, 2)
domain.forcing_terms.append(R)
# This will test that time is set to starttime in set_starttime
domain.set_starttime(5.0)
domain.compute_forcing_terms()
assert num.allclose(domain.quantities['stage'].explicit_update[1],
old_div((3*domain.get_time() + 7),1000))
assert num.allclose(domain.quantities['stage'].explicit_update[0], 0)
assert num.allclose(domain.quantities['stage'].explicit_update[2:], 0)
def test_absolute_time_dependent_rainfall_using_starttime(self):
rainfall_poly = ensure_numeric([[1,1], [2,1], [2,2], [1,2]], float)
a = [0.0, 0.0]
b = [0.0, 2.0]
c = [2.0, 0.0]
d = [0.0, 4.0]
e = [2.0, 2.0]
f = [4.0, 0.0]
points = [a, b, c, d, e, f]
# bac, bce, ecf, dbe
vertices = [[1,0,2], [1,2,4], [4,2,5], [3,1,4]]
domain = Domain(points, vertices)
# Flat surface with 1m of water
domain.set_quantity('elevation', 0)
domain.set_quantity('stage', 1.0)
domain.set_quantity('friction', 0)
Br = Reflective_boundary(domain)
domain.set_boundary({'exterior': Br})
# Setup only one forcing term, time dependent rainfall
# restricted to a polygon enclosing triangle #1 (bce)
domain.forcing_terms = []
R = Rainfall(domain,
rate=lambda t: 3*t + 7,
polygon=rainfall_poly)
assert num.allclose(R.exchange_area, 2)
domain.forcing_terms.append(R)
# This will test that time is set to starttime in set_starttime
domain.set_starttime(5.0)
domain.compute_forcing_terms()
assert num.allclose(domain.quantities['stage'].explicit_update[1],
old_div((3*domain.get_starttime() + 7),1000))
assert num.allclose(domain.quantities['stage'].explicit_update[0], 0)
assert num.allclose(domain.quantities['stage'].explicit_update[2:], 0)
def test_time_dependent_rainfall_using_georef(self):
"""test_time_dependent_rainfall_using_georef
This will also test the General forcing term using georef
"""
# Mesh in zone 56 (absolute coords)
x0 = 314036.58727982
y0 = 6224951.2960092
rainfall_poly = ensure_numeric([[1,1], [2,1], [2,2], [1,2]], float)
rainfall_poly += [x0, y0]
a = [0.0, 0.0]
b = [0.0, 2.0]
c = [2.0, 0.0]
d = [0.0, 4.0]
e = [2.0, 2.0]
f = [4.0, 0.0]
points = [a, b, c, d, e, f]
# bac, bce, ecf, dbe
vertices = [[1,0,2], [1,2,4], [4,2,5], [3,1,4]]
domain = Domain(points, vertices,
geo_reference=Geo_reference(56, x0, y0))
# Flat surface with 1m of water
domain.set_quantity('elevation', 0)
domain.set_quantity('stage', 1.0)
domain.set_quantity('friction', 0)
Br = Reflective_boundary(domain)
domain.set_boundary({'exterior': Br})
# Setup only one forcing term, time dependent rainfall
# restricted to a polygon enclosing triangle #1 (bce)
domain.forcing_terms = []
R = Rainfall(domain,
rate=lambda t: 3*t + 7,
polygon=rainfall_poly)
assert num.allclose(R.exchange_area, 2)
domain.forcing_terms.append(R)
# This will test that time is set to starttime in set_starttime
domain.set_starttime(5.0)
domain.compute_forcing_terms()
assert num.allclose(domain.quantities['stage'].explicit_update[1],
old_div((3*domain.get_time() + 7),1000))
assert num.allclose(domain.quantities['stage'].explicit_update[0], 0)
assert num.allclose(domain.quantities['stage'].explicit_update[2:], 0)
def test_absolute_time_dependent_rainfall_using_georef(self):
"""test_time_dependent_rainfall_using_georef
This will also test the General forcing term using georef
"""
# Mesh in zone 56 (absolute coords)
x0 = 314036.58727982
y0 = 6224951.2960092
rainfall_poly = ensure_numeric([[1,1], [2,1], [2,2], [1,2]], float)
rainfall_poly += [x0, y0]
a = [0.0, 0.0]
b = [0.0, 2.0]
c = [2.0, 0.0]
d = [0.0, 4.0]
e = [2.0, 2.0]
f = [4.0, 0.0]
points = [a, b, c, d, e, f]
# bac, bce, ecf, dbe
vertices = [[1,0,2], [1,2,4], [4,2,5], [3,1,4]]
domain = Domain(points, vertices,
geo_reference=Geo_reference(56, x0, y0))
# Flat surface with 1m of water
domain.set_quantity('elevation', 0)
domain.set_quantity('stage', 1.0)
domain.set_quantity('friction', 0)
Br = Reflective_boundary(domain)
domain.set_boundary({'exterior': Br})
# Setup only one forcing term, time dependent rainfall
# restricted to a polygon enclosing triangle #1 (bce)
domain.forcing_terms = []
R = Rainfall(domain,
rate=lambda t: 3*t + 7,
polygon=rainfall_poly)
assert num.allclose(R.exchange_area, 2)
domain.forcing_terms.append(R)
# This will test that time is set to starttime in set_starttime
domain.set_starttime(5.0)
domain.set_time(5.0)
domain.compute_forcing_terms()
# print(domain.quantities['stage'].explicit_update[1])
# print((3*domain.get_time() + 7)/1000.0)
# print(domain.relative_time)
# print(domain.get_time())
assert num.allclose(domain.quantities['stage'].explicit_update[1],
(3*domain.get_time() + 7)/1000.0)
assert num.allclose(domain.quantities['stage'].explicit_update[0], 0)
assert num.allclose(domain.quantities['stage'].explicit_update[2:], 0)
def test_absolute_time_dependent_rainfall_restricted_by_polygon_with_default(self):
"""
Test that default rainfall can be used when given rate runs out of data.
"""
import warnings
warnings.simplefilter('ignore', UserWarning)
a = [0.0, 0.0]
b = [0.0, 2.0]
c = [2.0, 0.0]
d = [0.0, 4.0]
e = [2.0, 2.0]
f = [4.0, 0.0]
points = [a, b, c, d, e, f]
# bac, bce, ecf, dbe
vertices = [[1,0,2], [1,2,4], [4,2,5], [3,1,4]]
domain = Domain(points, vertices)
# Flat surface with 1m of water
domain.set_quantity('elevation', 0)
domain.set_quantity('stage', 1.0)
domain.set_quantity('friction', 0)
Br = Reflective_boundary(domain)
domain.set_boundary({'exterior': Br})
# Setup only one forcing term, time dependent rainfall
# that expires at t==20
from anuga.fit_interpolate.interpolate import Modeltime_too_late
def main_rate(t):
if t > 20:
msg = 'Model time exceeded.'
raise_(Modeltime_too_late, msg)
else:
return 3*t + 7
domain.forcing_terms = []
R = Rainfall(domain,
rate=main_rate,
polygon = [[1,1], [2,1], [2,2], [1,2]],
default_rate=5.0)
assert num.allclose(R.exchange_area, 2)
domain.forcing_terms.append(R)
domain.set_time(10.)
domain.compute_forcing_terms()
assert num.allclose(domain.quantities['stage'].explicit_update[1],
(3*domain.get_time()+7)/1000)
assert num.allclose(domain.quantities['stage'].explicit_update[0], 0)
assert num.allclose(domain.quantities['stage'].explicit_update[2:], 0)
domain.set_time(100.)
domain.quantities['stage'].explicit_update[:] = 0.0 # Reset
domain.compute_forcing_terms()
assert num.allclose(domain.quantities['stage'].explicit_update[1],
5.0/1000) # Default value
assert num.allclose(domain.quantities['stage'].explicit_update[0], 0)
assert num.allclose(domain.quantities['stage'].explicit_update[2:], 0)
def test_rainfall_forcing_with_evolve(self):
"""test_rainfall_forcing_with_evolve
Test how forcing terms are called within evolve
"""
# FIXME(Ole): This test is just to experiment
import warnings
warnings.simplefilter('ignore', UserWarning)
a = [0.0, 0.0]
b = [0.0, 2.0]
c = [2.0, 0.0]
d = [0.0, 4.0]
e = [2.0, 2.0]
f = [4.0, 0.0]
points = [a, b, c, d, e, f]
# bac, bce, ecf, dbe
vertices = [[1,0,2], [1,2,4], [4,2,5], [3,1,4]]
domain = Domain(points, vertices)
# Flat surface with 1m of water
domain.set_quantity('elevation', 0)
domain.set_quantity('stage', 1.0)
domain.set_quantity('friction', 0)
Br = Reflective_boundary(domain)
domain.set_boundary({'exterior': Br})
# Setup only one forcing term, time dependent rainfall
# that expires at t==20
from anuga.fit_interpolate.interpolate import Modeltime_too_late
def main_rate(t):
if t > 20:
msg = 'Model time exceeded.'
raise_(Modeltime_too_late, msg)
else:
return 3*t + 7
domain.forcing_terms = []
R = Rainfall(domain,
rate=main_rate,
polygon=[[1,1], [2,1], [2,2], [1,2]],
default_rate=5.0)
assert num.allclose(R.exchange_area, 2)
domain.forcing_terms.append(R)
for t in domain.evolve(yieldstep=1, finaltime=25):
pass
#FIXME(Ole): A test here is hard because explicit_update also
# receives updates from the flux calculation.
def test_rainfall_forcing_with_evolve_1(self):
"""test_rainfall_forcing_with_evolve_exception
Test how forcing terms are called within evolve.
This test checks that proper exception is thrown when no default_rate is set
"""
import warnings
warnings.simplefilter('ignore', UserWarning)
a = [0.0, 0.0]
b = [0.0, 2.0]
c = [2.0, 0.0]
d = [0.0, 4.0]
e = [2.0, 2.0]
f = [4.0, 0.0]
points = [a, b, c, d, e, f]
# bac, bce, ecf, dbe
vertices = [[1,0,2], [1,2,4], [4,2,5], [3,1,4]]
domain = Domain(points, vertices)
# Flat surface with 1m of water
domain.set_quantity('elevation', 0)
domain.set_quantity('stage', 1.0)
domain.set_quantity('friction', 0)
Br = Reflective_boundary(domain)
domain.set_boundary({'exterior': Br})
# Setup only one forcing term, time dependent rainfall
# that expires at t==20
from anuga.fit_interpolate.interpolate import Modeltime_too_late
def main_rate(t):
if t > 20:
msg = 'Model time exceeded.'
raise_(Modeltime_too_late, msg)
else:
return 3*t + 7
domain.forcing_terms = []
R = Rainfall(domain,
rate=main_rate,
polygon=[[1,1], [2,1], [2,2], [1,2]])
assert num.allclose(R.exchange_area, 2)
domain.forcing_terms.append(R)
#for t in domain.evolve(yieldstep=1, finaltime=25):
# pass
try:
for t in domain.evolve(yieldstep=1, finaltime=25):
pass
except Modeltime_too_late as e:
# Test that error message is as expected
assert 'can specify keyword argument default_rate in the forcing function' in str(e)
else:
raise Exception('Should have raised exception')
def test_constant_wind_stress_from_file(self):
from anuga.config import rho_a, rho_w, eta_w
from math import pi, cos, sin
from anuga.file_conversion.sts2sww_mesh import sts2sww_mesh
cellsize = 25
nrows=5; ncols = 6;
refzone=50
xllcorner=366000;yllcorner=6369500;
number_of_timesteps = 6
timestep=12*60
eps=2e-16
points, vertices, boundary =rectangular(nrows-2,ncols-2,
len1=cellsize*(ncols-1),
len2=cellsize*(nrows-1),
origin=(xllcorner,yllcorner))
domain = Domain(points, vertices, boundary)
midpoints = domain.get_centroid_coordinates()
# Flat surface with 1m of water
domain.set_quantity('elevation', 0)
domain.set_quantity('stage', 1.0)
domain.set_quantity('friction', 0)
Br = Reflective_boundary(domain)
domain.set_boundary({'top': Br, 'bottom' :Br, 'left': Br, 'right': Br})
# Setup only one forcing term, constant wind stress
s = 100
phi = 135
pressure=1000
domain.forcing_terms = []
field_sts_filename = 'wind_field'
self.write_wind_pressure_field_sts(field_sts_filename,
nrows=nrows,
ncols=ncols,
cellsize=cellsize,
origin=(xllcorner,yllcorner),
refzone=50,
timestep=timestep,
number_of_timesteps=10,
speed=s,
angle=phi,
pressure=pressure)
sts2sww_mesh(field_sts_filename,spatial_thinning=1,
verbose=False)
# Setup wind stress
F = file_function(field_sts_filename+'.sww', domain,
quantities=['wind_speed', 'wind_angle'],
interpolation_points = midpoints)
W = Wind_stress(F,use_coordinates=False)
domain.forcing_terms.append(W)
domain.compute_forcing_terms()
const = old_div(eta_w*rho_a, rho_w)
# Convert to radians
phi = old_div(phi*pi, 180)
# Compute velocity vector (u, v)
u = s*cos(phi)
v = s*sin(phi)
# Compute wind stress
S = const * num.sqrt(u**2 + v**2)
assert num.allclose(domain.quantities['stage'].explicit_update, 0)
assert num.allclose(domain.quantities['xmomentum'].explicit_update, S*u)
assert num.allclose(domain.quantities['ymomentum'].explicit_update, S*v)
def test_variable_windfield_from_file(self):
from anuga.config import rho_a, rho_w, eta_w
from math import pi, cos, sin
from anuga.config import time_format
from anuga.file_conversion.sts2sww_mesh import sts2sww_mesh
cellsize = 25
#nrows=25; ncols = 25;
nrows=10; ncols = 10;
refzone=50
xllcorner=366000;yllcorner=6369500;
number_of_timesteps = 10
timestep=1
eps=2.e-16
spatial_thinning=1
points, vertices, boundary =rectangular(nrows-2,ncols-2,
len1=cellsize*(ncols-1),
len2=cellsize*(nrows-1),
origin=(xllcorner,yllcorner))
time=num.arange(0,10,1,float)
eval_time=time[7];
domain = Domain(points, vertices, boundary)
midpoints = domain.get_centroid_coordinates()
vertexpoints = domain.get_nodes()
"""
x=grid_1d(xllcorner,cellsize,ncols)
y=grid_1d(yllcorner,cellsize,nrows)
X,Y=num.meshgrid(x,y)
interpolation_points=num.empty((X.shape[0]*X.shape[1],2),float)
k=0
for i in range(X.shape[0]):
for j in range(X.shape[1]):
interpolation_points[k,0]=X[i,j]
interpolation_points[k,1]=Y[i,j]
k+=1
z=spatial_linear_varying_speed(eval_time,interpolation_points[:,0],
interpolation_points[:,1])
k=0
Z=num.empty((X.shape[0],X.shape[1]),float)
for i in range(X.shape[0]):
for j in range(X.shape[1]):
Z[i,j]=z[k]
k+=1
Q=num.empty((time.shape[0],points.shape[0]),float)
for i, t in enumerate(time):
Q[i,:]=spatial_linear_varying_speed(t,points[:,0],points[:,1])
from interpolate import Interpolation_function
I = Interpolation_function(time,Q,
vertex_coordinates = points,
triangles = domain.triangles,
#interpolation_points = midpoints,
interpolation_points=interpolation_points,
verbose=False)
V=num.empty((X.shape[0],X.shape[1]),float)
for k in range(len(interpolation_points)):
assert num.allclose(I(eval_time,k),z[k])
V[k/X.shape[1],k%X.shape[1]]=I(eval_time,k)
import mpl_toolkits.mplot3d.axes3d as p3
fig=P.figure()
ax = p3.Axes3D(fig)
ax.plot_surface(X,Y,V)
ax.plot_surface(X,Y,Z)
P.show()
"""
# Flat surface with 1m of water
domain.set_quantity('elevation', 0)
domain.set_quantity('stage', 1.0)
domain.set_quantity('friction', 0)
domain.set_time(7*timestep) # Take a time that is represented in file (not zero)
# Write wind stress file (ensure that domain time is covered)
field_sts_filename = 'wind_field'
self.write_wind_pressure_field_sts(field_sts_filename,
nrows=nrows,
ncols=ncols,
cellsize=cellsize,
origin=(xllcorner,yllcorner),
refzone=50,
timestep=timestep,
number_of_timesteps=10,
speed=spatial_linear_varying_speed,
angle=spatial_linear_varying_angle,
pressure=spatial_linear_varying_pressure)
sts2sww_mesh(field_sts_filename,spatial_thinning=spatial_thinning,
verbose=False)
# Setup wind stress
FW = file_function(field_sts_filename+'.sww', domain,
quantities=['wind_speed', 'wind_angle'],
interpolation_points = midpoints)
W = Wind_stress(FW,use_coordinates=False)
domain.forcing_terms = []
domain.forcing_terms.append(W)
domain.compute_forcing_terms()
# Compute reference solution
const = old_div(eta_w*rho_a, rho_w)
N = len(domain) # number_of_triangles
xc = domain.get_centroid_coordinates()
t = domain.get_time()
x = xc[:,0]
y = xc[:,1]
s_vec = spatial_linear_varying_speed(t,x,y)
phi_vec = spatial_linear_varying_angle(t,x,y)
for k in range(N):
# Convert to radians
phi = old_div(phi_vec[k]*pi, 180)
s = s_vec[k]
# Compute velocity vector (u, v)
u = s*cos(phi)
v = s*sin(phi)
# Compute wind stress
S = const * num.sqrt(u**2 + v**2)
assert num.allclose(domain.quantities['stage'].explicit_update[k],0)
assert num.allclose(domain.quantities['xmomentum'].\
explicit_update[k],S*u,eps)
assert num.allclose(domain.quantities['ymomentum'].\
explicit_update[k],S*v,eps)
os.remove(field_sts_filename+'.sts')
os.remove(field_sts_filename+'.sww')
def test_variable_pressurefield_from_file(self):
from anuga.config import rho_a, rho_w, eta_w
from math import pi, cos, sin
from anuga.config import time_format
from anuga.file_conversion.sts2sww_mesh import sts2sww_mesh
cellsize = 25
#nrows=25; ncols = 25;
nrows=10; ncols = 10;
refzone=50
xllcorner=366000;yllcorner=6369500;
number_of_timesteps = 10
timestep=1
eps=2.e-16
spatial_thinning=1
points, vertices, boundary =rectangular(nrows-2,ncols-2,
len1=cellsize*(ncols-1),
len2=cellsize*(nrows-1),
origin=(xllcorner,yllcorner))
time=num.arange(0,10,1,float)
eval_time=time[7];
domain = Domain(points, vertices, boundary)
midpoints = domain.get_centroid_coordinates()
vertexpoints = domain.get_nodes()
# Flat surface with 1m of water
domain.set_quantity('elevation', 0)
domain.set_quantity('stage', 1.0)
domain.set_quantity('friction', 0)
domain.set_time(7*timestep) # Take a time that is represented in file (not zero)
# Write wind stress file (ensure that domain time is covered)
field_sts_filename = 'wind_field'
self.write_wind_pressure_field_sts(field_sts_filename,
nrows=nrows,
ncols=ncols,
cellsize=cellsize,
origin=(xllcorner,yllcorner),
refzone=50,
timestep=timestep,
number_of_timesteps=10,
speed=spatial_linear_varying_speed,
angle=spatial_linear_varying_angle,
pressure=spatial_linear_varying_pressure)
sts2sww_mesh(field_sts_filename,spatial_thinning=spatial_thinning,
verbose=False)
# Setup barometric pressure
FP = file_function(field_sts_filename+'.sww', domain,
quantities=['barometric_pressure'],
interpolation_points = vertexpoints)
P = Barometric_pressure(FP,use_coordinates=False)
domain.forcing_terms = []
domain.forcing_terms.append(P)
domain.compute_forcing_terms()
N = len(domain) # number_of_triangles
xc = domain.get_centroid_coordinates()
t = domain.get_time()
x = xc[:,0]
y = xc[:,1]
p_vec = spatial_linear_varying_pressure(t,x,y)
h=1 #depth
px=0.000025 #pressure gradient in x-direction
py=0.0000125 #pressure gradient in y-direction
for k in range(N):
# Convert to radians
p = p_vec[k]
assert num.allclose(domain.quantities['stage'].explicit_update[k],0)
assert num.allclose(domain.quantities['xmomentum'].\
explicit_update[k],old_div(h*px,rho_w))
assert num.allclose(domain.quantities['ymomentum'].\
explicit_update[k],old_div(h*py,rho_w))
os.remove(field_sts_filename+'.sts')
os.remove(field_sts_filename+'.sww')
def test_constant_wind_stress_from_file_evolve(self):
from anuga.config import rho_a, rho_w, eta_w
from math import pi, cos, sin
from anuga.config import time_format
from anuga.file_conversion.sts2sww_mesh import sts2sww_mesh
cellsize = 25
nrows=5; ncols = 6;
refzone=50
xllcorner=366000;yllcorner=6369500;
number_of_timesteps = 27
timestep=1
eps=2e-16
points, vertices, boundary =rectangular(nrows-2,ncols-2,
len1=cellsize*(ncols-1),
len2=cellsize*(nrows-1),
origin=(xllcorner,yllcorner))
domain = Domain(points, vertices, boundary)
midpoints = domain.get_centroid_coordinates()
# Flat surface with 1m of water
domain.set_quantity('elevation', 0)
domain.set_quantity('stage', 1.0)
domain.set_quantity('friction', 0)
Br = Reflective_boundary(domain)
domain.set_boundary({'top': Br, 'bottom' :Br, 'left': Br, 'right': Br})
# Setup only one forcing term, constant wind stress
s = 100
phi = 135
field_sts_filename = 'wind_field'
self.write_wind_pressure_field_sts(field_sts_filename,
nrows=nrows,
ncols=ncols,
cellsize=cellsize,
origin=(xllcorner,yllcorner),
refzone=50,
timestep=timestep,
number_of_timesteps=number_of_timesteps,
speed=s,
angle=phi)
sts2sww_mesh(field_sts_filename,spatial_thinning=1,
verbose=False)
# Setup wind stress
F = file_function(field_sts_filename+'.sww', domain,
quantities=['wind_speed', 'wind_angle'],
interpolation_points = midpoints)
W = Wind_stress(F,use_coordinates=False)
domain.forcing_terms.append(W)
valuesUsingFunction=num.empty((3,number_of_timesteps+1,midpoints.shape[0]),
float)
i=0
for t in domain.evolve(yieldstep=1, finaltime=number_of_timesteps*timestep):
valuesUsingFunction[0,i]=domain.quantities['stage'].explicit_update
valuesUsingFunction[1,i]=domain.quantities['xmomentum'].explicit_update
valuesUsingFunction[2,i]=domain.quantities['ymomentum'].explicit_update
i+=1
domain_II = Domain(points, vertices, boundary)
# Flat surface with 1m of water
domain_II.set_quantity('elevation', 0)
domain_II.set_quantity('stage', 1.0)
domain_II.set_quantity('friction', 0)
Br = Reflective_boundary(domain_II)
domain_II.set_boundary({'top': Br, 'bottom' :Br, 'left': Br, 'right': Br})
s = 100
phi = 135
domain_II.forcing_terms = []
domain_II.forcing_terms.append(Wind_stress(s, phi))
i=0;
for t in domain_II.evolve(yieldstep=1,
finaltime=number_of_timesteps*timestep):
assert num.allclose(valuesUsingFunction[0,i],domain_II.quantities['stage'].explicit_update), max(valuesUsingFunction[0,i]-domain_II.quantities['stage'].explicit_update)
assert num.allclose(valuesUsingFunction[1,i],domain_II.quantities['xmomentum'].explicit_update)
assert num.allclose(valuesUsingFunction[2,i],domain_II.quantities['ymomentum'].explicit_update)
i+=1
os.remove(field_sts_filename+'.sts')
os.remove(field_sts_filename+'.sww')
def test_temporally_varying_wind_stress_from_file_evolve(self):
from anuga.config import rho_a, rho_w, eta_w
from math import pi, cos, sin
from anuga.config import time_format
from anuga.file_conversion.sts2sww_mesh import sts2sww_mesh
cellsize = 25
#nrows=20; ncols = 20;
nrows=10; ncols = 10;
refzone=50
xllcorner=366000;yllcorner=6369500;
number_of_timesteps = 28
timestep=1.
eps=2e-16
#points, vertices, boundary =rectangular(10,10,
points, vertices, boundary =rectangular(5,5,
len1=cellsize*(ncols-1),
len2=cellsize*(nrows-1),
origin=(xllcorner,yllcorner))
domain = Domain(points, vertices, boundary)
midpoints = domain.get_centroid_coordinates()
# Flat surface with 1m of water
domain.set_quantity('elevation', 0)
domain.set_quantity('stage', 1.0)
domain.set_quantity('friction', 0)
Br = Reflective_boundary(domain)
domain.set_boundary({'top': Br, 'bottom' :Br, 'left': Br, 'right': Br})
# Setup only one forcing term, constant wind stress
field_sts_filename = 'wind_field'
self.write_wind_pressure_field_sts(field_sts_filename,
nrows=nrows,
ncols=ncols,
cellsize=cellsize,
origin=(xllcorner,yllcorner),
refzone=50,
timestep=timestep,
number_of_timesteps=number_of_timesteps,
speed=time_varying_speed,
angle=time_varying_angle,
pressure=time_varying_pressure)
sts2sww_mesh(field_sts_filename,spatial_thinning=1,
verbose=False)
# Setup wind stress
F = file_function(field_sts_filename+'.sww', domain,
quantities=['wind_speed', 'wind_angle'],
interpolation_points = midpoints)
#W = Wind_stress(F,use_coordinates=False)
W = Wind_stress_fast(F,filename=field_sts_filename+'.sww', domain=domain)
domain.forcing_terms.append(W)
valuesUsingFunction=num.empty((3,2*number_of_timesteps,midpoints.shape[0]),
float)
i=0
for t in domain.evolve(yieldstep=timestep/2., finaltime=(number_of_timesteps-1)*timestep):
valuesUsingFunction[0,i]=domain.quantities['stage'].explicit_update
valuesUsingFunction[1,i]=domain.quantities['xmomentum'].explicit_update
valuesUsingFunction[2,i]=domain.quantities['ymomentum'].explicit_update
i+=1
domain_II = Domain(points, vertices, boundary)
# Flat surface with 1m of water
domain_II.set_quantity('elevation', 0)
domain_II.set_quantity('stage', 1.0)
domain_II.set_quantity('friction', 0)
Br = Reflective_boundary(domain_II)
domain_II.set_boundary({'top': Br, 'bottom' :Br, 'left': Br, 'right': Br})
domain_II.forcing_terms.append(Wind_stress(s=time_varying_speed,
phi=time_varying_angle))
i=0;
for t in domain_II.evolve(yieldstep=timestep/2.,
finaltime=(number_of_timesteps-1)*timestep):
assert num.allclose(valuesUsingFunction[0,i],
domain_II.quantities['stage'].explicit_update,
eps)
#print i,valuesUsingFunction[1,i]
assert num.allclose(valuesUsingFunction[1,i],
domain_II.quantities['xmomentum'].explicit_update,
eps),(valuesUsingFunction[1,i]-
domain_II.quantities['xmomentum'].explicit_update)
assert num.allclose(valuesUsingFunction[2,i],
domain_II.quantities['ymomentum'].explicit_update,
eps)
#if i==1: assert-1==1
i+=1
os.remove(field_sts_filename+'.sts')
os.remove(field_sts_filename+'.sww')
def test_spatially_varying_wind_stress_from_file_evolve(self):
from anuga.config import rho_a, rho_w, eta_w
from math import pi, cos, sin
from anuga.config import time_format
from anuga.file_conversion.sts2sww_mesh import sts2sww_mesh
cellsize = 25
nrows=20; ncols = 20;
nrows=10; ncols = 10;
refzone=50
xllcorner=366000;yllcorner=6369500;
number_of_timesteps = 28
timestep=1.
eps=2e-16
#points, vertices, boundary =rectangular(10,10,
points, vertices, boundary =rectangular(5,5,
len1=cellsize*(ncols-1),
len2=cellsize*(nrows-1),
origin=(xllcorner,yllcorner))
domain = Domain(points, vertices, boundary)
midpoints = domain.get_centroid_coordinates()
# Flat surface with 1m of water
domain.set_quantity('elevation', 0)
domain.set_quantity('stage', 1.0)
domain.set_quantity('friction', 0)
Br = Reflective_boundary(domain)
domain.set_boundary({'top': Br, 'bottom' :Br, 'left': Br, 'right': Br})
# Setup only one forcing term, constant wind stress
field_sts_filename = 'wind_field'
self.write_wind_pressure_field_sts(field_sts_filename,
nrows=nrows,
ncols=ncols,
cellsize=cellsize,
origin=(xllcorner,yllcorner),
refzone=50,
timestep=timestep,
number_of_timesteps=number_of_timesteps,
speed=spatial_linear_varying_speed,
angle=spatial_linear_varying_angle,
pressure=spatial_linear_varying_pressure)
sts2sww_mesh(field_sts_filename,spatial_thinning=1,
verbose=False)
# Setup wind stress
F = file_function(field_sts_filename+'.sww', domain,
quantities=['wind_speed', 'wind_angle'],
interpolation_points = midpoints)
W = Wind_stress(F,use_coordinates=False)
domain.forcing_terms.append(W)
valuesUsingFunction=num.empty((3,number_of_timesteps,midpoints.shape[0]),
float)
i=0
for t in domain.evolve(yieldstep=timestep, finaltime=(number_of_timesteps-1)*timestep):
valuesUsingFunction[0,i]=domain.quantities['stage'].explicit_update
valuesUsingFunction[1,i]=domain.quantities['xmomentum'].explicit_update
valuesUsingFunction[2,i]=domain.quantities['ymomentum'].explicit_update
i+=1
domain_II = Domain(points, vertices, boundary)
# Flat surface with 1m of water
domain_II.set_quantity('elevation', 0)
domain_II.set_quantity('stage', 1.0)
domain_II.set_quantity('friction', 0)
Br = Reflective_boundary(domain_II)
domain_II.set_boundary({'top': Br, 'bottom' :Br, 'left': Br, 'right': Br})
domain_II.forcing_terms.append(Wind_stress(s=spatial_linear_varying_speed,
phi=spatial_linear_varying_angle))
i=0;
for t in domain_II.evolve(yieldstep=timestep,
finaltime=(number_of_timesteps-1)*timestep):
#print valuesUsingFunction[1,i],domain_II.quantities['xmomentum'].explicit_update
assert num.allclose(valuesUsingFunction[0,i],
domain_II.quantities['stage'].explicit_update,
eps)
assert num.allclose(valuesUsingFunction[1,i],
domain_II.quantities['xmomentum'].explicit_update,
eps)
assert num.allclose(valuesUsingFunction[2,i],
domain_II.quantities['ymomentum'].explicit_update,
eps)
i+=1
os.remove(field_sts_filename+'.sts')
os.remove(field_sts_filename+'.sww')
def test_temporally_varying_pressure_stress_from_file_evolve(self):
from anuga.config import rho_a, rho_w, eta_w
from math import pi, cos, sin
from anuga.config import time_format
from anuga.file_conversion.sts2sww_mesh import sts2sww_mesh
cellsize = 25
#nrows=20; ncols = 20;
nrows=10; ncols = 10;
refzone=50
xllcorner=366000;yllcorner=6369500;
number_of_timesteps = 28
timestep=10.
eps=2e-16
#print "Building mesh"
#points, vertices, boundary =rectangular(10,10,
points, vertices, boundary =rectangular(5,5,
len1=cellsize*(ncols-1),
len2=cellsize*(nrows-1),
origin=(xllcorner,yllcorner))
domain = Domain(points, vertices, boundary)
vertexpoints = domain.get_nodes()
# Flat surface with 1m of water
domain.set_quantity('elevation', 0)
domain.set_quantity('stage', 1.0)
domain.set_quantity('friction', 0)
Br = Reflective_boundary(domain)
domain.set_boundary({'top': Br, 'bottom' :Br, 'left': Br, 'right': Br})
# Setup only one forcing term, constant wind stress
field_sts_filename = 'wind_field'
#print 'Writing pressure field sts file'
self.write_wind_pressure_field_sts(field_sts_filename,
nrows=nrows,
ncols=ncols,
cellsize=cellsize,
origin=(xllcorner,yllcorner),
refzone=50,
timestep=timestep,
number_of_timesteps=number_of_timesteps,
speed=time_varying_speed,
angle=time_varying_angle,
pressure=time_varying_pressure)
#print "converting sts to sww"
sts2sww_mesh(field_sts_filename,spatial_thinning=1,
verbose=False)
#print 'initialising file_function'
# Setup wind stress
F = file_function(field_sts_filename+'.sww', domain,
quantities=['barometric_pressure'],
interpolation_points = vertexpoints)
#P = Barometric_pressure(F,use_coordinates=False)
#print 'initialising pressure forcing term'
P = Barometric_pressure_fast(p=F,filename=field_sts_filename+'.sww',domain=domain)
domain.forcing_terms.append(P)
valuesUsingFunction=num.empty((3,2*number_of_timesteps,len(domain)),
float)
i=0
import time as timer
t0=timer.time()
for t in domain.evolve(yieldstep=timestep/2., finaltime=(number_of_timesteps-1)*timestep):
valuesUsingFunction[0,i]=domain.quantities['stage'].explicit_update
valuesUsingFunction[1,i]=domain.quantities['xmomentum'].explicit_update
valuesUsingFunction[2,i]=domain.quantities['ymomentum'].explicit_update
i+=1
#domain.write_time()
t1=timer.time()
#print "That took %fs seconds" %(t1-t0)
domain_II = Domain(points, vertices, boundary)
# Flat surface with 1m of water
domain_II.set_quantity('elevation', 0)
domain_II.set_quantity('stage', 1.0)
domain_II.set_quantity('friction', 0)
Br = Reflective_boundary(domain_II)
domain_II.set_boundary({'top': Br, 'bottom' :Br, 'left': Br, 'right': Br})
domain_II.forcing_terms.append(Barometric_pressure(p=time_varying_pressure))
i=0;
for t in domain_II.evolve(yieldstep=timestep/2.,
finaltime=(number_of_timesteps-1)*timestep):
assert num.allclose(valuesUsingFunction[0,i],
domain_II.quantities['stage'].explicit_update,
eps)
assert num.allclose(valuesUsingFunction[1,i],
domain_II.quantities['xmomentum'].explicit_update,
eps)
assert num.allclose(valuesUsingFunction[2,i],
domain_II.quantities['ymomentum'].explicit_update,
eps)
i+=1
os.remove(field_sts_filename+'.sts')
os.remove(field_sts_filename+'.sww')
def test_spatially_varying_pressure_stress_from_file_evolve(self):
from anuga.config import rho_a, rho_w, eta_w
from math import pi, cos, sin
from anuga.config import time_format
from anuga.file_conversion.sts2sww_mesh import sts2sww_mesh
cellsize = 25
#nrows=20; ncols = 20;
nrows=10; ncols = 10;
refzone=50
xllcorner=366000;yllcorner=6369500;
number_of_timesteps = 28
timestep=1.
eps=2e-16
#points, vertices, boundary =rectangular(10,10,
points, vertices, boundary =rectangular(5,5,
len1=cellsize*(ncols-1),
len2=cellsize*(nrows-1),
origin=(xllcorner,yllcorner))
domain = Domain(points, vertices, boundary)
vertexpoints = domain.get_nodes()
# Flat surface with 1m of water
domain.set_quantity('elevation', 0)
domain.set_quantity('stage', 1.0)
domain.set_quantity('friction', 0)
Br = Reflective_boundary(domain)
domain.set_boundary({'top': Br, 'bottom' :Br, 'left': Br, 'right': Br})
# Setup only one forcing term, constant wind stress
field_sts_filename = 'wind_field'
self.write_wind_pressure_field_sts(field_sts_filename,
nrows=nrows,
ncols=ncols,
cellsize=cellsize,
origin=(xllcorner,yllcorner),
refzone=50,
timestep=timestep,
number_of_timesteps=number_of_timesteps,
speed=spatial_linear_varying_speed,
angle=spatial_linear_varying_angle,
pressure=spatial_linear_varying_pressure)
sts2sww_mesh(field_sts_filename,spatial_thinning=1,
verbose=False)
# Setup wind stress
F = file_function(field_sts_filename+'.sww', domain,
quantities=['barometric_pressure'],
interpolation_points = vertexpoints)
P = Barometric_pressure(F,use_coordinates=False)
domain.forcing_terms.append(P)
valuesUsingFunction=num.empty((3,number_of_timesteps,len(domain)),
float)
i=0
for t in domain.evolve(yieldstep=timestep, finaltime=(number_of_timesteps-1)*timestep):
valuesUsingFunction[0,i]=domain.quantities['stage'].explicit_update
valuesUsingFunction[1,i]=domain.quantities['xmomentum'].explicit_update
valuesUsingFunction[2,i]=domain.quantities['ymomentum'].explicit_update
i+=1
domain_II = Domain(points, vertices, boundary)
# Flat surface with 1m of water
domain_II.set_quantity('elevation', 0)
domain_II.set_quantity('stage', 1.0)
domain_II.set_quantity('friction', 0)
Br = Reflective_boundary(domain_II)
domain_II.set_boundary({'top': Br, 'bottom' :Br, 'left': Br, 'right': Br})
domain_II.forcing_terms.append(Barometric_pressure(p=spatial_linear_varying_pressure))
i=0;
for t in domain_II.evolve(yieldstep=timestep,
finaltime=(number_of_timesteps-1)*timestep):
assert num.allclose(valuesUsingFunction[0,i],
domain_II.quantities['stage'].explicit_update,
eps)
assert num.allclose(valuesUsingFunction[1,i],
domain_II.quantities['xmomentum'].explicit_update,
eps)
assert num.allclose(valuesUsingFunction[2,i],
domain_II.quantities['ymomentum'].explicit_update,
eps)
i+=1
os.remove(field_sts_filename+'.sts')
os.remove(field_sts_filename+'.sww')
def test_flux_gravity(self):
#Assuming no friction
from anuga.config import g
a = [0.0, 0.0]
b = [0.0, 2.0]
c = [2.0, 0.0]
d = [0.0, 4.0]
e = [2.0, 2.0]
f = [4.0, 0.0]
points = [a, b, c, d, e, f]
# bac, bce, ecf, dbe
vertices = [[1,0,2], [1,2,4], [4,2,5], [3,1,4]]
domain = Domain(points, vertices)
domain.set_flow_algorithm('1_5')
B = Reflective_boundary(domain)
domain.set_boundary( {'exterior': B})
#Set up for a gradient of (3,0) at mid triangle (bce)
def slope(x, y):
return 3*x
h = 0.1
def stage(x, y):
return slope(x, y) + h
domain.set_quantity('elevation', slope)
domain.set_quantity('stage', stage)
for name in domain.conserved_quantities:
assert num.allclose(domain.quantities[name].explicit_update, 0)
assert num.allclose(domain.quantities[name].semi_implicit_update, 0)
# fluxes and gravity term are now combined. To ensure zero flux on boundary
# need to set reflective boundaries
domain.update_boundary()
domain.compute_fluxes()
assert num.allclose(domain.quantities['stage'].explicit_update, 0)
assert num.allclose(domain.quantities['xmomentum'].explicit_update, -g*h*3)
assert num.allclose(domain.quantities['ymomentum'].explicit_update, 0)
def test_manning_friction_old(self):
from anuga.config import g
a = [0.0, 0.0]
b = [0.0, 2.0]
c = [2.0, 0.0]
d = [0.0, 4.0]
e = [2.0, 2.0]
f = [4.0, 0.0]
points = [a, b, c, d, e, f]
# bac, bce, ecf, dbe
vertices = [[1,0,2], [1,2,4], [4,2,5], [3,1,4]]
domain = Domain(points, vertices)
# Use the old function which doesn't take into account the extra
# wetted area due to slope of bed
domain.set_sloped_mannings_function(False)
B = Reflective_boundary(domain)
domain.set_boundary( {'exterior': B})
#Set up for a gradient of (3,0) at mid triangle (bce)
def slope(x, y):
return 3*x
h = 0.1
def stage(x, y):
return slope(x, y) + h
eta = 0.07
domain.set_quantity('elevation', slope)
domain.set_quantity('stage', stage)
domain.set_quantity('friction', eta)
for name in domain.conserved_quantities:
assert num.allclose(domain.quantities[name].explicit_update, 0)
assert num.allclose(domain.quantities[name].semi_implicit_update, 0)
# Only manning friction in the forcing terms (gravity now combined with flux calc)
domain.compute_forcing_terms()
assert num.allclose(domain.quantities['stage'].explicit_update, 0)
assert num.allclose(domain.quantities['xmomentum'].explicit_update,
0)
assert num.allclose(domain.quantities['ymomentum'].explicit_update, 0)
assert num.allclose(domain.quantities['stage'].semi_implicit_update, 0)
assert num.allclose(domain.quantities['xmomentum'].semi_implicit_update,
0)
assert num.allclose(domain.quantities['ymomentum'].semi_implicit_update,
0)
#Create some momentum for friction to work with
domain.set_quantity('xmomentum', 1)
S = old_div(-g*eta**2, h**(7.0/3))
domain.compute_forcing_terms()
assert num.allclose(domain.quantities['stage'].semi_implicit_update, 0)
assert num.allclose(domain.quantities['xmomentum'].semi_implicit_update,
S)
assert num.allclose(domain.quantities['ymomentum'].semi_implicit_update,
0)
#A more complex example
domain.quantities['stage'].semi_implicit_update[:] = 0.0
domain.quantities['xmomentum'].semi_implicit_update[:] = 0.0
domain.quantities['ymomentum'].semi_implicit_update[:] = 0.0
domain.set_quantity('xmomentum', 3)
domain.set_quantity('ymomentum', 4)
# sqrt(3^2 +4^2) = 5
S = old_div(-g*eta**2, h**(7.0/3)) * 5
domain.compute_forcing_terms()
assert num.allclose(domain.quantities['stage'].semi_implicit_update, 0)
assert num.allclose(domain.quantities['xmomentum'].semi_implicit_update,3*S)
assert num.allclose(domain.quantities['ymomentum'].semi_implicit_update,4*S)
def test_manning_friction_new(self):
from anuga.config import g
import math
a = [0.0, 0.0]
b = [0.0, 2.0]
c = [2.0, 0.0]
d = [0.0, 4.0]
e = [2.0, 2.0]
f = [4.0, 0.0]
points = [a, b, c, d, e, f]
# bac, bce, ecf, dbe
vertices = [[1,0,2], [1,2,4], [4,2,5], [3,1,4]]
domain = Domain(points, vertices)
B = Reflective_boundary(domain)
domain.set_boundary( {'exterior': B})
# Use the new function which takes into account the extra
# wetted area due to slope of bed
domain.set_sloped_mannings_function(True)
#Set up for a gradient of (3,0) at mid triangle (bce)
def slope(x, y):
return 3*x
h = 0.1
def stage(x, y):
return slope(x, y) + h
eta = 0.07
domain.set_quantity('elevation', slope)
domain.set_quantity('stage', stage)
domain.set_quantity('friction', eta)
for name in domain.conserved_quantities:
assert num.allclose(domain.quantities[name].explicit_update, 0)
assert num.allclose(domain.quantities[name].semi_implicit_update, 0)
domain.compute_forcing_terms()
assert num.allclose(domain.quantities['stage'].explicit_update, 0)
assert num.allclose(domain.quantities['xmomentum'].explicit_update,
0)
assert num.allclose(domain.quantities['ymomentum'].explicit_update, 0)
assert num.allclose(domain.quantities['stage'].semi_implicit_update, 0)
assert num.allclose(domain.quantities['xmomentum'].semi_implicit_update,
0)
assert num.allclose(domain.quantities['ymomentum'].semi_implicit_update,
0)
#Create some momentum for friction to work with
domain.set_quantity('xmomentum', 1)
S = old_div(-g*eta**2, h**(7.0/3)) * math.sqrt(10)
domain.compute_forcing_terms()
assert num.allclose(domain.quantities['stage'].semi_implicit_update, 0)
assert num.allclose(domain.quantities['xmomentum'].semi_implicit_update,
S)
assert num.allclose(domain.quantities['ymomentum'].semi_implicit_update,
0)
#A more complex example
domain.quantities['stage'].semi_implicit_update[:] = 0.0
domain.quantities['xmomentum'].semi_implicit_update[:] = 0.0
domain.quantities['ymomentum'].semi_implicit_update[:] = 0.0
domain.set_quantity('xmomentum', 3)
domain.set_quantity('ymomentum', 4)
S = old_div(-g*eta**2 *5, h**(7.0/3)) * math.sqrt(10.0)
domain.compute_forcing_terms()
#print 'S', S
#print domain.quantities['xmomentum'].semi_implicit_update
#print domain.quantities['ymomentum'].semi_implicit_update
assert num.allclose(domain.quantities['stage'].semi_implicit_update, 0)
assert num.allclose(domain.quantities['xmomentum'].semi_implicit_update,3*S)
assert num.allclose(domain.quantities['ymomentum'].semi_implicit_update,4*S)
def test_inflow_using_circle(self):
from math import pi, cos, sin
a = [0.0, 0.0]
b = [0.0, 2.0]
c = [2.0, 0.0]
d = [0.0, 4.0]
e = [2.0, 2.0]
f = [4.0, 0.0]
points = [a, b, c, d, e, f]
# bac, bce, ecf, dbe
vertices = [[1,0,2], [1,2,4], [4,2,5], [3,1,4]]
domain = Domain(points, vertices)
# Flat surface with 1m of water
domain.set_quantity('elevation', 0)
domain.set_quantity('stage', 1.0)
domain.set_quantity('friction', 0)
Br = Reflective_boundary(domain)
domain.set_boundary({'exterior': Br})
# Setup only one forcing term, constant inflow of 2 m^3/s
# on a circle affecting triangles #0 and #1 (bac and bce)
domain.forcing_terms = []
I = Inflow(domain, rate=2.0, center=(1,1), radius=1)
domain.forcing_terms.append(I)
domain.compute_forcing_terms()
A = I.exchange_area
assert num.allclose(A, 4) # Two triangles
assert num.allclose(domain.quantities['stage'].explicit_update[1], 2.0/A)
assert num.allclose(domain.quantities['stage'].explicit_update[0], 2.0/A)
assert num.allclose(domain.quantities['stage'].explicit_update[2:], 0)
def test_inflow_using_circle_function(self):
from math import pi, cos, sin
a = [0.0, 0.0]
b = [0.0, 2.0]
c = [2.0, 0.0]
d = [0.0, 4.0]
e = [2.0, 2.0]
f = [4.0, 0.0]
points = [a, b, c, d, e, f]
# bac, bce, ecf, dbe
vertices = [[1,0,2], [1,2,4], [4,2,5], [3,1,4]]
domain = Domain(points, vertices)
# Flat surface with 1m of water
domain.set_quantity('elevation', 0)
domain.set_quantity('stage', 1.0)
domain.set_quantity('friction', 0)
Br = Reflective_boundary(domain)
domain.set_boundary({'exterior': Br})
# Setup only one forcing term, time dependent inflow of 2 m^3/s
# on a circle affecting triangles #0 and #1 (bac and bce)
domain.forcing_terms = []
I = Inflow(domain, rate=lambda t: 2., center=(1,1), radius=1)
domain.forcing_terms.append(I)
domain.compute_forcing_terms()
A = I.exchange_area
assert num.allclose(A, 4) # Two triangles
assert num.allclose(domain.quantities['stage'].explicit_update[1], 2.0/A)
assert num.allclose(domain.quantities['stage'].explicit_update[0], 2.0/A)
assert num.allclose(domain.quantities['stage'].explicit_update[2:], 0)
def test_inflow_catch_too_few_triangles(self):
"""
Test that exception is thrown if no triangles are covered
by the inflow area
"""
from math import pi, cos, sin
a = [0.0, 0.0]
b = [0.0, 2.0]
c = [2.0, 0.0]
d = [0.0, 4.0]
e = [2.0, 2.0]
f = [4.0, 0.0]
points = [a, b, c, d, e, f]
# bac, bce, ecf, dbe
vertices = [[1,0,2], [1,2,4], [4,2,5], [3,1,4]]
domain = Domain(points, vertices)
# Flat surface with 1m of water
domain.set_quantity('elevation', 0)
domain.set_quantity('stage', 1.0)
domain.set_quantity('friction', 0)
Br = Reflective_boundary(domain)
domain.set_boundary({'exterior': Br})
# Setup only one forcing term, constant inflow of 2 m^3/s
# on a circle affecting triangles #0 and #1 (bac and bce)
try:
Inflow(domain, rate=2.0, center=(1,1.1), radius=0.01)
except:
pass
else:
msg = 'Should have raised exception'
raise_(Exception, msg)
if __name__ == "__main__":
suite = unittest.makeSuite(Test_Forcing, 'test')
runner = unittest.TextTestRunner(verbosity=1)
runner.run(suite)
| 34.309464 | 180 | 0.540259 | 9,909 | 81,931 | 4.31648 | 0.050863 | 0.010194 | 0.044515 | 0.045707 | 0.871832 | 0.862644 | 0.845577 | 0.835687 | 0.826008 | 0.808847 | 0 | 0.042247 | 0.347365 | 81,931 | 2,387 | 181 | 34.323837 | 0.757663 | 0.105015 | 0 | 0.817414 | 0 | 0 | 0.047557 | 0.001407 | 0 | 0 | 0 | 0.000419 | 0.073424 | 1 | 0.033788 | false | 0.005198 | 0.05653 | 0.003899 | 0.103964 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
4029a4ce4aabb36c265b2c6f21702ec5d875e1d2 | 31,062 | py | Python | inkscape/.config/inkscape/extensions/circuitSymbols/drawSources.py | Elyk8/dotrice | 68924c7d1e3026ab94edd8c4f35c4ae30cf28f0c | [
"BSD-3-Clause"
] | null | null | null | inkscape/.config/inkscape/extensions/circuitSymbols/drawSources.py | Elyk8/dotrice | 68924c7d1e3026ab94edd8c4f35c4ae30cf28f0c | [
"BSD-3-Clause"
] | null | null | null | inkscape/.config/inkscape/extensions/circuitSymbols/drawSources.py | Elyk8/dotrice | 68924c7d1e3026ab94edd8c4f35c4ae30cf28f0c | [
"BSD-3-Clause"
] | null | null | null | #!/usr/bin/python
import inkscapeMadeEasy.inkscapeMadeEasy_Base as inkBase
import inkscapeMadeEasy.inkscapeMadeEasy_Draw as inkDraw
class source(inkBase.inkscapeMadeEasy):
def add(self, vector, delta):
# nector does not need to be numpy array. delta will be converted to numpy array. Numpy can then deal with np.array + list
return vector + np.array(delta)
def drawSigns(self, group, positionPos=None, positionNeg=None):
# draw + and - signs
# positive Sign
lineStyleSign = inkDraw.lineStyle.setSimpleBlack(lineWidth=0.6)
if positionPos is not None:
inkDraw.line.relCoords(group, [[2, 0]], self.add(positionPos, [-1, 0]), lineStyle=lineStyleSign)
inkDraw.line.relCoords(group, [[0, 2]], self.add(positionPos, [0, -1]), lineStyle=lineStyleSign)
# negative Sign
if positionNeg is not None:
inkDraw.line.relCoords(group, [[0, 2]], self.add(positionNeg, [0, -1]), lineStyle=lineStyleSign)
# ---------------------------------------------
def drawSourceV(self, parent, position=[0, 0], value='v(t)', sourceType='general', label='Source', angleDeg=0, flagVolt=True, flagCurr=True, currName='i',
invertArrows=False, mirror=False, convention='active', wireExtraSize=0,standard='IEEE',flagVariable=False):
""" draws a independend voltage source
parent: parent object
position: position [x,y]
value: string with value.
sourceType: type of source. Values: 'general' (default), 'sinusoidal'
label: label of the object (it can be repeated)
angleDeg: rotation angle in degrees counter-clockwise (default 0)
flagVolt: indicates whether the voltage arrow must be drawn (default: true)
flagCurr: indicates whether the current arrow must be drawn (default: true)
currName: current drop name (default: i)
mirror: mirror source drawing (default: False)
convention: passive/active sign convention. available types: 'passive', 'active' (default)
wireExtraSize: additional length added to the terminals. If negative, the length will be reduced. default: 0)
standard: types: 'IEEE' (american), 'IEC' (european)
"""
group = self.createGroup(parent, label)
elem = self.createGroup(group, label)
lineStyleSign = inkDraw.lineStyle.setSimpleBlack(lineWidth=0.6)
inkDraw.line.relCoords(elem, [[-(18 + wireExtraSize), 0]], self.add(position, [-7, 0]))
inkDraw.line.relCoords(elem, [[18 + wireExtraSize, 0]], self.add(position, [7, 0]))
inkDraw.circle.centerRadius(elem, [0, 0], 7.0, offset=position, label='circle')
if sourceType == 'general':
if standard == 'IEEE':
# signs
if mirror:
self.drawSigns(elem, positionPos=self.add(position, [-4, 0]), positionNeg=self.add(position, [4, 0]))
else:
self.drawSigns(elem, positionPos=self.add(position, [4, 0]), positionNeg=self.add(position, [-4, 0]))
if standard == 'IEC':
inkDraw.line.relCoords(elem, [[14, 0]], self.add(position, [-7, 0]),lineStyle=lineStyleSign)
if sourceType == 'sinusoidal':
sine = self.createGroup(elem)
inkDraw.arc.startEndRadius(sine, self.add(position, [-5, 0]), position, 2.6, [0, 0], lineStyle=lineStyleSign, flagRightOf=True,
largeArc=False)
inkDraw.arc.startEndRadius(sine, self.add(position, [5, 0]), position, 2.6, [0, 0], lineStyle=lineStyleSign, flagRightOf=True,
largeArc=False)
self.rotateElement(sine, position, -angleDeg)
if mirror:
self.drawSigns(elem, positionPos=self.add(position, [-10, -4]), positionNeg=self.add(position, [10, -4]))
else:
self.drawSigns(elem, positionPos=self.add(position, [10, -4]), positionNeg=self.add(position, [-10, -4]))
if flagVariable:
# build arrow marker
colorBlack = inkDraw.color.defined('black')
L_arrow = 2.5
markerPath = 'M 0,0 l -%f,%f l 0,-%f z' % (L_arrow * 1.2, L_arrow / 2.0, L_arrow)
markerArrow = inkDraw.marker.createMarker(self, 'arrow', markerPath, RenameMode=0, strokeColor=colorBlack, fillColor=colorBlack,
lineWidth=0.6, markerTransform='translate (1,0)')
lineStyleArrow = inkDraw.lineStyle.set(lineWidth=1, lineColor=colorBlack, markerEnd=markerArrow)
inkDraw.line.relCoords(elem, [[17, -15]], self.add(position, [-8, 7]), lineStyle=lineStyleArrow)
pos_text = self.add(position, [0, -8 - self.textOffset])
if inkDraw.useLatex:
value = '$' + value + '$'
inkDraw.text.latex(self, group, value, pos_text, fontSize=self.fontSize, refPoint='bc', preambleFile=self.preambleFile)
if angleDeg != 0:
self.rotateElement(group, position, angleDeg)
inv_volt = (invertArrows == mirror)
if flagVolt:
if invertArrows:
if inkDraw.useLatex:
if value[1] == '-':
value = value[0] + value[2:]
else:
value = value[0] + '-' + value[1:]
else:
if value[0] == '-':
value = value[1:]
else:
value = '-' + value
self.drawVoltArrow(group, self.add(position, [0, 8]), name=value, color=self.voltageColor, angleDeg=angleDeg, invertArrows=inv_volt)
if flagCurr:
if convention == 'active':
self.drawCurrArrow(group, self.add(position, [20 + wireExtraSize, -5]), name=currName, color=self.currentColor, angleDeg=angleDeg,
invertArrows=inv_volt)
if convention == 'passive':
self.drawCurrArrow(group, self.add(position, [20 + wireExtraSize, -5]), name=currName, color=self.currentColor, angleDeg=angleDeg,
invertArrows=not inv_volt)
return group
# ---------------------------------------------
def drawSourceVDC(self, parent, position=[0, 0], value='V', label='Source', angleDeg=0, flagVolt=True, flagCurr=True, currName='i',
invertArrows=False, mirror=False, convention='active', wireExtraSize=0,flagVariable=False):
""" draws a DC voltage source
parent: parent object
position: position [x,y]
value: string with value.
label: label of the object (it can be repeated)
angleDeg: rotation angle in degrees counter-clockwise (default 0)
flagVolt: indicates whether the voltage arrow must be drawn (default: true)
flagCurr: indicates whether the current arrow must be drawn (default: true)
currName: current drop name (default: i)
mirror: mirror source drawing (default: False)
convention: passive/active sign convention. available types: 'passive', 'active' (default)
wireExtraSize: additional length added to the terminals. If negative, the length will be reduced. default: 0)
"""
group = self.createGroup(parent, label)
elem = self.createGroup(group, label)
inkDraw.line.relCoords(elem, [[-(24 + wireExtraSize), 0]], self.add(position, [-1, 0]))
inkDraw.line.relCoords(elem, [[23 + wireExtraSize, 0]], self.add(position, [2, 0]))
# draw source
if mirror:
self.drawSigns(elem, positionPos=self.add(position, [-4, -6]), positionNeg=self.add(position, [5, -6]))
inkDraw.line.relCoords(elem, [[0, -6]], self.add(position, [2, 3]))
inkDraw.line.relCoords(elem, [[0, -14]], self.add(position, [-1, 7]))
else:
self.drawSigns(elem, positionPos=self.add(position, [5, -6]), positionNeg=self.add(position, [-4, -6]))
inkDraw.line.relCoords(elem, [[0, -6]], self.add(position, [-1, 3]))
inkDraw.line.relCoords(elem, [[0, -14]], self.add(position, [2, 7]))
if flagVariable:
# build arrow marker
colorBlack = inkDraw.color.defined('black')
L_arrow = 2.5
markerPath = 'M 0,0 l -%f,%f l 0,-%f z' % (L_arrow * 1.2, L_arrow / 2.0, L_arrow)
markerArrow = inkDraw.marker.createMarker(self, 'arrow', markerPath, RenameMode=0, strokeColor=colorBlack, fillColor=colorBlack,
lineWidth=0.6, markerTransform='translate (1,0)')
lineStyleArrow = inkDraw.lineStyle.set(lineWidth=1, lineColor=colorBlack, markerEnd=markerArrow)
inkDraw.line.relCoords(elem, [[16, -10]], self.add(position, [-8, 5]), lineStyle=lineStyleArrow)
pos_text = self.add(position, [0, -8 - self.textOffset])
if inkDraw.useLatex:
value = '$' + value + '$'
inkDraw.text.latex(self, group, value, pos_text, fontSize=self.fontSize, refPoint='bc', preambleFile=self.preambleFile)
if angleDeg != 0:
self.rotateElement(group, position, angleDeg)
inv_volt = (invertArrows == mirror)
if flagVolt:
if invertArrows:
if inkDraw.useLatex:
if value[1] == '-':
value = value[0] + value[2:]
else:
value = value[0] + '-' + value[1:]
else:
if value[0] == '-':
value = value[1:]
else:
value = '-' + value
self.drawVoltArrow(group, self.add(position, [0, 8]), name=value, color=self.voltageColor, angleDeg=angleDeg, invertArrows=inv_volt)
if flagCurr:
if convention == 'active':
self.drawCurrArrow(group, self.add(position, [20 + wireExtraSize, -5]), name=currName, color=self.currentColor, angleDeg=angleDeg,
invertArrows=inv_volt)
if convention == 'passive':
self.drawCurrArrow(group, self.add(position, [20 + wireExtraSize, -5]), name=currName, color=self.currentColor, angleDeg=angleDeg,
invertArrows=not inv_volt)
return group
# ---------------------------------------------
def drawSourceVDCbattery(self, parent, position=[0, 0], value='V', label='Source', angleDeg=0, flagVolt=True, flagCurr=True, currName='i',
invertArrows=False, mirror=False, convention='active', wireExtraSize=0,flagVariable=False):
""" draws a DC battery source
parent: parent object
position: position [x,y]
value: string with value.
label: label of the object (it can be repeated)
angleDeg: rotation angle in degrees counter-clockwise (default 0)
flagVolt: indicates whether the voltage arrow must be drawn (default: true)
flagCurr: indicates whether the current arrow must be drawn (default: true)
currName: current drop name (default: i)
mirror: mirror source drawing (default: False)
convention: passive/active sign convention. available types: 'passive', 'active' (default)
wireExtraSize: additional length added to the terminals. If negative, the length will be reduced. default: 0)
"""
group = self.createGroup(parent, label)
elem = self.createGroup(group, label)
inkDraw.line.relCoords(elem, [[-(18 + wireExtraSize), 0]], self.add(position, [-7, 0]))
inkDraw.line.relCoords(elem, [[17 + wireExtraSize, 0]], self.add(position, [8, 0]))
# draw source
if mirror:
self.drawSigns(elem, positionPos=self.add(position, [-10, -4]), positionNeg=self.add(position, [11, -4]))
inkDraw.line.relCoords(elem, [[0, -6]], self.add(position, [-4, 3]))
inkDraw.line.relCoords(elem, [[0, -14]], self.add(position, [-7, 7]))
inkDraw.line.relCoords(elem, [[0, -6]], self.add(position, [2, 3]))
inkDraw.line.relCoords(elem, [[0, -14]], self.add(position, [-1, 7]))
inkDraw.line.relCoords(elem, [[0, -6]], self.add(position, [8, 3]))
inkDraw.line.relCoords(elem, [[0, -14]], self.add(position, [5, 7]))
else:
self.drawSigns(elem, positionPos=self.add(position, [11, -4]), positionNeg=self.add(position, [-10, -4]))
inkDraw.line.relCoords(elem, [[0, -14]], self.add(position, [-4, 7]))
inkDraw.line.relCoords(elem, [[0, -6]], self.add(position, [-7, 3]))
inkDraw.line.relCoords(elem, [[0, -14]], self.add(position, [2, 7]))
inkDraw.line.relCoords(elem, [[0, -6]], self.add(position, [-1, 3]))
inkDraw.line.relCoords(elem, [[0, -14]], self.add(position, [8, 7]))
inkDraw.line.relCoords(elem, [[0, -6]], self.add(position, [5, 3]))
if flagVariable:
# build arrow marker
colorBlack = inkDraw.color.defined('black')
L_arrow = 2.5
markerPath = 'M 0,0 l -%f,%f l 0,-%f z' % (L_arrow * 1.2, L_arrow / 2.0, L_arrow)
markerArrow = inkDraw.marker.createMarker(self, 'arrow', markerPath, RenameMode=0, strokeColor=colorBlack, fillColor=colorBlack,
lineWidth=0.6, markerTransform='translate (1,0)')
lineStyleArrow = inkDraw.lineStyle.set(lineWidth=1, lineColor=colorBlack, markerEnd=markerArrow)
inkDraw.line.relCoords(elem, [[19, -18]], self.add(position, [-9, 8]), lineStyle=lineStyleArrow)
pos_text = self.add(position, [0, -9 - self.textOffset])
if inkDraw.useLatex:
value = '$' + value + '$'
inkDraw.text.latex(self, group, value, pos_text, fontSize=self.fontSize, refPoint='bc', preambleFile=self.preambleFile)
if angleDeg != 0:
self.rotateElement(group, position, angleDeg)
inv_volt = (invertArrows == mirror)
if flagVolt:
if invertArrows:
if inkDraw.useLatex:
if value[1] == '-':
value = value[0] + value[2:]
else:
value = value[0] + '-' + value[1:]
else:
if value[0] == '-':
value = value[1:]
else:
value = '-' + value
self.drawVoltArrow(group, self.add(position, [0, 9]), name=value, color=self.voltageColor, angleDeg=angleDeg, invertArrows=inv_volt)
if flagCurr:
if convention == 'active':
self.drawCurrArrow(group, self.add(position, [20 + wireExtraSize, -5]), name=currName, color=self.currentColor, angleDeg=angleDeg,
invertArrows=inv_volt)
if convention == 'passive':
self.drawCurrArrow(group, self.add(position, [20 + wireExtraSize, -5]), name=currName, color=self.currentColor, angleDeg=angleDeg,
invertArrows=not inv_volt)
return group
# ---------------------------------------------
def drawSourceI(self, parent, position=[0, 0], value='i(t)', label='Source', angleDeg=0, flagVolt=True, flagCurr=True, voltName='v',
invertArrows=False, mirror=False, convention='active', wireExtraSize=0,standard='IEEE',flagVariable=False):
""" draws a independend general current source
parent: parent object
position: position [x,y]
value: string with value.
label: label of the object (it can be repeated)
angleDeg: rotation angle in degrees counter-clockwise (default 0)
flagVolt: indicates whether the voltage arrow must be drawn (default: true)
voltName: voltage drop name (default: v)
flagCurr: indicates whether the current arrow must be drawn (default: true)
mirror: mirror source drawing (default: False)
convention: passive/active sign convention. available types: 'passive', 'active' (default)
wireExtraSize: additional length added to the terminals. If negative, the length will be reduced. default: 0)
standard: types: 'IEEE' (american), 'IEC' (european), 'OLD' (two circles - DIN)
"""
group = self.createGroup(parent, label)
elem = self.createGroup(group, label)
# terminals and circle(s)
if standard.upper() == 'OLD':
inkDraw.line.relCoords(elem, [[-(14 + wireExtraSize), 0]], self.add(position, [-11, 0]))
inkDraw.line.relCoords(elem, [[14 + wireExtraSize, 0]], self.add(position, [11, 0]))
inkDraw.circle.centerRadius(elem, [4, 0], 7.0, offset=position, label='circle')
inkDraw.circle.centerRadius(elem, [-4, 0], 7.0, offset=position, label='circle')
else:
inkDraw.line.relCoords(elem, [[-(18 + wireExtraSize), 0]], self.add(position, [-7, 0]))
inkDraw.line.relCoords(elem, [[18 + wireExtraSize, 0]], self.add(position, [7, 0]))
inkDraw.circle.centerRadius(elem, [0, 0], 7.0, offset=position, label='circle')
# arrow
if standard.upper() == 'IEEE':
lineStyleSign = inkDraw.lineStyle.set(lineWidth=0.7, lineColor=inkDraw.color.defined('black'), fillColor=inkDraw.color.defined('black'))
if mirror:
inkDraw.line.relCoords(elem, [[-5, 0], [0, 1.2], [-3, -1.2], [3, -1.2], [0, 1.2]], self.add(position, [4, 0]), lineStyle=lineStyleSign)
else:
inkDraw.line.relCoords(elem, [[5, 0], [0, 1.2], [3, -1.2], [-3, -1.2], [0, 1.2]], self.add(position, [-4, 0]), lineStyle=lineStyleSign)
if standard.upper() == 'IEC':
lineStyleSign = inkDraw.lineStyle.setSimpleBlack(lineWidth=0.6)
inkDraw.line.relCoords(elem, [[0,14]], self.add(position, [0,-7]),lineStyle=lineStyleSign)
#if standard.upper() == 'OLD':
# lineStyleSign = inkDraw.lineStyle.set(lineWidth=0.7, lineColor=inkDraw.color.defined('black'), fillColor=inkDraw.color.defined('black'))
# if mirror:
# inkDraw.line.relCoords(elem, [[-13, 0], [0, 1.2], [-3, -1.2], [3, -1.2], [0, 1.2]], self.add(position, [8, 0]), lineStyle=lineStyleSign)
# else:
# inkDraw.line.relCoords(elem, [[13, 0], [0, 1.2], [3, -1.2], [-3, -1.2], [0, 1.2]], self.add(position, [-8, 0]), lineStyle=lineStyleSign)
if flagVariable:
# build arrow marker
colorBlack = inkDraw.color.defined('black')
L_arrow = 2.5
markerPath = 'M 0,0 l -%f,%f l 0,-%f z' % (L_arrow * 1.2, L_arrow / 2.0, L_arrow)
markerArrow = inkDraw.marker.createMarker(self, 'arrow', markerPath, RenameMode=0, strokeColor=colorBlack, fillColor=colorBlack,
lineWidth=0.6, markerTransform='translate (1,0)')
lineStyleArrow = inkDraw.lineStyle.set(lineWidth=1, lineColor=colorBlack, markerEnd=markerArrow)
if standard.upper() == 'OLD':
inkDraw.line.relCoords(elem, [[20, -15]], self.add(position, [-10, 7]), lineStyle=lineStyleArrow)
else:
inkDraw.line.relCoords(elem, [[17, -15]], self.add(position, [-8, 7]), lineStyle=lineStyleArrow)
pos_text = self.add(position, [0, -8 - self.textOffset])
if inkDraw.useLatex:
value = '$' + value + '$'
inkDraw.text.latex(self, group, value, pos_text, fontSize=self.fontSize, refPoint='bc', preambleFile=self.preambleFile)
if angleDeg != 0:
self.rotateElement(group, position, angleDeg)
inv_curr = (invertArrows == mirror)
if flagVolt:
if convention == 'active':
self.drawVoltArrow(group, self.add(position, [0, 9]), name=voltName, color=self.voltageColor, angleDeg=angleDeg,
invertArrows=inv_curr)
if convention == 'passive':
self.drawVoltArrow(group, self.add(position, [0, 9]), name=voltName, color=self.voltageColor, angleDeg=angleDeg,
invertArrows=not inv_curr)
if flagCurr:
if invertArrows:
if inkDraw.useLatex:
if value[1] == '-':
value = value[0] + value[2:]
else:
value = value[0] + '-' + value[1:]
else:
if value[0] == '-':
value = value[1:]
else:
value = '-' + value
self.drawCurrArrow(group, self.add(position, [20 + wireExtraSize, -5]), name=value, color=self.currentColor, angleDeg=angleDeg,
invertArrows=inv_curr)
return group
# ---------------------------------------------
def drawControledSourceV(self, parent, position=[0, 0], controlType='volt', gain='k', controlName='v_c', label='Source', angleDeg=0,
flagVolt=True, flagCurr=True, currName='i', invertArrows=False, mirror=False, convention='active', drawControl=False,
wireExtraSize=0,standard='IEEE'):
""" draws a controlled general voltage source
parent: parent object
position: position [x,y]
controlType: 'volt' or 'curr'
gain: controlled source gain value
controlName: name of the controlling signal
label: label of the object (it can be repeated)
angleDeg: rotation angle in degrees counter-clockwise (default 0)
flagVolt: indicates whether the voltage arrow must be drawn (default: true)
currName: current name (default: i)
flagCurr: indicates whether the current arrow must be drawn (default: true)
mirror: mirror source drawing (default: False)
convention: passive/active sign convention. available types: 'passive', 'active' (default)
drawControl: draws control annotation arrow (default:false)
wireExtraSize: additional length added to the terminals. If negative, the length will be reduced. default: 0)
standard: types: 'IEEE' (american), 'IEC' (european)
"""
group = self.createGroup(parent, label)
elem = self.createGroup(group, label)
lineStyleSign = inkDraw.lineStyle.setSimpleBlack(lineWidth=0.6)
inkDraw.line.relCoords(elem, [[-(17 + wireExtraSize), 0]], self.add(position, [-8, 0]))
inkDraw.line.relCoords(elem, [[17 + wireExtraSize, 0]], self.add(position, [8, 0]))
inkDraw.line.relCoords(elem, [[8, 8], [8, -8], [-8, -8], [-8, 8]], self.add(position, [-8, 0]))
if standard == 'IEEE':
# signs
if mirror:
self.drawSigns(elem, positionPos=self.add(position, [-4, 0]), positionNeg=self.add(position, [4, 0]))
else:
self.drawSigns(elem, positionPos=self.add(position, [4, 0]), positionNeg=self.add(position, [-4, 0]))
if standard == 'IEC':
inkDraw.line.relCoords(elem, [[14, 0]], self.add(position, [-7, 0]),lineStyle=lineStyleSign)
# text
pos_text = self.add(position, [0, -8 - self.textOffset])
value = gain + '.' + controlName
if inkDraw.useLatex:
value = '$' + value + '$'
inkDraw.text.latex(self, group, value, pos_text, fontSize=self.fontSize, refPoint='bc', preambleFile=self.preambleFile)
if angleDeg != 0:
self.rotateElement(group, position, angleDeg)
# arrows
inv_volt = (invertArrows == mirror)
if flagVolt:
if invertArrows:
if inkDraw.useLatex:
if value[1] == '-':
value = value[0] + value[2:]
else:
value = value[0] + '-' + value[1:]
else:
if value[0] == '-':
value = value[1:]
else:
value = '-' + value
self.drawVoltArrow(group, self.add(position, [0, 8]), name=value, color=self.voltageColor, angleDeg=angleDeg, invertArrows=inv_volt)
if flagCurr:
if convention == 'active':
self.drawCurrArrow(group, self.add(position, [20 + wireExtraSize, -5]), name=currName, color=self.currentColor, angleDeg=angleDeg,
invertArrows=inv_volt)
if convention == 'passive':
self.drawCurrArrow(group, self.add(position, [20 + wireExtraSize, -5]), name=currName, color=self.currentColor, angleDeg=angleDeg,
invertArrows=not inv_volt)
# control signal
if drawControl:
for theta in range(0, 360, 90):
pos1 = self.add(position, [-20, 25 + theta / 4])
pos2 = self.add(position, [0, 25 + theta / 4])
if controlType == 'volt':
temp1 = self.drawVoltArrow(parent, pos1, name=controlName, color=self.voltageColor, angleDeg=theta, invertArrows=False)
temp2 = self.drawVoltArrow(parent, pos2, name=controlName, color=self.voltageColor, angleDeg=theta, invertArrows=True)
if controlType == 'curr':
temp1 = self.drawCurrArrow(parent, pos1, name=controlName, color=self.currentColor, angleDeg=theta, invertArrows=False)
temp2 = self.drawCurrArrow(parent, pos2, name=controlName, color=self.currentColor, angleDeg=theta, invertArrows=True)
self.rotateElement(temp1, pos1, theta)
self.rotateElement(temp2, pos2, theta)
return group
# ---------------------------------------------
def drawControledSourceI(self, parent, position=[0, 0], controlType='volt', gain='k', controlName='v_c', label='Source', angleDeg=0,
flagVolt=True, flagCurr=True, voltName='v', invertArrows=False, mirror=False, convention='active', drawControl=False,
wireExtraSize=0,standard='IEEE'):
""" draws a controlled general current source
parent: parent object
position: position [x,y]
controlType: 'volt' or 'curr'
gain: controlled source gain value
controlName: name of the controlling signal
label: label of the object (it can be repeated)
angleDeg: rotation angle in degrees counter-clockwise (default 0)
flagVolt: indicates whether the voltage arrow must be drawn (default: true)
voltName: voltage name (default: v)
flagCurr: indicates whether the current arrow must be drawn (default: true)
mirror: mirror source drawing (default: False)
convention: passive/active sign convention. available types: 'passive', 'active' (default)
wireExtraSize: additional length added to the terminals. If negative, the length will be reduced. default: 0)
drawControl: draws control annotation arrow (default:false)
standard: types: 'IEEE' (american), 'IEC' (european)
"""
group = self.createGroup(parent, label)
elem = self.createGroup(group, label)
inkDraw.line.relCoords(elem, [[-(17 + wireExtraSize), 0]], self.add(position, [-8, 0]))
inkDraw.line.relCoords(elem, [[17 + wireExtraSize, 0]], self.add(position, [8, 0]))
inkDraw.line.relCoords(elem, [[8, 8], [8, -8], [-8, -8], [-8, 8]], self.add(position, [-8, 0]))
if standard == 'IEEE':
# arrow
lineStyleSign = inkDraw.lineStyle.set(lineWidth=0.7, lineColor=inkDraw.color.defined('black'), fillColor=inkDraw.color.defined('black'))
if mirror:
inkDraw.line.relCoords(elem, [[-5, 0], [0, 1.2], [-3, -1.2], [3, -1.2], [0, 1.2]], self.add(position, [4, 0]), lineStyle=lineStyleSign)
else:
inkDraw.line.relCoords(elem, [[5, 0], [0, 1.2], [3, -1.2], [-3, -1.2], [0, 1.2]], self.add(position, [-4, 0]), lineStyle=lineStyleSign)
if standard == 'IEC':
lineStyleSign = inkDraw.lineStyle.setSimpleBlack(lineWidth=0.6)
inkDraw.line.relCoords(elem, [[0,14]], self.add(position, [0,-7]),lineStyle=lineStyleSign)
# text
pos_text = self.add(position, [0, -8 - self.textOffset])
value = gain + '.' + controlName
if inkDraw.useLatex:
value = '$' + value + '$'
inkDraw.text.latex(self, group, value, pos_text, fontSize=self.fontSize, refPoint='bc', preambleFile=self.preambleFile)
if angleDeg != 0:
self.rotateElement(group, position, angleDeg)
# arrows
inv_curr = (invertArrows == mirror)
if flagVolt:
if convention == 'active':
self.drawVoltArrow(group, self.add(position, [0, 8]), name=voltName, color=self.voltageColor, angleDeg=angleDeg,
invertArrows=inv_curr)
if convention == 'passive':
self.drawVoltArrow(group, self.add(position, [0, 8]), name=voltName, color=self.voltageColor, angleDeg=angleDeg,
invertArrows=not inv_curr)
if flagCurr:
if invertArrows:
if inkDraw.useLatex:
if value[1] == '-':
value = value[0] + value[2:]
else:
value = value[0] + '-' + value[1:]
else:
if value[0] == '-':
value = value[1:]
else:
value = '-' + value
self.drawCurrArrow(group, self.add(position, [20 + wireExtraSize, -5]), name=value, color=self.currentColor, angleDeg=angleDeg,
invertArrows=inv_curr)
# control signal
if drawControl:
for theta in range(0, 360, 90):
pos1 = self.add(position, [-20, 25 + theta / 4])
pos2 = self.add(position, [0, 25 + theta / 4])
if controlType == 'volt':
temp1 = self.drawVoltArrow(parent, pos1, name=controlName, color=self.voltageColor, angleDeg=theta, invertArrows=False)
temp2 = self.drawVoltArrow(parent, pos2, name=controlName, color=self.voltageColor, angleDeg=theta, invertArrows=True)
if controlType == 'curr':
temp1 = self.drawCurrArrow(parent, pos1, name=controlName, color=self.currentColor, angleDeg=theta, invertArrows=False)
temp2 = self.drawCurrArrow(parent, pos2, name=controlName, color=self.currentColor, angleDeg=theta, invertArrows=True)
self.rotateElement(temp1, pos1, theta)
self.rotateElement(temp2, pos2, theta)
return group
| 53.371134 | 158 | 0.57382 | 3,375 | 31,062 | 5.264593 | 0.067259 | 0.039397 | 0.081889 | 0.063485 | 0.944113 | 0.93432 | 0.932857 | 0.918618 | 0.90753 | 0.897062 | 0 | 0.030512 | 0.285687 | 31,062 | 581 | 159 | 53.462995 | 0.770281 | 0.184019 | 0 | 0.855956 | 0 | 0 | 0.02284 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.022161 | false | 0.016621 | 0.00554 | 0.00277 | 0.049862 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
4030505b9603d1865f92c966ab67b2f10ca9b440 | 13,191 | py | Python | tests/test_metrics.py | HeqingZhang/mmsegmentation | 90d8038e909be9f2154b49d15f95a648ceb75120 | [
"Apache-2.0"
] | 367 | 2022-01-14T03:32:25.000Z | 2022-03-31T04:48:20.000Z | tests/test_metrics.py | Junjun2016/LiteHRNet | e2b13de52e970215be566067cab7bd880010f062 | [
"Apache-2.0"
] | 27 | 2022-01-27T07:12:49.000Z | 2022-03-31T04:31:13.000Z | tests/test_metrics.py | Junjun2016/LiteHRNet | e2b13de52e970215be566067cab7bd880010f062 | [
"Apache-2.0"
] | 53 | 2022-01-18T11:21:43.000Z | 2022-03-31T06:42:41.000Z | import numpy as np
from mmseg.core.evaluation import (eval_metrics, mean_dice, mean_fscore,
mean_iou)
from mmseg.core.evaluation.metrics import f_score
def get_confusion_matrix(pred_label, label, num_classes, ignore_index):
"""Intersection over Union
Args:
pred_label (np.ndarray): 2D predict map
label (np.ndarray): label 2D label map
num_classes (int): number of categories
ignore_index (int): index ignore in evaluation
"""
mask = (label != ignore_index)
pred_label = pred_label[mask]
label = label[mask]
n = num_classes
inds = n * label + pred_label
mat = np.bincount(inds, minlength=n**2).reshape(n, n)
return mat
# This func is deprecated since it's not memory efficient
def legacy_mean_iou(results, gt_seg_maps, num_classes, ignore_index):
num_imgs = len(results)
assert len(gt_seg_maps) == num_imgs
total_mat = np.zeros((num_classes, num_classes), dtype=np.float)
for i in range(num_imgs):
mat = get_confusion_matrix(
results[i], gt_seg_maps[i], num_classes, ignore_index=ignore_index)
total_mat += mat
all_acc = np.diag(total_mat).sum() / total_mat.sum()
acc = np.diag(total_mat) / total_mat.sum(axis=1)
iou = np.diag(total_mat) / (
total_mat.sum(axis=1) + total_mat.sum(axis=0) - np.diag(total_mat))
return all_acc, acc, iou
# This func is deprecated since it's not memory efficient
def legacy_mean_dice(results, gt_seg_maps, num_classes, ignore_index):
num_imgs = len(results)
assert len(gt_seg_maps) == num_imgs
total_mat = np.zeros((num_classes, num_classes), dtype=np.float)
for i in range(num_imgs):
mat = get_confusion_matrix(
results[i], gt_seg_maps[i], num_classes, ignore_index=ignore_index)
total_mat += mat
all_acc = np.diag(total_mat).sum() / total_mat.sum()
acc = np.diag(total_mat) / total_mat.sum(axis=1)
dice = 2 * np.diag(total_mat) / (
total_mat.sum(axis=1) + total_mat.sum(axis=0))
return all_acc, acc, dice
# This func is deprecated since it's not memory efficient
def legacy_mean_fscore(results,
gt_seg_maps,
num_classes,
ignore_index,
beta=1):
num_imgs = len(results)
assert len(gt_seg_maps) == num_imgs
total_mat = np.zeros((num_classes, num_classes), dtype=np.float)
for i in range(num_imgs):
mat = get_confusion_matrix(
results[i], gt_seg_maps[i], num_classes, ignore_index=ignore_index)
total_mat += mat
all_acc = np.diag(total_mat).sum() / total_mat.sum()
recall = np.diag(total_mat) / total_mat.sum(axis=1)
precision = np.diag(total_mat) / total_mat.sum(axis=0)
fv = np.vectorize(f_score)
fscore = fv(precision, recall, beta=beta)
return all_acc, recall, precision, fscore
def test_metrics():
pred_size = (10, 30, 30)
num_classes = 19
ignore_index = 255
results = np.random.randint(0, num_classes, size=pred_size)
label = np.random.randint(0, num_classes, size=pred_size)
# Test the availability of arg: ignore_index.
label[:, 2, 5:10] = ignore_index
# Test the correctness of the implementation of mIoU calculation.
ret_metrics = eval_metrics(
results, label, num_classes, ignore_index, metrics='mIoU')
all_acc, acc, iou = ret_metrics['aAcc'], ret_metrics['Acc'], ret_metrics[
'IoU']
all_acc_l, acc_l, iou_l = legacy_mean_iou(results, label, num_classes,
ignore_index)
assert all_acc == all_acc_l
assert np.allclose(acc, acc_l)
assert np.allclose(iou, iou_l)
# Test the correctness of the implementation of mDice calculation.
ret_metrics = eval_metrics(
results, label, num_classes, ignore_index, metrics='mDice')
all_acc, acc, dice = ret_metrics['aAcc'], ret_metrics['Acc'], ret_metrics[
'Dice']
all_acc_l, acc_l, dice_l = legacy_mean_dice(results, label, num_classes,
ignore_index)
assert all_acc == all_acc_l
assert np.allclose(acc, acc_l)
assert np.allclose(dice, dice_l)
# Test the correctness of the implementation of mDice calculation.
ret_metrics = eval_metrics(
results, label, num_classes, ignore_index, metrics='mFscore')
all_acc, recall, precision, fscore = ret_metrics['aAcc'], ret_metrics[
'Recall'], ret_metrics['Precision'], ret_metrics['Fscore']
all_acc_l, recall_l, precision_l, fscore_l = legacy_mean_fscore(
results, label, num_classes, ignore_index)
assert all_acc == all_acc_l
assert np.allclose(recall, recall_l)
assert np.allclose(precision, precision_l)
assert np.allclose(fscore, fscore_l)
# Test the correctness of the implementation of joint calculation.
ret_metrics = eval_metrics(
results,
label,
num_classes,
ignore_index,
metrics=['mIoU', 'mDice', 'mFscore'])
all_acc, acc, iou, dice, precision, recall, fscore = ret_metrics[
'aAcc'], ret_metrics['Acc'], ret_metrics['IoU'], ret_metrics[
'Dice'], ret_metrics['Precision'], ret_metrics[
'Recall'], ret_metrics['Fscore']
assert all_acc == all_acc_l
assert np.allclose(acc, acc_l)
assert np.allclose(iou, iou_l)
assert np.allclose(dice, dice_l)
assert np.allclose(precision, precision_l)
assert np.allclose(recall, recall_l)
assert np.allclose(fscore, fscore_l)
# Test the correctness of calculation when arg: num_classes is larger
# than the maximum value of input maps.
results = np.random.randint(0, 5, size=pred_size)
label = np.random.randint(0, 4, size=pred_size)
ret_metrics = eval_metrics(
results,
label,
num_classes,
ignore_index=255,
metrics='mIoU',
nan_to_num=-1)
all_acc, acc, iou = ret_metrics['aAcc'], ret_metrics['Acc'], ret_metrics[
'IoU']
assert acc[-1] == -1
assert iou[-1] == -1
ret_metrics = eval_metrics(
results,
label,
num_classes,
ignore_index=255,
metrics='mDice',
nan_to_num=-1)
all_acc, acc, dice = ret_metrics['aAcc'], ret_metrics['Acc'], ret_metrics[
'Dice']
assert acc[-1] == -1
assert dice[-1] == -1
ret_metrics = eval_metrics(
results,
label,
num_classes,
ignore_index=255,
metrics='mFscore',
nan_to_num=-1)
all_acc, precision, recall, fscore = ret_metrics['aAcc'], ret_metrics[
'Precision'], ret_metrics['Recall'], ret_metrics['Fscore']
assert precision[-1] == -1
assert recall[-1] == -1
assert fscore[-1] == -1
ret_metrics = eval_metrics(
results,
label,
num_classes,
ignore_index=255,
metrics=['mDice', 'mIoU', 'mFscore'],
nan_to_num=-1)
all_acc, acc, iou, dice, precision, recall, fscore = ret_metrics[
'aAcc'], ret_metrics['Acc'], ret_metrics['IoU'], ret_metrics[
'Dice'], ret_metrics['Precision'], ret_metrics[
'Recall'], ret_metrics['Fscore']
assert acc[-1] == -1
assert dice[-1] == -1
assert iou[-1] == -1
assert precision[-1] == -1
assert recall[-1] == -1
assert fscore[-1] == -1
# Test the bug which is caused by torch.histc.
# torch.histc: https://pytorch.org/docs/stable/generated/torch.histc.html
# When the arg:bins is set to be same as arg:max,
# some channels of mIoU may be nan.
results = np.array([np.repeat(31, 59)])
label = np.array([np.arange(59)])
num_classes = 59
ret_metrics = eval_metrics(
results, label, num_classes, ignore_index=255, metrics='mIoU')
all_acc, acc, iou = ret_metrics['aAcc'], ret_metrics['Acc'], ret_metrics[
'IoU']
assert not np.any(np.isnan(iou))
def test_mean_iou():
pred_size = (10, 30, 30)
num_classes = 19
ignore_index = 255
results = np.random.randint(0, num_classes, size=pred_size)
label = np.random.randint(0, num_classes, size=pred_size)
label[:, 2, 5:10] = ignore_index
ret_metrics = mean_iou(results, label, num_classes, ignore_index)
all_acc, acc, iou = ret_metrics['aAcc'], ret_metrics['Acc'], ret_metrics[
'IoU']
all_acc_l, acc_l, iou_l = legacy_mean_iou(results, label, num_classes,
ignore_index)
assert all_acc == all_acc_l
assert np.allclose(acc, acc_l)
assert np.allclose(iou, iou_l)
results = np.random.randint(0, 5, size=pred_size)
label = np.random.randint(0, 4, size=pred_size)
ret_metrics = mean_iou(
results, label, num_classes, ignore_index=255, nan_to_num=-1)
all_acc, acc, iou = ret_metrics['aAcc'], ret_metrics['Acc'], ret_metrics[
'IoU']
assert acc[-1] == -1
assert acc[-1] == -1
def test_mean_dice():
pred_size = (10, 30, 30)
num_classes = 19
ignore_index = 255
results = np.random.randint(0, num_classes, size=pred_size)
label = np.random.randint(0, num_classes, size=pred_size)
label[:, 2, 5:10] = ignore_index
ret_metrics = mean_dice(results, label, num_classes, ignore_index)
all_acc, acc, iou = ret_metrics['aAcc'], ret_metrics['Acc'], ret_metrics[
'Dice']
all_acc_l, acc_l, dice_l = legacy_mean_dice(results, label, num_classes,
ignore_index)
assert all_acc == all_acc_l
assert np.allclose(acc, acc_l)
assert np.allclose(iou, dice_l)
results = np.random.randint(0, 5, size=pred_size)
label = np.random.randint(0, 4, size=pred_size)
ret_metrics = mean_dice(
results, label, num_classes, ignore_index=255, nan_to_num=-1)
all_acc, acc, dice = ret_metrics['aAcc'], ret_metrics['Acc'], ret_metrics[
'Dice']
assert acc[-1] == -1
assert dice[-1] == -1
def test_mean_fscore():
pred_size = (10, 30, 30)
num_classes = 19
ignore_index = 255
results = np.random.randint(0, num_classes, size=pred_size)
label = np.random.randint(0, num_classes, size=pred_size)
label[:, 2, 5:10] = ignore_index
ret_metrics = mean_fscore(results, label, num_classes, ignore_index)
all_acc, recall, precision, fscore = ret_metrics['aAcc'], ret_metrics[
'Recall'], ret_metrics['Precision'], ret_metrics['Fscore']
all_acc_l, recall_l, precision_l, fscore_l = legacy_mean_fscore(
results, label, num_classes, ignore_index)
assert all_acc == all_acc_l
assert np.allclose(recall, recall_l)
assert np.allclose(precision, precision_l)
assert np.allclose(fscore, fscore_l)
ret_metrics = mean_fscore(
results, label, num_classes, ignore_index, beta=2)
all_acc, recall, precision, fscore = ret_metrics['aAcc'], ret_metrics[
'Recall'], ret_metrics['Precision'], ret_metrics['Fscore']
all_acc_l, recall_l, precision_l, fscore_l = legacy_mean_fscore(
results, label, num_classes, ignore_index, beta=2)
assert all_acc == all_acc_l
assert np.allclose(recall, recall_l)
assert np.allclose(precision, precision_l)
assert np.allclose(fscore, fscore_l)
results = np.random.randint(0, 5, size=pred_size)
label = np.random.randint(0, 4, size=pred_size)
ret_metrics = mean_fscore(
results, label, num_classes, ignore_index=255, nan_to_num=-1)
all_acc, recall, precision, fscore = ret_metrics['aAcc'], ret_metrics[
'Recall'], ret_metrics['Precision'], ret_metrics['Fscore']
assert recall[-1] == -1
assert precision[-1] == -1
assert fscore[-1] == -1
def test_filename_inputs():
import cv2
import tempfile
def save_arr(input_arrays: list, title: str, is_image: bool, dir: str):
filenames = []
SUFFIX = '.png' if is_image else '.npy'
for idx, arr in enumerate(input_arrays):
filename = '{}/{}-{}{}'.format(dir, title, idx, SUFFIX)
if is_image:
cv2.imwrite(filename, arr)
else:
np.save(filename, arr)
filenames.append(filename)
return filenames
pred_size = (10, 30, 30)
num_classes = 19
ignore_index = 255
results = np.random.randint(0, num_classes, size=pred_size)
labels = np.random.randint(0, num_classes, size=pred_size)
labels[:, 2, 5:10] = ignore_index
with tempfile.TemporaryDirectory() as temp_dir:
result_files = save_arr(results, 'pred', False, temp_dir)
label_files = save_arr(labels, 'label', True, temp_dir)
ret_metrics = eval_metrics(
result_files,
label_files,
num_classes,
ignore_index,
metrics='mIoU')
all_acc, acc, iou = ret_metrics['aAcc'], ret_metrics[
'Acc'], ret_metrics['IoU']
all_acc_l, acc_l, iou_l = legacy_mean_iou(results, labels, num_classes,
ignore_index)
assert all_acc == all_acc_l
assert np.allclose(acc, acc_l)
assert np.allclose(iou, iou_l)
| 37.688571 | 79 | 0.637935 | 1,845 | 13,191 | 4.306775 | 0.095935 | 0.101938 | 0.064435 | 0.084571 | 0.818651 | 0.804933 | 0.799899 | 0.783916 | 0.765542 | 0.761767 | 0 | 0.019783 | 0.245091 | 13,191 | 349 | 80 | 37.796562 | 0.778168 | 0.074975 | 0 | 0.727273 | 0 | 0 | 0.032784 | 0 | 0 | 0 | 0 | 0 | 0.202797 | 1 | 0.034965 | false | 0 | 0.017483 | 0 | 0.06993 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
4098cb75097ef32fb06fbb9da21062dfcdd26410 | 84,577 | py | Python | harness/determined/_swagger/client/api/internal_api.py | wbwatkinson/determined | f9e099e06746a79a2eaf51a89acc264fc5c301e1 | [
"Apache-2.0"
] | null | null | null | harness/determined/_swagger/client/api/internal_api.py | wbwatkinson/determined | f9e099e06746a79a2eaf51a89acc264fc5c301e1 | [
"Apache-2.0"
] | null | null | null | harness/determined/_swagger/client/api/internal_api.py | wbwatkinson/determined | f9e099e06746a79a2eaf51a89acc264fc5c301e1 | [
"Apache-2.0"
] | null | null | null | # coding: utf-8
"""
Determined API (Beta)
Determined helps deep learning teams train models more quickly, easily share GPU resources, and effectively collaborate. Determined allows deep learning engineers to focus on building and training models at scale, without needing to worry about DevOps or writing custom code for common tasks like fault tolerance or experiment tracking. You can think of Determined as a platform that bridges the gap between tools like TensorFlow and PyTorch --- which work great for a single researcher with a single GPU --- to the challenges that arise when doing deep learning at scale, as teams, clusters, and data sets all increase in size. # noqa: E501
OpenAPI spec version: 0.1
Contact: community@determined.ai
Generated by: https://github.com/swagger-api/swagger-codegen.git
"""
from __future__ import absolute_import
import re # noqa: F401
# python 2 and python 3 compatibility library
import six
from determined._swagger.client.api_client import ApiClient
class InternalApi(object):
"""NOTE: This class is auto generated by the swagger code generator program.
Do not edit the class manually.
Ref: https://github.com/swagger-api/swagger-codegen
"""
def __init__(self, api_client=None):
if api_client is None:
api_client = ApiClient()
self.api_client = api_client
def determined_complete_trial_searcher_validation(self, trial_id, body, **kwargs): # noqa: E501
"""Reports to the searcher that the trial has completed the given searcher operation. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.determined_complete_trial_searcher_validation(trial_id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int trial_id: The id of the trial. (required)
:param V1CompleteValidateAfterOperation body: The completed operation. (required)
:return: V1CompleteTrialSearcherValidationResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.determined_complete_trial_searcher_validation_with_http_info(trial_id, body, **kwargs) # noqa: E501
else:
(data) = self.determined_complete_trial_searcher_validation_with_http_info(trial_id, body, **kwargs) # noqa: E501
return data
def determined_complete_trial_searcher_validation_with_http_info(self, trial_id, body, **kwargs): # noqa: E501
"""Reports to the searcher that the trial has completed the given searcher operation. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.determined_complete_trial_searcher_validation_with_http_info(trial_id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int trial_id: The id of the trial. (required)
:param V1CompleteValidateAfterOperation body: The completed operation. (required)
:return: V1CompleteTrialSearcherValidationResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['trial_id', 'body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method determined_complete_trial_searcher_validation" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'trial_id' is set
if ('trial_id' not in params or
params['trial_id'] is None):
raise ValueError("Missing the required parameter `trial_id` when calling `determined_complete_trial_searcher_validation`") # noqa: E501
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `determined_complete_trial_searcher_validation`") # noqa: E501
collection_formats = {}
path_params = {}
if 'trial_id' in params:
path_params['trialId'] = params['trial_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['BearerToken'] # noqa: E501
return self.api_client.call_api(
'/api/v1/trials/{trialId}/searcher/completed_operation', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='V1CompleteTrialSearcherValidationResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def determined_compute_hp_importance(self, experiment_id, **kwargs): # noqa: E501
"""Trigger the computation of hyperparameter importance on-demand for a specific metric on a specific experiment. The status and results can be retrieved with GetHPImportance. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.determined_compute_hp_importance(experiment_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int experiment_id: The id of the experiment. (required)
:return: V1ComputeHPImportanceResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.determined_compute_hp_importance_with_http_info(experiment_id, **kwargs) # noqa: E501
else:
(data) = self.determined_compute_hp_importance_with_http_info(experiment_id, **kwargs) # noqa: E501
return data
def determined_compute_hp_importance_with_http_info(self, experiment_id, **kwargs): # noqa: E501
"""Trigger the computation of hyperparameter importance on-demand for a specific metric on a specific experiment. The status and results can be retrieved with GetHPImportance. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.determined_compute_hp_importance_with_http_info(experiment_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int experiment_id: The id of the experiment. (required)
:return: V1ComputeHPImportanceResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['experiment_id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method determined_compute_hp_importance" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'experiment_id' is set
if ('experiment_id' not in params or
params['experiment_id'] is None):
raise ValueError("Missing the required parameter `experiment_id` when calling `determined_compute_hp_importance`") # noqa: E501
collection_formats = {}
path_params = {}
if 'experiment_id' in params:
path_params['experimentId'] = params['experiment_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['BearerToken'] # noqa: E501
return self.api_client.call_api(
'/api/v1/experiments/{experimentId}/hyperparameter-importance', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='V1ComputeHPImportanceResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def determined_create_experiment(self, body, **kwargs): # noqa: E501
"""Create an experiment. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.determined_create_experiment(body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param V1CreateExperimentRequest body: (required)
:return: V1CreateExperimentResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.determined_create_experiment_with_http_info(body, **kwargs) # noqa: E501
else:
(data) = self.determined_create_experiment_with_http_info(body, **kwargs) # noqa: E501
return data
def determined_create_experiment_with_http_info(self, body, **kwargs): # noqa: E501
"""Create an experiment. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.determined_create_experiment_with_http_info(body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param V1CreateExperimentRequest body: (required)
:return: V1CreateExperimentResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method determined_create_experiment" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `determined_create_experiment`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['BearerToken'] # noqa: E501
return self.api_client.call_api(
'/api/v1/experiments', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='V1CreateExperimentResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def determined_get_current_trial_searcher_operation(self, trial_id, **kwargs): # noqa: E501
"""Get the current searcher operation. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.determined_get_current_trial_searcher_operation(trial_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int trial_id: The id of the trial. (required)
:return: V1GetCurrentTrialSearcherOperationResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.determined_get_current_trial_searcher_operation_with_http_info(trial_id, **kwargs) # noqa: E501
else:
(data) = self.determined_get_current_trial_searcher_operation_with_http_info(trial_id, **kwargs) # noqa: E501
return data
def determined_get_current_trial_searcher_operation_with_http_info(self, trial_id, **kwargs): # noqa: E501
"""Get the current searcher operation. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.determined_get_current_trial_searcher_operation_with_http_info(trial_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int trial_id: The id of the trial. (required)
:return: V1GetCurrentTrialSearcherOperationResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['trial_id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method determined_get_current_trial_searcher_operation" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'trial_id' is set
if ('trial_id' not in params or
params['trial_id'] is None):
raise ValueError("Missing the required parameter `trial_id` when calling `determined_get_current_trial_searcher_operation`") # noqa: E501
collection_formats = {}
path_params = {}
if 'trial_id' in params:
path_params['trialId'] = params['trial_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['BearerToken'] # noqa: E501
return self.api_client.call_api(
'/api/v1/trials/{trialId}/searcher/operation', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='V1GetCurrentTrialSearcherOperationResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def determined_get_hp_importance(self, experiment_id, **kwargs): # noqa: E501
"""Retrieve the latest computation of hyperparameter importance. Currently this is triggered for training loss (if emitted) and the searcher metric after 10% increments in an experiment's progress, but no more than every 10 minutes. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.determined_get_hp_importance(experiment_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int experiment_id: The id of the experiment. (required)
:param int period_seconds: Seconds to wait when polling for updates.
:return: StreamResultOfV1GetHPImportanceResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.determined_get_hp_importance_with_http_info(experiment_id, **kwargs) # noqa: E501
else:
(data) = self.determined_get_hp_importance_with_http_info(experiment_id, **kwargs) # noqa: E501
return data
def determined_get_hp_importance_with_http_info(self, experiment_id, **kwargs): # noqa: E501
"""Retrieve the latest computation of hyperparameter importance. Currently this is triggered for training loss (if emitted) and the searcher metric after 10% increments in an experiment's progress, but no more than every 10 minutes. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.determined_get_hp_importance_with_http_info(experiment_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int experiment_id: The id of the experiment. (required)
:param int period_seconds: Seconds to wait when polling for updates.
:return: StreamResultOfV1GetHPImportanceResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['experiment_id', 'period_seconds'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method determined_get_hp_importance" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'experiment_id' is set
if ('experiment_id' not in params or
params['experiment_id'] is None):
raise ValueError("Missing the required parameter `experiment_id` when calling `determined_get_hp_importance`") # noqa: E501
collection_formats = {}
path_params = {}
if 'experiment_id' in params:
path_params['experimentId'] = params['experiment_id'] # noqa: E501
query_params = []
if 'period_seconds' in params:
query_params.append(('periodSeconds', params['period_seconds'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['BearerToken'] # noqa: E501
return self.api_client.call_api(
'/api/v1/experiments/{experimentId}/hyperparameter-importance', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='StreamResultOfV1GetHPImportanceResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def determined_get_resource_pools(self, **kwargs): # noqa: E501
"""Get a list of all resource pools from the cluster. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.determined_get_resource_pools(async_req=True)
>>> result = thread.get()
:param async_req bool
:param int offset: Skip the number of resource pools before returning results. Negative values denote number of resource pools to skip from the end before returning results.
:param int limit: Limit the number of resource pools. A value of 0 denotes no limit.
:return: V1GetResourcePoolsResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.determined_get_resource_pools_with_http_info(**kwargs) # noqa: E501
else:
(data) = self.determined_get_resource_pools_with_http_info(**kwargs) # noqa: E501
return data
def determined_get_resource_pools_with_http_info(self, **kwargs): # noqa: E501
"""Get a list of all resource pools from the cluster. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.determined_get_resource_pools_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool
:param int offset: Skip the number of resource pools before returning results. Negative values denote number of resource pools to skip from the end before returning results.
:param int limit: Limit the number of resource pools. A value of 0 denotes no limit.
:return: V1GetResourcePoolsResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['offset', 'limit'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method determined_get_resource_pools" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
if 'offset' in params:
query_params.append(('offset', params['offset'])) # noqa: E501
if 'limit' in params:
query_params.append(('limit', params['limit'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['BearerToken'] # noqa: E501
return self.api_client.call_api(
'/api/v1/resource-pools', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='V1GetResourcePoolsResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def determined_get_telemetry(self, **kwargs): # noqa: E501
"""Get telemetry information. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.determined_get_telemetry(async_req=True)
>>> result = thread.get()
:param async_req bool
:return: V1GetTelemetryResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.determined_get_telemetry_with_http_info(**kwargs) # noqa: E501
else:
(data) = self.determined_get_telemetry_with_http_info(**kwargs) # noqa: E501
return data
def determined_get_telemetry_with_http_info(self, **kwargs): # noqa: E501
"""Get telemetry information. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.determined_get_telemetry_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool
:return: V1GetTelemetryResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = [] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method determined_get_telemetry" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = [] # noqa: E501
return self.api_client.call_api(
'/api/v1/master/telemetry', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='V1GetTelemetryResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def determined_metric_batches(self, experiment_id, metric_name, metric_type, **kwargs): # noqa: E501
"""Get the milestones (in batches processed) at which a metric is recorded by an experiment. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.determined_metric_batches(experiment_id, metric_name, metric_type, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int experiment_id: The id of the experiment. (required)
:param str metric_name: A metric name. (required)
:param str metric_type: The type of metric. - METRIC_TYPE_UNSPECIFIED: Zero-value (not allowed). - METRIC_TYPE_TRAINING: For metrics emitted during training. - METRIC_TYPE_VALIDATION: For metrics emitted during validation. (required)
:param int period_seconds: Seconds to wait when polling for updates.
:return: StreamResultOfV1MetricBatchesResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.determined_metric_batches_with_http_info(experiment_id, metric_name, metric_type, **kwargs) # noqa: E501
else:
(data) = self.determined_metric_batches_with_http_info(experiment_id, metric_name, metric_type, **kwargs) # noqa: E501
return data
def determined_metric_batches_with_http_info(self, experiment_id, metric_name, metric_type, **kwargs): # noqa: E501
"""Get the milestones (in batches processed) at which a metric is recorded by an experiment. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.determined_metric_batches_with_http_info(experiment_id, metric_name, metric_type, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int experiment_id: The id of the experiment. (required)
:param str metric_name: A metric name. (required)
:param str metric_type: The type of metric. - METRIC_TYPE_UNSPECIFIED: Zero-value (not allowed). - METRIC_TYPE_TRAINING: For metrics emitted during training. - METRIC_TYPE_VALIDATION: For metrics emitted during validation. (required)
:param int period_seconds: Seconds to wait when polling for updates.
:return: StreamResultOfV1MetricBatchesResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['experiment_id', 'metric_name', 'metric_type', 'period_seconds'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method determined_metric_batches" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'experiment_id' is set
if ('experiment_id' not in params or
params['experiment_id'] is None):
raise ValueError("Missing the required parameter `experiment_id` when calling `determined_metric_batches`") # noqa: E501
# verify the required parameter 'metric_name' is set
if ('metric_name' not in params or
params['metric_name'] is None):
raise ValueError("Missing the required parameter `metric_name` when calling `determined_metric_batches`") # noqa: E501
# verify the required parameter 'metric_type' is set
if ('metric_type' not in params or
params['metric_type'] is None):
raise ValueError("Missing the required parameter `metric_type` when calling `determined_metric_batches`") # noqa: E501
collection_formats = {}
path_params = {}
if 'experiment_id' in params:
path_params['experimentId'] = params['experiment_id'] # noqa: E501
query_params = []
if 'metric_name' in params:
query_params.append(('metricName', params['metric_name'])) # noqa: E501
if 'metric_type' in params:
query_params.append(('metricType', params['metric_type'])) # noqa: E501
if 'period_seconds' in params:
query_params.append(('periodSeconds', params['period_seconds'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['BearerToken'] # noqa: E501
return self.api_client.call_api(
'/api/v1/experiments/{experimentId}/metrics-stream/batches', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='StreamResultOfV1MetricBatchesResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def determined_metric_names(self, experiment_id, **kwargs): # noqa: E501
"""Get the set of metric names recorded for an experiment. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.determined_metric_names(experiment_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int experiment_id: The id of the experiment. (required)
:param int period_seconds: Seconds to wait when polling for updates.
:return: StreamResultOfV1MetricNamesResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.determined_metric_names_with_http_info(experiment_id, **kwargs) # noqa: E501
else:
(data) = self.determined_metric_names_with_http_info(experiment_id, **kwargs) # noqa: E501
return data
def determined_metric_names_with_http_info(self, experiment_id, **kwargs): # noqa: E501
"""Get the set of metric names recorded for an experiment. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.determined_metric_names_with_http_info(experiment_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int experiment_id: The id of the experiment. (required)
:param int period_seconds: Seconds to wait when polling for updates.
:return: StreamResultOfV1MetricNamesResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['experiment_id', 'period_seconds'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method determined_metric_names" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'experiment_id' is set
if ('experiment_id' not in params or
params['experiment_id'] is None):
raise ValueError("Missing the required parameter `experiment_id` when calling `determined_metric_names`") # noqa: E501
collection_formats = {}
path_params = {}
if 'experiment_id' in params:
path_params['experimentId'] = params['experiment_id'] # noqa: E501
query_params = []
if 'period_seconds' in params:
query_params.append(('periodSeconds', params['period_seconds'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['BearerToken'] # noqa: E501
return self.api_client.call_api(
'/api/v1/experiments/{experimentId}/metrics-stream/metric-names', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='StreamResultOfV1MetricNamesResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def determined_report_trial_checkpoint_metadata(self, checkpoint_metadata_trial_id, body, **kwargs): # noqa: E501
"""Record a checkpoint. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.determined_report_trial_checkpoint_metadata(checkpoint_metadata_trial_id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int checkpoint_metadata_trial_id: The ID of the trial associated with the checkpoint. (required)
:param V1CheckpointMetadata body: The training metrics to persist. (required)
:return: V1ReportTrialCheckpointMetadataResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.determined_report_trial_checkpoint_metadata_with_http_info(checkpoint_metadata_trial_id, body, **kwargs) # noqa: E501
else:
(data) = self.determined_report_trial_checkpoint_metadata_with_http_info(checkpoint_metadata_trial_id, body, **kwargs) # noqa: E501
return data
def determined_report_trial_checkpoint_metadata_with_http_info(self, checkpoint_metadata_trial_id, body, **kwargs): # noqa: E501
"""Record a checkpoint. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.determined_report_trial_checkpoint_metadata_with_http_info(checkpoint_metadata_trial_id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int checkpoint_metadata_trial_id: The ID of the trial associated with the checkpoint. (required)
:param V1CheckpointMetadata body: The training metrics to persist. (required)
:return: V1ReportTrialCheckpointMetadataResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['checkpoint_metadata_trial_id', 'body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method determined_report_trial_checkpoint_metadata" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'checkpoint_metadata_trial_id' is set
if ('checkpoint_metadata_trial_id' not in params or
params['checkpoint_metadata_trial_id'] is None):
raise ValueError("Missing the required parameter `checkpoint_metadata_trial_id` when calling `determined_report_trial_checkpoint_metadata`") # noqa: E501
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `determined_report_trial_checkpoint_metadata`") # noqa: E501
collection_formats = {}
path_params = {}
if 'checkpoint_metadata_trial_id' in params:
path_params['checkpointMetadata.trialId'] = params['checkpoint_metadata_trial_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['BearerToken'] # noqa: E501
return self.api_client.call_api(
'/api/v1/trials/{checkpointMetadata.trialId}/checkpoint_metadata', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='V1ReportTrialCheckpointMetadataResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def determined_report_trial_progress(self, trial_id, body, **kwargs): # noqa: E501
"""For bookkeeping, updates the progress towards to current requested searcher training length. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.determined_report_trial_progress(trial_id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int trial_id: The id of the trial. (required)
:param float body: Total units completed by the trial, in terms of the unit used to configure the searcher. (required)
:return: V1ReportTrialProgressResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.determined_report_trial_progress_with_http_info(trial_id, body, **kwargs) # noqa: E501
else:
(data) = self.determined_report_trial_progress_with_http_info(trial_id, body, **kwargs) # noqa: E501
return data
def determined_report_trial_progress_with_http_info(self, trial_id, body, **kwargs): # noqa: E501
"""For bookkeeping, updates the progress towards to current requested searcher training length. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.determined_report_trial_progress_with_http_info(trial_id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int trial_id: The id of the trial. (required)
:param float body: Total units completed by the trial, in terms of the unit used to configure the searcher. (required)
:return: V1ReportTrialProgressResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['trial_id', 'body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method determined_report_trial_progress" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'trial_id' is set
if ('trial_id' not in params or
params['trial_id'] is None):
raise ValueError("Missing the required parameter `trial_id` when calling `determined_report_trial_progress`") # noqa: E501
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `determined_report_trial_progress`") # noqa: E501
collection_formats = {}
path_params = {}
if 'trial_id' in params:
path_params['trialId'] = params['trial_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['BearerToken'] # noqa: E501
return self.api_client.call_api(
'/api/v1/trials/{trialId}/progress', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='V1ReportTrialProgressResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def determined_report_trial_searcher_early_exit(self, trial_id, body, **kwargs): # noqa: E501
"""Reports to the searcher that the trial has completed the current requested amount of training with the given searcher validation metric. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.determined_report_trial_searcher_early_exit(trial_id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int trial_id: The id of the trial. (required)
:param V1TrialEarlyExit body: The exit reason. (required)
:return: V1ReportTrialSearcherEarlyExitResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.determined_report_trial_searcher_early_exit_with_http_info(trial_id, body, **kwargs) # noqa: E501
else:
(data) = self.determined_report_trial_searcher_early_exit_with_http_info(trial_id, body, **kwargs) # noqa: E501
return data
def determined_report_trial_searcher_early_exit_with_http_info(self, trial_id, body, **kwargs): # noqa: E501
"""Reports to the searcher that the trial has completed the current requested amount of training with the given searcher validation metric. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.determined_report_trial_searcher_early_exit_with_http_info(trial_id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int trial_id: The id of the trial. (required)
:param V1TrialEarlyExit body: The exit reason. (required)
:return: V1ReportTrialSearcherEarlyExitResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['trial_id', 'body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method determined_report_trial_searcher_early_exit" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'trial_id' is set
if ('trial_id' not in params or
params['trial_id'] is None):
raise ValueError("Missing the required parameter `trial_id` when calling `determined_report_trial_searcher_early_exit`") # noqa: E501
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `determined_report_trial_searcher_early_exit`") # noqa: E501
collection_formats = {}
path_params = {}
if 'trial_id' in params:
path_params['trialId'] = params['trial_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['BearerToken'] # noqa: E501
return self.api_client.call_api(
'/api/v1/trials/{trialId}/early_exit', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='V1ReportTrialSearcherEarlyExitResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def determined_report_trial_training_metrics(self, training_metrics_trial_id, body, **kwargs): # noqa: E501
"""Record training metrics for specified training. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.determined_report_trial_training_metrics(training_metrics_trial_id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int training_metrics_trial_id: The trial associated with these metrics. (required)
:param V1TrainingMetrics body: The training metrics to persist. (required)
:return: V1ReportTrialTrainingMetricsResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.determined_report_trial_training_metrics_with_http_info(training_metrics_trial_id, body, **kwargs) # noqa: E501
else:
(data) = self.determined_report_trial_training_metrics_with_http_info(training_metrics_trial_id, body, **kwargs) # noqa: E501
return data
def determined_report_trial_training_metrics_with_http_info(self, training_metrics_trial_id, body, **kwargs): # noqa: E501
"""Record training metrics for specified training. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.determined_report_trial_training_metrics_with_http_info(training_metrics_trial_id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int training_metrics_trial_id: The trial associated with these metrics. (required)
:param V1TrainingMetrics body: The training metrics to persist. (required)
:return: V1ReportTrialTrainingMetricsResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['training_metrics_trial_id', 'body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method determined_report_trial_training_metrics" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'training_metrics_trial_id' is set
if ('training_metrics_trial_id' not in params or
params['training_metrics_trial_id'] is None):
raise ValueError("Missing the required parameter `training_metrics_trial_id` when calling `determined_report_trial_training_metrics`") # noqa: E501
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `determined_report_trial_training_metrics`") # noqa: E501
collection_formats = {}
path_params = {}
if 'training_metrics_trial_id' in params:
path_params['trainingMetrics.trialId'] = params['training_metrics_trial_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['BearerToken'] # noqa: E501
return self.api_client.call_api(
'/api/v1/trials/{trainingMetrics.trialId}/training_metrics', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='V1ReportTrialTrainingMetricsResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def determined_report_trial_validation_metrics(self, validation_metrics_trial_id, body, **kwargs): # noqa: E501
"""Record validation metrics. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.determined_report_trial_validation_metrics(validation_metrics_trial_id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int validation_metrics_trial_id: The trial associated with these metrics. (required)
:param V1ValidationMetrics body: The training metrics to persist. (required)
:return: V1ReportTrialValidationMetricsResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.determined_report_trial_validation_metrics_with_http_info(validation_metrics_trial_id, body, **kwargs) # noqa: E501
else:
(data) = self.determined_report_trial_validation_metrics_with_http_info(validation_metrics_trial_id, body, **kwargs) # noqa: E501
return data
def determined_report_trial_validation_metrics_with_http_info(self, validation_metrics_trial_id, body, **kwargs): # noqa: E501
"""Record validation metrics. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.determined_report_trial_validation_metrics_with_http_info(validation_metrics_trial_id, body, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int validation_metrics_trial_id: The trial associated with these metrics. (required)
:param V1ValidationMetrics body: The training metrics to persist. (required)
:return: V1ReportTrialValidationMetricsResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['validation_metrics_trial_id', 'body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method determined_report_trial_validation_metrics" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'validation_metrics_trial_id' is set
if ('validation_metrics_trial_id' not in params or
params['validation_metrics_trial_id'] is None):
raise ValueError("Missing the required parameter `validation_metrics_trial_id` when calling `determined_report_trial_validation_metrics`") # noqa: E501
# verify the required parameter 'body' is set
if ('body' not in params or
params['body'] is None):
raise ValueError("Missing the required parameter `body` when calling `determined_report_trial_validation_metrics`") # noqa: E501
collection_formats = {}
path_params = {}
if 'validation_metrics_trial_id' in params:
path_params['validationMetrics.trialId'] = params['validation_metrics_trial_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['BearerToken'] # noqa: E501
return self.api_client.call_api(
'/api/v1/trials/{validationMetrics.trialId}/validation_metrics', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='V1ReportTrialValidationMetricsResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def determined_trial_preemption_signal(self, trial_id, **kwargs): # noqa: E501
"""Stream preemption signals for the given trial. Upon connection a signal is sent immediately, for synchronization purposes. If it is to preempt, that will be the only signal and the trial should preempt. Otherwise, the trial should continue to listen. The only signal ever sent after this will be to preempt. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.determined_trial_preemption_signal(trial_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int trial_id: The requested trial's id. (required)
:return: StreamResultOfV1TrialPreemptionSignalResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.determined_trial_preemption_signal_with_http_info(trial_id, **kwargs) # noqa: E501
else:
(data) = self.determined_trial_preemption_signal_with_http_info(trial_id, **kwargs) # noqa: E501
return data
def determined_trial_preemption_signal_with_http_info(self, trial_id, **kwargs): # noqa: E501
"""Stream preemption signals for the given trial. Upon connection a signal is sent immediately, for synchronization purposes. If it is to preempt, that will be the only signal and the trial should preempt. Otherwise, the trial should continue to listen. The only signal ever sent after this will be to preempt. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.determined_trial_preemption_signal_with_http_info(trial_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int trial_id: The requested trial's id. (required)
:return: StreamResultOfV1TrialPreemptionSignalResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['trial_id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method determined_trial_preemption_signal" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'trial_id' is set
if ('trial_id' not in params or
params['trial_id'] is None):
raise ValueError("Missing the required parameter `trial_id` when calling `determined_trial_preemption_signal`") # noqa: E501
collection_formats = {}
path_params = {}
if 'trial_id' in params:
path_params['trialId'] = params['trial_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['BearerToken'] # noqa: E501
return self.api_client.call_api(
'/api/v1/trials/{trialId}/signals/preemption', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='StreamResultOfV1TrialPreemptionSignalResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def determined_trials_sample(self, experiment_id, metric_name, metric_type, **kwargs): # noqa: E501
"""Get a sample of the metrics over time for a sample of the trials. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.determined_trials_sample(experiment_id, metric_name, metric_type, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int experiment_id: The id of the experiment. (required)
:param str metric_name: A metric name. (required)
:param str metric_type: The type of metric. - METRIC_TYPE_UNSPECIFIED: Zero-value (not allowed). - METRIC_TYPE_TRAINING: For metrics emitted during training. - METRIC_TYPE_VALIDATION: For metrics emitted during validation. (required)
:param int max_trials: Maximum number of trials to fetch data for.
:param int max_datapoints: Maximum number of initial / historical data points.
:param int start_batches: Beginning of window (inclusive) to fetch data for.
:param int end_batches: Ending of window (inclusive) to fetch data for.
:param int period_seconds: Seconds to wait when polling for updates.
:return: StreamResultOfV1TrialsSampleResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.determined_trials_sample_with_http_info(experiment_id, metric_name, metric_type, **kwargs) # noqa: E501
else:
(data) = self.determined_trials_sample_with_http_info(experiment_id, metric_name, metric_type, **kwargs) # noqa: E501
return data
def determined_trials_sample_with_http_info(self, experiment_id, metric_name, metric_type, **kwargs): # noqa: E501
"""Get a sample of the metrics over time for a sample of the trials. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.determined_trials_sample_with_http_info(experiment_id, metric_name, metric_type, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int experiment_id: The id of the experiment. (required)
:param str metric_name: A metric name. (required)
:param str metric_type: The type of metric. - METRIC_TYPE_UNSPECIFIED: Zero-value (not allowed). - METRIC_TYPE_TRAINING: For metrics emitted during training. - METRIC_TYPE_VALIDATION: For metrics emitted during validation. (required)
:param int max_trials: Maximum number of trials to fetch data for.
:param int max_datapoints: Maximum number of initial / historical data points.
:param int start_batches: Beginning of window (inclusive) to fetch data for.
:param int end_batches: Ending of window (inclusive) to fetch data for.
:param int period_seconds: Seconds to wait when polling for updates.
:return: StreamResultOfV1TrialsSampleResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['experiment_id', 'metric_name', 'metric_type', 'max_trials', 'max_datapoints', 'start_batches', 'end_batches', 'period_seconds'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method determined_trials_sample" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'experiment_id' is set
if ('experiment_id' not in params or
params['experiment_id'] is None):
raise ValueError("Missing the required parameter `experiment_id` when calling `determined_trials_sample`") # noqa: E501
# verify the required parameter 'metric_name' is set
if ('metric_name' not in params or
params['metric_name'] is None):
raise ValueError("Missing the required parameter `metric_name` when calling `determined_trials_sample`") # noqa: E501
# verify the required parameter 'metric_type' is set
if ('metric_type' not in params or
params['metric_type'] is None):
raise ValueError("Missing the required parameter `metric_type` when calling `determined_trials_sample`") # noqa: E501
collection_formats = {}
path_params = {}
if 'experiment_id' in params:
path_params['experimentId'] = params['experiment_id'] # noqa: E501
query_params = []
if 'metric_name' in params:
query_params.append(('metricName', params['metric_name'])) # noqa: E501
if 'metric_type' in params:
query_params.append(('metricType', params['metric_type'])) # noqa: E501
if 'max_trials' in params:
query_params.append(('maxTrials', params['max_trials'])) # noqa: E501
if 'max_datapoints' in params:
query_params.append(('maxDatapoints', params['max_datapoints'])) # noqa: E501
if 'start_batches' in params:
query_params.append(('startBatches', params['start_batches'])) # noqa: E501
if 'end_batches' in params:
query_params.append(('endBatches', params['end_batches'])) # noqa: E501
if 'period_seconds' in params:
query_params.append(('periodSeconds', params['period_seconds'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['BearerToken'] # noqa: E501
return self.api_client.call_api(
'/api/v1/experiments/{experimentId}/metrics-stream/trials-sample', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='StreamResultOfV1TrialsSampleResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def determined_trials_snapshot(self, experiment_id, metric_name, metric_type, batches_processed, **kwargs): # noqa: E501
"""Get a snapshot of a metric across all trials at a certain point of progress. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.determined_trials_snapshot(experiment_id, metric_name, metric_type, batches_processed, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int experiment_id: The id of the experiment. (required)
:param str metric_name: A metric name. (required)
:param str metric_type: The type of metric. - METRIC_TYPE_UNSPECIFIED: Zero-value (not allowed). - METRIC_TYPE_TRAINING: For metrics emitted during training. - METRIC_TYPE_VALIDATION: For metrics emitted during validation. (required)
:param int batches_processed: The point of progress at which to query metrics. (required)
:param int batches_margin: A range either side of batches_processed to include near-misses.
:param int period_seconds: Seconds to wait when polling for updates.
:return: StreamResultOfV1TrialsSnapshotResponse
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.determined_trials_snapshot_with_http_info(experiment_id, metric_name, metric_type, batches_processed, **kwargs) # noqa: E501
else:
(data) = self.determined_trials_snapshot_with_http_info(experiment_id, metric_name, metric_type, batches_processed, **kwargs) # noqa: E501
return data
def determined_trials_snapshot_with_http_info(self, experiment_id, metric_name, metric_type, batches_processed, **kwargs): # noqa: E501
"""Get a snapshot of a metric across all trials at a certain point of progress. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.determined_trials_snapshot_with_http_info(experiment_id, metric_name, metric_type, batches_processed, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int experiment_id: The id of the experiment. (required)
:param str metric_name: A metric name. (required)
:param str metric_type: The type of metric. - METRIC_TYPE_UNSPECIFIED: Zero-value (not allowed). - METRIC_TYPE_TRAINING: For metrics emitted during training. - METRIC_TYPE_VALIDATION: For metrics emitted during validation. (required)
:param int batches_processed: The point of progress at which to query metrics. (required)
:param int batches_margin: A range either side of batches_processed to include near-misses.
:param int period_seconds: Seconds to wait when polling for updates.
:return: StreamResultOfV1TrialsSnapshotResponse
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['experiment_id', 'metric_name', 'metric_type', 'batches_processed', 'batches_margin', 'period_seconds'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method determined_trials_snapshot" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'experiment_id' is set
if ('experiment_id' not in params or
params['experiment_id'] is None):
raise ValueError("Missing the required parameter `experiment_id` when calling `determined_trials_snapshot`") # noqa: E501
# verify the required parameter 'metric_name' is set
if ('metric_name' not in params or
params['metric_name'] is None):
raise ValueError("Missing the required parameter `metric_name` when calling `determined_trials_snapshot`") # noqa: E501
# verify the required parameter 'metric_type' is set
if ('metric_type' not in params or
params['metric_type'] is None):
raise ValueError("Missing the required parameter `metric_type` when calling `determined_trials_snapshot`") # noqa: E501
# verify the required parameter 'batches_processed' is set
if ('batches_processed' not in params or
params['batches_processed'] is None):
raise ValueError("Missing the required parameter `batches_processed` when calling `determined_trials_snapshot`") # noqa: E501
collection_formats = {}
path_params = {}
if 'experiment_id' in params:
path_params['experimentId'] = params['experiment_id'] # noqa: E501
query_params = []
if 'metric_name' in params:
query_params.append(('metricName', params['metric_name'])) # noqa: E501
if 'metric_type' in params:
query_params.append(('metricType', params['metric_type'])) # noqa: E501
if 'batches_processed' in params:
query_params.append(('batchesProcessed', params['batches_processed'])) # noqa: E501
if 'batches_margin' in params:
query_params.append(('batchesMargin', params['batches_margin'])) # noqa: E501
if 'period_seconds' in params:
query_params.append(('periodSeconds', params['period_seconds'])) # noqa: E501
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json']) # noqa: E501
# Authentication setting
auth_settings = ['BearerToken'] # noqa: E501
return self.api_client.call_api(
'/api/v1/experiments/{experimentId}/metrics-stream/trials-snapshot', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='StreamResultOfV1TrialsSnapshotResponse', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
| 46.470879 | 647 | 0.647446 | 9,723 | 84,577 | 5.372313 | 0.040008 | 0.040739 | 0.018225 | 0.023433 | 0.952446 | 0.942395 | 0.935771 | 0.926773 | 0.917527 | 0.910711 | 0 | 0.014489 | 0.268832 | 84,577 | 1,819 | 648 | 46.496427 | 0.830191 | 0.356764 | 0 | 0.787276 | 0 | 0 | 0.231848 | 0.091285 | 0 | 0 | 0 | 0 | 0 | 1 | 0.034791 | false | 0 | 0.019881 | 0 | 0.106362 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
40adef50bf33c68c32dc83c5f3b0e125d90da5aa | 10,491 | py | Python | src/utils/read_datasets.py | canbakiskan/neuro-inspired-defense | 3c323aa3fa797ac6ea69db2731995370ede26f2f | [
"Apache-2.0"
] | null | null | null | src/utils/read_datasets.py | canbakiskan/neuro-inspired-defense | 3c323aa3fa797ac6ea69db2731995370ede26f2f | [
"Apache-2.0"
] | null | null | null | src/utils/read_datasets.py | canbakiskan/neuro-inspired-defense | 3c323aa3fa797ac6ea69db2731995370ede26f2f | [
"Apache-2.0"
] | 1 | 2021-01-06T09:38:23.000Z | 2021-01-06T09:38:23.000Z | import torch
import torch.nn as nn
from torchvision import datasets, transforms
import numpy as np
from os import path
from .namers import attack_file_namer
def tiny_imagenet(args):
data_dir = args.directory + "data/"
train_dir = path.join(data_dir, "original_dataset",
"tiny-imagenet-200", "train")
test_dir = path.join(data_dir, "original_dataset",
"tiny-imagenet-200", "val")
use_cuda = not args.no_cuda and torch.cuda.is_available()
kwargs = {"num_workers": 4, "pin_memory": True} if use_cuda else {}
transform_train = transforms.Compose(
[
transforms.RandomCrop(64, padding=4),
transforms.RandomHorizontalFlip(),
transforms.ToTensor(),
]
)
transform_test = transforms.Compose([transforms.ToTensor(), ])
trainset = datasets.ImageFolder(train_dir, transform=transform_train)
train_loader = torch.utils.data.DataLoader(
trainset, batch_size=args.train_batch_size, shuffle=True, num_workers=2
)
testset = datasets.ImageFolder(test_dir, transform=transform_test)
test_loader = torch.utils.data.DataLoader(
testset, batch_size=args.test_batch_size, shuffle=False, num_workers=2
)
return train_loader, test_loader
def tiny_imagenet_from_file(args):
use_cuda = not args.no_cuda and torch.cuda.is_available()
kwargs = {"num_workers": 1, "pin_memory": True} if use_cuda else {}
# Read
if args.attack_box_type == "other" and args.attack_otherbox_type == "transfer":
filepath = args.directory + "data/attacked_dataset/" + \
args.dataset + "/" + args.attack_transfer_file
elif args.attack_box_type == "white":
filepath = attack_file_namer(args)
else:
raise AssertionError
test_images = np.load(filepath)
data_dir = args.directory + "data/"
test_dir = path.join(data_dir, "original_dataset",
"tiny-imagenet-200", "val")
transform_test = transforms.Compose([transforms.ToTensor(), ])
testset = datasets.ImageFolder(test_dir, transform=transform_test)
test_loader = torch.utils.data.DataLoader(
testset, batch_size=args.test_batch_size, shuffle=False, num_workers=2
)
tensor_x = torch.Tensor(test_images / np.max(test_images))
tensor_y = torch.Tensor(test_loader.dataset.targets).long()
tensor_data = torch.utils.data.TensorDataset(tensor_x, tensor_y)
attack_loader = torch.utils.data.DataLoader(
tensor_data, batch_size=args.test_batch_size, shuffle=False, **kwargs
)
return attack_loader
def tiny_imagenet_initialization_from_file(args):
use_cuda = not args.no_cuda and torch.cuda.is_available()
kwargs = {"num_workers": 1, "pin_memory": True} if use_cuda else {}
# Read
filepath = args.directory + "data/attacked_dataset/" + \
args.dataset + "/" + args.attack_initialization_file
test_images = np.load(filepath)
data_dir = args.directory + "data/"
test_dir = path.join(data_dir, "original_dataset",
"tiny-imagenet-200", "val")
transform_test = transforms.Compose([transforms.ToTensor(), ])
testset = datasets.ImageFolder(test_dir, transform=transform_test)
test_loader = torch.utils.data.DataLoader(
testset, batch_size=args.test_batch_size, shuffle=False, num_workers=2
)
tensor_x = torch.Tensor(test_images / np.max(test_images))
tensor_y = torch.Tensor(test_loader.dataset.targets).long()
tensor_data = torch.utils.data.TensorDataset(tensor_x, tensor_y)
attack_loader = torch.utils.data.DataLoader(
tensor_data, batch_size=args.test_batch_size, shuffle=False, **kwargs
)
return attack_loader
def imagenette(args):
data_dir = args.directory + "data/"
train_dir = path.join(data_dir, "original_dataset",
"imagenette2-160", "train")
test_dir = path.join(data_dir, "original_dataset",
"imagenette2-160", "val")
use_cuda = not args.no_cuda and torch.cuda.is_available()
kwargs = {"num_workers": 4, "pin_memory": True} if use_cuda else {}
transform_train = transforms.Compose(
[
transforms.RandomCrop((160), padding=4),
transforms.RandomHorizontalFlip(),
transforms.ToTensor(),
]
)
transform_test = transforms.Compose(
[
transforms.CenterCrop(160),
transforms.ToTensor(),
]
)
trainset = datasets.ImageFolder(train_dir, transform=transform_train)
train_loader = torch.utils.data.DataLoader(
trainset, batch_size=args.train_batch_size, shuffle=True, num_workers=2
)
testset = datasets.ImageFolder(test_dir, transform=transform_test)
test_loader = torch.utils.data.DataLoader(
testset, batch_size=args.test_batch_size, shuffle=False, num_workers=2
)
return train_loader, test_loader
def imagenette_from_file(args):
use_cuda = not args.no_cuda and torch.cuda.is_available()
kwargs = {"num_workers": 1, "pin_memory": True} if use_cuda else {}
# Read
if args.attack_box_type == "other" and args.attack_otherbox_type == "transfer":
filepath = args.directory + "data/attacked_dataset/" + \
args.dataset + "/" + args.attack_transfer_file
elif args.attack_box_type == "white":
filepath = attack_file_namer(args)
else:
raise AssertionError
test_images = np.load(filepath)
data_dir = args.directory + "data/"
test_dir = path.join(data_dir, "original_dataset",
"imagenette2-160", "val")
transform_test = transforms.Compose([transforms.ToTensor(), ])
testset = datasets.ImageFolder(test_dir, transform=transform_test)
test_loader = torch.utils.data.DataLoader(
testset, batch_size=args.test_batch_size, shuffle=False, num_workers=2
)
tensor_x = torch.Tensor(test_images / np.max(test_images))
tensor_y = torch.Tensor(test_loader.dataset.targets).long()
tensor_data = torch.utils.data.TensorDataset(tensor_x, tensor_y)
attack_loader = torch.utils.data.DataLoader(
tensor_data, batch_size=args.test_batch_size, shuffle=False, **kwargs
)
return attack_loader
def imagenette_initialization_from_file(args):
use_cuda = not args.no_cuda and torch.cuda.is_available()
kwargs = {"num_workers": 1, "pin_memory": True} if use_cuda else {}
# Read
filepath = args.directory + "data/attacked_dataset/" + \
args.dataset + "/" + args.attack_initialization_file
test_images = np.load(filepath)
data_dir = args.directory + "data/"
test_dir = path.join(data_dir, "original_dataset",
"imagenette2-160", "val")
transform_test = transforms.Compose([transforms.ToTensor(), ])
testset = datasets.ImageFolder(test_dir, transform=transform_test)
test_loader = torch.utils.data.DataLoader(
testset, batch_size=args.test_batch_size, shuffle=False, num_workers=2
)
tensor_x = torch.Tensor(test_images / np.max(test_images))
tensor_y = torch.Tensor(test_loader.dataset.targets).long()
tensor_data = torch.utils.data.TensorDataset(tensor_x, tensor_y)
attack_loader = torch.utils.data.DataLoader(
tensor_data, batch_size=args.test_batch_size, shuffle=False, **kwargs
)
return attack_loader
def cifar10(args):
use_cuda = not args.no_cuda and torch.cuda.is_available()
kwargs = {"num_workers": 4, "pin_memory": True} if use_cuda else {}
transform_train = transforms.Compose(
[
transforms.RandomCrop(32, padding=4),
transforms.RandomHorizontalFlip(),
transforms.ToTensor(),
]
)
transform_test = transforms.Compose([transforms.ToTensor(), ])
trainset = datasets.CIFAR10(
root=args.directory + "data/original_dataset",
train=True,
download=True,
transform=transform_train,
)
train_loader = torch.utils.data.DataLoader(
trainset, batch_size=args.train_batch_size, shuffle=True, num_workers=2
)
testset = datasets.CIFAR10(
root=args.directory + "data/original_dataset",
train=False,
download=True,
transform=transform_test,
)
test_loader = torch.utils.data.DataLoader(
testset, batch_size=args.test_batch_size, shuffle=False, num_workers=2
)
return train_loader, test_loader
def cifar10_from_file(args):
use_cuda = not args.no_cuda and torch.cuda.is_available()
kwargs = {"num_workers": 1, "pin_memory": True} if use_cuda else {}
# Read
if args.attack_box_type == "other" and args.attack_otherbox_type == "transfer":
filepath = args.directory + "data/attacked_dataset/" + \
args.dataset + "/" + args.attack_transfer_file
elif args.attack_box_type == "white":
filepath = attack_file_namer(args)
else:
raise AssertionError
test_images = np.load(filepath)
cifar10 = datasets.CIFAR10(
path.join(args.directory, "data/original_dataset"),
train=False,
transform=None,
target_transform=None,
download=False,
)
tensor_x = torch.Tensor(test_images / np.max(test_images))
tensor_y = torch.Tensor(cifar10.targets).long()
tensor_data = torch.utils.data.TensorDataset(tensor_x, tensor_y)
attack_loader = torch.utils.data.DataLoader(
tensor_data, batch_size=args.test_batch_size, shuffle=False, **kwargs
)
return attack_loader
def cifar10_initialization_from_file(args):
use_cuda = not args.no_cuda and torch.cuda.is_available()
kwargs = {"num_workers": 1, "pin_memory": True} if use_cuda else {}
# Read
filepath = args.directory + "data/attacked_dataset/" + \
args.dataset + "/" + args.attack_initialization_file
test_images = np.load(filepath)
cifar10 = datasets.CIFAR10(
path.join(args.directory, "data/original_dataset"),
train=False,
transform=None,
target_transform=None,
download=False,
)
tensor_x = torch.Tensor(test_images / np.max(test_images))
tensor_y = torch.Tensor(cifar10.targets).long()
tensor_data = torch.utils.data.TensorDataset(tensor_x, tensor_y)
attack_loader = torch.utils.data.DataLoader(
tensor_data, batch_size=args.test_batch_size, shuffle=False, **kwargs
)
return attack_loader
| 33.094637 | 83 | 0.677057 | 1,280 | 10,491 | 5.296094 | 0.076563 | 0.042484 | 0.045434 | 0.047205 | 0.965629 | 0.965629 | 0.965629 | 0.964892 | 0.964892 | 0.94719 | 0 | 0.009933 | 0.213135 | 10,491 | 316 | 84 | 33.199367 | 0.811266 | 0.002764 | 0 | 0.743478 | 0 | 0 | 0.07451 | 0.02066 | 0 | 0 | 0 | 0 | 0.013043 | 1 | 0.03913 | false | 0 | 0.026087 | 0 | 0.104348 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
40b9c3cf7c42e6ec447b4657187518057e673df2 | 9,687 | py | Python | config/yolo_config.py | felixchenfy/ros_yolo_as_template_matching | 0d5c0a52ba5540d2a644e0b426f9041a2a5e7858 | [
"MIT"
] | 29 | 2019-12-02T01:54:18.000Z | 2022-02-15T09:23:27.000Z | config/yolo_config.py | felixchenfy/ros_yolo_as_template_matching | 0d5c0a52ba5540d2a644e0b426f9041a2a5e7858 | [
"MIT"
] | 8 | 2019-12-24T13:13:44.000Z | 2022-02-10T00:16:31.000Z | config/yolo_config.py | felixchenfy/ros_yolo_as_template_matching | 0d5c0a52ba5540d2a644e0b426f9041a2a5e7858 | [
"MIT"
] | 5 | 2020-01-31T00:31:37.000Z | 2022-03-28T06:14:09.000Z | # -*- coding: future_fstrings -*-
from __future__ import division
''' Provide a class to write yolo configs to file '''
class YoloConfig(object):
def __init__(self, num_classes, num_layers):
if num_layers not in [1, 2, 3]:
raise ValueError("Yolo layer number should be 1 or 2 or 3")
self.str_yolo_config = set_yolo_config(num_classes, num_layers)
def get(self):
return self.str_yolo_config
def write_to_file(self, filename):
with open(filename, 'w') as f:
f.write(self.str_yolo_config)
print("Write yolo config to: ", filename)
def set_yolo_config(num_classes, num_layers):
num_filters = (num_classes + 5) * 3
yolo_basic = f'''[net]
# Testing
#batch=1
#subdivisions=1
# Training
batch=16
subdivisions=1
width=416
height=416
channels=3
momentum=0.9
decay=0.0005
angle=0
saturation = 1.5
exposure = 1.5
hue=.1
learning_rate=0.001
burn_in=1000
max_batches = 500200
policy=steps
steps=400000,450000
scales=.1,.1
[convolutional]
batch_normalize=1
filters=32
size=3
stride=1
pad=1
activation=leaky
# Downsample
[convolutional]
batch_normalize=1
filters=64
size=3
stride=2
pad=1
activation=leaky
[convolutional]
batch_normalize=1
filters=32
size=1
stride=1
pad=1
activation=leaky
[convolutional]
batch_normalize=1
filters=64
size=3
stride=1
pad=1
activation=leaky
[shortcut]
from=-3
activation=linear
# Downsample
[convolutional]
batch_normalize=1
filters=128
size=3
stride=2
pad=1
activation=leaky
[convolutional]
batch_normalize=1
filters=64
size=1
stride=1
pad=1
activation=leaky
[convolutional]
batch_normalize=1
filters=128
size=3
stride=1
pad=1
activation=leaky
[shortcut]
from=-3
activation=linear
[convolutional]
batch_normalize=1
filters=64
size=1
stride=1
pad=1
activation=leaky
[convolutional]
batch_normalize=1
filters=128
size=3
stride=1
pad=1
activation=leaky
[shortcut]
from=-3
activation=linear
# Downsample
[convolutional]
batch_normalize=1
filters=256
size=3
stride=2
pad=1
activation=leaky
[convolutional]
batch_normalize=1
filters=128
size=1
stride=1
pad=1
activation=leaky
[convolutional]
batch_normalize=1
filters=256
size=3
stride=1
pad=1
activation=leaky
[shortcut]
from=-3
activation=linear
[convolutional]
batch_normalize=1
filters=128
size=1
stride=1
pad=1
activation=leaky
[convolutional]
batch_normalize=1
filters=256
size=3
stride=1
pad=1
activation=leaky
[shortcut]
from=-3
activation=linear
[convolutional]
batch_normalize=1
filters=128
size=1
stride=1
pad=1
activation=leaky
[convolutional]
batch_normalize=1
filters=256
size=3
stride=1
pad=1
activation=leaky
[shortcut]
from=-3
activation=linear
[convolutional]
batch_normalize=1
filters=128
size=1
stride=1
pad=1
activation=leaky
[convolutional]
batch_normalize=1
filters=256
size=3
stride=1
pad=1
activation=leaky
[shortcut]
from=-3
activation=linear
[convolutional]
batch_normalize=1
filters=128
size=1
stride=1
pad=1
activation=leaky
[convolutional]
batch_normalize=1
filters=256
size=3
stride=1
pad=1
activation=leaky
[shortcut]
from=-3
activation=linear
[convolutional]
batch_normalize=1
filters=128
size=1
stride=1
pad=1
activation=leaky
[convolutional]
batch_normalize=1
filters=256
size=3
stride=1
pad=1
activation=leaky
[shortcut]
from=-3
activation=linear
[convolutional]
batch_normalize=1
filters=128
size=1
stride=1
pad=1
activation=leaky
[convolutional]
batch_normalize=1
filters=256
size=3
stride=1
pad=1
activation=leaky
[shortcut]
from=-3
activation=linear
[convolutional]
batch_normalize=1
filters=128
size=1
stride=1
pad=1
activation=leaky
[convolutional]
batch_normalize=1
filters=256
size=3
stride=1
pad=1
activation=leaky
[shortcut]
from=-3
activation=linear
# Downsample
[convolutional]
batch_normalize=1
filters=512
size=3
stride=2
pad=1
activation=leaky
[convolutional]
batch_normalize=1
filters=256
size=1
stride=1
pad=1
activation=leaky
[convolutional]
batch_normalize=1
filters=512
size=3
stride=1
pad=1
activation=leaky
[shortcut]
from=-3
activation=linear
[convolutional]
batch_normalize=1
filters=256
size=1
stride=1
pad=1
activation=leaky
[convolutional]
batch_normalize=1
filters=512
size=3
stride=1
pad=1
activation=leaky
[shortcut]
from=-3
activation=linear
[convolutional]
batch_normalize=1
filters=256
size=1
stride=1
pad=1
activation=leaky
[convolutional]
batch_normalize=1
filters=512
size=3
stride=1
pad=1
activation=leaky
[shortcut]
from=-3
activation=linear
[convolutional]
batch_normalize=1
filters=256
size=1
stride=1
pad=1
activation=leaky
[convolutional]
batch_normalize=1
filters=512
size=3
stride=1
pad=1
activation=leaky
[shortcut]
from=-3
activation=linear
[convolutional]
batch_normalize=1
filters=256
size=1
stride=1
pad=1
activation=leaky
[convolutional]
batch_normalize=1
filters=512
size=3
stride=1
pad=1
activation=leaky
[shortcut]
from=-3
activation=linear
[convolutional]
batch_normalize=1
filters=256
size=1
stride=1
pad=1
activation=leaky
[convolutional]
batch_normalize=1
filters=512
size=3
stride=1
pad=1
activation=leaky
[shortcut]
from=-3
activation=linear
[convolutional]
batch_normalize=1
filters=256
size=1
stride=1
pad=1
activation=leaky
[convolutional]
batch_normalize=1
filters=512
size=3
stride=1
pad=1
activation=leaky
[shortcut]
from=-3
activation=linear
[convolutional]
batch_normalize=1
filters=256
size=1
stride=1
pad=1
activation=leaky
[convolutional]
batch_normalize=1
filters=512
size=3
stride=1
pad=1
activation=leaky
[shortcut]
from=-3
activation=linear
# Downsample
[convolutional]
batch_normalize=1
filters=1024
size=3
stride=2
pad=1
activation=leaky
[convolutional]
batch_normalize=1
filters=512
size=1
stride=1
pad=1
activation=leaky
[convolutional]
batch_normalize=1
filters=1024
size=3
stride=1
pad=1
activation=leaky
[shortcut]
from=-3
activation=linear
[convolutional]
batch_normalize=1
filters=512
size=1
stride=1
pad=1
activation=leaky
[convolutional]
batch_normalize=1
filters=1024
size=3
stride=1
pad=1
activation=leaky
[shortcut]
from=-3
activation=linear
[convolutional]
batch_normalize=1
filters=512
size=1
stride=1
pad=1
activation=leaky
[convolutional]
batch_normalize=1
filters=1024
size=3
stride=1
pad=1
activation=leaky
[shortcut]
from=-3
activation=linear
[convolutional]
batch_normalize=1
filters=512
size=1
stride=1
pad=1
activation=leaky
[convolutional]
batch_normalize=1
filters=1024
size=3
stride=1
pad=1
activation=leaky
[shortcut]
from=-3
activation=linear
######################
[convolutional]
batch_normalize=1
filters=512
size=1
stride=1
pad=1
activation=leaky
[convolutional]
batch_normalize=1
size=3
stride=1
pad=1
filters=1024
activation=leaky
[convolutional]
batch_normalize=1
filters=512
size=1
stride=1
pad=1
activation=leaky
[convolutional]
batch_normalize=1
size=3
stride=1
pad=1
filters=1024
activation=leaky
[convolutional]
batch_normalize=1
filters=512
size=1
stride=1
pad=1
activation=leaky
[convolutional]
batch_normalize=1
size=3
stride=1
pad=1
filters=1024
activation=leaky
[convolutional]
size=1
stride=1
pad=1
filters={num_filters}
activation=linear
'''
yolo_layer1 = f'''[yolo]
mask = 6,7,8
anchors = 10,13, 16,30, 33,23, 30,61, 62,45, 59,119, 116,90, 156,198, 373,326
classes={num_classes}
num=9
jitter=.3
ignore_thresh = .7
truth_thresh = 1
random=1
[route]
layers = -4
[convolutional]
batch_normalize=1
filters=256
size=1
stride=1
pad=1
activation=leaky
[upsample]
stride=2
[route]
layers = -1, 61
[convolutional]
batch_normalize=1
filters=256
size=1
stride=1
pad=1
activation=leaky
[convolutional]
batch_normalize=1
size=3
stride=1
pad=1
filters=512
activation=leaky
[convolutional]
batch_normalize=1
filters=256
size=1
stride=1
pad=1
activation=leaky
[convolutional]
batch_normalize=1
size=3
stride=1
pad=1
filters=512
activation=leaky
[convolutional]
batch_normalize=1
filters=256
size=1
stride=1
pad=1
activation=leaky
[convolutional]
batch_normalize=1
size=3
stride=1
pad=1
filters=512
activation=leaky
[convolutional]
size=1
stride=1
pad=1
filters={num_filters}
activation=linear
'''
yolo_layer2 = f'''[yolo]
mask = 3,4,5
anchors = 10,13, 16,30, 33,23, 30,61, 62,45, 59,119, 116,90, 156,198, 373,326
classes={num_classes}
num=9
jitter=.3
ignore_thresh = .7
truth_thresh = 1
random=1
[route]
layers = -4
[convolutional]
batch_normalize=1
filters=128
size=1
stride=1
pad=1
activation=leaky
[upsample]
stride=2
[route]
layers = -1, 36
[convolutional]
batch_normalize=1
filters=128
size=1
stride=1
pad=1
activation=leaky
[convolutional]
batch_normalize=1
size=3
stride=1
pad=1
filters=256
activation=leaky
[convolutional]
batch_normalize=1
filters=128
size=1
stride=1
pad=1
activation=leaky
[convolutional]
batch_normalize=1
size=3
stride=1
pad=1
filters=256
activation=leaky
[convolutional]
batch_normalize=1
filters=128
size=1
stride=1
pad=1
activation=leaky
[convolutional]
batch_normalize=1
size=3
stride=1
pad=1
filters=256
activation=leaky
[convolutional]
size=1
stride=1
pad=1
filters={num_filters}
activation=linear
'''
yolo_layer3 = f'''[yolo]
mask = 0,1,2
anchors = 10,13, 16,30, 33,23, 30,61, 62,45, 59,119, 116,90, 156,198, 373,326
classes={num_classes}
num=9
jitter=.3
ignore_thresh = .7
truth_thresh = 1
random=1
'''
if num_layers == 1:
return yolo_basic + yolo_layer1
elif num_layers == 2:
return yolo_basic + yolo_layer1 + yolo_layer2
elif num_layers == 3:
return yolo_basic + yolo_layer1 + yolo_layer2 + yolo_layer3
if __name__ == "__main__":
def test_yolo_config():
print("Testing: yolo_config.py")
filename = "tmp.cfg"
yolo_config = YoloConfig(num_classes=2)
yolo_config.write_to_file(filename)
test_yolo_config()
| 11.573477 | 85 | 0.76804 | 1,530 | 9,687 | 4.764052 | 0.090196 | 0.082316 | 0.266703 | 0.276581 | 0.884758 | 0.881328 | 0.879956 | 0.856222 | 0.851969 | 0.845109 | 0 | 0.091016 | 0.124394 | 9,687 | 836 | 86 | 11.587321 | 0.768333 | 0.0032 | 0 | 0.91582 | 0 | 0.004354 | 0.885429 | 0.015415 | 0 | 0 | 0 | 0 | 0 | 1 | 0.007257 | false | 0 | 0.001451 | 0.001451 | 0.015965 | 0.002903 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
dc03da85aee0c2a33fc102a7e4eac30e4c481b81 | 2,911 | py | Python | src/problem008.py | wreckoner/PyEuler | 96f2f3317e868337891d1ccb288bb485951e41ea | [
"MIT"
] | null | null | null | src/problem008.py | wreckoner/PyEuler | 96f2f3317e868337891d1ccb288bb485951e41ea | [
"MIT"
] | null | null | null | src/problem008.py | wreckoner/PyEuler | 96f2f3317e868337891d1ccb288bb485951e41ea | [
"MIT"
] | null | null | null | #-*- coding:utf-8 -*-
"""
Problem 8: Largest product in a series
The four adjacent digits in the 1000-digit number that have the greatest product are 9 × 9 × 8 × 9 = 5832.
73167176531330624919225119674426574742355349194934
96983520312774506326239578318016984801869478851843
85861560789112949495459501737958331952853208805511
12540698747158523863050715693290963295227443043557
66896648950445244523161731856403098711121722383113
62229893423380308135336276614282806444486645238749
30358907296290491560440772390713810515859307960866
70172427121883998797908792274921901699720888093776
65727333001053367881220235421809751254540594752243
52584907711670556013604839586446706324415722155397
53697817977846174064955149290862569321978468622482
83972241375657056057490261407972968652414535100474
82166370484403199890008895243450658541227588666881
16427171479924442928230863465674813919123162824586
17866458359124566529476545682848912883142607690042
24219022671055626321111109370544217506941658960408
07198403850962455444362981230987879927244284909188
84580156166097919133875499200524063689912560717606
05886116467109405077541002256983155200055935729725
71636269561882670428252483600823257530420752963450
Find the thirteen adjacent digits in the 1000-digit number that have the greatest product. What is the value of this product?
Answer: 23514624000
"""
SERIES = "73167176531330624919225119674426574742355349194934\
96983520312774506326239578318016984801869478851843\
85861560789112949495459501737958331952853208805511\
12540698747158523863050715693290963295227443043557\
66896648950445244523161731856403098711121722383113\
62229893423380308135336276614282806444486645238749\
30358907296290491560440772390713810515859307960866\
70172427121883998797908792274921901699720888093776\
65727333001053367881220235421809751254540594752243\
52584907711670556013604839586446706324415722155397\
53697817977846174064955149290862569321978468622482\
83972241375657056057490261407972968652414535100474\
82166370484403199890008895243450658541227588666881\
16427171479924442928230863465674813919123162824586\
17866458359124566529476545682848912883142607690042\
24219022671055626321111109370544217506941658960408\
07198403850962455444362981230987879927244284909188\
84580156166097919133875499200524063689912560717606\
05886116467109405077541002256983155200055935729725\
71636269561882670428252483600823257530420752963450"
def largest_product_in_a_series(number_string):
"""
Create a window of 13 characters that traverses the entire number and finds product for each window. Linear time complexity.
"""
left, right = 0, 13
largest_product = 0
while right < len(number_string):
window = number_string[left:right]
product = eval('*'.join(window))
if product > largest_product:
largest_product = product
left += 1
right += 1
return largest_product
if __name__ == '__main__':
print largest_product_in_a_series(SERIES) | 41.585714 | 125 | 0.902439 | 171 | 2,911 | 15.245614 | 0.467836 | 0.037591 | 0.018412 | 0.019563 | 0.839662 | 0.813195 | 0.813195 | 0.813195 | 0.813195 | 0.813195 | 0 | 0.748622 | 0.06527 | 2,911 | 70 | 126 | 41.585714 | 0.208379 | 0.00687 | 0 | 0 | 0 | 0 | 0.006276 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0.030303 | 0 | 0 | 1 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 11 |
90e526c1b76792ded6f314cc006e44184a08c343 | 28,930 | py | Python | heat/tests/neutron/test_neutron_security_group.py | maestro-hybrid-cloud/heat | 91a4bb3170bd81b1c67a896706851e55709c9b5a | [
"Apache-2.0"
] | null | null | null | heat/tests/neutron/test_neutron_security_group.py | maestro-hybrid-cloud/heat | 91a4bb3170bd81b1c67a896706851e55709c9b5a | [
"Apache-2.0"
] | null | null | null | heat/tests/neutron/test_neutron_security_group.py | maestro-hybrid-cloud/heat | 91a4bb3170bd81b1c67a896706851e55709c9b5a | [
"Apache-2.0"
] | null | null | null | #
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
from neutronclient.common import exceptions as neutron_exc
from neutronclient.v2_0 import client as neutronclient
from novaclient.v2 import security_group_rules as nova_sgr
from novaclient.v2 import security_groups as nova_sg
from heat.common import exception
from heat.common import template_format
from heat.engine import scheduler
from heat.engine import stack as parser
from heat.engine import template
from heat.tests import common
from heat.tests.nova import fakes as fakes_nova
from heat.tests import utils
class SecurityGroupTest(common.HeatTestCase):
test_template = '''
heat_template_version: 2015-04-30
resources:
the_sg:
type: OS::Neutron::SecurityGroup
properties:
description: HTTP and SSH access
rules:
- port_range_min: 22
port_range_max: 22
remote_ip_prefix: 0.0.0.0/0
protocol: tcp
- port_range_min: 80
port_range_max: 80
protocol: tcp
remote_ip_prefix: 0.0.0.0/0
- remote_mode: remote_group_id
remote_group_id: wwww
protocol: tcp
- direction: egress
port_range_min: 22
port_range_max: 22
protocol: tcp
remote_ip_prefix: 10.0.1.0/24
- direction: egress
remote_mode: remote_group_id
remote_group_id: xxxx
- direction: egress
remote_mode: remote_group_id
'''
test_template_update = '''
heat_template_version: 2015-04-30
resources:
the_sg:
type: OS::Neutron::SecurityGroup
properties:
description: SSH access for private network
name: myrules
rules:
- port_range_min: 22
port_range_max: 22
remote_ip_prefix: 10.0.0.10/24
protocol: tcp
'''
test_template_validate = '''
heat_template_version: 2015-04-30
resources:
the_sg:
type: OS::Neutron::SecurityGroup
properties:
name: default
'''
def setUp(self):
super(SecurityGroupTest, self).setUp()
self.fc = fakes_nova.FakeClient()
self.m.StubOutWithMock(nova_sgr.SecurityGroupRuleManager, 'create')
self.m.StubOutWithMock(nova_sgr.SecurityGroupRuleManager, 'delete')
self.m.StubOutWithMock(nova_sg.SecurityGroupManager, 'create')
self.m.StubOutWithMock(nova_sg.SecurityGroupManager, 'delete')
self.m.StubOutWithMock(nova_sg.SecurityGroupManager, 'get')
self.m.StubOutWithMock(nova_sg.SecurityGroupManager, 'list')
self.m.StubOutWithMock(neutronclient.Client, 'create_security_group')
self.m.StubOutWithMock(
neutronclient.Client, 'create_security_group_rule')
self.m.StubOutWithMock(neutronclient.Client, 'show_security_group')
self.m.StubOutWithMock(
neutronclient.Client, 'delete_security_group_rule')
self.m.StubOutWithMock(neutronclient.Client, 'delete_security_group')
self.m.StubOutWithMock(neutronclient.Client, 'update_security_group')
def create_stack(self, templ):
t = template_format.parse(templ)
self.stack = self.parse_stack(t)
self.assertIsNone(self.stack.create())
return self.stack
def parse_stack(self, t):
stack_name = 'test_stack'
tmpl = template.Template(t)
stack = parser.Stack(utils.dummy_context(), stack_name, tmpl)
stack.store()
return stack
def assertResourceState(self, rsrc, ref_id, metadata=None):
metadata = metadata or {}
self.assertIsNone(rsrc.validate())
self.assertEqual((rsrc.CREATE, rsrc.COMPLETE), rsrc.state)
self.assertEqual(ref_id, rsrc.FnGetRefId())
self.assertEqual(metadata, dict(rsrc.metadata_get()))
def test_security_group(self):
show_created = {'security_group': {
'tenant_id': 'f18ca530cc05425e8bac0a5ff92f7e88',
'name': 'sc1',
'description': '',
'security_group_rules': [{
'direction': 'ingress',
'protocol': 'tcp',
'port_range_max': '22',
'id': 'bbbb',
'ethertype': 'IPv4',
'security_group_id': 'aaaa',
'remote_group_id': None,
'remote_ip_prefix': '0.0.0.0/0',
'tenant_id': 'f18ca530cc05425e8bac0a5ff92f7e88',
'port_range_min': '22'
}, {
'direction': 'ingress',
'protocol': 'tcp',
'port_range_max': '80',
'id': 'cccc',
'ethertype': 'IPv4',
'security_group_id': 'aaaa',
'remote_group_id': None,
'remote_ip_prefix': '0.0.0.0/0',
'tenant_id': 'f18ca530cc05425e8bac0a5ff92f7e88',
'port_range_min': '80'
}, {
'direction': 'ingress',
'protocol': 'tcp',
'port_range_max': None,
'id': 'dddd',
'ethertype': 'IPv4',
'security_group_id': 'aaaa',
'remote_group_id': 'wwww',
'remote_ip_prefix': None,
'tenant_id': 'f18ca530cc05425e8bac0a5ff92f7e88',
'port_range_min': None
}, {
'direction': 'egress',
'protocol': 'tcp',
'port_range_max': '22',
'id': 'eeee',
'ethertype': 'IPv4',
'security_group_id': 'aaaa',
'remote_group_id': None,
'remote_ip_prefix': '10.0.1.0/24',
'tenant_id': 'f18ca530cc05425e8bac0a5ff92f7e88',
'port_range_min': '22'
}, {
'direction': 'egress',
'protocol': None,
'port_range_max': None,
'id': 'ffff',
'ethertype': 'IPv4',
'security_group_id': 'aaaa',
'remote_group_id': 'xxxx',
'remote_ip_prefix': None,
'tenant_id': 'f18ca530cc05425e8bac0a5ff92f7e88',
'port_range_min': None
}, {
'direction': 'egress',
'protocol': None,
'port_range_max': None,
'id': 'gggg',
'ethertype': 'IPv4',
'security_group_id': 'aaaa',
'remote_group_id': 'aaaa',
'remote_ip_prefix': None,
'tenant_id': 'f18ca530cc05425e8bac0a5ff92f7e88',
'port_range_min': None
}],
'id': 'aaaa'}
}
# create script
sg_name = utils.PhysName('test_stack', 'the_sg')
neutronclient.Client.create_security_group({
'security_group': {
'name': sg_name,
'description': 'HTTP and SSH access'
}
}).AndReturn({
'security_group': {
'tenant_id': 'f18ca530cc05425e8bac0a5ff92f7e88',
'name': sg_name,
'description': 'HTTP and SSH access',
'security_group_rules': [{
"direction": "egress",
"ethertype": "IPv4",
"id": "aaaa-1",
"port_range_max": None,
"port_range_min": None,
"protocol": None,
"remote_group_id": None,
"remote_ip_prefix": None,
"security_group_id": "aaaa",
"tenant_id": "f18ca530cc05425e8bac0a5ff92f7e88"
}, {
"direction": "egress",
"ethertype": "IPv6",
"id": "aaaa-2",
"port_range_max": None,
"port_range_min": None,
"protocol": None,
"remote_group_id": None,
"remote_ip_prefix": None,
"security_group_id": "aaaa",
"tenant_id": "f18ca530cc05425e8bac0a5ff92f7e88"
}],
'id': 'aaaa'
}
})
neutronclient.Client.create_security_group_rule({
'security_group_rule': {
'direction': 'ingress',
'remote_group_id': None,
'remote_ip_prefix': '0.0.0.0/0',
'port_range_min': '22',
'ethertype': 'IPv4',
'port_range_max': '22',
'protocol': 'tcp',
'security_group_id': 'aaaa'
}
}).AndReturn({
'security_group_rule': {
'direction': 'ingress',
'remote_group_id': None,
'remote_ip_prefix': '0.0.0.0/0',
'port_range_min': '22',
'ethertype': 'IPv4',
'port_range_max': '22',
'protocol': 'tcp',
'security_group_id': 'aaaa',
'id': 'bbbb'
}
})
neutronclient.Client.create_security_group_rule({
'security_group_rule': {
'direction': 'ingress',
'remote_group_id': None,
'remote_ip_prefix': '0.0.0.0/0',
'port_range_min': '80',
'ethertype': 'IPv4',
'port_range_max': '80',
'protocol': 'tcp',
'security_group_id': 'aaaa'
}
}).AndReturn({
'security_group_rule': {
'direction': 'ingress',
'remote_group_id': None,
'remote_ip_prefix': '0.0.0.0/0',
'port_range_min': '80',
'ethertype': 'IPv4',
'port_range_max': '80',
'protocol': 'tcp',
'security_group_id': 'aaaa',
'id': 'cccc'
}
})
neutronclient.Client.create_security_group_rule({
'security_group_rule': {
'direction': 'ingress',
'remote_group_id': 'wwww',
'remote_ip_prefix': None,
'port_range_min': None,
'ethertype': 'IPv4',
'port_range_max': None,
'protocol': 'tcp',
'security_group_id': 'aaaa'
}
}).AndReturn({
'security_group_rule': {
'direction': 'ingress',
'remote_group_id': 'wwww',
'remote_ip_prefix': None,
'port_range_min': None,
'ethertype': 'IPv4',
'port_range_max': None,
'protocol': 'tcp',
'security_group_id': 'aaaa',
'id': 'dddd'
}
})
neutronclient.Client.show_security_group('aaaa').AndReturn({
'security_group': {
'tenant_id': 'f18ca530cc05425e8bac0a5ff92f7e88',
'name': sg_name,
'description': 'HTTP and SSH access',
'security_group_rules': [{
"direction": "egress",
"ethertype": "IPv4",
"id": "aaaa-1",
"port_range_max": None,
"port_range_min": None,
"protocol": None,
"remote_group_id": None,
"remote_ip_prefix": None,
"security_group_id": "aaaa",
"tenant_id": "f18ca530cc05425e8bac0a5ff92f7e88"
}, {
"direction": "egress",
"ethertype": "IPv6",
"id": "aaaa-2",
"port_range_max": None,
"port_range_min": None,
"protocol": None,
"remote_group_id": None,
"remote_ip_prefix": None,
"security_group_id": "aaaa",
"tenant_id": "f18ca530cc05425e8bac0a5ff92f7e88"
}],
'id': 'aaaa'
}
})
neutronclient.Client.delete_security_group_rule('aaaa-1').AndReturn(
None)
neutronclient.Client.delete_security_group_rule('aaaa-2').AndReturn(
None)
neutronclient.Client.create_security_group_rule({
'security_group_rule': {
'direction': 'egress',
'remote_group_id': None,
'remote_ip_prefix': '10.0.1.0/24',
'port_range_min': '22',
'ethertype': 'IPv4',
'port_range_max': '22',
'protocol': 'tcp',
'security_group_id': 'aaaa'
}
}).AndReturn({
'security_group_rule': {
'direction': 'egress',
'remote_group_id': None,
'remote_ip_prefix': '10.0.1.0/24',
'port_range_min': '22',
'ethertype': 'IPv4',
'port_range_max': '22',
'protocol': 'tcp',
'security_group_id': 'aaaa',
'id': 'eeee'
}
})
neutronclient.Client.create_security_group_rule({
'security_group_rule': {
'direction': 'egress',
'remote_group_id': 'xxxx',
'remote_ip_prefix': None,
'port_range_min': None,
'ethertype': 'IPv4',
'port_range_max': None,
'protocol': None,
'security_group_id': 'aaaa'
}
}).AndReturn({
'security_group_rule': {
'direction': 'egress',
'remote_group_id': 'xxxx',
'remote_ip_prefix': None,
'port_range_min': None,
'ethertype': 'IPv4',
'port_range_max': None,
'protocol': None,
'security_group_id': 'aaaa',
'id': 'ffff'
}
})
neutronclient.Client.create_security_group_rule({
'security_group_rule': {
'direction': 'egress',
'remote_group_id': 'aaaa',
'remote_ip_prefix': None,
'port_range_min': None,
'ethertype': 'IPv4',
'port_range_max': None,
'protocol': None,
'security_group_id': 'aaaa'
}
}).AndReturn({
'security_group_rule': {
'direction': 'egress',
'remote_group_id': 'aaaa',
'remote_ip_prefix': None,
'port_range_min': None,
'ethertype': 'IPv4',
'port_range_max': None,
'protocol': None,
'security_group_id': 'aaaa',
'id': 'gggg'
}
})
# update script
neutronclient.Client.update_security_group(
'aaaa',
{'security_group': {
'description': 'SSH access for private network',
'name': 'myrules'}}
).AndReturn({
'security_group': {
'tenant_id': 'f18ca530cc05425e8bac0a5ff92f7e88',
'name': 'myrules',
'description': 'SSH access for private network',
'security_group_rules': [],
'id': 'aaaa'
}
})
neutronclient.Client.show_security_group('aaaa').AndReturn(
show_created)
neutronclient.Client.delete_security_group_rule('bbbb').AndReturn(None)
neutronclient.Client.delete_security_group_rule('cccc').AndReturn(None)
neutronclient.Client.delete_security_group_rule('dddd').AndReturn(None)
neutronclient.Client.delete_security_group_rule('eeee').AndReturn(None)
neutronclient.Client.delete_security_group_rule('ffff').AndReturn(None)
neutronclient.Client.delete_security_group_rule('gggg').AndReturn(None)
neutronclient.Client.show_security_group('aaaa').AndReturn({
'security_group': {
'tenant_id': 'f18ca530cc05425e8bac0a5ff92f7e88',
'name': 'sc1',
'description': '',
'security_group_rules': [],
'id': 'aaaa'
}
})
neutronclient.Client.create_security_group_rule({
'security_group_rule': {
'direction': 'egress',
'ethertype': 'IPv4',
'security_group_id': 'aaaa',
}
}).AndReturn({
'security_group_rule': {
'direction': 'egress',
'remote_group_id': None,
'remote_ip_prefix': None,
'port_range_min': None,
'ethertype': 'IPv4',
'port_range_max': None,
'protocol': None,
'security_group_id': 'aaaa',
'id': 'hhhh'
}
})
neutronclient.Client.create_security_group_rule({
'security_group_rule': {
'direction': 'egress',
'ethertype': 'IPv6',
'security_group_id': 'aaaa',
}
}).AndReturn({
'security_group_rule': {
'direction': 'egress',
'remote_group_id': None,
'remote_ip_prefix': None,
'port_range_min': None,
'ethertype': 'IPv6',
'port_range_max': None,
'protocol': None,
'security_group_id': 'aaaa',
'id': 'iiii'
}
})
neutronclient.Client.create_security_group_rule({
'security_group_rule': {
'direction': 'ingress',
'remote_group_id': None,
'remote_ip_prefix': '10.0.0.10/24',
'port_range_min': '22',
'ethertype': 'IPv4',
'port_range_max': '22',
'protocol': 'tcp',
'security_group_id': 'aaaa'
}
}).AndReturn({
'security_group_rule': {
'direction': 'ingress',
'remote_group_id': None,
'remote_ip_prefix': '10.0.0.10/24',
'port_range_min': '22',
'ethertype': 'IPv4',
'port_range_max': '22',
'protocol': 'tcp',
'security_group_id': 'aaaa',
'id': 'jjjj'
}
})
# delete script
neutronclient.Client.show_security_group('aaaa').AndReturn(
show_created)
neutronclient.Client.delete_security_group_rule('bbbb').AndReturn(None)
neutronclient.Client.delete_security_group_rule('cccc').AndReturn(None)
neutronclient.Client.delete_security_group_rule('dddd').AndReturn(None)
neutronclient.Client.delete_security_group_rule('eeee').AndReturn(None)
neutronclient.Client.delete_security_group_rule('ffff').AndReturn(None)
neutronclient.Client.delete_security_group_rule('gggg').AndReturn(None)
neutronclient.Client.delete_security_group('aaaa').AndReturn(None)
self.m.ReplayAll()
stack = self.create_stack(self.test_template)
sg = stack['the_sg']
self.assertResourceState(sg, 'aaaa')
updated_tmpl = template_format.parse(self.test_template_update)
updated_stack = utils.parse_stack(updated_tmpl)
stack.update(updated_stack)
stack.delete()
self.m.VerifyAll()
def test_security_group_exception(self):
# create script
sg_name = utils.PhysName('test_stack', 'the_sg')
neutronclient.Client.create_security_group({
'security_group': {
'name': sg_name,
'description': 'HTTP and SSH access'
}
}).AndReturn({
'security_group': {
'tenant_id': 'f18ca530cc05425e8bac0a5ff92f7e88',
'name': sg_name,
'description': 'HTTP and SSH access',
'security_group_rules': [],
'id': 'aaaa'
}
})
neutronclient.Client.create_security_group_rule({
'security_group_rule': {
'direction': 'ingress',
'remote_group_id': None,
'remote_ip_prefix': '0.0.0.0/0',
'port_range_min': '22',
'ethertype': 'IPv4',
'port_range_max': '22',
'protocol': 'tcp',
'security_group_id': 'aaaa'
}
}).AndRaise(
neutron_exc.Conflict())
neutronclient.Client.create_security_group_rule({
'security_group_rule': {
'direction': 'ingress',
'remote_group_id': None,
'remote_ip_prefix': '0.0.0.0/0',
'port_range_min': '80',
'ethertype': 'IPv4',
'port_range_max': '80',
'protocol': 'tcp',
'security_group_id': 'aaaa'
}
}).AndRaise(
neutron_exc.Conflict())
neutronclient.Client.create_security_group_rule({
'security_group_rule': {
'direction': 'ingress',
'remote_group_id': 'wwww',
'remote_ip_prefix': None,
'port_range_min': None,
'ethertype': 'IPv4',
'port_range_max': None,
'protocol': 'tcp',
'security_group_id': 'aaaa'
}
}).AndRaise(
neutron_exc.Conflict())
neutronclient.Client.show_security_group('aaaa').AndReturn({
'security_group': {
'tenant_id': 'f18ca530cc05425e8bac0a5ff92f7e88',
'name': sg_name,
'description': 'HTTP and SSH access',
'security_group_rules': [],
'id': 'aaaa'
}
})
neutronclient.Client.create_security_group_rule({
'security_group_rule': {
'direction': 'egress',
'remote_group_id': None,
'remote_ip_prefix': '10.0.1.0/24',
'port_range_min': '22',
'ethertype': 'IPv4',
'port_range_max': '22',
'protocol': 'tcp',
'security_group_id': 'aaaa'
}
}).AndRaise(
neutron_exc.Conflict())
neutronclient.Client.create_security_group_rule({
'security_group_rule': {
'direction': 'egress',
'remote_group_id': 'xxxx',
'remote_ip_prefix': None,
'port_range_min': None,
'ethertype': 'IPv4',
'port_range_max': None,
'protocol': None,
'security_group_id': 'aaaa'
}
}).AndRaise(
neutron_exc.Conflict())
neutronclient.Client.create_security_group_rule({
'security_group_rule': {
'direction': 'egress',
'remote_group_id': 'aaaa',
'remote_ip_prefix': None,
'port_range_min': None,
'ethertype': 'IPv4',
'port_range_max': None,
'protocol': None,
'security_group_id': 'aaaa'
}
}).AndRaise(
neutron_exc.Conflict())
# delete script
neutronclient.Client.show_security_group('aaaa').AndReturn({
'security_group': {
'tenant_id': 'f18ca530cc05425e8bac0a5ff92f7e88',
'name': 'sc1',
'description': '',
'security_group_rules': [{
'direction': 'ingress',
'protocol': 'tcp',
'port_range_max': '22',
'id': 'bbbb',
'ethertype': 'IPv4',
'security_group_id': 'aaaa',
'remote_group_id': None,
'remote_ip_prefix': '0.0.0.0/0',
'tenant_id': 'f18ca530cc05425e8bac0a5ff92f7e88',
'port_range_min': '22'
}, {
'direction': 'ingress',
'protocol': 'tcp',
'port_range_max': '80',
'id': 'cccc',
'ethertype': 'IPv4',
'security_group_id': 'aaaa',
'remote_group_id': None,
'remote_ip_prefix': '0.0.0.0/0',
'tenant_id': 'f18ca530cc05425e8bac0a5ff92f7e88',
'port_range_min': '80'
}, {
'direction': 'ingress',
'protocol': 'tcp',
'port_range_max': None,
'id': 'dddd',
'ethertype': 'IPv4',
'security_group_id': 'aaaa',
'remote_group_id': 'wwww',
'remote_ip_prefix': None,
'tenant_id': 'f18ca530cc05425e8bac0a5ff92f7e88',
'port_range_min': None
}, {
'direction': 'egress',
'protocol': 'tcp',
'port_range_max': '22',
'id': 'eeee',
'ethertype': 'IPv4',
'security_group_id': 'aaaa',
'remote_group_id': None,
'remote_ip_prefix': '10.0.1.0/24',
'tenant_id': 'f18ca530cc05425e8bac0a5ff92f7e88',
'port_range_min': '22'
}, {
'direction': 'egress',
'protocol': None,
'port_range_max': None,
'id': 'ffff',
'ethertype': 'IPv4',
'security_group_id': 'aaaa',
'remote_group_id': None,
'remote_ip_prefix': 'xxxx',
'tenant_id': 'f18ca530cc05425e8bac0a5ff92f7e88',
'port_range_min': None
}, {
'direction': 'egress',
'protocol': None,
'port_range_max': None,
'id': 'gggg',
'ethertype': 'IPv4',
'security_group_id': 'aaaa',
'remote_group_id': None,
'remote_ip_prefix': 'aaaa',
'tenant_id': 'f18ca530cc05425e8bac0a5ff92f7e88',
'port_range_min': None
}],
'id': 'aaaa'}})
neutronclient.Client.delete_security_group_rule('bbbb').AndRaise(
neutron_exc.NeutronClientException(status_code=404))
neutronclient.Client.delete_security_group_rule('cccc').AndRaise(
neutron_exc.NeutronClientException(status_code=404))
neutronclient.Client.delete_security_group_rule('dddd').AndRaise(
neutron_exc.NeutronClientException(status_code=404))
neutronclient.Client.delete_security_group_rule('eeee').AndRaise(
neutron_exc.NeutronClientException(status_code=404))
neutronclient.Client.delete_security_group_rule('ffff').AndRaise(
neutron_exc.NeutronClientException(status_code=404))
neutronclient.Client.delete_security_group_rule('gggg').AndRaise(
neutron_exc.NeutronClientException(status_code=404))
neutronclient.Client.delete_security_group('aaaa').AndRaise(
neutron_exc.NeutronClientException(status_code=404))
neutronclient.Client.show_security_group('aaaa').AndRaise(
neutron_exc.NeutronClientException(status_code=404))
self.m.ReplayAll()
stack = self.create_stack(self.test_template)
sg = stack['the_sg']
self.assertResourceState(sg, 'aaaa')
scheduler.TaskRunner(sg.delete)()
sg.state_set(sg.CREATE, sg.COMPLETE, 'to delete again')
sg.resource_id = 'aaaa'
stack.delete()
self.m.VerifyAll()
def test_security_group_validate(self):
stack = self.create_stack(self.test_template_validate)
sg = stack['the_sg']
ex = self.assertRaises(exception.StackValidationFailed, sg.validate)
self.assertEqual(
'Security groups cannot be assigned the name "default".',
ex.message)
| 38.015769 | 79 | 0.502143 | 2,531 | 28,930 | 5.435796 | 0.080996 | 0.132287 | 0.075374 | 0.055241 | 0.853903 | 0.843073 | 0.822794 | 0.794447 | 0.747565 | 0.732955 | 0 | 0.041618 | 0.379571 | 28,930 | 760 | 80 | 38.065789 | 0.724887 | 0.021293 | 0 | 0.805085 | 0 | 0 | 0.312586 | 0.036299 | 0 | 0 | 0 | 0 | 0.014124 | 1 | 0.009887 | false | 0 | 0.016949 | 0 | 0.035311 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
2913bca47604ace1be6ffe54ae540b869d30568d | 168 | py | Python | mtorch/core/logger/__init__.py | NullConvergence/torch_temp | 29a0d7190f0be6124f51bd85b8320cd8b3cef29a | [
"MIT"
] | 3 | 2019-08-08T13:23:50.000Z | 2019-08-15T15:29:36.000Z | mtorch/core/logger/__init__.py | NullConvergence/torch-template | 29a0d7190f0be6124f51bd85b8320cd8b3cef29a | [
"MIT"
] | 10 | 2019-09-20T21:25:22.000Z | 2019-10-16T10:52:04.000Z | mtorch/core/logger/__init__.py | NullConvergence/mtorch | 29a0d7190f0be6124f51bd85b8320cd8b3cef29a | [
"MIT"
] | 2 | 2019-08-08T13:23:52.000Z | 2019-08-08T19:46:55.000Z | from .logger import *
from .tensorboard_writer import *
from .tb_logger import *
from .sacred_logger import *
from .wandb_logger import *
from .neptune_logger import *
| 24 | 33 | 0.785714 | 23 | 168 | 5.521739 | 0.391304 | 0.472441 | 0.503937 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.142857 | 168 | 6 | 34 | 28 | 0.881944 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
295eea9017c89da340a826b79532269c1046854f | 3,046 | py | Python | Scripts/SetIfEmpty/SetIfEmpty_test.py | lh34s8/content | 6b1d72035be3a58641239e8cc9e7cd94139e55ba | [
"MIT"
] | null | null | null | Scripts/SetIfEmpty/SetIfEmpty_test.py | lh34s8/content | 6b1d72035be3a58641239e8cc9e7cd94139e55ba | [
"MIT"
] | 9 | 2021-02-08T20:51:18.000Z | 2021-09-23T23:27:38.000Z | Scripts/SetIfEmpty/SetIfEmpty_test.py | lh34s8/content | 6b1d72035be3a58641239e8cc9e7cd94139e55ba | [
"MIT"
] | null | null | null | from SetIfEmpty import get_value_to_set
def test_when_value_is_a_valid_string_should_return_value():
validString = "validString"
expectedOutput = validString
result = get_value_to_set({'value': validString, 'defaultValue': 'defaultValue', 'applyIfEmpty': 'true'})
assert expectedOutput == result
def test_when_value_is_a_number_should_return_value():
number = 0
expectedOutput = number
result = get_value_to_set({'value': number, 'defaultValue': 'defaultValue', 'applyIfEmpty': 'true'})
assert expectedOutput == result
def test_when_value_is_a_dictionary_should_return_value():
dictionary = {'name': "John", 'lastName': 'Doe'}
expectedOutput = dictionary
result = get_value_to_set({'value': dictionary, 'defaultValue': 'defaultValue', 'applyIfEmpty': 'true'})
assert expectedOutput == result
def test_when_value_is_empty_string_should_return_value():
expectedOutput = "defaultValue"
result = get_value_to_set({'value': '', 'defaultValue': 'defaultValue', 'applyIfEmpty': 'True'})
assert expectedOutput == result
def test_when_value_is_empty_dictionary_should_return_default_value():
expectedOutput = "defaultValue"
result = get_value_to_set({'value': {}, 'defaultValue': 'defaultValue', 'applyIfEmpty': 'true'})
assert expectedOutput == result
def test_when_value_is_none_should_return_default_value():
expectedOutput = "defaultValue"
result = get_value_to_set({'value': None, 'defaultValue': 'defaultValue', 'applyIfEmpty': 'true'})
assert expectedOutput == result
def test_when_value_is_empty_string_and_apply_if_empty_is_false_should_return_empty_string():
expectedOutput = ""
result = get_value_to_set({'value': '', 'defaultValue': 'defaultValue', 'applyIfEmpty': 'False'})
assert expectedOutput == result
def test_when_value_is_empty_dictionary_and_apply_if_empty_is_false_should_return_empty_dictionary():
expectedOutput = {}
result = get_value_to_set({'value': {}, 'defaultValue': 'defaultValue', 'applyIfEmpty': 'false'})
assert expectedOutput == result
def test_when_value_is_none_and_apply_if_empty_is_false_should_return_default_value():
expectedOutput = "defaultValue"
result = get_value_to_set({'value': None, 'defaultValue': 'defaultValue', 'applyIfEmpty': 'false'})
assert expectedOutput == result
def test_when_value_is_empty_array_and_apply_if_empty_is_true_should_return_default_value():
expectedOutput = "defaultValue"
result = get_value_to_set({'value': [""], 'defaultValue': 'defaultValue', 'applyIfEmpty': 'true'})
assert expectedOutput == result
def test_when_value_is_empty_dict():
value = {}
expectedOutput = "defaultValue"
result = get_value_to_set({'value': value, 'defaultValue': expectedOutput, 'applyIfEmpty': 'true'})
assert expectedOutput == result
def test_when_value_is_dict():
value = {1: 2}
result = get_value_to_set({'value': value, 'defaultValue': "defaultValue", 'applyIfEmpty': 'true'})
assert result == value
| 33.844444 | 109 | 0.74327 | 345 | 3,046 | 6.107246 | 0.118841 | 0.049359 | 0.061699 | 0.080209 | 0.837684 | 0.829616 | 0.77029 | 0.77029 | 0.732321 | 0.665401 | 0 | 0.001142 | 0.137886 | 3,046 | 89 | 110 | 34.224719 | 0.801219 | 0 | 0 | 0.320755 | 0 | 0 | 0.207814 | 0 | 0 | 0 | 0 | 0 | 0.226415 | 1 | 0.226415 | false | 0 | 0.018868 | 0 | 0.245283 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
4668f091b477cf69ef6743ecf7e389511d57bdea | 12,991 | py | Python | autoscale_cloudroast/test_repo/autoscale/functional/scaling_group/test_scaling_group_negative.py | alex/otter | e46316634ae4c211f7436aa4d41321ac1edba0af | [
"Apache-2.0"
] | 1 | 2015-11-08T12:58:44.000Z | 2015-11-08T12:58:44.000Z | autoscale_cloudroast/test_repo/autoscale/functional/scaling_group/test_scaling_group_negative.py | alex/otter | e46316634ae4c211f7436aa4d41321ac1edba0af | [
"Apache-2.0"
] | null | null | null | autoscale_cloudroast/test_repo/autoscale/functional/scaling_group/test_scaling_group_negative.py | alex/otter | e46316634ae4c211f7436aa4d41321ac1edba0af | [
"Apache-2.0"
] | null | null | null | """
Test negative scenarios for a scaling group.
"""
from test_repo.autoscale.fixtures import AutoscaleFixture
from autoscale.status_codes import HttpStatusCodes
class ScalingGroupNegative(AutoscaleFixture):
"""
Verify negative scenarios for scaling group.
"""
# @unittest.skip('invalid when tests are running in parallel and on a tenant that has groups')
# def test_list_scaling_group_when_none_exist(self):
# """
# Negative test: List scaling groups when none exists on the account.
# (also helps validate that teardowns within the testsuite )
# """
# list_groups_resp = self.autoscale_client.list_scaling_groups()
# list_groups = list_groups_resp.entity
# self.assertEquals(list_groups_resp.status_code, 200,
# msg='The list group call when no groups exists failed with {0}'
# .format(list_groups_resp.status_code)
# self.validate_headers(list_groups_resp.headers)
# self.assertEquals(list_groups, [],
# msg='Some scaling groups exist on the account')
# def test_scaling_group_name_blank(self):
# """
# Negative Test: Scaling group should not get created with an empty
# group configuration name
# """
# expected_status_code = HttpStatusCodes.BAD_REQUEST
# error_create_resp = self.autoscale_behaviors.create_scaling_group_given(
# gc_name='')
# create_error = error_create_resp.entity
# self.assertEquals(error_create_resp.status_code, expected_status_code,
# msg='Create scaling group succeeded with invalid request: {0}'
# .format(error_create_resp.status_code))
# self.assertTrue(create_error is None,
# msg='Create scaling group with invalid request returned: {0}'
# .format(create_error))
def test_scaling_group_name_whitespace(self):
"""
Negative Test: Scaling group should not get created with group
configuration name as only whitespace
"""
expected_status_code = HttpStatusCodes.BAD_REQUEST
error_create_resp = self.autoscale_behaviors.create_scaling_group_given(
gc_name=' ')
create_error = error_create_resp.entity
self.assertEquals(error_create_resp.status_code, expected_status_code,
msg='Create scaling group succeeded with invalid request: {0}'
.format(error_create_resp.status_code))
self.assertTrue(create_error is None,
msg='Create scaling group with invalid request returned: {0}'
.format(create_error))
def test_scaling_group_minentities_lessthan_zero(self):
"""
Negative Test: Scaling group should not get created when min entities
are less than Zero
"""
expected_status_code = HttpStatusCodes.BAD_REQUEST
error_create_resp = self.autoscale_behaviors.create_scaling_group_given(
gc_min_entities='-100')
create_error = error_create_resp.entity
self.assertEquals(error_create_resp.status_code, expected_status_code,
msg='Create scaling group succeeded with invalid request: {0}'
.format(error_create_resp.status_code))
self.assertTrue(create_error is None,
msg='Create scaling group with invalid request returned: {0}'
.format(create_error))
def test_scaling_group_maxentities_lessthan_zero(self):
"""
Negative Test: Scaling group should not get created when max entities
are less than Zero
"""
expected_status_code = HttpStatusCodes.BAD_REQUEST
error_create_resp = self.autoscale_behaviors.create_scaling_group_given(
gc_max_entities='-0.01')
create_error = error_create_resp.entity
self.assertEquals(error_create_resp.status_code, expected_status_code,
msg='Create scaling group succeeded with invalid request: {0}'
.format(error_create_resp.status_code))
self.assertTrue(create_error is None,
msg='Create scaling group with invalid request returned: {0}'
.format(create_error))
def test_scaling_group_maxentities_over_max(self):
"""
Negative Test: Scaling group should not get created when max entities
are over 25
"""
expected_status_code = HttpStatusCodes.BAD_REQUEST
error_create_resp = self.autoscale_behaviors.create_scaling_group_given(
gc_max_entities=self.max_maxentities + 1)
create_error = error_create_resp.entity
self.assertEquals(error_create_resp.status_code, expected_status_code,
msg='Create scaling group succeeded with invalid request: {0}'
.format(error_create_resp.status_code))
self.assertTrue(create_error is None,
msg='Create scaling group with invalid request returned: {0}'
.format(create_error))
def test_scaling_group_cooldown_lessthan_zero(self):
"""
Negative Test: Scaling group should not get created when cooldown
is less than Zero
"""
expected_status_code = HttpStatusCodes.BAD_REQUEST
error_create_resp = self.autoscale_behaviors.create_scaling_group_given(
gc_cooldown='-0.08')
create_error = error_create_resp.entity
self.assertEquals(error_create_resp.status_code, expected_status_code,
msg='Create scaling group succeeded with invalid request: {0}'
.format(error_create_resp.status_code))
self.assertTrue(create_error is None,
msg='Create scaling group with invalid request returned: {0}'
.format(create_error))
def test_scaling_group_minentities_max(self):
"""
Negative Test: Scaling group should not get created when min entities are over allowed
maxentities
"""
expected_status_code = HttpStatusCodes.BAD_REQUEST
gc_min_entities = self.max_maxentities + 1
create_resp = self.autoscale_behaviors.create_scaling_group_given(
gc_min_entities=gc_min_entities)
self.assertEquals(create_resp.status_code, expected_status_code,
msg='Create scaling group passed with max minentities. Response: {0}'
.format(create_resp.status_code))
def test_create_scaling_group_minentities_over_maxentities(self):
"""
Negative Test: Scaling group should not get created when min entities are over maxentities
"""
expected_status_code = HttpStatusCodes.BAD_REQUEST
gc_min_entities = 22
gc_max_entities = 2
create_resp = self.autoscale_behaviors.create_scaling_group_given(
gc_min_entities=gc_min_entities,
gc_max_entities=gc_max_entities)
self.assertEquals(create_resp.status_code, expected_status_code,
msg='Create scaling group passed with max < minentities. Response: {0}'
.format(create_resp.status_code))
def test_scaling_group_maxentities_max(self):
"""
Negative Test: Scaling group should not get created when max entities
is over 25
"""
expected_status_code = HttpStatusCodes.BAD_REQUEST
gc_max_entities = self.max_maxentities + 1
create_resp = self.autoscale_behaviors.create_scaling_group_given(
gc_max_entities=gc_max_entities)
self.assertEquals(create_resp.status_code, expected_status_code,
msg='Create group passed when maxntities is over 25 with response: {0}'
.format(create_resp.status_code))
def test_scaling_group_with_max_cooldown(self):
"""
Negative Test: Scaling group should not get created when cooldown
is over 86400 seconds (24 hrs)
"""
expected_status_code = HttpStatusCodes.BAD_REQUEST
create_resp = self.autoscale_behaviors.create_scaling_group_given(
gc_cooldown=self.max_cooldown + 1)
self.assertEquals(create_resp.status_code, expected_status_code,
msg='Create group passed when cooldown is over 24 hrs with response: {0}'
.format(create_resp.status_code))
def test_get_invalid_group_id(self):
"""
Negative Test: Get group with invalid group id should fail with
resource not found 404
"""
group = 13344
expected_status_code = HttpStatusCodes.NOT_FOUND
error_create_resp = self.autoscale_client.view_manifest_config_for_scaling_group(
group_id=group)
create_error = error_create_resp.entity
self.assertEquals(error_create_resp.status_code, expected_status_code,
msg='Create group succeeded with invalid request: {0}'
.format(error_create_resp.status_code))
self.assertTrue(create_error is None,
msg='Create group with invalid request returned: {0}'
.format(create_error))
def test_update_invalid_group_id(self):
"""
Negative Test: Update group with invalid group id should fail with
resource not found 404
"""
group = gc_max_entities = 25
expected_status_code = HttpStatusCodes.NOT_FOUND
error_create_resp = self.autoscale_client.update_group_config(
group_id=group,
name=self.gc_name,
cooldown=self.gc_cooldown,
min_entities=self.gc_min_entities,
max_entities=gc_max_entities,
metadata={})
create_error = error_create_resp.entity
self.assertEquals(error_create_resp.status_code, expected_status_code,
msg='Create group succeeded with invalid request: {0}'
.format(error_create_resp.status_code))
self.assertTrue(create_error is None,
msg='Create group with invalid request returned: {0}'
.format(create_error))
def test_get_group_after_deletion(self):
"""
Negative Test: Get group when group is deleted should fail with 404
"""
create_resp = self.autoscale_behaviors.create_scaling_group_min()
group = create_resp.entity
del_resp = self.autoscale_client.delete_scaling_group(
group_id=group.id)
self.assertEquals(
create_resp.status_code, 201, msg='create group failed')
self.assertEquals(del_resp.status_code, 204, msg='Delete group failed')
expected_status_code = HttpStatusCodes.NOT_FOUND
error_create_resp = self.autoscale_client.view_manifest_config_for_scaling_group(
group_id=group.id)
create_error = error_create_resp.entity
self.assertEquals(error_create_resp.status_code, expected_status_code,
msg='Create group succeeded with invalid request: {0}'
.format(error_create_resp.status_code))
self.assertTrue(create_error is None,
msg='Create group with invalid request returned: {0}'
.format(create_error))
def test_update_group_after_deletion(self):
"""
Negative Test: Trying to update group when group is deleted should fail with 404
"""
create_resp = self.autoscale_behaviors.create_scaling_group_min()
group = create_resp.entity
del_resp = self.autoscale_client.delete_scaling_group(
group_id=group.id)
self.assertEquals(
create_resp.status_code, 201, msg='create group failed')
self.assertEquals(del_resp.status_code, 204, msg='Delete group failed')
expected_status_code = HttpStatusCodes.NOT_FOUND
error_create_resp = self.autoscale_client.update_group_config(
group_id=group.id,
name=self.gc_name,
cooldown=90,
min_entities=self.gc_min_entities,
max_entities=group.groupConfiguration.maxEntities,
metadata={})
create_error = error_create_resp.entity
self.assertEquals(error_create_resp.status_code, expected_status_code,
msg='Create group succeeded with invalid request: {0}, groupid: {1}'
.format(error_create_resp.status_code, group.id))
self.assertTrue(create_error is None,
msg='Create group with invalid request returned: {0}'
.format(create_error))
| 49.208333 | 99 | 0.646832 | 1,478 | 12,991 | 5.378214 | 0.092016 | 0.077997 | 0.075481 | 0.075481 | 0.867153 | 0.848157 | 0.822493 | 0.822493 | 0.802742 | 0.802742 | 0 | 0.01003 | 0.286275 | 12,991 | 263 | 100 | 49.395437 | 0.847282 | 0.210761 | 0 | 0.732919 | 0 | 0 | 0.132342 | 0 | 0 | 0 | 0 | 0 | 0.161491 | 1 | 0.080745 | false | 0.024845 | 0.012422 | 0 | 0.099379 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
466b216a63e177b7fc7ec2eda41b69076a40c110 | 17,636 | py | Python | snake_project/tests/test_gamemodes.py | GonnaFlyMethod/snake1976 | 567caccdbd027509dc15210e011ed7709132220d | [
"MIT"
] | 4 | 2020-07-20T11:45:27.000Z | 2020-07-22T08:36:38.000Z | snake_project/tests/test_gamemodes.py | GonnaFlyMethod/snake1976 | 567caccdbd027509dc15210e011ed7709132220d | [
"MIT"
] | null | null | null | snake_project/tests/test_gamemodes.py | GonnaFlyMethod/snake1976 | 567caccdbd027509dc15210e011ed7709132220d | [
"MIT"
] | 1 | 2020-08-23T23:28:30.000Z | 2020-08-23T23:28:30.000Z | from extra.game_environment.menu_files.menu import Menu
from extra.game_environment.score_files.score import Score
from gamemodes.classic_mode import ClassicModeGameManager
from gamemodes.survival_mode import SurvivalModeGameManager
from gamemodes.battle_mode import BattleModeGameManager
class TestClassicGamemodeClass:
def setup(self):
"""Initialization of the game mode, player and installation of default
settings.
"""
menu_inst = Menu()
score_inst = Score('TestName')
self.gamemode = ClassicModeGameManager(score_inst, menu_inst)
self.gamemode.settings_storage['width'] = 40
self.gamemode.settings_storage['height'] = 20
walls = "can crawl through the walls"
self.gamemode.settings_storage['walls'] = walls
self.gamemode.settings_storage['speed'] = 0.08
self.gamemode.settings_storage['length'] = 3
self.gamemode.set_default_settings()
self.gamemode.initialize_new_player()
def test_initialize_new_player_method_classic_mode(self):
"""Testing the content of snake_segments' lists and the value
of adding points.
"""
self.setup()
assert len(self.gamemode.snake_segments_coord_x) != 0
assert len(self.gamemode.snake_segments_coord_y) != 0
assert self.gamemode.adding_points in [20, 30, 40]
def test_snake_and_walls_logic_classic_mode(self):
# Testing the ability to pass through one wall and exit the other.
# Note: see self.gamemode.settings_storage['walls'] in setup method of
# this class.
self.gamemode.head_x_coord = 41
self.gamemode.process_hook_logic()
assert self.gamemode.head_x_coord == 1
assert not self.gamemode.game_over
self.gamemode.head_x_coord = 0
self.gamemode.process_hook_logic()
assert self.gamemode.head_x_coord == self.gamemode.width - 1
assert not self.gamemode.game_over
self.gamemode.head_y_coord = 22
self.gamemode.process_hook_logic()
assert self.gamemode.head_y_coord == 0
assert not self.gamemode.game_over
self.gamemode.head_y_coord = -1
self.gamemode.process_hook_logic()
assert self.gamemode.head_y_coord == self.gamemode.height
assert not self.gamemode.game_over
# Testing logic when the ability to pass through the wall is disabled.
walls = "can't crawl through the walls"
self.gamemode.settings_storage['walls'] = walls
self.gamemode.set_default_settings()
self.gamemode.initialize_new_player()
self.gamemode.head_x_coord = 40
self.gamemode.process_hook_logic()
assert self.gamemode.game_over
self.gamemode.game_over = False
self.gamemode.head_x_coord = 0
self.gamemode.process_hook_logic()
assert self.gamemode.game_over
self.gamemode.game_over = False
self.gamemode.head_y_coord = 21
self.gamemode.process_hook_logic()
assert self.gamemode.game_over
self.gamemode.game_over = False
self.gamemode.head_y_coord = -1
self.gamemode.process_hook_logic()
assert self.gamemode.game_over
self.gamemode.game_over = False
def test_snake_eats_fruit_logic_classic_mode(self):
# Testing the logic of increasing the number of snake's segments when it
# eats fruit.
self.gamemode.head_x_coord = 20
self.gamemode.head_y_coord = 11
self.gamemode.x_coord_of_fruit = 20
self.gamemode.y_coord_of_fruit = 10
self.gamemode.process_hook_logic()
assert self.gamemode.num_of_snake_segments == 4
def test_snake_eats_itself_logic_classic_mode(self):
self.gamemode.head_x_coord = 20
self.gamemode.head_y_coord = 13
self.gamemode.process_hook_logic()
assert self.gamemode.game_over
class TestSurvivalGamemodeClass:
def setup(self):
"""Initialization of the game mode, players and installation of default
settings.
"""
menu_inst = Menu()
score_inst = Score('TestName')
self.gamemode = SurvivalModeGameManager(score_inst, menu_inst)
self.gamemode.settings_storage['width'] = 40
self.gamemode.settings_storage['height'] = 20
walls = "can crawl through the walls"
self.gamemode.settings_storage['walls'] = walls
self.gamemode.settings_storage['speed'] = 0.08
self.gamemode.settings_storage['length'] = 3
self.gamemode.set_default_settings()
self.gamemode.initialize_new_players()
def test_initialize_new_players_method_survival_mode(self):
"""Testing the content of snake_segments_coord_x and
snake_segments_coord_y for snake 1 and snake 2 and the value of adding
points.
"""
self.setup()
assert len(self.gamemode.snake_segments_coord_x_1) != 0
assert len(self.gamemode.snake_segments_coord_y_1) != 0
assert len(self.gamemode.snake_segments_coord_x_2) != 0
assert len(self.gamemode.snake_segments_coord_y_2) != 0
assert self.gamemode.adding_points in [20, 30, 40]
def test_snakes_and_walls_logic_survival_mode(self):
# Testing the ability to pass through one wall and exit the other for
# snake 1.
# Note: see self.gamemode.settings_storage['walls'] in setup method of
# this class.
self.gamemode.head_x_coord_1 = 41
self.gamemode.process_hook_logic_for_player_1()
assert self.gamemode.head_x_coord_1 == 1
assert not self.gamemode.game_over_1
self.gamemode.head_x_coord_1 = 0
self.gamemode.process_hook_logic_for_player_1()
assert self.gamemode.head_x_coord_1 == self.gamemode.width - 1
assert not self.gamemode.game_over_1
self.gamemode.head_y_coord_1 = 22
self.gamemode.process_hook_logic_for_player_1()
assert self.gamemode.head_y_coord_1 == 0
assert not self.gamemode.game_over_1
self.gamemode.head_y_coord_1 = -1
self.gamemode.process_hook_logic_for_player_1()
assert self.gamemode.head_y_coord_1 == self.gamemode.height
assert not self.gamemode.game_over_1
# Testing the ability to pass through one wall and exit the other for
# snake 2.
self.gamemode.head_x_coord_2 = 41
self.gamemode.process_hook_logic_for_player_2()
assert self.gamemode.head_x_coord_2 == 1
assert not self.gamemode.game_over_2
self.gamemode.head_x_coord_2 = 0
self.gamemode.process_hook_logic_for_player_2()
assert self.gamemode.head_x_coord_2 == self.gamemode.width - 1
assert not self.gamemode.game_over_2
self.gamemode.head_y_coord_2 = 22
self.gamemode.process_hook_logic_for_player_2()
assert self.gamemode.head_y_coord_2 == 0
assert not self.gamemode.game_over_2
self.gamemode.head_y_coord_2 = -1
self.gamemode.process_hook_logic_for_player_2()
assert self.gamemode.head_y_coord_2 == self.gamemode.height
assert not self.gamemode.game_over_2
# Testing logic when the ability to pass through the wall is disabled.
walls = "can't crawl through the walls"
self.gamemode.settings_storage['walls'] = walls
self.gamemode.set_default_settings()
self.gamemode.initialize_new_players()
# Snake 1.
self.gamemode.head_x_coord_1 = 40
self.gamemode.process_hook_logic_for_player_1()
assert self.gamemode.game_over_1
self.gamemode.game_over_1 = False
self.gamemode.head_x_coord_1 = 0
self.gamemode.process_hook_logic_for_player_1()
assert self.gamemode.game_over_1
self.gamemode.game_over_1 = False
self.gamemode.head_y_coord_1 = 21
self.gamemode.process_hook_logic_for_player_1()
assert self.gamemode.game_over_1
self.gamemode.game_over_1 = False
self.gamemode.head_y_coord_1 = -1
self.gamemode.process_hook_logic_for_player_1()
assert self.gamemode.game_over_1
self.gamemode.game_over_1 = False
# Snake 2.
self.gamemode.head_x_coord_2 = 40
self.gamemode.process_hook_logic_for_player_2()
assert self.gamemode.game_over_2
self.gamemode.game_over_2 = False
self.gamemode.head_x_coord_2 = 0
self.gamemode.process_hook_logic_for_player_2()
assert self.gamemode.game_over_2
self.gamemode.game_over_2 = False
self.gamemode.head_y_coord_2 = 21
self.gamemode.process_hook_logic_for_player_2()
assert self.gamemode.game_over_2
self.gamemode.game_over_2 = False
self.gamemode.head_y_coord_2 = -1
self.gamemode.process_hook_logic_for_player_2()
assert self.gamemode.game_over_2
self.gamemode.game_over_2 = False
def test_snakes_eat_fruit_logic_survival_mode(self):
# Testing the logic of increasing the number of snakes' segments when
# they eat fruit.
# Test for snake 1.
self.gamemode.head_x_coord_1 = 20
self.gamemode.head_y_coord_1 = 11
self.gamemode.x_coord_of_fruit = 20
self.gamemode.y_coord_of_fruit = 10
self.gamemode.process_hook_logic_for_player_1()
assert self.gamemode.num_of_snake_segments_1 == 4
# Test for snake 2.
self.gamemode.head_x_coord_2 = 20
self.gamemode.head_y_coord_2 = 11
self.gamemode.x_coord_of_fruit = 20
self.gamemode.y_coord_of_fruit = 10
self.gamemode.process_hook_logic_for_player_2()
assert self.gamemode.num_of_snake_segments_2 == 4
def test_snakes_eat_themselves_logic_survival_mode(self):
# Snake 1.
self.gamemode.head_x_coord_1 = 15
self.gamemode.head_y_coord_1 = 13
self.gamemode.process_hook_logic_for_player_1()
assert self.gamemode.game_over_1
self.gamemode.game_over_1 = False
# Snake 2.
self.gamemode.head_x_coord_2 = 25
self.gamemode.head_y_coord_2 = 13
self.gamemode.process_hook_logic_for_player_2()
assert self.gamemode.game_over_2
self.gamemode.game_over_2 = False
def test_common_logic_of_2_snakes_survival_mode(self):
# If two heads of snakes have the same coordinates, they lose.
self.gamemode.head_x_coord_1 = 20
self.gamemode.head_y_coord_1 = 20
self.gamemode.head_x_coord_2 = 20
self.gamemode.head_y_coord_2 = 20
self.gamemode.process_common_logic_of_2_snakes()
assert self.gamemode.game_over_1
assert self.gamemode.game_over_2
self.gamemode.game_over_1 = False
self.gamemode.game_over_2 = False
# If the coordinates of the first snake match the coordinates of the
# elements of the tail of the 2nd snake, then the first snake loses and
# vice versa.
# Initial coords for the 2-nd snake's segments.
self.gamemode.head_x_coord_1 = 25
self.gamemode.head_y_coord_1 = 13
self.gamemode.process_common_logic_of_2_snakes()
assert self.gamemode.game_over_1
# Initial coords for the 1-st snake's segments.
self.gamemode.head_x_coord_2 = 15
self.gamemode.head_y_coord_2 = 13
self.gamemode.process_common_logic_of_2_snakes()
assert self.gamemode.game_over_2
class TestBattleGamemodeClass:
def setup(self):
"""Initialization of the game mode, players and installation of default
settings.
"""
menu_inst = Menu()
score_inst = Score('TestName')
self.gamemode = BattleModeGameManager(score_inst, menu_inst)
walls = "can crawl through the walls"
self.gamemode.settings_storage['walls'] = walls
self.gamemode.settings_storage['speed'] = 0.08
self.gamemode.settings_storage['game_time'] = 1000
self.gamemode.settings_storage['length'] = 3
self.gamemode.set_default_settings()
self.gamemode.initialize_new_players()
def test_initialize_new_players_method_battle_mode(self):
"""Testing the content of snake_segments_coord_x and
snake_segments_coord_y for snake 1 and snake 2 and the value of adding
points.
"""
self.setup()
assert len(self.gamemode.snake_segments_coord_x_1) != 0
assert len(self.gamemode.snake_segments_coord_y_1) != 0
assert len(self.gamemode.snake_segments_coord_x_2) != 0
assert len(self.gamemode.snake_segments_coord_y_2) != 0
assert self.gamemode.adding_points in [20, 30, 40]
def test_snakes_and_walls_logic_battle_mode(self):
# Testing the ability to pass through one wall and exit the other for
# snake 1.
# Note: see self.gamemode.settings_storage['walls'] in setup method of
# this class.
self.gamemode.head_x_coord_1 = 20
self.gamemode.process_hook_logic_for_player_1()
assert self.gamemode.head_x_coord_1 == 1
assert not self.gamemode.game_over_1
self.gamemode.head_x_coord_1 = 0
self.gamemode.process_hook_logic_for_player_1()
assert self.gamemode.head_x_coord_1 == 19
assert not self.gamemode.game_over_1
self.gamemode.head_y_coord_1 = 22
self.gamemode.process_hook_logic_for_player_1()
assert self.gamemode.head_y_coord_1 == 0
assert not self.gamemode.game_over_1
self.gamemode.head_y_coord_1 = -1
self.gamemode.process_hook_logic_for_player_1()
assert self.gamemode.head_y_coord_1 == 20
assert not self.gamemode.game_over_1
# Testing the ability to pass through one wall and exit the other for
# snake 2.
self.gamemode.head_x_coord_2 = 60
self.gamemode.process_hook_logic_for_player_2()
assert self.gamemode.head_x_coord_2 == 41
assert not self.gamemode.game_over_2
self.gamemode.head_x_coord_2 = 40
self.gamemode.process_hook_logic_for_player_2()
assert self.gamemode.head_x_coord_2 == 59
assert not self.gamemode.game_over_2
self.gamemode.head_y_coord_2 = 22
self.gamemode.process_hook_logic_for_player_2()
assert self.gamemode.head_y_coord_2 == 0
assert not self.gamemode.game_over_2
self.gamemode.head_y_coord_2 = -1
self.gamemode.process_hook_logic_for_player_2()
assert self.gamemode.head_y_coord_2 == 20
assert not self.gamemode.game_over_2
# Testing logic when the ability to pass through the wall is disabled.
walls = "can't crawl through the walls"
self.gamemode.settings_storage['walls'] = walls
self.gamemode.set_default_settings()
self.gamemode.initialize_new_players()
# Snake 1.
self.gamemode.head_x_coord_1 = 20
self.gamemode.process_hook_logic_for_player_1()
assert self.gamemode.game_over_1
self.gamemode.game_over_1 = False
self.gamemode.head_x_coord_1 = 0
self.gamemode.process_hook_logic_for_player_1()
assert self.gamemode.game_over_1
self.gamemode.game_over_1 = False
self.gamemode.head_y_coord_1 = 21
self.gamemode.process_hook_logic_for_player_1()
assert self.gamemode.game_over_1
self.gamemode.game_over_1 = False
self.gamemode.head_y_coord_1 = -1
self.gamemode.process_hook_logic_for_player_1()
assert self.gamemode.game_over_1
self.gamemode.game_over_1 = False
# Snake 2.
self.gamemode.head_x_coord_2 = 60
self.gamemode.process_hook_logic_for_player_2()
assert self.gamemode.game_over_2
self.gamemode.game_over_2 = False
self.gamemode.head_x_coord_2 = 40
self.gamemode.process_hook_logic_for_player_2()
assert self.gamemode.game_over_2
self.gamemode.game_over_2 = False
self.gamemode.head_y_coord_2 = 21
self.gamemode.process_hook_logic_for_player_2()
assert self.gamemode.game_over_2
self.gamemode.game_over_2 = False
self.gamemode.head_y_coord_2 = -1
self.gamemode.process_hook_logic_for_player_2()
assert self.gamemode.game_over_2
self.gamemode.game_over_2 = False
def test_snakes_eat_fruit_logic_battle_mode(self):
# Testing the logic of increasing the number of segments of the 2-nd
# snake when the first one eats fruit and vice versa.
# Snake 1.
self.gamemode.head_x_coord_1 = 10
self.gamemode.head_y_coord_1 = 11
self.gamemode.x_coord_of_fruit_1 = 10
self.gamemode.y_coord_of_fruit_1 = 10
self.gamemode.process_hook_logic_for_player_1()
assert self.gamemode.num_of_snake_segments_2 == 4
# Snake 2.
self.gamemode.head_x_coord_2 = 55
self.gamemode.head_y_coord_2 = 11
self.gamemode.x_coord_of_fruit_2 = 55
self.gamemode.y_coord_of_fruit_2 = 10
self.gamemode.process_hook_logic_for_player_2()
assert self.gamemode.num_of_snake_segments_1 == 4
def test_snakes_eat_themselves_logic_battle_mode(self):
# Snake 1.
self.gamemode.head_x_coord_1 = 10
self.gamemode.head_y_coord_1 = 13
self.gamemode.process_hook_logic_for_player_1()
assert self.gamemode.game_over_1
# Snake 2.
self.gamemode.head_x_coord_2 = 50
self.gamemode.head_y_coord_2 = 13
self.gamemode.process_hook_logic_for_player_2()
assert self.gamemode.game_over_2
| 39.017699 | 80 | 0.691086 | 2,519 | 17,636 | 4.494641 | 0.056372 | 0.299947 | 0.12436 | 0.128952 | 0.917417 | 0.911853 | 0.901872 | 0.891362 | 0.869193 | 0.856739 | 0 | 0.032184 | 0.237129 | 17,636 | 451 | 81 | 39.104213 | 0.80935 | 0.119528 | 0 | 0.798722 | 0 | 0 | 0.018647 | 0 | 0 | 0 | 0 | 0 | 0.277955 | 1 | 0.051118 | false | 0 | 0.015974 | 0 | 0.076677 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
46a163ce1764a63c76e72ac1ad1e3282103d68e9 | 117 | py | Python | api/tests/__init__.py | Skelmis/nextcord-stats | 311268284166307b563826da0829b01c47df34d7 | [
"MIT"
] | null | null | null | api/tests/__init__.py | Skelmis/nextcord-stats | 311268284166307b563826da0829b01c47df34d7 | [
"MIT"
] | 1 | 2022-02-23T14:28:02.000Z | 2022-02-27T10:30:35.000Z | api/tests/__init__.py | Skelmis/nextcord-stats | 311268284166307b563826da0829b01c47df34d7 | [
"MIT"
] | null | null | null | import datetime
def get_aware_time() -> datetime.datetime:
return datetime.datetime.now(datetime.timezone.utc)
| 19.5 | 55 | 0.777778 | 15 | 117 | 5.933333 | 0.666667 | 0.359551 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.119658 | 117 | 5 | 56 | 23.4 | 0.864078 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | true | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 8 |
d3d85d3c232feddff3d1dfc2f1f20374625b2cd7 | 53,088 | py | Python | New_Atoll_Code_region_aggregating_visualization_short_aco.py | ale37911/AtollGeoMorph | 77cc408010c0a8a257fe5fd40e694199cda1ea42 | [
"MIT"
] | 1 | 2022-01-25T18:31:14.000Z | 2022-01-25T18:31:14.000Z | New_Atoll_Code_region_aggregating_visualization_short_aco.py | ale37911/AtollGeoMorph | 77cc408010c0a8a257fe5fd40e694199cda1ea42 | [
"MIT"
] | null | null | null | New_Atoll_Code_region_aggregating_visualization_short_aco.py | ale37911/AtollGeoMorph | 77cc408010c0a8a257fe5fd40e694199cda1ea42 | [
"MIT"
] | null | null | null | #%%---------------------Import python libaries-----------------------
#import gdal
import matplotlib.pyplot as plt
import numpy as np
import os
import pandas as pd
import seaborn as sns
import matplotlib.patches
#%%------------------input data
atollCompLoc = 'G:\Shared drives\Ortiz Atolls Database\CompositesWithCount\AllAtolls' #Location of all Atoll Composites (currently the ones made in 2019)
atollComp = os.listdir(atollCompLoc)
morphOutput = 'G:\Shared drives\Ortiz Atolls Database\MorphometricOutput' # Location that the output will be saved to
countryName = 'AllAtollsNew'
newAtollList = []
for i in range(len(atollComp)):
fileName = atollComp[i]
atollName = fileName[0:-20]
full_path = morphOutput + '\\' + countryName + '\\' + atollName
if os.path.exists(full_path):
os.chdir(full_path)
if os.path.isfile('df_motu.csv'):
newAtollList.append(atollName)
PF = []
for i in range(len(newAtollList)):
if newAtollList[i][0:4] == 'P_PF':
PF.append(newAtollList[i])
#%% Create large dataFrames
i = 0
atollName = newAtollList[i]
fileName = atollName + '50c50mCountClip2.tif'
resolution = 30
morphOutput = 'G:\Shared drives\Ortiz Atolls Database\MorphometricOutput' # Location that the output will be saved to
countryName = 'AllAtollsNew'
full_path = morphOutput + '\\' + countryName + '\\' + atollName # create county and atoll directory if they do not exist
os.chdir(full_path) # set working directory to the atoll directory
# read in dataframes
df3 = pd.read_csv('df_reef_flat.csv')
df2 = pd.read_csv('df_motu.csv')
dfatoll = pd.read_csv('df_atollOnly.csv')
df2small = pd.read_csv('df_motu_small.csv')
df3['ocean basin'] = atollName[0]
df3['country code'] = atollName[2:4]
df3['atoll name'] = atollName[5:]
df2['ocean basin'] = atollName[0]
df2['country code'] = atollName[2:4]
df2['atoll name'] = atollName[5:]
dfatoll['ocean basin'] = atollName[0]
dfatoll['country code'] = atollName[2:4]
dfatoll['atoll name'] = atollName[5:]
df2small['ocean basin'] = atollName[0]
df2small['country code'] = atollName[2:4]
df2small['atoll name'] = atollName[5:]
unwanted = df2.columns[df2.columns.str.startswith('Unnamed')]
df2.drop(unwanted, axis=1, inplace=True)
unwanted = df3.columns[df3.columns.str.startswith('Unnamed')]
df3.drop(unwanted, axis=1, inplace=True)
unwanted = dfatoll.columns[dfatoll.columns.str.startswith('Unnamed')]
dfatoll.drop(unwanted, axis=1, inplace=True)
unwanted = df2small.columns[df2small.columns.str.startswith('Unnamed')]
df2small.drop(unwanted, axis=1, inplace=True)
df2all = df2.copy(deep=True)
df3all = df3.copy(deep=True)
dfatollall = dfatoll.copy(deep=True)
df2smallall = df2small.copy(deep=True)
for i in range(1,155):
atollName = newAtollList[i]
fileName = atollName + '50c50mCountClip2.tif'
resolution = 30
morphOutput = 'G:\Shared drives\Ortiz Atolls Database\MorphometricOutput' # Location that the output will be saved to
countryName = 'AllAtollsNew'
full_path = morphOutput + '\\' + countryName + '\\' + atollName # create county and atoll directory if they do not exist
os.chdir(full_path) # set working directory to the atoll directory
# read in dataframes
df3 = pd.read_csv('df_reef_flat.csv')
df2 = pd.read_csv('df_motu.csv')
dfatoll = pd.read_csv('df_atollOnly.csv')
df2small = pd.read_csv('df_motu_small.csv')
df3['ocean basin'] = atollName[0]
df3['country code'] = atollName[2:4]
df3['atoll name'] = atollName[5:]
df2['ocean basin'] = atollName[0]
df2['country code'] = atollName[2:4]
df2['atoll name'] = atollName[5:]
dfatoll['ocean basin'] = atollName[0]
dfatoll['country code'] = atollName[2:4]
dfatoll['atoll name'] = atollName[5:]
df2small['ocean basin'] = atollName[0]
df2small['country code'] = atollName[2:4]
df2small['atoll name'] = atollName[5:]
unwanted = df2.columns[df2.columns.str.startswith('Unnamed')]
df2.drop(unwanted, axis=1, inplace=True)
unwanted = df3.columns[df3.columns.str.startswith('Unnamed')]
df3.drop(unwanted, axis=1, inplace=True)
unwanted = dfatoll.columns[dfatoll.columns.str.startswith('Unnamed')]
dfatoll.drop(unwanted, axis=1, inplace=True)
unwanted = df2small.columns[df2small.columns.str.startswith('Unnamed')]
df2small.drop(unwanted, axis=1, inplace=True)
frames2 = [df2all, df2]
frames3 = [df3all, df3]
frames4 = [dfatollall, dfatoll]
framessmall = [df2smallall, df2small]
df2all = pd.concat(frames2)
df3all = pd.concat(frames3)
dfatollall = pd.concat(frames4)
df2smallall = pd.concat(framessmall)
#%% save large dataframes
morphOutput = 'G:\Shared drives\Ortiz Atolls Database\MorphometricOutput' # Location that the output will be saved to
countryName = 'AllAtollsNew'
full_path = morphOutput + '\\' + countryName + '\\Regional_Analysis'
os.chdir(full_path) # set working directory to the atoll directory
df2all.to_csv('df_motu_allACO.csv')
df3all.to_csv('df_reef_flat_allACO.csv')
dfatollall.to_csv('df_atollOnly_all.csv')
df2smallall.to_csv('df_smallmotu_all.csv')
#%% Alternatively if large dataframes exist, just read them in large dataframes
morphOutput = 'G:\Shared drives\Ortiz Atolls Database\MorphometricOutput' # Location that the output will be saved to
countryName = 'AllAtollsNew'
full_path = morphOutput + '\\' + countryName + '\\Regional_Analysis'
os.chdir(full_path) # set working directory to the atoll directory
df3all = pd.read_csv('df_reef_flat_allACO.csv')
df2all = pd.read_csv('df_motu_allACO.csv')
dfatollall = pd.read_csv('df_atollOnly_all.csv')
df2smallall = pd.read_csv('df_smallmotu_all.csv')
df_binned2 = pd.read_csv('French Polynesia' + ' df_binned.csv')
df3all['bins latitude'] = pd.cut(df3all['centroid_lat'], bins = [-25, -13, -3, 4, 15], labels = ['-25 to -13', '-13 to -3', '-3 to 4', '4 to 15'], ordered = False)
df2all['bins latitude'] = pd.cut(df2all['centroid_lat'], bins = [-25, -13, -3, 4, 15], labels = ['-25 to -13', '-13 to -3', '-3 to 4', '4 to 15'], ordered = False)
dfatollall['bins latitude'] = pd.cut(dfatollall['centroid_lat'], bins = [-25, -13, -3, 4, 15], labels = ['-25 to -13', '-13 to -3', '-3 to 4', '4 to 15'], ordered = False)
#%% decide on grouping (regional or all or other)
df3all['bins abs latitude'] = pd.cut(df3all['centroid_lat'].abs(), bins = [-1, 4.7, 14, 30], labels = ['low', 'mid', 'high'], ordered = False)
df2all['bins abs latitude'] = pd.cut(df2all['centroid_lat'].abs(), bins = [-1, 4.7, 14, 30], labels = ['low', 'mid', 'high'], ordered = False)
atoll_centroids = df3all.groupby(['atoll name']).mean()[['centroid_lat','centroid_long']]
region_bin = df3all.groupby(['atoll name']).first()[['country code']]
t2 = region_bin.groupby('country code').size()
df3all_PF = df3all[df3all['country code'] == 'PF']
df2all_PF = df2all[df2all['country code'] == 'PF']
# depending on plotting interest/grouping
# region_name = 'French Polynesia'
# df_reef = df3all_PF
# df_motu = df2all_PF
region_name = 'All Atolls'
df_reef = df3all
df_motu = df2all
#%% # create summary tables
df_motu_summary = df_motu.groupby(['atoll name','motu index']).first()[['ocean basin','country code','bins abs latitude']]
df_motu_summary[['motu label','reef flat label','centroid_lat']] = df_motu.groupby(['atoll name','motu index']).mean()[['motu label','reef flat label','centroid_lat']]
df_motu_summary[['area (m^2)','perimeter (m)','mean motu to reef flat distance (m)','mean motu lagoon to reef flat lagoon (m)','mean motu width (m)','mean ocean reef width (m)', 'mean lagoon reef width (m)','motu length (m)','ocean side motu length (m)','lagoon side motu length (m)']] = df_motu.groupby(['atoll name','motu index']).mean()[['area m^2','perimeter m','motu to reef flat distance','motu lagoon to reef flat lagoon','motu width','ocean reef width', 'lagoon reef width','motu length','ocean side motu length','lagoon side motu length']]
df_motu_summary[['std motu to reef flat distance (m)','std motu lagoon to reef flat lagoon (m)','std motu width (m)','std ocean reef width (m)', 'std lagoon reef width (m)']] = df_motu.groupby(['atoll name','motu index']).std()[['motu to reef flat distance','motu lagoon to reef flat lagoon','motu width','ocean reef width', 'lagoon reef width']]
df_reef_summary = df_reef.groupby(['atoll name','reef flat index']).mean()[['reef flat label','centroid_lat']]
df_reef_summary[['area (m^2)','perimeter (m)','mean reef flat width (m)','mean effective reef flat width (m)','mean reef flat width motu (std)','ocean side reef flat length (m)']] = df_reef.groupby(['atoll name','reef flat index']).mean()[['area m^2','perimeter R','reef flat width','effective reef flat width','reef flat width motu','ocean side reef flat length']]
df_reef_summary[['std reef flat width (m)','std effective reef flat width (m)','std reef flat width motu (m)']] = df_reef.groupby(['atoll name','reef flat index']).std()[['reef flat width','effective reef flat width','reef flat width motu']]
#% totals
def NumberObjects(m, s1):
mt =m.copy()
num = len(mt[s1].unique())
return num
df_totals = df_motu.groupby('atoll name').first()[['ocean basin','country code','bins abs latitude']]
df_totals[['atoll centroid_lat', 'atoll centroid_long']] = df_motu.groupby('atoll name').mean()[['centroid_lat', 'centroid_long']]
df_totals['Number Motu'] = df_motu.groupby('atoll name').apply(NumberObjects,s1 = 'motu index')
df_totals['Number Reef Flat'] = df_reef.groupby('atoll name').apply(NumberObjects,s1 = 'reef flat index')
#%
df_totals[['total motu area (m^2)','total motu perimeter (m)','total motu length (m)','total ocean side motu length (m)','total lagoon side motu length (m)']] = df_motu_summary.groupby('atoll name').sum()[['area (m^2)','perimeter (m)','motu length (m)','ocean side motu length (m)','lagoon side motu length (m)']]
df_totals[['mean motu to reef flat distance (m)','mean motu lagoon to reef flat lagoon (m)','mean motu width (m)']] = df_motu.groupby('atoll name').mean()[['motu to reef flat distance','motu lagoon to reef flat lagoon','motu width']]
df_totals[['std motu to reef flat distance (m)','std motu lagoon to reef flat lagoon (m)','std motu width (m)']] = df_motu.groupby('atoll name').std()[['motu to reef flat distance','motu lagoon to reef flat lagoon','motu width',]]
df_totals[['total reef flat area (m^2)','total reef flat perimeter (m)','total ocean side reef flat length (m)']] = df_reef_summary.groupby('atoll name').sum()[['area (m^2)','perimeter (m)','ocean side reef flat length (m)']]
df_totals[['mean reef flat width (m)','mean effective reef flat width (m)']] = df_reef.groupby('atoll name',).mean()[['reef flat width','effective reef flat width']]
df_totals[['std reef flat width (m)','std effective reef flat width (m)']] = df_reef.groupby('atoll name').std()[['reef flat width','effective reef flat width']]
df_totals['percent reef flat length covered by motu (%)'] = df_totals['total ocean side motu length (m)']/df_totals['total ocean side reef flat length (m)'] *100
df_totals['percent reef flat area covered by motu (%)'] = df_totals['total motu area (m^2)']/df_totals['total reef flat area (m^2)'] *100
df_totals['bins latitude'] = pd.cut(df_totals['atoll centroid_lat'], bins = [-25, -13, -3, 4, 15], labels = ['-25 to -13', '-13 to -3', '-3 to 4', '4 to 15'], ordered = False)
df_totals.to_csv(region_name + ' df_totals_ACO.csv')
#%% Create binned large dataFrames
df_binned = df_reef.groupby(['atoll name','bins ac']).mean()[['centroid_lat', 'centroid_long','reef flat width','effective reef flat width','reef flat width motu','total binned reef flat length']]
df_binned.columns = [['atoll centroid_lat', 'atoll centroid_long','mean reef flat width (m)','mean effective reef flat width (m)','mean reef flat width motu (m)','total binned reef flat length (m)']]
df_binned[['bins abs latitude']] = df_reef.groupby(['atoll name','bins ac']).first()[['bins abs latitude']]
df_binned[['std reef flat width (m)','std effective reef flat width (m)']] = df_reef.groupby(['atoll name','bins ac']).std()[['reef flat width','effective reef flat width']]
df_binned[['mean motu to reef flat distance (m)','mean motu lagoon to reef flat lagoon (m)','mean motu width (m)','mean ocean reef width (m)', 'mean lagoon reef width (m)','total binned motu length (m)']] = df_motu.groupby(['atoll name','bins ac']).mean()[['motu to reef flat distance','motu lagoon to reef flat lagoon','motu width','ocean reef width', 'lagoon reef width','total binned motu length']]
df_binned[['std motu to reef flat distance (m)','std motu lagoon to reef flat lagoon (m)','std motu width (m)','std ocean reef width (m)', 'std lagoon reef width (m)']] = df_motu.groupby(['atoll name','bins ac']).std()[['motu to reef flat distance','motu lagoon to reef flat lagoon','motu width','ocean reef width', 'lagoon reef width']]
df_binned['percent reef flat length covered by motu (%)'] = df_binned['total binned motu length (m)'].squeeze().divide(df_binned['total binned reef flat length (m)'].squeeze(),fill_value = 0)*100
df_binned = df_binned.reset_index(drop = False)
df_binned.to_csv(region_name + ' df_binnedACO.csv')
#%% merge small and large motu
df_motu_summary_large = df_motu_summary.reset_index(drop = False)
df2all_small2 = df2smallall.reset_index(drop = False)
maxMotu = df_motu_summary_large[['atoll name','motu index']].groupby('atoll name').max()
df2all_small2['motu index'] = df2all_small2['small motu index'] + maxMotu.loc[df2all_small2['atoll name']].reset_index(drop = 'atoll name').squeeze()
frames = [df2all_small2, df_motu_summary_large]
df_motu_summary_all = pd.concat(frames)
#%% create total motu summary
df_totals_all = df_motu.groupby('atoll name').first()[['ocean basin','country code','bins abs latitude']]
df_totals_all['Number Motu'] = df_motu_summary_all.groupby('atoll name').apply(NumberObjects,s1 = 'motu index')
df_totals_all[['total motu area (km^2)']] = df_motu_summary_all.groupby('atoll name').sum()[['area (m^2)']]/1000000
df_totals_all[['total motu perimeter (km)']] = df_motu_summary_all.groupby('atoll name').sum()[['perimeter (m)']]/1000
df_totals_all['Number Motu small'] = df2all_small2.groupby('atoll name').apply(NumberObjects,s1 = 'motu index')
df_totals_all[['motu area small (km^2)']] = df2all_small2.groupby('atoll name').sum()[['area (m^2)']]/1000000
df_totals_all[['motu perimeter small (km)']] = df2all_small2.groupby('atoll name').sum()[['perimeter (m)']]/1000
df_totals_all['Number Motu large'] = df_motu_summary_large.groupby('atoll name').apply(NumberObjects,s1 = 'motu index')
df_totals_all[['motu area large (km^2)']] = df_motu_summary_large.groupby('atoll name').sum()[['area (m^2)']]/1000000
df_totals_all[['motu perimeter large (km)']] = df_motu_summary_large.groupby('atoll name').sum()[['perimeter (m)']]/1000
df_totals_all.to_csv('AllMotuSummarySmallLargeMotu.csv')
#%%Motu summary data
df_reef['exposure bin'] = pd.cut(df_reef['exposure angle'], bins = [-1, 45, 135, 225, 315, 360], labels = ['North', 'East', 'South', 'West', 'North'], ordered = False)
df_motu['exposure bin'] = pd.cut(df_motu['exposure angle'], bins = [-1, 45, 135, 225, 315, 360], labels = ['North', 'East', 'South', 'West', 'North'], ordered = False)
df_motu_summary = df_motu.groupby(['atoll name','motu index']).first()[['ocean basin','country code','bins abs latitude','motu excentricity']]
df_motu_summary[['motu label','reef flat label','centroid_lat']] = df_motu.groupby(['atoll name','motu index']).mean()[['motu label','reef flat label','centroid_lat']]
df_motu_summary[['area (m^2)','perimeter (m)','mean motu to reef flat distance (m)','mean motu lagoon to reef flat lagoon (m)','mean motu width (m)','mean ocean reef width (m)', 'mean lagoon reef width (m)','motu length (m)','ocean side motu length (m)','lagoon side motu length (m)']] = df_motu.groupby(['atoll name','motu index']).mean()[['area m^2','perimeter m','motu to reef flat distance','motu lagoon to reef flat lagoon','motu width','ocean reef width', 'lagoon reef width','motu length','ocean side motu length','lagoon side motu length']]
df_motu_summary[['std motu to reef flat distance (m)','std motu lagoon to reef flat lagoon (m)','std motu width (m)','std ocean reef width (m)', 'std lagoon reef width (m)']] = df_motu.groupby(['atoll name','motu index']).std()[['motu to reef flat distance','motu lagoon to reef flat lagoon','motu width','ocean reef width', 'lagoon reef width']]
df_motu_summary[['directional bin']] = df_motu[df_motu['o/l label']=='ocean'].groupby(['atoll name','motu index'])['bins ac'].agg(pd.Series.mode).to_frame()
df_motu_summary[['exposure bin']] = df_motu[df_motu['o/l label']=='ocean'].groupby(['atoll name','motu index'])['exposure bin'].agg(pd.Series.mode).to_frame()
df_motu_summary['exposure bin'][df_motu_summary['exposure bin'].str.len() < 3.0] = np.nan
m = df_motu_summary[df_motu_summary['directional bin'] != df_motu_summary['exposure bin']]
#%% Exposure Angle & Position Angle
from scipy.stats import circmean
def circMean(m, s1):
mt =m.copy()
mt[[s1]]
r = circmean(mt[[s1]], high = 360, low = 0)
return r
df_motu_summary['mean exposure angle'] = df_motu[df_motu['o/l label']=='ocean'].groupby(['atoll name','motu index']).apply(circMean, s1 = 'exposure angle')
df_motu_summary['mean exposure bin'] = pd.cut(df_motu_summary['mean exposure angle'], bins = [-1, 45, 135, 225, 315, 360], labels = ['North', 'East', 'South', 'West', 'North'], ordered = False)
df_motu_summary['mean position angle'] = df_motu[df_motu['o/l label']=='ocean'].groupby(['atoll name','motu index']).apply(circMean, s1 = 'binning angle ac')
df_motu_summary['mean position bin'] = pd.cut(df_motu_summary['mean position angle'], bins = [-1, 45, 135, 225, 315, 360], labels = ['North', 'East', 'South', 'West', 'North'], ordered = False)
df_merged = df_motu_summary.merge(df_reef_summary, on=['atoll name','reef flat label'])
#%% valuble column names
s1 = 'mean lagoon reef width (m)'
s2 = 'mean motu width (m)'
s3 = 'mean ocean reef width (m)'
s4 = 'motu total reef width (m)'
s5 = 'motu-reef-flat-dist / reef-flat width'
s6 = 'motu length / reef-flat length'
df_merged['motu total reef width (m)'] = df_merged[s1] + df_merged[s2] + df_merged[s3]
#x = motu length / atoll perimeter; y-axis = motu-reef-flat-dist / reef-flat width
df_merged['motu-reef-flat-dist / reef-flat width'] = df_merged['mean ocean reef width (m)']/df_merged['motu total reef width (m)']
df_merged['motu length / reef-flat length'] = df_merged['motu length (m)']/df_merged['ocean side reef flat length (m)']
df_mergedm = df_merged[df_merged['mean position bin'] != df_merged['mean exposure bin']]
colors = {'low':'blue', 'mid':'orange', 'high':'green'}
p1 = s5
p2 = s6
#%% Plot critical reef flat width vs motu length FP
p1 = s3
p2 = 'motu length (m)'
cmp = plt.get_cmap('gist_earth',6)
ax1 = df_merged[(df_merged['mean position bin'] == 'North') & (df_merged['country code']== 'PF')].plot.scatter(y=p1, x=p2, c= cmp(1), xlim = (0,70000), ylim = (0,3000), label = 'North',s=25)
df_merged[(df_merged['mean position bin'] == 'East') & (df_merged['country code']== 'PF')].plot.scatter(y=p1, x=p2, c= cmp(2), xlim = (0,70000), ylim = (0,3000),ax=ax1, label = 'East',s=25)
df_merged[(df_merged['mean position bin'] == 'South') & (df_merged['country code']== 'PF')].plot.scatter(y=p1, x=p2, c= cmp(3), xlim = (0,70000), ylim = (0,3000),ax=ax1, label = 'South',s=10)
df_merged[(df_merged['mean position bin'] == 'West') & (df_merged['country code']== 'PF')].plot.scatter(y=p1, x=p2, c= cmp(4), xlim = (0,6000), ylim = (0,1500),ax=ax1, label = 'West',s=10)
plt.legend(framealpha=0.0)
plt.yticks(np.arange(0,1500,step=250),fontsize=12)
plt.xticks(np.arange(0,60000,step=15000),np.arange(0,60,step=15),fontsize=12)
plt.xlabel('Motu Length (km)')
plt.ylabel('Ocean Reef Width (m)')
ax1.tick_params(axis='both',which='major',width=2,length=7,direction='in')
#plt.savefig('MotuLengthOceanReefWidthFP.png',dpi=600)
#%% Plot critical reef flat width vs motu length normalized
#p1 = s3
#p2 = 'motu length (m)'
p1 = s5
p2 = s6
cmp = plt.get_cmap('gist_earth',6)
ax1 = df_merged[(df_merged['mean position bin'] == 'North') & (df_merged['country code']== 'PF')].plot.scatter(y=p1, x=p2, c= cmp(1), xlim = (0,70000), ylim = (0,3000), label = 'North',s=25)
df_merged[(df_merged['mean position bin'] == 'East') & (df_merged['country code']== 'PF')].plot.scatter(y=p1, x=p2, c= cmp(2), xlim = (0,70000), ylim = (0,3000),ax=ax1, label = 'East',s=25)
df_merged[(df_merged['mean position bin'] == 'South') & (df_merged['country code']== 'PF')].plot.scatter(y=p1, x=p2, c= cmp(3), xlim = (0,70000), ylim = (0,3000),ax=ax1, label = 'South',s=10)
df_merged[(df_merged['mean position bin'] == 'West') & (df_merged['country code']== 'PF')].plot.scatter(y=p1, x=p2, c= cmp(4), xlim = (0,1), ylim = (0,1),ax=ax1, label = 'West',s=10)
plt.legend(framealpha=0.0)
plt.yticks(np.arange(0,1.1,step=.25))
plt.xticks(np.arange(0,1.1,step=.25))
plt.xlabel('Motu Length/Reef-flat Length')
plt.ylabel('Ocean Reef Width/Total Reef-flat Width')
ax1.tick_params(axis='both',which='major',width=2,length=7,direction='in')
#plt.savefig('MotuLengthOceanReefWidthFPNormalized.png',dpi=600)
#%%strings
df_merged = df_motu_summary.merge(df_reef_summary, on=['atoll name','reef flat label'])
s1 = 'mean lagoon reef width (m)'
s2 = 'mean motu width (m)'
s3 = 'mean ocean reef width (m)'
s4 = 'motu total reef width (m)'
s5 = 'motu-reef-flat-dist / reef-flat width'
s6 = 'motu length / reef-flat length'
df_merged['bins abs latitude'] = pd.cut(df_merged['centroid_lat_x'].abs(), bins = [-1, 4.7, 14, 30], labels = ['low', 'mid', 'high'], ordered = False)
df_merged['bins abs latitude'] = pd.cut(df_merged['centroid_lat_x'].abs(), bins = [-1, 4.7, 14, 30], labels = ['low', 'mid', 'high'], ordered = False)
#%%Motu length v reef width (m) binned by direction
df_merged['motu total reef width (m)'] = df_merged[s1] + df_merged[s2] + df_merged[s3]
df_merged['motu-reef-flat-dist / reef-flat width'] = df_merged['mean ocean reef width (m)']/df_merged['motu total reef width (m)']
df_merged['motu length / reef-flat length'] = df_merged['motu length (m)']/df_merged['ocean side reef flat length (m)']
p1 = s3
p2 = 'motu length (m)'
blues = plt.get_cmap('Blues',5)
purples = plt.get_cmap('Purples',5)
reds = plt.get_cmap('Reds',5)
oranges = plt.get_cmap('Oranges',6)
greens = plt.get_cmap('Greens',5)
df_merged['bins abs lat'] = df_merged['bins abs latitude'].map({'high': 'high tropical', 'mid': 'mid tropical', 'low':'equatorial'})
ax1 = df_merged[df_merged['bins abs latitude'] == 'low'].plot.scatter(y=p1, x=p2, color=blues(3), label = 'equatorial')
df_merged[df_merged['bins abs latitude'] == 'mid'].plot.scatter(y=p1, x=p2, color=oranges(3), ax=ax1, label = 'mid tropical')
df_merged[df_merged['bins abs latitude'] == 'high'].plot.scatter(y=p1, x=p2, color=greens(3), xlim = (0,70000), ylim = (0,3000),ax=ax1, label = 'high tropical')
plt.legend(framealpha=0.0)
plt.yticks(np.arange(0,3000,step=500))
plt.xticks(np.arange(0,70000,step=15000),np.arange(0,70,step=15))
# legend = plt.legend()
# legend.get_frame().set_facecolor('none')
plt.xlabel('Motu Length (km)')
plt.ylabel('Ocean Reef Width (m)')
ax1.tick_params(axis='both',which='major',width=2,length=7,direction='in')
#plt.savefig('MotuLengthOceanReefWidthAll.png',dpi=600)
#%% normalized All data critical reef width vs length
p1 = s5
p2 = s6
blues = plt.get_cmap('Blues',5)
purples = plt.get_cmap('Purples',5)
reds = plt.get_cmap('Reds',5)
oranges = plt.get_cmap('Oranges',6)
greens = plt.get_cmap('Greens',5)
df_merged['bins abs lat'] = df_merged['bins abs latitude'].map({'high': 'high tropical', 'mid': 'mid tropical', 'low':'equatorial'})
ax1 = df_merged[df_merged['bins abs latitude'] == 'low'].plot.scatter(y=p1, x=p2, color=blues(3), label = 'equatorial')
df_merged[df_merged['bins abs latitude'] == 'mid'].plot.scatter(y=p1, x=p2, color=oranges(3), ax=ax1, label = 'mid tropical')
df_merged[df_merged['bins abs latitude'] == 'high'].plot.scatter(y=p1, x=p2, color=greens(3), xlim = (0,1), ylim = (0,1),ax=ax1, label = 'high tropical')
plt.legend(framealpha=0.0)
#plt.yticks(np.arange(0,1500,step=250),fontsize=12)
plt.yticks(np.arange(0,1.1,step=.25))
plt.xticks(np.arange(0,1.1,step=.25))
plt.xlabel('Motu Length/Reef-flat Length')
plt.ylabel('Ocean Reef Width/Total Reef-flat Width')
ax1.tick_params(axis='both',which='major',width=2,length=7,direction='in')
#plt.savefig('MotuLengthOceanReefWidthAllNorm.png',dpi=600)
#%% 2 d histigrams
# libraries
df_merged4 = df_merged.reset_index(drop = False)
df_merged4[['log 10 motu length (m)']] = np.log10(df_merged4[['motu length (m)']])
df_merged4[['log 10 motu width (m)']] = np.log10(df_merged4[['mean motu width (m)']])
df_merged5 = df_merged4[df_merged4['bins abs latitude'] == 'high'] #change to mid, low
#sns.displot(df_merged5, x='log 10 motu length (m)', y='log 10 motu width (m)', bins = [10,10])
#sns.displot(df_merged4, x='log 10 motu length (m)', y='log 10 motu width (m)', hue='bins abs latitude', kind="kde")
plt.xlim([0, 5])
plt.ylim([0, 5])
#sns.displot(df_merged4, x='motu length (m)', y='mean motu width (m)', hue='bins abs latitude', kind="kde")
sns.displot(df_merged5, x='log 10 motu length (m)', y='log 10 motu width (m)', hue='bins abs latitude', kind="kde",fill = True, levels = (0.05,0.1,0.2,0.3,0.4,0.5,0.6,0.7,0.8,0.9,1))
#%% all widths in one with the colors for FP
df_binned2['label bin'] = df_binned2['bins ac'].map({'North': 'a', 'East': 'b','South': 'c', 'West': 'd'})
df_binned2[['a) motu width','d) ocean reef width', 'c) lagoon reef width','b) reef flat width','e) effective reef flat width']] = df_binned2[['mean motu width (m)','mean ocean reef width (m)', 'mean lagoon reef width (m)','mean reef flat width (m)','mean effective reef flat width (m)']]
axs = df_binned2[['label bin','a) motu width','d) ocean reef width', 'c) lagoon reef width','b) reef flat width','e) effective reef flat width']].boxplot(by = 'label bin',figsize = (12,6),layout=(1, 5),patch_artist = True, grid=False, color = {'whiskers' : 'black',
'caps' : 'black',
'medians' : 'black',
'boxes' : 'black'})
cmp = plt.get_cmap('gist_earth',6)
for i in range(0,5):
axs[i].findobj(matplotlib.patches.Patch)[0].set_facecolor(cmp(1))
axs[i].findobj(matplotlib.patches.Patch)[1].set_facecolor(cmp(2))
axs[i].findobj(matplotlib.patches.Patch)[2].set_facecolor(cmp(3))
axs[i].findobj(matplotlib.patches.Patch)[3].set_facecolor(cmp(4))
axs[0].set_xticklabels(('North', 'East', 'South', 'West','North', 'East', 'South', 'West','North', 'East', 'South', 'West','North', 'East', 'South', 'West','North', 'East', 'South', 'West'))
axs[0].set(xlabel="", ylabel='mean width (m)')
axs[1].set(xlabel="")
axs[2].set(xlabel="")
axs[3].set(xlabel="")
axs[4].set(xlabel="")
plt.show()
#plt.savefig('WidthsFP_Boxplots.png')
#%%
df_merged['atoll name 2'] = df_merged.index
df_mergedbin = df_merged[['motu length (m)']]
df_mergedbin[['bins ac']] = df_merged[['mean position bin']]
df_mergedbin.reset_index(level=0, inplace=True)
df_binnedlength= df_mergedbin.groupby(['atoll name','bins ac']).mean()[['motu length (m)']]
df_binnedlength[['motu length (km)']] = df_binnedlength[['motu length (m)']]/1000
df_binnedlength.reset_index(level=1, inplace=True)
df_binnedlength['label bin'] = df_binnedlength['bins ac'].map({'North': 'a', 'East': 'b','South': 'c', 'West': 'd'})
#%% plot percent length blocked by motu binned box plot
df_binned2['label bin'] = df_binned2['bins ac'].map({'North': 'a', 'East': 'b','South': 'c', 'West': 'd'})
fig, ax = plt.subplots(1, 2, figsize=(8, 5))
df_binned2.boxplot('percent reef flat length covered by motu (%)','label bin', ax=ax[1],patch_artist = True, grid=False, color = {'whiskers' : 'black',
'caps' : 'black',
'medians' : 'black',
'boxes' : 'black'})
df_binnedlength.boxplot('motu length (km)','label bin', ax=ax[0],patch_artist = True, grid=False, color = {'whiskers' : 'black',
'caps' : 'black',
'medians' : 'black',
'boxes' : 'black'})
ax[0].set_xticklabels(('North', 'East', 'South', 'West'))
ax[1].set_xticklabels(('North', 'East', 'South', 'West'))
ax[1].set(xlabel="", ylabel='reef flat length blocked by motu (%)', title='percent reef flat blocked by motu')
ax[0].set(xlabel="", ylabel='mean motu length (km)', title='motu length')
cmp = plt.get_cmap('gist_earth',6)
for i in range(0,2):
ax[i].findobj(matplotlib.patches.Patch)[0].set_facecolor(cmp(1))
ax[i].findobj(matplotlib.patches.Patch)[1].set_facecolor(cmp(2))
ax[i].findobj(matplotlib.patches.Patch)[2].set_facecolor(cmp(3))
ax[i].findobj(matplotlib.patches.Patch)[3].set_facecolor(cmp(4))
ax[1].set_ylim((-1,120))
ax[0].set_ylim((-.4,44))
#plt.savefig('%BlockedFP_Boxplots.png')
#%%
df_motu_summary.to_csv(region_name + ' df_motu_summaryACO.csv')
df_reef_summary.to_csv(region_name + ' df_reef_summaryACO.csv')
#%% total perimeter/area by lattitude
df_reef['bins latitude 3'] = pd.cut(df_reef['centroid_lat'], bins = [-25,-23,-21,-19,-17,-15,-13,-11,-9,-7,-5,-3,-1,1,3,5,7,9,11,13,15], labels = [-24, -22, -20, -18, -16, -14, -12, -10, -8, -6, -4, -2, 0, 2, 4, 6, 8, 10, 12, 14], ordered = False)
df_motu['bins latitude 3'] = pd.cut(df_motu['centroid_lat'], bins = [-25,-23,-21,-19,-17,-15,-13,-11,-9,-7,-5,-3,-1,1,3,5,7,9,11,13,15], labels = [-24, -22, -20, -18, -16, -14, -12, -10, -8, -6, -4, -2, 0, 2, 4, 6, 8, 10, 12, 14], ordered = False)
df_reef['bins latitude 4'] = pd.cut(df_reef['centroid_lat'], bins = [-25.5,-22.5,-19.5,-16.5,-13.5,-10.5,-7.5,-4.5,-1.5,1.5,4.5,7.5,10.5,13.5,16.5], labels = [-24, -21, -18, -15, -12, -9, -6, -3, 0, 3, 6, 9, 12, 15], ordered = False)
df_motu['bins latitude 4'] = pd.cut(df_motu['centroid_lat'], bins = [-25.5,-22.5,-19.5,-16.5,-13.5,-10.5,-7.5,-4.5,-1.5,1.5,4.5,7.5,10.5,13.5,16.5], labels = [-24, -21, -18, -15, -12, -9, -6, -3, 0, 3, 6, 9, 12, 15], ordered = False)
s1 = 'bins latitude 4'
df_motu_summary[s1] = df_motu.groupby(['atoll name','motu index']).first()[[s1]]
df_reef_summary[s1] = df_reef.groupby(['atoll name','reef flat index']).first()[[s1]]
df_motu_summary = df_motu_summary.reset_index(drop=False)
df_lat_totals = df_motu_summary.groupby([s1]).sum()[['area (m^2)','perimeter (m)']]
df_lat_totals['number atolls'] = df_motu_summary.groupby([s1]).nunique()[['atoll name']]
df_lat_totals['number motu'] = df_motu_summary.groupby([s1]).count()[['area (m^2)']]
df_lat_totals['total motu area (km^2)'] = df_lat_totals['area (m^2)']/1000000
df_lat_totals['total motu perimeter (km)'] = df_lat_totals['perimeter (m)']/1000
df_lat_totals[['total reef flat area (m^2)','total reef flat perimeter (m)']] = df_reef_summary.groupby([s1]).sum()[['area (m^2)','perimeter (m)']]
df_lat_totals['number reef flat'] = df_reef_summary.groupby([s1]).count()[['area (m^2)']]
df_lat_totals['total reef flat area (km^2)'] = df_lat_totals['total reef flat area (m^2)']/1000000
df_lat_totals['total reef flat perimeter (km)'] = df_lat_totals['total reef flat perimeter (m)']/1000
df_lat_totals = df_lat_totals.drop(['area (m^2)','perimeter (m)','total reef flat area (m^2)','total reef flat perimeter (m)'], axis=1)
#%%
df_lat_totals2 = df_lat_totals.reset_index(drop = False)
df_lat_totals2 = df_lat_totals2.append({'bins latitude 4':-27,'number motu':0, 'total motu area (km^2)':0, 'total motu perimeter (km)':0, 'number reef flat':0, 'total reef flat area (km^2)':0,'total reef flat perimeter (km)':0},ignore_index=True)
df_lat_totals2 = df_lat_totals2.append({'bins latitude 4':15,'number motu':0, 'total motu area (km^2)':0, 'total motu perimeter (km)':0, 'number reef flat':0, 'total reef flat area (km^2)':0,'total reef flat perimeter (km)':0},ignore_index=True)
df_lat_totals2 = df_lat_totals2.sort_values([s1])
df_lat_totals2 = df_lat_totals2.reset_index(drop=True)
#%%
df2=df2all_PF
df3=df3all_PF
blues = plt.get_cmap('Blues',6)
purples = plt.get_cmap('Purples',6)
reds = plt.get_cmap('Reds',6)
oranges = plt.get_cmap('Oranges',6)
greens = plt.get_cmap('Greens',6)
fig_dims = (4.5, 4)
fig, ax = plt.subplots(figsize=fig_dims)
lineW = 2
# Draw the density plot
sns.distplot(df2['motu width'], hist = False, kde = True,
kde_kws = {'linewidth': lineW},
label = 'motu width', color = reds(4))
sns.distplot(df3['reef flat width'], hist = False, kde = True,
kde_kws = {'linewidth': lineW},
label = 'reef flat width', color = blues(4))
sns.distplot(df2['lagoon reef width'], hist = False, kde = True,
kde_kws = {'linewidth': lineW},
label = 'lagoon reef width', color = oranges(4))
sns.distplot(df2['ocean reef width'], hist = False, kde = True,
kde_kws = {'linewidth': lineW},
label = 'ocean reef width', color = purples(4))
sns.distplot(df3['effective reef flat width'], hist = False, kde = True,
kde_kws = {'linewidth': lineW},
label = 'effective reef flat width', color = greens(4))
# Plot formatting
plt.legend(prop={'size': 12}, title = 'Widths')
plt.title('a) French Polynesia Width')
plt.xlabel('Width (m)')
plt.ylabel('Density')
plt.xlim([0, 2000])
plt.ylim([0,.013])
plt.yticks(np.arange(0,.015,step=.003))
plt.xticks(np.arange(0,2000,step=500))
plt.tick_params(axis='both',which='major',width=2,length=7,direction='in')
leg = plt.legend()
leg.get_frame().set_linewidth(0.0)
plt.tight_layout()
#plt.savefig('DensityFP_AllWidths.png',dpi=600)
#%% density functions for the width measurements - motu width
df = df2.copy()
s2 = 'bins ac'
s1 = 'motu width'
#Draw the density plot
linecolor = reds
fig_dims = (4.5, 4)
fig, ax = plt.subplots(figsize=fig_dims)
sns.distplot(df[df[s2] == 'North'][s1], hist = False, kde = True,
kde_kws = {'linewidth': lineW},
color = linecolor(5),
label = 'North')
sns.distplot(df[df[s2] == 'East'][s1], hist = False, kde = True,
kde_kws = {'linewidth': lineW},
color = linecolor(4),
label = 'East')
sns.distplot(df[df[s2] == 'South'][s1], hist = False, kde = True,
kde_kws = {'linewidth': lineW},
color = linecolor(3),
label = 'South')
sns.distplot(df[df[s2] == 'West'][s1], hist = False, kde = True,
kde_kws = {'linewidth': lineW},
color = linecolor(2),
label = 'West')
# Plot formatting
plt.title('b) Motu Width')
plt.xlabel('Width (m)')
plt.ylabel('Density')
leg = plt.legend()
leg.get_frame().set_linewidth(0.0)
plt.xlim([0, 2000])
plt.ylim([0,.013])
plt.yticks(np.arange(0,.015,step=.003))
plt.xticks(np.arange(0,2000,step=500))
plt.tick_params(axis='both',which='major',width=2,length=7,direction='in')
plt.tight_layout()
#plt.savefig('DensityFP_motuwidth.png',dpi=600)
#%% density functions for the width measurements - reef flat width
df = df3.copy()
s2 = 'bins ac'
s1 = 'reef flat width'
#Draw the density plot
linecolor = blues
fig_dims = (4.5, 4)
fig, ax = plt.subplots(figsize=fig_dims)
sns.distplot(df[df[s2] == 'North'][s1], hist = False, kde = True,
kde_kws = {'linewidth': lineW},
color = linecolor(5),
label = 'North')
sns.distplot(df[df[s2] == 'East'][s1], hist = False, kde = True,
kde_kws = {'linewidth': lineW},
color = linecolor(4),
label = 'East')
sns.distplot(df[df[s2] == 'South'][s1], hist = False, kde = True,
kde_kws = {'linewidth': lineW},
color = linecolor(3),
label = 'South')
sns.distplot(df[df[s2] == 'West'][s1], hist = False, kde = True,
kde_kws = {'linewidth': lineW},
color = linecolor(2),
label = 'West')
# Plot formatting
plt.title('c) Reef Flat Width')
plt.xlabel('Width (m)')
plt.ylabel('Density')
leg = plt.legend()
leg.get_frame().set_linewidth(0.0)
plt.xlim([0, 2000])
plt.ylim([0,.013])
plt.yticks(np.arange(0,.015,step=.003))
plt.xticks(np.arange(0,2000,step=500))
plt.tick_params(axis='both',which='major',width=2,length=7,direction='in')
plt.tight_layout()
#plt.savefig('DensityFP_rfwidth.png',dpi=600)
#%% density functions for the width measurements - lagoon reef width
df = df2.copy()
s2 = 'bins ac'
s1 = 'lagoon reef width'
#Draw the density plot
linecolor = oranges
fig_dims = (4.5, 4)
fig, ax = plt.subplots(figsize=fig_dims)
sns.distplot(df[df[s2] == 'North'][s1], hist = False, kde = True,
kde_kws = {'linewidth': lineW},
color = linecolor(5),
label = 'North')
sns.distplot(df[df[s2] == 'East'][s1], hist = False, kde = True,
kde_kws = {'linewidth': lineW},
color = linecolor(4),
label = 'East')
sns.distplot(df[df[s2] == 'South'][s1], hist = False, kde = True,
kde_kws = {'linewidth': lineW},
color = linecolor(3),
label = 'South')
sns.distplot(df[df[s2] == 'West'][s1], hist = False, kde = True,
kde_kws = {'linewidth': lineW},
color = linecolor(2),
label = 'West')
# Plot formatting
plt.title('d) Motu Width')
plt.xlabel('Width (m)')
plt.ylabel('Density')
leg = plt.legend()
leg.get_frame().set_linewidth(0.0)
plt.xlim([0, 2000])
plt.ylim([0,.013])
plt.yticks(np.arange(0,.015,step=.003))
plt.xticks(np.arange(0,2000,step=500))
plt.tick_params(axis='both',which='major',width=2,length=7,direction='in')
plt.tight_layout()
#plt.savefig('DensityFP_motulagoonwidth.png',dpi=600)
#%% density functions for the width measurements - ocean reef width
df = df2.copy()
s2 = 'bins ac'
s1 = 'ocean reef width'
#Draw the density plot
linecolor = purples
fig_dims = (4.5, 4)
fig, ax = plt.subplots(figsize=fig_dims)
sns.distplot(df[df[s2] == 'North'][s1], hist = False, kde = True,
kde_kws = {'linewidth': lineW},
color = linecolor(5),
label = 'North')
sns.distplot(df[df[s2] == 'East'][s1], hist = False, kde = True,
kde_kws = {'linewidth': lineW},
color = linecolor(4),
label = 'East')
sns.distplot(df[df[s2] == 'South'][s1], hist = False, kde = True,
kde_kws = {'linewidth': lineW},
color = linecolor(3),
label = 'South')
sns.distplot(df[df[s2] == 'West'][s1], hist = False, kde = True,
kde_kws = {'linewidth': lineW},
color = linecolor(2),
label = 'West')
# Plot formatting
plt.title('e) Ocean Reef Width')
plt.xlabel('Width (m)')
plt.ylabel('Density')
plt.xlim([0, 2000])
leg = plt.legend()
leg.get_frame().set_linewidth(0.0)
plt.xlim([0, 2000])
plt.ylim([0,.013])
plt.yticks(np.arange(0,.015,step=.003))
plt.xticks(np.arange(0,2000,step=500))
plt.tick_params(axis='both',which='major',width=2,length=7,direction='in')
plt.tight_layout()
#plt.savefig('DensityFP_motuoceanwidth.png',dpi=600)
#%% density functions for the width measurements - effective reef width
df = df3.copy()
s1 = 'reef flat width'
s2 = 'bins ac'
s1 = 'effective reef flat width'
#df = df_motu[df_motu['motu length'] > 1000].copy()
# df = df2.copy()
# s1 = 'ocean reef width'
# s1 = 'lagoon reef width'
# s1 = 'motu width'
#Draw the density plot
linecolor = greens
fig_dims = (4.5, 4)
fig, ax = plt.subplots(figsize=fig_dims)
sns.distplot(df[df[s2] == 'North'][s1], hist = False, kde = True,
kde_kws = {'linewidth': lineW},
color = linecolor(5),
label = 'North')
sns.distplot(df[df[s2] == 'East'][s1], hist = False, kde = True,
kde_kws = {'linewidth': lineW},
color = linecolor(4),
label = 'East')
sns.distplot(df[df[s2] == 'South'][s1], hist = False, kde = True,
kde_kws = {'linewidth': lineW},
color = linecolor(3),
label = 'South')
sns.distplot(df[df[s2] == 'West'][s1], hist = False, kde = True,
kde_kws = {'linewidth': lineW},
color = linecolor(2),
label = 'West')
# Plot formatting
#plt.legend(prop={'size': 12}, title = s1)
plt.title('f) Effective Reef Flat Width')
plt.xlabel('Width (m)')
plt.ylabel('Density')
leg = plt.legend()
leg.get_frame().set_linewidth(0.0)
plt.xlim([0, 2000])
plt.ylim([0,.013])
plt.yticks(np.arange(0,.015,step=.003))
plt.xticks(np.arange(0,2000,step=500))
plt.tick_params(axis='both',which='major',width=2,length=7,direction='in')
plt.tight_layout()
#plt.savefig('DensityFP_effectiverw.png',dpi=600)
#%% density functions for the width measurements - all atolls -
df2=df2all.copy()
df3=df3all.copy()
blues = plt.get_cmap('Blues',6)
purples = plt.get_cmap('Purples',6)
reds = plt.get_cmap('Reds',6)
oranges = plt.get_cmap('Oranges',6)
greens = plt.get_cmap('Greens',6)
fig_dims = (4.5, 4)
fig, ax = plt.subplots(figsize=fig_dims)
lineW = 2
# Draw the density plot
sns.distplot(df2['motu width'], hist = False, kde = True,
kde_kws = {'linewidth': lineW},
label = 'motu width', color = reds(4))
sns.distplot(df3['reef flat width'], hist = False, kde = True,
kde_kws = {'linewidth': lineW},
label = 'reef flat width', color = blues(4))
sns.distplot(df2['lagoon reef width'], hist = False, kde = True,
kde_kws = {'linewidth': lineW},
label = 'lagoon reef width', color = oranges(4))
sns.distplot(df2['ocean reef width'], hist = False, kde = True,
kde_kws = {'linewidth': lineW},
label = 'ocean reef width', color = purples(4))
sns.distplot(df3['effective reef flat width'], hist = False, kde = True,
kde_kws = {'linewidth': lineW},
label = 'effective reef flat width', color = greens(4))
# Plot formatting
plt.legend(prop={'size': 12}, title = 'Widths')
plt.title('a) All Atolls Width')
plt.xlabel('Width (m)')
plt.ylabel('Density')
plt.xlim([0, 2000])
plt.ylim([0,.008])
plt.yticks(np.arange(0,.008,step=.0025))
plt.xticks(np.arange(0,2000,step=500))
plt.tick_params(axis='both',which='major',width=2,length=7,direction='in')
leg = plt.legend()
leg.get_frame().set_linewidth(0.0)
plt.tight_layout()
#plt.savefig('DensityAll_AllWidths.png',dpi=600)
#%% density functions for the width measurements - all atolls - motu width
df = df2.copy()
s1 = 'motu width'
s2 = 'bins abs latitude'
linecolor = reds
fig_dims = (4.5, 4)
fig, ax = plt.subplots(figsize=fig_dims)
# Draw the density plot
sns.distplot(df[df[s2] == 'low'][s1], hist = False, kde = True,
kde_kws = {'linewidth': lineW},bins=int(2000),
label = 'equatorial',color = linecolor(5))
sns.distplot(df[df[s2] == 'mid'][s1], hist = False, kde = True,
kde_kws = {'linewidth': lineW},
label = 'mid tropical',color = linecolor(4))
sns.distplot(df[df[s2] == 'high'][s1], hist = False, kde = True,
kde_kws = {'linewidth': lineW},
label = 'high tropical',color = linecolor(3))
# Plot formatting
plt.legend(prop={'size': 12}, title = s1)
plt.title('b) Motu Width')
plt.xlabel('Width (m)')
plt.ylabel('Density')
plt.xlim([0, 2000])
plt.ylim([0,.008])
plt.yticks(np.arange(0,.008,step=.0025))
plt.xticks(np.arange(0,2000,step=500))
plt.tick_params(axis='both',which='major',width=2,length=7,direction='in')
plt.tight_layout()
leg = plt.legend()
leg.get_frame().set_linewidth(0.0)
#plt.savefig('DensityAll_MotuWidths.png',dpi=600)
#%% density functions for the width measurements - all atolls - reef total width
df = df3.copy()
s1 = 'reef flat width'
s2 = 'bins abs latitude'
linecolor = blues
fig_dims = (4.5, 4)
fig, ax = plt.subplots(figsize=fig_dims)
# Draw the density plot
sns.distplot(df[df[s2] == 'low'][s1], hist = False, kde = True,
kde_kws = {'linewidth': lineW},bins=int(2000),
label = 'equatorial',color = linecolor(5))
sns.distplot(df[df[s2] == 'mid'][s1], hist = False, kde = True,
kde_kws = {'linewidth': lineW},
label = 'mid tropical',color = linecolor(4))
sns.distplot(df[df[s2] == 'high'][s1], hist = False, kde = True,
kde_kws = {'linewidth': lineW},
label = 'high tropical',color = linecolor(3))
# Plot formatting
plt.legend(prop={'size': 12}, title = s1)
plt.title('c) Reef Flat Width')
plt.xlabel('Width (m)')
plt.ylabel('Density')
plt.xlim([0, 2000])
plt.ylim([0,.008])
plt.yticks(np.arange(0,.008,step=.0025))
plt.xticks(np.arange(0,2000,step=500))
plt.tick_params(axis='both',which='major',width=2,length=7,direction='in')
plt.tight_layout()
leg = plt.legend()
leg.get_frame().set_linewidth(0.0)
#plt.savefig('DensityAll_AllReefTotalWidths.png',dpi=600)
#%% density functions for the width measurements - all atolls - lagoon reef width
df = df2.copy()
s1 = 'lagoon reef width'
s2 = 'bins abs latitude'
linecolor = oranges
fig_dims = (4.5, 4)
fig, ax = plt.subplots(figsize=fig_dims)
# Draw the density plot
sns.distplot(df[df[s2] == 'low'][s1], hist = False, kde = True,
kde_kws = {'linewidth': lineW},bins=int(2000),
label = 'equatorial',color = linecolor(5))
sns.distplot(df[df[s2] == 'mid'][s1], hist = False, kde = True,
kde_kws = {'linewidth': lineW},
label = 'mid tropical',color = linecolor(4))
sns.distplot(df[df[s2] == 'high'][s1], hist = False, kde = True,
kde_kws = {'linewidth': lineW},
label = 'high tropical',color = linecolor(3))
# Plot formatting
plt.legend(prop={'size': 12}, title = s1)
plt.title('d) Lagoon Reef Width')
plt.xlabel('Width (m)')
plt.ylabel('Density')
plt.xlim([0, 2000])
plt.ylim([0,.008])
plt.yticks(np.arange(0,.008,step=.0025))
plt.xticks(np.arange(0,2000,step=500))
plt.tick_params(axis='both',which='major',width=2,length=7,direction='in')
plt.tight_layout()
leg = plt.legend()
leg.get_frame().set_linewidth(0.0)
#plt.savefig('DensityAll_LagoonReefWidths.png',dpi=600)
#%% density functions for the width measurements - all atolls - ocean reef width
df = df2.copy()
s1 = 'ocean reef width'
s2 = 'bins abs latitude'
linecolor = purples
fig_dims = (4.5, 4)
fig, ax = plt.subplots(figsize=fig_dims)
# Draw the density plot
sns.distplot(df[df[s2] == 'low'][s1], hist = False, kde = True,
kde_kws = {'linewidth': lineW},bins=int(2000),
label = 'equatorial',color = linecolor(5))
sns.distplot(df[df[s2] == 'mid'][s1], hist = False, kde = True,
kde_kws = {'linewidth': lineW},
label = 'mid tropical',color = linecolor(4))
sns.distplot(df[df[s2] == 'high'][s1], hist = False, kde = True,
kde_kws = {'linewidth': lineW},
label = 'high tropical',color = linecolor(3))
# Plot formatting
plt.legend(prop={'size': 12}, title = s1)
plt.title('e) Ocean Reef Width')
plt.xlabel('Width (m)')
plt.ylabel('Density')
plt.xlim([0, 2000])
plt.ylim([0,.008])
plt.yticks(np.arange(0,.008,step=.0025))
plt.xticks(np.arange(0,2000,step=500))
plt.tick_params(axis='both',which='major',width=2,length=7,direction='in')
plt.tight_layout()
leg = plt.legend()
leg.get_frame().set_linewidth(0.0)
#plt.savefig('DensityAll_OceanReefWidths.png',dpi=600)
#%% density functions for the width measurements - all atolls - effective width
df = df3.copy()
s1 = 'effective reef flat width'
s2 = 'bins abs latitude'
linecolor = greens
fig_dims = (4.5, 4)
fig, ax = plt.subplots(figsize=fig_dims)
# Draw the density plot
sns.distplot(df[df[s2] == 'low'][s1], hist = False, kde = True,
kde_kws = {'linewidth': lineW},bins=int(2000),
label = 'equatorial',color = linecolor(5))
sns.distplot(df[df[s2] == 'mid'][s1], hist = False, kde = True,
kde_kws = {'linewidth': lineW},
label = 'mid tropical',color = linecolor(4))
sns.distplot(df[df[s2] == 'high'][s1], hist = False, kde = True,
kde_kws = {'linewidth': lineW},
label = 'high tropical',color = linecolor(3))
# Plot formatting
plt.legend(prop={'size': 12}, title = s1)
plt.title('f) Effective Reef Flat Width')
plt.xlabel('Width (m)')
plt.ylabel('Density')
plt.xlim([0, 2000])
plt.ylim([0,.008])
plt.yticks(np.arange(0,.008,step=.0025))
plt.xticks(np.arange(0,2000,step=500))
plt.tick_params(axis='both',which='major',width=2,length=7,direction='in')
plt.tight_layout()
leg = plt.legend()
leg.get_frame().set_linewidth(0.0)
#plt.savefig('DensityAll_EffectivereefWidths.png',dpi=600)
#%% calc. critical reef-flat widths for diff groups
def calcCritWidth(df,s1,s2,l,s3,border):
'''takes a dataframe, the two strings to iterate over, and the length to calc above, plus the bin order
returns a dataframe with rows for each bin then each column: mean, std, number/count, %count, total'''
aa = df[df[s1]>l][s2].agg(['mean','std','count'])
aa['total count'] = df.count().max()
aa['percent count'] = aa['count']/aa['total count'] * 100
df2 = pd.DataFrame([aa],index=['all'])
for i in df[s3].dropna().unique():
aa2 = df[(df[s3]==i) & (df[s1]>l)][s2].agg(['mean','std','count'])
aa2['total count'] = df[df[s3]==i].count().max() #find total motu in given bin
aa2['percent count'] = aa2['count']/aa2['total count'] * 100
aa2.name=i
df2 = df2.append([aa2])
df2['length'] = l
df2 = df2.reindex(border)
return df2
dfnewlong = calcCritWidth(df_merged,'motu length (m)','mean ocean reef width (m)',10000,'bins abs latitude',['low','mid','high','all'])
dfnew = calcCritWidth(df_merged,'motu length (m)','mean ocean reef width (m)',1000,'bins abs latitude',['low','mid','high','all'])
dfNew = dfnew.append(dfnewlong)
#now calc. for normalized values
dfnew = calcCritWidth(df_merged,'motu length / reef-flat length','motu-reef-flat-dist / reef-flat width',.1,'bins abs latitude',['low','mid','high','all'])
dfnewl = calcCritWidth(df_merged,'motu length / reef-flat length','motu-reef-flat-dist / reef-flat width',.25,'bins abs latitude',['low','mid','high','all'])
dfNewNorm = dfnew.append(dfnewl)
#%%
#df_mergedFP = df_merged #if you've reset way back in the beginning
#%%
# dfpnew = calcCritWidth(df_mergedFP,'motu length (m)','mean ocean reef width (m)',0,'directional bin',['North','East','South','West','all'])
# dfpnewl = calcCritWidth(df_mergedFP,'motu length (m)','mean ocean reef width (m)',10000,'directional bin',['North','East','South','West','all'])
# dfNewfp = dfpnew.append(dfpnewl)
# #now calc. for normalized values
# dfnew = calcCritWidth(df_mergedFP,'motu length / reef-flat length','motu-reef-flat-dist / reef-flat width',.1,'directional bin',['North','East','South','West','all'])
# dfnewl = calcCritWidth(df_mergedFP,'motu length / reef-flat length','motu-reef-flat-dist / reef-flat width',.25,'directional bin',['North','East','South','West','all'])
# dfNewNormFP = dfnew.append(dfnewl)
# #export these tables to excel
# # Create some Pandas dataframes from some data.
# with pd.ExcelWriter('SummaryCriticalReefFlatWidth.xlsx') as writer:
# workbook=writer.book
# worksheet=workbook.add_worksheet('All Motu')
# writer.sheets['All Motu'] = worksheet
# worksheet.write_string(0, 0, 'Totals critical reef flat width (m)')
# dfNew.to_excel(writer, sheet_name='All Motu', startrow = 1)
# worksheet.write_string(13,0,'Normalized')
# dfNewNorm.to_excel(writer, sheet_name='All Motu', startrow = 14)
# worksheet=workbook.add_worksheet('FP Motu')
# writer.sheets['FP Motu'] = worksheet
# worksheet.write_string(0, 0, 'Totals critical reef flat width (m)')
# dfNewfp.to_excel(writer, sheet_name='FP Motu', startrow = 1)
# worksheet.write_string(13,0,'Normalized')
# dfNewNormFP.to_excel(writer, sheet_name='FP Motu', startrow = 14)
| 46.773568 | 549 | 0.644948 | 7,979 | 53,088 | 4.200276 | 0.061787 | 0.041057 | 0.026765 | 0.021484 | 0.840127 | 0.809841 | 0.775497 | 0.746255 | 0.714418 | 0.691144 | 0 | 0.04357 | 0.168192 | 53,088 | 1,134 | 550 | 46.814815 | 0.715369 | 0.120818 | 0 | 0.706422 | 0 | 0 | 0.30677 | 0.006501 | 0 | 0 | 0 | 0 | 0 | 1 | 0.003932 | false | 0 | 0.009174 | 0 | 0.017038 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
d3da69d90442b17ea4483257426d455f8cbc0b5a | 9,465 | py | Python | tests/providers/hashicorp/secrets/test_vault.py | mebelousov/airflow | d99833c9b5be9eafc0c7851343ee86b6c20aed40 | [
"Apache-2.0"
] | 2 | 2021-07-30T17:35:51.000Z | 2021-08-03T13:50:57.000Z | tests/providers/hashicorp/secrets/test_vault.py | mebelousov/airflow | d99833c9b5be9eafc0c7851343ee86b6c20aed40 | [
"Apache-2.0"
] | 9 | 2021-02-08T20:50:21.000Z | 2022-03-29T22:29:28.000Z | tests/providers/hashicorp/secrets/test_vault.py | mebelousov/airflow | d99833c9b5be9eafc0c7851343ee86b6c20aed40 | [
"Apache-2.0"
] | 1 | 2020-04-25T00:31:39.000Z | 2020-04-25T00:31:39.000Z | # Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
from unittest import TestCase, mock
from hvac.exceptions import InvalidPath, VaultError
from airflow.providers.hashicorp.secrets.vault import VaultBackend
class TestVaultSecrets(TestCase):
@mock.patch("airflow.providers.hashicorp.secrets.vault.hvac")
def test_get_conn_uri(self, mock_hvac):
mock_client = mock.MagicMock()
mock_hvac.Client.return_value = mock_client
mock_client.secrets.kv.v2.read_secret_version.return_value = {
'request_id': '94011e25-f8dc-ec29-221b-1f9c1d9ad2ae',
'lease_id': '',
'renewable': False,
'lease_duration': 0,
'data': {
'data': {'conn_uri': 'postgresql://airflow:airflow@host:5432/airflow'},
'metadata': {'created_time': '2020-03-16T21:01:43.331126Z',
'deletion_time': '',
'destroyed': False,
'version': 1}},
'wrap_info': None,
'warnings': None,
'auth': None
}
kwargs = {
"connections_path": "connections",
"mount_point": "airflow",
"auth_type": "token",
"url": "http://127.0.0.1:8200",
"token": "s.7AU0I51yv1Q1lxOIg1F3ZRAS"
}
test_client = VaultBackend(**kwargs)
returned_uri = test_client.get_conn_uri(conn_id="test_postgres")
self.assertEqual('postgresql://airflow:airflow@host:5432/airflow', returned_uri)
@mock.patch("airflow.providers.hashicorp.secrets.vault.hvac")
def test_get_conn_uri_engine_version_1(self, mock_hvac):
mock_client = mock.MagicMock()
mock_hvac.Client.return_value = mock_client
mock_client.secrets.kv.v1.read_secret.return_value = {
'request_id': '182d0673-618c-9889-4cba-4e1f4cfe4b4b',
'lease_id': '',
'renewable': False,
'lease_duration': 2764800,
'data': {'conn_uri': 'postgresql://airflow:airflow@host:5432/airflow'},
'wrap_info': None,
'warnings': None,
'auth': None}
kwargs = {
"connections_path": "connections",
"mount_point": "airflow",
"auth_type": "token",
"url": "http://127.0.0.1:8200",
"token": "s.7AU0I51yv1Q1lxOIg1F3ZRAS",
"kv_engine_version": 1
}
test_client = VaultBackend(**kwargs)
returned_uri = test_client.get_conn_uri(conn_id="test_postgres")
mock_client.secrets.kv.v1.read_secret.assert_called_once_with(
mount_point='airflow', path='connections/test_postgres')
self.assertEqual('postgresql://airflow:airflow@host:5432/airflow', returned_uri)
@mock.patch.dict('os.environ', {
'AIRFLOW_CONN_TEST_MYSQL': 'mysql://airflow:airflow@host:5432/airflow',
})
@mock.patch("airflow.providers.hashicorp.secrets.vault.hvac")
def test_get_conn_uri_non_existent_key(self, mock_hvac):
"""
Test that if the key with connection ID is not present in Vault, VaultClient.get_connections
should return None
"""
mock_client = mock.MagicMock()
mock_hvac.Client.return_value = mock_client
# Response does not contain the requested key
mock_client.secrets.kv.v2.read_secret_version.side_effect = InvalidPath()
kwargs = {
"connections_path": "connections",
"mount_point": "airflow",
"auth_type": "token",
"url": "http://127.0.0.1:8200",
"token": "s.7AU0I51yv1Q1lxOIg1F3ZRAS"
}
test_client = VaultBackend(**kwargs)
self.assertIsNone(test_client.get_conn_uri(conn_id="test_mysql"))
mock_client.secrets.kv.v2.read_secret_version.assert_called_once_with(
mount_point='airflow', path='connections/test_mysql')
self.assertEqual([], test_client.get_connections(conn_id="test_mysql"))
@mock.patch("airflow.providers.hashicorp.secrets.vault.hvac")
def test_get_variable_value(self, mock_hvac):
mock_client = mock.MagicMock()
mock_hvac.Client.return_value = mock_client
mock_client.secrets.kv.v2.read_secret_version.return_value = {
'request_id': '2d48a2ad-6bcb-e5b6-429d-da35fdf31f56',
'lease_id': '',
'renewable': False,
'lease_duration': 0,
'data': {'data': {'value': 'world'},
'metadata': {'created_time': '2020-03-28T02:10:54.301784Z',
'deletion_time': '',
'destroyed': False,
'version': 1}},
'wrap_info': None,
'warnings': None,
'auth': None
}
kwargs = {
"variables_path": "variables",
"mount_point": "airflow",
"auth_type": "token",
"url": "http://127.0.0.1:8200",
"token": "s.7AU0I51yv1Q1lxOIg1F3ZRAS"
}
test_client = VaultBackend(**kwargs)
returned_uri = test_client.get_variable("hello")
self.assertEqual('world', returned_uri)
@mock.patch("airflow.providers.hashicorp.secrets.vault.hvac")
def test_get_variable_value_engine_version_1(self, mock_hvac):
mock_client = mock.MagicMock()
mock_hvac.Client.return_value = mock_client
mock_client.secrets.kv.v1.read_secret.return_value = {
'request_id': '182d0673-618c-9889-4cba-4e1f4cfe4b4b',
'lease_id': '',
'renewable': False,
'lease_duration': 2764800,
'data': {'value': 'world'},
'wrap_info': None,
'warnings': None,
'auth': None}
kwargs = {
"variables_path": "variables",
"mount_point": "airflow",
"auth_type": "token",
"url": "http://127.0.0.1:8200",
"token": "s.7AU0I51yv1Q1lxOIg1F3ZRAS",
"kv_engine_version": 1
}
test_client = VaultBackend(**kwargs)
returned_uri = test_client.get_variable("hello")
mock_client.secrets.kv.v1.read_secret.assert_called_once_with(
mount_point='airflow', path='variables/hello')
self.assertEqual('world', returned_uri)
@mock.patch.dict('os.environ', {
'AIRFLOW_VAR_HELLO': 'world',
})
@mock.patch("airflow.providers.hashicorp.secrets.vault.hvac")
def test_get_variable_value_non_existent_key(self, mock_hvac):
"""
Test that if the key with connection ID is not present in Vault, VaultClient.get_connections
should return None
"""
mock_client = mock.MagicMock()
mock_hvac.Client.return_value = mock_client
# Response does not contain the requested key
mock_client.secrets.kv.v2.read_secret_version.side_effect = InvalidPath()
kwargs = {
"variables_path": "variables",
"mount_point": "airflow",
"auth_type": "token",
"url": "http://127.0.0.1:8200",
"token": "s.7AU0I51yv1Q1lxOIg1F3ZRAS"
}
test_client = VaultBackend(**kwargs)
self.assertIsNone(test_client.get_variable("hello"))
mock_client.secrets.kv.v2.read_secret_version.assert_called_once_with(
mount_point='airflow', path='variables/hello')
self.assertIsNone(test_client.get_variable("hello"))
@mock.patch("airflow.providers.hashicorp.secrets.vault.hvac")
def test_auth_failure_raises_error(self, mock_hvac):
mock_client = mock.MagicMock()
mock_hvac.Client.return_value = mock_client
mock_client.is_authenticated.return_value = False
kwargs = {
"connections_path": "connections",
"mount_point": "airflow",
"auth_type": "token",
"url": "http://127.0.0.1:8200",
"token": "test_wrong_token"
}
with self.assertRaisesRegex(VaultError, "Vault Authentication Error!"):
VaultBackend(**kwargs).get_connections(conn_id='test')
@mock.patch("airflow.providers.hashicorp.secrets.vault.hvac")
def test_empty_token_raises_error(self, mock_hvac):
mock_client = mock.MagicMock()
mock_hvac.Client.return_value = mock_client
kwargs = {
"connections_path": "connections",
"mount_point": "airflow",
"auth_type": "token",
"url": "http://127.0.0.1:8200",
}
with self.assertRaisesRegex(VaultError, "token cannot be None for auth_type='token'"):
VaultBackend(**kwargs).get_connections(conn_id='test')
| 40.276596 | 100 | 0.612995 | 1,051 | 9,465 | 5.295909 | 0.196004 | 0.048509 | 0.032699 | 0.034136 | 0.801833 | 0.774883 | 0.774883 | 0.759792 | 0.732663 | 0.698347 | 0 | 0.042816 | 0.264659 | 9,465 | 234 | 101 | 40.448718 | 0.756897 | 0.11252 | 0 | 0.782123 | 0 | 0 | 0.290024 | 0.122236 | 0 | 0 | 0 | 0 | 0.078212 | 1 | 0.044693 | false | 0 | 0.01676 | 0 | 0.067039 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
310aea93d0d69f387e4434778f4e7d8617e60503 | 3,834 | py | Python | scipy/linalg/benchmarks/bench_basic.py | lesserwhirls/scipy-cwt | ee673656d879d9356892621e23ed0ced3d358621 | [
"BSD-3-Clause"
] | 8 | 2015-10-07T00:37:32.000Z | 2022-01-21T17:02:33.000Z | scipy/linalg/benchmarks/bench_basic.py | lesserwhirls/scipy-cwt | ee673656d879d9356892621e23ed0ced3d358621 | [
"BSD-3-Clause"
] | null | null | null | scipy/linalg/benchmarks/bench_basic.py | lesserwhirls/scipy-cwt | ee673656d879d9356892621e23ed0ced3d358621 | [
"BSD-3-Clause"
] | 8 | 2015-05-09T14:23:57.000Z | 2018-11-15T05:56:00.000Z | import sys
from numpy.testing import *
import numpy.linalg as linalg
def random(size):
return rand(*size)
class TestSolve(TestCase):
def bench_random(self):
basic_solve = linalg.solve
print
print ' Solving system of linear equations'
print ' =================================='
print ' | contiguous | non-contiguous '
print '----------------------------------------------'
print ' size | scipy | basic | scipy | basic '
for size,repeat in [(20,1000),(100,150),(500,2),(1000,1)][:-1]:
repeat *= 2
print '%5s' % size,
sys.stdout.flush()
a = random([size,size])
# larger diagonal ensures non-singularity:
for i in range(size): a[i,i] = 10*(.1+a[i,i])
b = random([size])
print '| %6.2f ' % measure('solve(a,b)',repeat),
sys.stdout.flush()
print '| %6.2f ' % measure('basic_solve(a,b)',repeat),
sys.stdout.flush()
a = a[-1::-1,-1::-1] # turn into a non-contiguous array
assert not a.flags['CONTIGUOUS']
print '| %6.2f ' % measure('solve(a,b)',repeat),
sys.stdout.flush()
print '| %6.2f ' % measure('basic_solve(a,b)',repeat),
sys.stdout.flush()
print ' (secs for %s calls)' % (repeat)
class TestInv(TestCase):
def bench_random(self):
basic_inv = linalg.inv
print
print ' Finding matrix inverse'
print ' =================================='
print ' | contiguous | non-contiguous '
print '----------------------------------------------'
print ' size | scipy | basic | scipy | basic'
for size,repeat in [(20,1000),(100,150),(500,2),(1000,1)][:-1]:
repeat *= 2
print '%5s' % size,
sys.stdout.flush()
a = random([size,size])
# large diagonal ensures non-singularity:
for i in range(size): a[i,i] = 10*(.1+a[i,i])
print '| %6.2f ' % measure('inv(a)',repeat),
sys.stdout.flush()
print '| %6.2f ' % measure('basic_inv(a)',repeat),
sys.stdout.flush()
a = a[-1::-1,-1::-1] # turn into a non-contiguous array
assert not a.flags['CONTIGUOUS']
print '| %6.2f ' % measure('inv(a)',repeat),
sys.stdout.flush()
print '| %6.2f ' % measure('basic_inv(a)',repeat),
sys.stdout.flush()
print ' (secs for %s calls)' % (repeat)
class TestDet(TestCase):
def bench_random(self):
basic_det = linalg.det
print
print ' Finding matrix determinant'
print ' =================================='
print ' | contiguous | non-contiguous '
print '----------------------------------------------'
print ' size | scipy | basic | scipy | basic '
for size,repeat in [(20,1000),(100,150),(500,2),(1000,1)][:-1]:
repeat *= 2
print '%5s' % size,
sys.stdout.flush()
a = random([size,size])
print '| %6.2f ' % measure('det(a)',repeat),
sys.stdout.flush()
print '| %6.2f ' % measure('basic_det(a)',repeat),
sys.stdout.flush()
a = a[-1::-1,-1::-1] # turn into a non-contiguous array
assert not a.flags['CONTIGUOUS']
print '| %6.2f ' % measure('det(a)',repeat),
sys.stdout.flush()
print '| %6.2f ' % measure('basic_det(a)',repeat),
sys.stdout.flush()
print ' (secs for %s calls)' % (repeat)
if __name__ == "__main__":
run_module_suite()
| 31.170732 | 71 | 0.454095 | 422 | 3,834 | 4.07346 | 0.182464 | 0.078534 | 0.122164 | 0.104712 | 0.827225 | 0.824898 | 0.770797 | 0.770797 | 0.770797 | 0.770797 | 0 | 0.044988 | 0.339071 | 3,834 | 122 | 72 | 31.42623 | 0.633386 | 0.046688 | 0 | 0.8 | 0 | 0 | 0.265004 | 0.065771 | 0 | 0 | 0 | 0 | 0.035294 | 0 | null | null | 0 | 0.035294 | null | null | 0.423529 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 9 |
312f8746cafddc61eb8242db8c665e19e51638cf | 8,949 | py | Python | teaser/data/input/lca_data_input.py | linuscuy/TEASER | 5bba638a6df0dd9c41de9036d42490c24497e04b | [
"MIT"
] | null | null | null | teaser/data/input/lca_data_input.py | linuscuy/TEASER | 5bba638a6df0dd9c41de9036d42490c24497e04b | [
"MIT"
] | null | null | null | teaser/data/input/lca_data_input.py | linuscuy/TEASER | 5bba638a6df0dd9c41de9036d42490c24497e04b | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
"""
Created on Wed Oct 13 17:25:24 2021
@author: Linus
"""
from teaser.logic.buildingobjects.buildingphysics.en15804indicatorvalue import En15804IndicatorValue
def load_en15804_lca_data_id(lca_data, lca_id, data_class):
"""LCA-data loader with id as identification.
Loads LCA-data specified in the JSON by given LCA-ID.
Parameters
----------
lca_data : En15804MainLcaData()
instance of TEASERS En15804nLcaData class
lca_id : str
id of LCA-data from JSON
data_class : DataClass()
DataClass containing the bindings for En15804MainLcaData, TypeBuildingElement and
Material (typically this is the data class stored in prj.data,
but the user can individually change that.
"""
binding = data_class.lca_data_bind
for id, data in binding.items():
if id != "version":
if id == lca_id:
lca_data.lca_data_id = id
lca_data.name = data["name"]
lca_data.ref_flow_value = data["ref_flow"]["value"]
lca_data.ref_flow_unit = data["ref_flow"]["unit"]
pere = En15804IndicatorValue()
pert = En15804IndicatorValue()
penre = En15804IndicatorValue()
penrm = En15804IndicatorValue()
penrt = En15804IndicatorValue()
sm = En15804IndicatorValue()
rsf = En15804IndicatorValue()
nrsf = En15804IndicatorValue()
fw = En15804IndicatorValue()
hwd = En15804IndicatorValue()
nhwd = En15804IndicatorValue()
rwd = En15804IndicatorValue()
cru = En15804IndicatorValue()
mfr = En15804IndicatorValue()
mer = En15804IndicatorValue()
eee = En15804IndicatorValue()
eet = En15804IndicatorValue()
gwp = En15804IndicatorValue()
odp = En15804IndicatorValue()
pocp = En15804IndicatorValue()
ap = En15804IndicatorValue()
ep = En15804IndicatorValue()
adpe = En15804IndicatorValue()
adpf = En15804IndicatorValue()
pere.set_values(**data["pere"])
pert.set_values(**data["pert"])
penre.set_values(**data["penre"])
penrm.set_values(**data["penrm"])
penrt.set_values(**data["penrt"])
sm.set_values(**data["sm"])
rsf.set_values(**data["rsf"])
nrsf.set_values(**data["nrsf"])
fw.set_values(**data["fw"])
hwd.set_values(**data["hwd"])
nhwd.set_values(**data["nhwd"])
rwd.set_values(**data["rwd"])
cru.set_values(**data["cru"])
mfr.set_values(**data["mfr"])
mer.set_values(**data["mer"])
eee.set_values(**data["eee"])
eet.set_values(**data["eet"])
gwp.set_values(**data["gwp"])
odp.set_values(**data["odp"])
pocp.set_values(**data["pocp"])
ap.set_values(**data["ap"])
ep.set_values(**data["ep"])
adpe.set_values(**data["adpe"])
adpf.set_values(**data["adpf"])
lca_data.pere = pere
lca_data.pert = pert
lca_data.penre = penre
lca_data.penrm = penrm
lca_data.penrt = penrt
lca_data.sm = sm
lca_data.rsf = rsf
lca_data.nrsf = nrsf
lca_data.fw = fw
lca_data.hwd = hwd
lca_data.nhwd = nhwd
lca_data.rwd = rwd
lca_data.cru = cru
lca_data.mfr = mfr
lca_data.mer = mer
lca_data.eee = eee
lca_data.eet = eet
lca_data.gwp = gwp
lca_data.odp = odp
lca_data.pocp = pocp
lca_data.ap = ap
lca_data.ep = ep
lca_data.adpe = adpe
lca_data.adpf = adpf
if data["fallback"]:
lca_data.load_fallbacks(data["fallback"], data_class)
lca_data.add_fallbacks()
else:
lca_data.fallback = []
def load_en15804_lca_data_fallback_id(lca_data, lca_id, data_class):
"""LCA-data-fallback loader with id as identification.
Loads LCA-data-fallbacks specified in the JSON by given LCA-ID.
LCA-fallbacks are specified in an seperated JSON-file to clarify they are
just partial defined.
Parameters
----------
lca_data : En15804MainLcaData()
instance of TEASERS En15804nLcaData class
lca_id : str
id of LCA-data from JSON
data_class : DataClass()
DataClass containing the bindings for En15804MainLcaData, TypeBuildingElement and
Material (typically this is the data class stored in prj.data,
but the user can individually change that.
"""
binding = data_class.lca_data_fallback_bind
for id, data in binding.items():
if id != "version":
if id == lca_id:
lca_data.lca_data_id = id
lca_data.name = data["name"]
lca_data.ref_flow_value = data["ref_flow"]["value"]
lca_data.ref_flow_unit = data["ref_flow"]["unit"]
pere = En15804IndicatorValue()
pert = En15804IndicatorValue()
penre = En15804IndicatorValue()
penrm = En15804IndicatorValue()
penrt = En15804IndicatorValue()
sm = En15804IndicatorValue()
rsf = En15804IndicatorValue()
nrsf = En15804IndicatorValue()
fw = En15804IndicatorValue()
hwd = En15804IndicatorValue()
nhwd = En15804IndicatorValue()
rwd = En15804IndicatorValue()
cru = En15804IndicatorValue()
mfr = En15804IndicatorValue()
mer = En15804IndicatorValue()
eee = En15804IndicatorValue()
eet = En15804IndicatorValue()
gwp = En15804IndicatorValue()
odp = En15804IndicatorValue()
pocp = En15804IndicatorValue()
ap = En15804IndicatorValue()
ep = En15804IndicatorValue()
adpe = En15804IndicatorValue()
adpf = En15804IndicatorValue()
pere.set_values(**data["pere"])
pert.set_values(**data["pert"])
penre.set_values(**data["penre"])
penrm.set_values(**data["penrm"])
penrt.set_values(**data["penrt"])
sm.set_values(**data["sm"])
rsf.set_values(**data["rsf"])
nrsf.set_values(**data["nrsf"])
fw.set_values(**data["fw"])
hwd.set_values(**data["hwd"])
nhwd.set_values(**data["nhwd"])
rwd.set_values(**data["rwd"])
cru.set_values(**data["cru"])
mfr.set_values(**data["mfr"])
mer.set_values(**data["mer"])
eee.set_values(**data["eee"])
eet.set_values(**data["eet"])
gwp.set_values(**data["gwp"])
odp.set_values(**data["odp"])
pocp.set_values(**data["pocp"])
ap.set_values(**data["ap"])
ep.set_values(**data["ep"])
adpe.set_values(**data["adpe"])
adpf.set_values(**data["adpf"])
lca_data.pere = pere
lca_data.pert = pert
lca_data.penre = penre
lca_data.penrm = penrm
lca_data.penrt = penrt
lca_data.sm = sm
lca_data.rsf = rsf
lca_data.nrsf = nrsf
lca_data.fw = fw
lca_data.hwd = hwd
lca_data.nhwd = nhwd
lca_data.rwd = rwd
lca_data.cru = cru
lca_data.mfr = mfr
lca_data.mer = mer
lca_data.eee = eee
lca_data.eet = eet
lca_data.gwp = gwp
lca_data.odp = odp
lca_data.pocp = pocp
lca_data.ap = ap
lca_data.ep = ep
lca_data.adpe = adpe
lca_data.adpf = adpf
lca_data.fallback = None | 37.13278 | 100 | 0.496368 | 843 | 8,949 | 5.083037 | 0.136418 | 0.124154 | 0.145624 | 0.01867 | 0.918553 | 0.905018 | 0.905018 | 0.905018 | 0.872345 | 0.858343 | 0 | 0.056859 | 0.404514 | 8,949 | 241 | 101 | 37.13278 | 0.747232 | 0.125712 | 0 | 0.934911 | 0 | 0 | 0.032472 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.011834 | false | 0 | 0.005917 | 0 | 0.017751 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
730fc86cfbe27424ed8ec8a7561ebe7a5fd4738d | 583 | py | Python | scripts/vault/bypass_constructor.py | yogeshprabhu/Ansible-inventory-file-examples | 4bf6547f02b54c9d5ffedae7efdc407f08174dad | [
"MIT"
] | 25 | 2017-10-17T07:09:11.000Z | 2021-06-18T20:39:18.000Z | scripts/vault/bypass_constructor.py | yogeshprabhu/Ansible-inventory-file-examples | 4bf6547f02b54c9d5ffedae7efdc407f08174dad | [
"MIT"
] | 1 | 2019-10-25T14:50:33.000Z | 2019-10-25T14:50:40.000Z | scripts/vault/bypass_constructor.py | yogeshprabhu/Ansible-inventory-file-examples | 4bf6547f02b54c9d5ffedae7efdc407f08174dad | [
"MIT"
] | 20 | 2017-11-03T15:02:38.000Z | 2022-03-08T22:18:48.000Z | #!/usr/bin/env python
# -*- coding: utf-8 -*-
print """_meta:
hostvars:
foobar:
should_be_artemis_here: !vault |
$ANSIBLE_VAULT;1.2;AES256;alan
30386264646430643536336230313232653130643332356531633437363837323430663031356364
3836313935643038306263613631396136663634613066650a303838613532313236663966343433
37636234366130393131616631663831383237653761373533363666303361333662373664336261
6136313463383061330a633835643434616562633238383530356632336664316366376139306135
3534
ungrouped:
hosts:
- foobar"""
| 32.388889 | 90 | 0.763293 | 30 | 583 | 14.666667 | 0.933333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.68476 | 0.178388 | 583 | 17 | 91 | 34.294118 | 0.23382 | 0.072041 | 0 | 0 | 0 | 0 | 0.96846 | 0.692022 | 0 | 1 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0.076923 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
733427acf231433358d1b236f98a5d2fc968c65f | 8,214 | py | Python | numba_stream/grid_test.py | jackd/numba-stream | 79a12616a4a5b3107d8e9c17dc98cdeb79b2430a | [
"Apache-2.0"
] | null | null | null | numba_stream/grid_test.py | jackd/numba-stream | 79a12616a4a5b3107d8e9c17dc98cdeb79b2430a | [
"Apache-2.0"
] | null | null | null | numba_stream/grid_test.py | jackd/numba-stream | 79a12616a4a5b3107d8e9c17dc98cdeb79b2430a | [
"Apache-2.0"
] | null | null | null | import unittest
import numpy as np
import numba_stream.grid as grid
class GridTest(unittest.TestCase):
# def test_neighbor_offsets(self):
# actual = grid.neighbor_offsets(np.array((3,)))
# expected = [[-1], [0], [1]]
# np.testing.assert_equal(actual, expected)
# actual = grid.neighbor_offsets(np.array((2, 3)))
# expected = np.stack(np.meshgrid(np.arange(2),
# np.arange(3) - 1,
# indexing='ij'),
# axis=-1).reshape((6, 2))
# np.testing.assert_equal(actual, expected)
# actual = grid.neighbor_offsets(np.array((2, 2)))
# np.testing.assert_equal(actual, [[0, 0], [0, 1], [1, 0], [1, 1]])
# actual = grid.neighbor_offsets(np.array((3, 3)))
# np.testing.assert_equal(actual, [
# [-1, -1],
# [-1, 0],
# [-1, 1],
# [0, -1],
# [0, 0],
# [0, 1],
# [1, -1],
# [1, 0],
# [1, 1],
# ])
def test_ravel_multi_index(self):
dims = (5, 6, 7)
size = 100
indices = tuple(np.random.randint(0, d, size=size) for d in dims)
actual = grid.ravel_multi_index(indices, dims)
expected = np.ravel_multi_index(indices, dims)
np.testing.assert_equal(actual, expected)
def test_ravel_multi_index_transpose(self):
dims = (5, 6, 7)
size = 100
indices = tuple(np.random.randint(0, d, size=size) for d in dims)
actual = grid.ravel_multi_index_transpose(np.stack(indices, axis=-1), dims)
expected = np.ravel_multi_index(indices, dims)
np.testing.assert_equal(actual, expected)
def test_unravel_index(self):
dims = (5, 6, 7)
size = 100
indices = tuple(np.random.randint(0, d, size=size) for d in dims)
ravelled = np.ravel_multi_index(indices, dims)
# dims = (2, 3)
# ravelled = np.array([1, 3, 5, 2])
actual = grid.unravel_index(ravelled, dims)
expected = np.stack(np.unravel_index(ravelled, dims), axis=0)
np.testing.assert_equal(actual, expected)
def test_unravel_index_transpose(self):
dims = (5, 6, 7)
size = 100
indices = tuple(np.random.randint(0, d, size=size) for d in dims)
ravelled = np.ravel_multi_index(indices, dims)
# dims = (2, 3)
# ravelled = np.array([1, 3, 5, 2])
actual = grid.unravel_index_transpose(ravelled, dims)
expected = np.stack(np.unravel_index(ravelled, dims), axis=-1)
np.testing.assert_equal(actual, expected)
def test_base_grid_coords(self):
np.testing.assert_equal(
grid.base_grid_coords(np.array((3, 4))),
[
[0, 0],
[0, 1],
[0, 2],
[0, 3],
[1, 0],
[1, 1],
[1, 2],
[1, 3],
[2, 0],
[2, 1],
[2, 2],
[2, 3],
],
)
def test_grid_coords(self):
coords, shape = grid.grid_coords(
in_shape=np.array([5]),
kernel_shape=np.array([3]),
strides=np.array([1]),
padding=np.array([0]),
)
np.testing.assert_equal(shape, (3,))
np.testing.assert_equal(coords, np.expand_dims([0, 1, 2], axis=-1))
def test_strided_grid_coords(self):
coords, shape = grid.grid_coords(
in_shape=np.array([5]),
kernel_shape=np.array([3]),
strides=np.array([2]),
padding=np.array([0]),
)
np.testing.assert_equal(shape, (2,))
np.testing.assert_equal(coords, np.expand_dims([0, 2], axis=-1))
def test_padded_grid_coords(self):
coords, shape = grid.grid_coords(
in_shape=np.array([5]),
kernel_shape=np.array([3]),
strides=np.array([1]),
padding=np.array([1]),
)
np.testing.assert_equal(shape, (5,))
np.testing.assert_equal(coords, np.expand_dims([-1, 0, 1, 2, 3], axis=-1))
def test_padded_strided_grid_coords(self):
coords, shape = grid.grid_coords(
in_shape=np.array([5]),
kernel_shape=np.array([3]),
strides=np.array([2]),
padding=np.array([1]),
)
np.testing.assert_equal(shape, (3,))
np.testing.assert_equal(coords, np.expand_dims([-1, 1, 3], axis=-1))
def test_even_grid_coords(self):
coords, shape = grid.grid_coords(
in_shape=np.array([4]),
kernel_shape=np.array([3]),
strides=np.array([2]),
padding=np.array([1]),
)
np.testing.assert_equal(shape, (2,))
np.testing.assert_equal(coords, np.expand_dims([-1, 1], axis=-1))
def test_sparse_neighborhood_1d(self):
in_shape = np.array((7,))
kernel_shape = np.array((3,))
strides = np.array((2,))
padding = np.array((0,))
p, indices, splits, out_shape = grid.sparse_neighborhood(
in_shape, kernel_shape, strides, padding=padding
)
np.testing.assert_equal(out_shape, (3,))
np.testing.assert_equal(p, tuple(range(3)) * 3)
np.testing.assert_equal(indices, [0, 1, 2, 2, 3, 4, 4, 5, 6])
np.testing.assert_equal(splits, [0, 3, 6, 9])
in_shape = np.array((7,))
kernel_shape = np.array((2,))
strides = np.array((2,))
padding = np.array((0,))
p, indices, splits, out_shape = grid.sparse_neighborhood(
in_shape, kernel_shape, strides, padding=padding
)
np.testing.assert_equal(out_shape, (3,))
np.testing.assert_equal(p, tuple(range(2)) * 3)
np.testing.assert_equal(indices, [0, 1, 2, 3, 4, 5])
np.testing.assert_equal(splits, [0, 2, 4, 6])
def test_sparse_neighborhood(self):
in_shape = np.array((4, 5), dtype=np.int64)
kernel_shape = np.array((3, 3), dtype=np.int64)
strides = np.array((2, 2), dtype=np.int64)
padding = np.array((0, 0), dtype=np.int64)
p, indices, splits, out_shape = grid.sparse_neighborhood(
in_shape, kernel_shape, strides, padding=padding
)
np.testing.assert_equal(out_shape, (1, 2))
np.testing.assert_equal(p, tuple(range(9)) * 2)
np.testing.assert_equal(
indices, [0, 1, 2, 5, 6, 7, 10, 11, 12, 2, 3, 4, 7, 8, 9, 12, 13, 14]
)
np.testing.assert_equal(splits, [0, 9, 18])
def test_sparse_neighborhood_padded(self):
in_shape = np.array((4, 5), dtype=np.int64)
kernel_shape = np.array((3, 3), dtype=np.int64)
strides = np.array((2, 2), dtype=np.int64)
padding = np.array((1, 1), dtype=np.int64)
p, indices, splits, out_shape = grid.sparse_neighborhood(
in_shape, kernel_shape, strides, padding=padding
)
np.testing.assert_equal(out_shape, (2, 3))
np.testing.assert_equal(
p,
(4, 5, 7, 8, 3, 4, 5, 6, 7, 8, 3, 4, 6, 7, 1, 2, 4, 5, 7, 8)
+ tuple(range(9))
+ (0, 1, 3, 4, 6, 7),
)
np.testing.assert_equal(
indices,
[
0,
1,
5,
6,
1,
2,
3,
6,
7,
8,
3,
4,
8,
9,
5,
6,
10,
11,
15,
16,
6,
7,
8,
11,
12,
13,
16,
17,
18,
8,
9,
13,
14,
18,
19,
],
)
np.testing.assert_equal(splits, [0, 4, 10, 14, 20, 29, 35])
if __name__ == "__main__":
unittest.main()
# GridTest().test_grid_coords()
| 33.255061 | 83 | 0.492695 | 1,036 | 8,214 | 3.753861 | 0.086873 | 0.077398 | 0.134996 | 0.179995 | 0.847776 | 0.812291 | 0.762407 | 0.729493 | 0.711237 | 0.665467 | 0 | 0.066412 | 0.362065 | 8,214 | 246 | 84 | 33.390244 | 0.675763 | 0.117117 | 0 | 0.53125 | 0 | 0 | 0.001108 | 0 | 0 | 0 | 0 | 0 | 0.161458 | 1 | 0.067708 | false | 0 | 0.015625 | 0 | 0.088542 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
734695cdfe43ddd8825c655221c557e8df2e81b0 | 33,684 | py | Python | src/openprocurement/tender/limited/tests/cancellation_blanks.py | pontostroy/api | 5afdd3a62a8e562cf77e2d963d88f1a26613d16a | [
"Apache-2.0"
] | 3 | 2020-03-13T06:44:23.000Z | 2020-11-05T18:25:29.000Z | src/openprocurement/tender/limited/tests/cancellation_blanks.py | pontostroy/api | 5afdd3a62a8e562cf77e2d963d88f1a26613d16a | [
"Apache-2.0"
] | 2 | 2021-03-25T23:29:58.000Z | 2022-03-21T22:18:37.000Z | src/openprocurement/tender/limited/tests/cancellation_blanks.py | pontostroy/api | 5afdd3a62a8e562cf77e2d963d88f1a26613d16a | [
"Apache-2.0"
] | 3 | 2020-10-16T16:25:14.000Z | 2021-05-22T12:26:20.000Z | # -*- coding: utf-8 -*-
from mock import patch
from datetime import timedelta
from openprocurement.tender.core.utils import get_now
from openprocurement.api.constants import RELEASE_2020_04_19
from openprocurement.tender.core.tests.cancellation import (
activate_cancellation_after_2020_04_19,
)
from openprocurement.tender.belowthreshold.tests.base import test_organization, test_cancellation
# TenderCancellationResourceTest
def create_tender_cancellation_invalid(self):
response = self.app.post_json(
"/tenders/some_id/cancellations", {"data": test_cancellation}, status=404
)
self.assertEqual(response.status, "404 Not Found")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["status"], "error")
self.assertEqual(
response.json["errors"], [{u"description": u"Not Found", u"location": u"url", u"name": u"tender_id"}]
)
request_path = "/tenders/{}/cancellations?acc_token={}".format(self.tender_id, self.tender_token)
response = self.app.post(request_path, "data", status=415)
self.assertEqual(response.status, "415 Unsupported Media Type")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["status"], "error")
self.assertEqual(
response.json["errors"],
[
{
u"description": u"Content-Type header should be one of ['application/json']",
u"location": u"header",
u"name": u"Content-Type",
}
],
)
response = self.app.post(request_path, "data", content_type="application/json", status=422)
self.assertEqual(response.status, "422 Unprocessable Entity")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["status"], "error")
self.assertEqual(
response.json["errors"],
[{u"description": u"No JSON object could be decoded", u"location": u"body", u"name": u"data"}],
)
response = self.app.post_json(request_path, "data", status=422)
self.assertEqual(response.status, "422 Unprocessable Entity")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["status"], "error")
self.assertEqual(
response.json["errors"], [{u"description": u"Data not available", u"location": u"body", u"name": u"data"}]
)
response = self.app.post_json(request_path, {"not_data": {}}, status=422)
self.assertEqual(response.status, "422 Unprocessable Entity")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["status"], "error")
self.assertEqual(
response.json["errors"], [{u"description": u"Data not available", u"location": u"body", u"name": u"data"}]
)
response = self.app.post_json(request_path, {"data": {}}, status=422)
self.assertEqual(response.status, "422 Unprocessable Entity")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["status"], "error")
self.assertEqual(
response.json["errors"],
[{u"description": [u"This field is required."], u"location": u"body", u"name": u"reason"}],
)
response = self.app.post_json(request_path, {"data": {"invalid_field": "invalid_value"}}, status=422)
self.assertEqual(response.status, "422 Unprocessable Entity")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["status"], "error")
self.assertEqual(
response.json["errors"], [{u"description": u"Rogue field", u"location": u"body", u"name": u"invalid_field"}]
)
def create_tender_cancellation(self):
response = self.app.post_json(
"/tenders/{}/cancellations?acc_token={}".format(self.tender_id, self.tender_token),
{"data": test_cancellation},
)
self.assertEqual(response.status, "201 Created")
self.assertEqual(response.content_type, "application/json")
cancellation = response.json["data"]
self.assertEqual(cancellation["reason"], "cancellation reason")
self.assertIn("id", cancellation)
self.assertIn(cancellation["id"], response.headers["Location"])
response = self.app.get("/tenders/{}".format(self.tender_id))
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["data"]["status"], "active")
response = self.app.post_json(
"/tenders/{}/cancellations?acc_token={}".format(self.tender_id, self.tender_token),
{"data": test_cancellation},
)
self.assertEqual(response.status, "201 Created")
self.assertEqual(response.content_type, "application/json")
first_cancellation = response.json["data"]
self.assertEqual(first_cancellation["reason"], "cancellation reason")
response = self.app.post_json(
"/tenders/{}/cancellations?acc_token={}".format(self.tender_id, self.tender_token),
{"data": test_cancellation},
)
self.assertEqual(response.status, "201 Created")
self.assertEqual(response.content_type, "application/json")
second_cancellation = response.json["data"]
self.assertEqual(second_cancellation["reason"], "cancellation reason")
if get_now() < RELEASE_2020_04_19:
response = self.app.patch_json(
"/tenders/{}/cancellations/{}?acc_token={}".format(
self.tender_id, second_cancellation["id"], self.tender_token
),
{"data": {"status": "active"}},
)
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["data"]["status"], "active")
else:
activate_cancellation_after_2020_04_19(self, second_cancellation["id"])
response = self.app.get("/tenders/{}".format(self.tender_id))
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["data"]["status"], "cancelled")
response = self.app.post_json(
"/tenders/{}/cancellations?acc_token={}".format(self.tender_id, self.tender_token),
{"data": test_cancellation},
status=403,
)
self.assertEqual(response.status, "403 Forbidden")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(
response.json["errors"][0]["description"], "Can't update tender in current (cancelled) status"
)
def create_tender_cancellation_with_post(self):
response = self.app.post_json(
"/tenders/{}/cancellations?acc_token={}".format(self.tender_id, self.tender_token),
{"data": test_cancellation},
)
self.assertEqual(response.status, "201 Created")
self.assertEqual(response.content_type, "application/json")
cancellation = response.json["data"]
self.assertEqual(cancellation["reason"], "cancellation reason")
self.assertIn("id", cancellation)
self.assertIn(cancellation["id"], response.headers["Location"])
response = self.app.get("/tenders/{}".format(self.tender_id))
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["data"]["status"], "active")
cancellation = dict(**test_cancellation)
cancellation.update({
"status": "active"
})
response = self.app.post_json(
"/tenders/{}/cancellations?acc_token={}".format(self.tender_id, self.tender_token),
{"data": cancellation},
)
self.assertEqual(response.status, "201 Created")
self.assertEqual(response.content_type, "application/json")
cancellation = response.json["data"]
self.assertEqual(cancellation["reason"], "cancellation reason")
if get_now() < RELEASE_2020_04_19:
self.assertEqual(cancellation["status"], "active")
self.assertIn("id", cancellation)
self.assertIn(cancellation["id"], response.headers["Location"])
else:
self.assertEqual(cancellation["status"], "draft")
activate_cancellation_after_2020_04_19(self, cancellation["id"])
response = self.app.get("/tenders/{}".format(self.tender_id))
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["data"]["status"], "cancelled")
def create_cancellation_on_lot(self):
""" Try create cancellation with cancellationOf = lot while tender hasn't lots """
cancellation = dict(**test_cancellation)
cancellation.update({
"cancellationOf": "lot",
"relatedLot": "1" * 32
})
response = self.app.post_json(
"/tenders/{}/cancellations?acc_token={}".format(self.tender_id, self.tender_token),
{"data": cancellation},
status=422,
)
self.assertEqual(response.status, "422 Unprocessable Entity")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(
response.json["errors"],
[
{u"location": u"body", u"name": u"relatedLot", u"description": [u"relatedLot should be one of lots"]},
{
u"location": u"body",
u"name": u"cancellationOf",
u"description": [
u'Lot cancellation can not be submitted, since "multiple lots" option is not available for this type of tender.'
],
},
],
)
# TenderNegotiationCancellationResourceTest
def negotiation_create_cancellation_on_lot(self):
""" Try create cancellation with cancellationOf = lot while tender hasn't lots """
cancellation = dict(**test_cancellation)
cancellation.update({
"cancellationOf": "lot",
"relatedLot": "1" * 32
})
response = self.app.post_json(
"/tenders/{}/cancellations?acc_token={}".format(self.tender_id, self.tender_token),
{"data": cancellation},
status=422,
)
self.assertEqual(response.status, "422 Unprocessable Entity")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(
response.json["errors"],
[{u"description": [u"relatedLot should be one of lots"], u"location": u"body", u"name": u"relatedLot"}],
)
# TenderNegotiationLotsCancellationResourceTest
def create_tender_lots_cancellation(self):
lot_id = self.initial_lots[0]["id"]
cancellation = dict(**test_cancellation)
cancellation.update({
"cancellationOf": "lot",
"relatedLot": lot_id
})
response = self.app.post_json(
"/tenders/{}/cancellations?acc_token={}".format(self.tender_id, self.tender_token),
{"data": cancellation},
)
self.assertEqual(response.status, "201 Created")
self.assertEqual(response.content_type, "application/json")
cancellation = response.json["data"]
self.assertEqual(cancellation["reason"], "cancellation reason")
self.assertIn("id", cancellation)
self.assertIn(cancellation["id"], response.headers["Location"])
response = self.app.get("/tenders/{}".format(self.tender_id))
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["data"]["lots"][0]["status"], "active")
self.assertEqual(response.json["data"]["status"], "active")
cancellation = dict(**test_cancellation)
cancellation.update({
"cancellationOf": "lot",
"relatedLot": lot_id,
"status": "active"
})
response = self.app.post_json(
"/tenders/{}/cancellations?acc_token={}".format(self.tender_id, self.tender_token),
{"data": cancellation},
)
self.assertEqual(response.status, "201 Created")
self.assertEqual(response.content_type, "application/json")
cancellation = response.json["data"]
self.assertEqual(cancellation["reason"], "cancellation reason")
self.assertIn("id", cancellation)
self.assertIn(cancellation["id"], response.headers["Location"])
if RELEASE_2020_04_19 > get_now():
self.assertEqual(cancellation["status"], "active")
else:
activate_cancellation_after_2020_04_19(self, cancellation["id"])
response = self.app.get("/tenders/{}".format(self.tender_id))
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["data"]["lots"][0]["status"], "cancelled")
self.assertNotEqual(response.json["data"]["status"], "cancelled")
cancellation = dict(**test_cancellation)
cancellation.update({
"cancellationOf": "lot",
"relatedLot": lot_id,
"status": "active"
})
response = self.app.post_json(
"/tenders/{}/cancellations?acc_token={}".format(self.tender_id, self.tender_token),
{"data": cancellation},
status=403,
)
self.assertEqual(response.status, "403 Forbidden")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["errors"][0]["description"], "Can perform cancellation only in active lot status")
cancellation = dict(**test_cancellation)
cancellation.update({
"status": "active",
"cancellationOf": "lot",
"relatedLot": self.initial_lots[1]["id"],
})
response = self.app.post_json(
"/tenders/{}/cancellations?acc_token={}".format(self.tender_id, self.tender_token),
{"data": cancellation},
)
self.assertEqual(response.status, "201 Created")
self.assertEqual(response.content_type, "application/json")
cancellation = response.json["data"]
self.assertEqual(cancellation["reason"], "cancellation reason")
self.assertIn("id", cancellation)
self.assertIn(cancellation["id"], response.headers["Location"])
if RELEASE_2020_04_19 > get_now():
self.assertEqual(cancellation["status"], "active")
else:
activate_cancellation_after_2020_04_19(self, cancellation["id"])
response = self.app.get("/tenders/{}".format(self.tender_id))
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["data"]["lots"][0]["status"], "cancelled")
self.assertEqual(response.json["data"]["lots"][1]["status"], "cancelled")
self.assertEqual(response.json["data"]["status"], "cancelled")
def cancelled_lot_without_relatedLot(self):
cancellation = dict(**test_cancellation)
cancellation.update({
"cancellationOf": "lot",
})
response = self.app.post_json(
"/tenders/{}/cancellations?acc_token={}".format(self.tender_id, self.tender_token),
{"data": cancellation},
status=422,
)
self.assertEqual(response.status, "422 Unprocessable Entity")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(
response.json["errors"],
[{"location": "body", "name": "relatedLot", "description": ["This field is required."]}],
)
def delete_first_lot_second_cancel(self):
""" One lot we delete another cancel and check tender status """
self.app.patch_json(
"/tenders/{}?acc_token={}".format(self.tender_id, self.tender_token),
{"data": {"items": [{"relatedLot": self.initial_lots[1]["id"]}]}},
)
response = self.app.delete(
"/tenders/{}/lots/{}?acc_token={}".format(self.tender_id, self.initial_lots[0]["id"], self.tender_token)
)
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.content_type, "application/json")
response = self.app.get("/tenders/{}/lots?acc_token={}".format(self.tender_id, self.tender_token))
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(len(response.json["data"]), 1)
cancellation = dict(**test_cancellation)
cancellation.update({
"status": "active",
"cancellationOf": "lot",
"relatedLot": self.initial_lots[1]["id"],
})
response = self.app.post_json(
"/tenders/{}/cancellations?acc_token={}".format(self.tender_id, self.tender_token),
{"data": cancellation},
)
self.assertEqual(response.status, "201 Created")
self.assertEqual(response.content_type, "application/json")
cancellation = response.json["data"]
self.assertEqual(cancellation["reason"], "cancellation reason")
self.assertIn("id", cancellation)
self.assertIn(cancellation["id"], response.headers["Location"])
if RELEASE_2020_04_19 > get_now():
self.assertEqual(cancellation["status"], "active")
else:
activate_cancellation_after_2020_04_19(self, cancellation["id"])
response = self.app.get("/tenders/{}?acc_token={}".format(self.tender_id, self.tender_token))
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["data"]["status"], "cancelled")
def cancel_tender(self):
cancellation = dict(**test_cancellation)
cancellation.update({
"status": "active",
"cancellationOf": "tender",
})
response = self.app.post_json(
"/tenders/{}/cancellations?acc_token={}".format(self.tender_id, self.tender_token),
{"data": cancellation},
)
self.assertEqual(response.status, "201 Created")
self.assertEqual(response.content_type, "application/json")
cancellation = response.json["data"]
self.assertEqual(cancellation["reason"], "cancellation reason")
if get_now() < RELEASE_2020_04_19:
self.assertEqual(cancellation["status"], "active")
self.assertIn("id", cancellation)
self.assertIn(cancellation["id"], response.headers["Location"])
else:
activate_cancellation_after_2020_04_19(self, cancellation["id"])
# Check tender
response = self.app.get("/tenders/{}?acc_token={}".format(self.tender_id, self.tender_token))
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["data"]["status"], "cancelled")
# Check lots
response = self.app.get("/tenders/{}/lots?acc_token={}".format(self.tender_id, self.tender_token))
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["data"][0]["status"], "active")
self.assertEqual(response.json["data"][1]["status"], "active")
def create_cancellation_on_tender_with_one_complete_lot(self):
lot = self.initial_lots[0]
# Create award
response = self.app.post_json(
"/tenders/{}/awards?acc_token={}".format(self.tender_id, self.tender_token),
{
"data": {
"suppliers": [test_organization],
"status": "pending",
"qualified": True,
"value": {"amount": 469, "currency": "UAH", "valueAddedTaxIncluded": True},
"lotID": lot["id"],
}
},
)
self.assertEqual(response.status, "201 Created")
self.assertEqual(response.json["data"]["status"], "pending")
# Activate award
award = response.json["data"]
response = self.app.patch_json(
"/tenders/{}/awards/{}?acc_token={}".format(self.tender_id, award["id"], self.tender_token),
{"data": {"status": "active"}},
)
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.json["data"]["status"], "active")
# time travel
tender = self.db.get(self.tender_id)
for i in tender.get("awards", []):
if i.get("complaintPeriod", {}): # reporting procedure does not have complaintPeriod
i["complaintPeriod"]["endDate"] = i["complaintPeriod"]["startDate"]
self.db.save(tender)
# Sign contract
response = self.app.get("/tenders/{}/contracts".format(self.tender_id))
response = self.app.patch_json(
"/tenders/{}/contracts/{}?acc_token={}".format(
self.tender_id, response.json["data"][0]["id"], self.tender_token
),
{"data": {"status": "active", "value": {"valueAddedTaxIncluded": False}}},
)
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.json["data"]["status"], "active")
# Try to create cancellation on tender
cancellation = dict(**test_cancellation)
cancellation.update({
"status": "active",
"cancellationOf": "tender",
})
response = self.app.post_json(
"/tenders/{}/cancellations?acc_token={}".format(self.tender_id, self.tender_token),
{"data": cancellation},
status=403,
)
self.assertEqual(response.status, "403 Forbidden")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(
response.json["errors"][0]["description"], "Can't perform cancellation, if there is at least one complete lot"
)
def cancellation_on_not_active_lot(self):
lot = self.initial_lots[0]
# Create cancellation on lot with status cancelled
cancellation = dict(**test_cancellation)
cancellation.update({
"status": "active",
"cancellationOf": "lot",
"relatedLot": lot["id"],
})
response = self.app.post_json(
"/tenders/{}/cancellations?acc_token={}".format(self.tender_id, self.tender_token),
{"data": cancellation},
)
self.assertEqual(response.status, "201 Created")
self.assertEqual(response.content_type, "application/json")
cancellation_id = response.json["data"]["id"]
if RELEASE_2020_04_19 < get_now():
activate_cancellation_after_2020_04_19(self, cancellation_id)
# check lot status
response = self.app.get("/tenders/{}/lots/{}".format(self.tender_id, lot["id"]))
self.assertEqual(response.json["data"]["status"], "cancelled")
# Try to create cancellation on lot with status cancelled
cancellation = dict(**test_cancellation)
cancellation.update({
"status": "active",
"cancellationOf": "lot",
"relatedLot": lot["id"],
})
response = self.app.post_json(
"/tenders/{}/cancellations?acc_token={}".format(self.tender_id, self.tender_token),
{"data": cancellation},
status=403,
)
self.assertEqual(response.status, "403 Forbidden")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["errors"][0]["description"], "Can perform cancellation only in active lot status")
@patch("openprocurement.tender.core.models.RELEASE_2020_04_19", get_now() - timedelta(days=1))
@patch("openprocurement.tender.core.views.cancellation.RELEASE_2020_04_19", get_now() - timedelta(days=1))
def create_tender_cancellation_2020_04_19(self):
reasonType_choices = self.valid_reasonType_choices
request_path = "/tenders/{}/cancellations?acc_token={}".format(self.tender_id, self.tender_token)
cancellation = dict(**test_cancellation)
cancellation.update({"reasonType": reasonType_choices[0]})
response = self.app.post_json(
request_path,
{"data": cancellation}
)
self.assertEqual(response.status, "201 Created")
self.assertEqual(response.content_type, "application/json")
cancellation = response.json["data"]
cancellation_id = cancellation["id"]
self.assertEqual(cancellation["reason"], "cancellation reason")
self.assertIn("id", cancellation)
self.assertIn("date", cancellation)
self.assertEqual(cancellation["reasonType"], reasonType_choices[0])
self.assertEqual(cancellation["status"], "draft")
self.assertIn(cancellation_id, response.headers["Location"])
cancellation = dict(**test_cancellation)
cancellation.update({"reasonType": reasonType_choices[1]})
response = self.app.post_json(
request_path,
{"data": cancellation}
)
self.assertEqual(response.status, "201 Created")
self.assertEqual(response.content_type, "application/json")
cancellation = response.json["data"]
cancellation_id = cancellation["id"]
self.assertEqual(cancellation["reason"], "cancellation reason")
self.assertIn("id", cancellation)
self.assertIn("date", cancellation)
self.assertEqual(cancellation["reasonType"], reasonType_choices[1])
self.assertEqual(cancellation["status"], "draft")
self.assertIn(cancellation_id, response.headers["Location"])
response = self.app.post(
"/tenders/{}/cancellations/{}/documents?acc_token={}".format(
self.tender_id, cancellation_id, self.tender_token
),
upload_files=[("file", "name.doc", "content")],
)
self.assertEqual(response.status, "201 Created")
self.assertEqual(response.content_type, "application/json")
request_path = "/tenders/{}/cancellations/{}?acc_token={}".format(self.tender_id, cancellation_id, self.tender_token)
response = self.app.patch_json(
request_path,
{"data": {"status": "pending"}},
)
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.content_type, "application/json")
cancellation = response.json["data"]
self.assertEqual(cancellation["status"], "pending")
@patch("openprocurement.tender.core.models.RELEASE_2020_04_19", get_now() - timedelta(days=1))
@patch("openprocurement.tender.core.validation.RELEASE_2020_04_19", get_now() - timedelta(days=1))
@patch("openprocurement.tender.core.views.cancellation.RELEASE_2020_04_19", get_now() - timedelta(days=1))
def patch_tender_cancellation_2020_04_19(self):
reasonType_choices = self.valid_reasonType_choices
cancellation = dict(**test_cancellation)
cancellation.update({"reasonType": reasonType_choices[0]})
response = self.app.post_json(
"/tenders/{}/cancellations?acc_token={}".format(self.tender_id, self.tender_token),
{"data": cancellation}
)
self.assertEqual(response.status, "201 Created")
self.assertEqual(response.content_type, "application/json")
cancellation = response.json["data"]
cancellation_id = cancellation["id"]
self.assertEqual(cancellation["reason"], "cancellation reason")
self.assertIn("id", cancellation)
self.assertIn("date", cancellation)
self.assertEqual(cancellation["reasonType"], reasonType_choices[0])
self.assertEqual(cancellation["status"], "draft")
self.assertIn(cancellation_id, response.headers["Location"])
response = self.app.patch_json(
"/tenders/{}/cancellations/{}?acc_token={}".format(self.tender_id, cancellation_id, self.tender_token),
{"data": {"status": "pending"}},
status=422
)
self.assertEqual(response.status, "422 Unprocessable Entity")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(
response.json["errors"], [{
u"description": u"Fields reason, cancellationOf and documents must be filled for switch cancellation to pending status",
u"location": u"body",
u"name": u"data",
}]
)
response = self.app.patch_json(
"/tenders/{}/cancellations/{}?acc_token={}".format(self.tender_id, cancellation_id, self.tender_token),
{"data": {"status": "active"}},
status=422
)
self.assertEqual(response.status, "422 Unprocessable Entity")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(
response.json["errors"], [{
u"description": u"Cancellation can't be updated from draft to active status",
u"location": u"body",
u"name": u"data",
}]
)
response = self.app.post(
"/tenders/{}/cancellations/{}/documents?acc_token={}".format(
self.tender_id, cancellation_id, self.tender_token
),
upload_files=[("file", "name.doc", "content")],
)
self.assertEqual(response.status, "201 Created")
self.assertEqual(response.content_type, "application/json")
request_path = "/tenders/{}/cancellations/{}?acc_token={}".format(self.tender_id, cancellation_id,
self.tender_token)
response = self.app.patch_json(
request_path,
{"data": {"status": "pending"}},
)
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.content_type, "application/json")
cancellation = response.json["data"]
self.assertEqual(cancellation["status"], "pending")
response = self.app.patch_json(
"/tenders/{}/cancellations/{}?acc_token={}".format(self.tender_id, cancellation_id, self.tender_token),
{"data": {"status": "draft"}},
status=422
)
self.assertEqual(response.status, "422 Unprocessable Entity")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(
response.json["errors"], [{
u"description": u"Cancellation can't be updated from pending to draft status",
u"location": u"body",
u"name": u"data",
}]
)
response = self.app.patch_json(
"/tenders/{}/cancellations/{}?acc_token={}".format(self.tender_id, cancellation_id, self.tender_token),
{"data": {"status": "active"}},
status=422
)
self.assertEqual(response.status, "422 Unprocessable Entity")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(
response.json["errors"], [{
u"description": u"Cancellation can't be updated from pending to active status",
u"location": u"body",
u"name": u"data",
}]
)
response = self.app.patch_json(
"/tenders/{}/cancellations/{}?acc_token={}".format(self.tender_id, cancellation_id, self.tender_token),
{"data": {"status": "unsuccessful"}},
)
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["data"]["status"], "unsuccessful")
response = self.app.patch_json(
"/tenders/{}/cancellations/{}?acc_token={}".format(self.tender_id, cancellation_id, self.tender_token),
{"data": {"status": None}},
)
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.content_type, "application/json")
# self.assertEqual(response.json["data"]["status"], "unsuccessful")
response = self.app.patch_json(
"/tenders/{}/cancellations/{}?acc_token={}".format(self.tender_id, cancellation_id, self.tender_token),
{"data": {"status": "pending"}},
status=422
)
self.assertEqual(response.status, "422 Unprocessable Entity")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(
response.json["errors"], [{
u"description": u"Cancellation can't be updated from unsuccessful to pending status",
u"location": u"body",
u"name": u"data",
}]
)
cancellation = dict(**test_cancellation)
cancellation.update({"reasonType": reasonType_choices[1]})
response = self.app.post_json(
"/tenders/{}/cancellations?acc_token={}".format(self.tender_id, self.tender_token),
{"data": cancellation}
)
self.assertEqual(response.status, "201 Created")
self.assertEqual(response.content_type, "application/json")
cancellation = response.json["data"]
cancellation_id = cancellation["id"]
self.assertEqual(cancellation["reason"], "cancellation reason")
self.assertIn("id", cancellation)
self.assertIn("date", cancellation)
self.assertEqual(cancellation["reasonType"], reasonType_choices[1])
self.assertEqual(cancellation["status"], "draft")
self.assertIn(cancellation_id, response.headers["Location"])
response = self.app.post(
"/tenders/{}/cancellations/{}/documents?acc_token={}".format(
self.tender_id, cancellation_id, self.tender_token
),
upload_files=[("file", "name.doc", "content")],
)
self.assertEqual(response.status, "201 Created")
self.assertEqual(response.content_type, "application/json")
response = self.app.patch_json(
"/tenders/{}/cancellations/{}?acc_token={}".format(self.tender_id, cancellation_id, self.tender_token),
{"data": {"status": "pending"}},
)
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.content_type, "application/json")
cancellation = response.json["data"]
self.assertEqual(cancellation["status"], "pending")
with patch(
"openprocurement.tender.core.validation.get_now",
return_value=get_now() + timedelta(days=20)) as mock_date:
response = self.app.patch_json(
"/tenders/{}/cancellations/{}?acc_token={}".format(self.tender_id, cancellation_id, self.tender_token),
{"data": {"status": "active"}},
)
self.assertEqual(response.status, "200 OK")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(response.json["data"]["status"], "active")
response = self.app.patch_json(
"/tenders/{}/cancellations/{}?acc_token={}".format(self.tender_id, cancellation_id, self.tender_token),
{"data": {"status": "pending"}},
status=403
)
self.assertEqual(response.status, "403 Forbidden")
self.assertEqual(response.content_type, "application/json")
self.assertEqual(
response.json["errors"], [{
u"description": u"Can't update tender in current (cancelled) status",
u"location": u"body",
u"name": u"data",
}]
) | 41.482759 | 132 | 0.667201 | 3,683 | 33,684 | 5.965789 | 0.053489 | 0.135172 | 0.173767 | 0.079192 | 0.90588 | 0.896232 | 0.880575 | 0.866102 | 0.848535 | 0.837794 | 0 | 0.016875 | 0.176671 | 33,684 | 812 | 133 | 41.482759 | 0.775394 | 0.0209 | 0 | 0.745324 | 0 | 0.001439 | 0.244666 | 0.06732 | 0 | 0 | 0 | 0 | 0.32518 | 1 | 0.018705 | false | 0 | 0.008633 | 0 | 0.027338 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
7dfe3186f8a8101fe611a6926b38ca1df625f579 | 1,911 | py | Python | pytamil/tamil19.py | Ezhil-Language-Foundation/pytamil | 6fe67a618ec699447fb71f3106c263a944572b73 | [
"MIT"
] | 1 | 2020-04-25T09:25:40.000Z | 2020-04-25T09:25:40.000Z | pytamil/tamil19.py | kumaranvram/pytamil | cd999fac9a63a055accefe18f18b0a154d152569 | [
"MIT"
] | null | null | null | pytamil/tamil19.py | kumaranvram/pytamil | cd999fac9a63a055accefe18f18b0a154d152569 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
import sys
#FIXME: use PYTHONPATH with module directory
sys.path.append('../pytamil')
from pytamil import தமிழ்
from தமிழ் import இலக்கணம் as இல
from தமிழ் import புணர்ச்சி
from தமிழ் import எழுத்து
from தமிழ் import மாத்திரை
# print( எழுத்து.எழுத்துக்கள்['மெல்லினம்'])
# print( எழுத்து.எழுத்துக்கள்['குறில்'] )
# print (தமிழ்.வட்டெழுத்து('வணக்கம்'))
# print ( எழுத்து.தொடர்மொழி_ஆக்கு('விருந்து', 'ஓம்பல்' ))
# print( இல.தொடர்மொழி_ஆக்கு('விருந்து', 'ஓம்பல்'))
# print( இல.தொடர்மொழி_ஆக்கு('மெய்', 'எழுத்து'))
# print( இல.தொடர்மொழி_ஆக்கு('மெய்', 'பழுத்து'))
# print( இல.தொடர்மொழி_ஆக்கு('முள்', 'இலை'))
# print( இல.தொடர்மொழி_ஆக்கு('உயிர்', 'எழுத்து'))
# print( இல.தொடர்மொழி_ஆக்கு('வேல்', 'எறிந்தான்'))
# விதிகள் =[]
# விதிகள் = getவிதிகள்(entries,விதிகள்)
# சான்றுகள் = []
# சான்றுகள் = getசான்றுகள்(entries, சான்றுகள்)
# print(விதிகள்)
# print(சான்றுகள்)
# result = புணர்ச்சி.check("(...)(இ,ஈ,ஐ)" ,"மணிதன்")
# print (result)
# result = புணர்ச்சி.check("(...)(உயிர்)" , "மணி")
# print (result)
# result = புணர்ச்சி.check("(உயிர்)(...)" , "அடி")
# print (result)
# print(புணர்ச்சி.தொடர்மொழி_ஆக்கு( 'உயிர்' , 'எழுத்து'))
# புணர்ச்சி.புணர்ச்சிசெய்('''சே|உடம்படுமெய்(ய்)|சும்மா + சும்மா|திரிதல்(வ்)|அடி ,
# சே|உடம்படுமெய்(வ்) + திரிதல்(வ்)|அடி,
# சே|உடம்படுமெய்(வ்) + திரிதல்(வ்)|அடி ''')
# புணர்ச்சி.புணர்ச்சிசெய்('''சே|உடம்படுமெய்(ய்)|சும்மா + சும்மா|திரிதல்(வ்)|அடி ,
# சே|உடம்படுமெய்(வ்) + திரிதல்(வ்)|அடி''')
# புணர்ச்சி.புணர்ச்சிசெய்('சேய் +இயல்பு+ அவ்')
# புணர்ச்சி.தொடர்மொழி_ஆக்கு('சே', 'அடி' )
# புணர்ச்சி.தொடர்மொழி_ஆக்கு('கண்', 'மங்கியது')
# print(மாத்திரை.மாத்திரை_கொடு('புணர்ச்சிசெய்'))
# print(மாத்திரை.மொத்தமாத்திரை('புணர்ச்சிசெய்'))
# print(புணர்ச்சி.தொடர்மொழி_ஆக்கு( 'மணி' , 'அடித்தான்'))
# print(புணர்ச்சி.தொடர்மொழி_ஆக்கு( 'மெய்', 'எழுத்து'))
print(புணர்ச்சி.தொடர்மொழி_ஆக்கு( 'நிலா', 'ஒளி'))
| 30.333333 | 82 | 0.462062 | 887 | 1,911 | 1.440812 | 0.087937 | 0.067293 | 0.056338 | 0.070423 | 0.792645 | 0.777778 | 0.716745 | 0.693271 | 0.649452 | 0.64867 | 0 | 0.000606 | 0.137101 | 1,911 | 62 | 83 | 30.822581 | 0.52638 | 0.831502 | 0 | 0 | 0 | 0 | 0.059859 | 0 | 0 | 0 | 0 | 0.016129 | 0 | 1 | 0 | true | 0 | 0.75 | 0 | 0.75 | 0.125 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 9 |
b469b67600381a5066664e4560057c4f9af04a57 | 35,825 | py | Python | tests/dhcpv6/prefix_delegation/test_v6_prefix_delegation.py | shawnmullaney/forge | aaaef0a0645f73d24666aab6a400f3604e753aac | [
"0BSD"
] | null | null | null | tests/dhcpv6/prefix_delegation/test_v6_prefix_delegation.py | shawnmullaney/forge | aaaef0a0645f73d24666aab6a400f3604e753aac | [
"0BSD"
] | null | null | null | tests/dhcpv6/prefix_delegation/test_v6_prefix_delegation.py | shawnmullaney/forge | aaaef0a0645f73d24666aab6a400f3604e753aac | [
"0BSD"
] | null | null | null | """DHCPv6 Prefix Delegation"""
# pylint: disable=invalid-name,line-too-long
import pytest
import srv_msg
import srv_control
import misc
import references
@pytest.mark.v6
@pytest.mark.dhcp6
@pytest.mark.PD
@pytest.mark.rfc3633
def test_prefix_delegation_onlyPD_request():
misc.test_setup()
srv_control.config_srv_subnet('3000::/64', '3000::1-3000::3')
srv_control.config_srv_prefix('2001:db8:1::', '0', '90', '92')
srv_control.build_and_send_config_files('SSH', 'config-file')
srv_control.start_srv('DHCP', 'started')
misc.test_procedure()
srv_msg.client_does_include('Client', None, 'IA-PD')
srv_msg.client_does_include('Client', None, 'client-id')
srv_msg.client_send_msg('SOLICIT')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, 'ADVERTISE')
srv_msg.response_check_include_option('Response', None, '25')
srv_msg.response_check_option_content('Response', '25', None, 'sub-option', '26')
misc.test_procedure()
srv_msg.client_copy_option('server-id')
srv_msg.client_copy_option('IA_PD')
srv_msg.client_does_include('Client', None, 'client-id')
srv_msg.client_send_msg('REQUEST')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, 'REPLY')
srv_msg.response_check_include_option('Response', None, '25')
srv_msg.response_check_option_content('Response', '25', None, 'sub-option', '26')
references.references_check('RFC')
@pytest.mark.v6
@pytest.mark.dhcp6
@pytest.mark.PD
@pytest.mark.rfc3633
def test_prefix_delegation_IA_and_PD_request():
misc.test_setup()
srv_control.config_srv_subnet('3000::/64', '3000::1-3000::1')
srv_control.config_srv_prefix('2001:db8:1::', '0', '90', '92')
srv_control.build_and_send_config_files('SSH', 'config-file')
srv_control.start_srv('DHCP', 'started')
misc.test_procedure()
srv_msg.client_does_include('Client', None, 'IA-PD')
srv_msg.client_does_include('Client', None, 'client-id')
srv_msg.client_does_include('Client', None, 'IA-NA')
srv_msg.client_send_msg('SOLICIT')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, 'ADVERTISE')
srv_msg.response_check_include_option('Response', None, '25')
srv_msg.response_check_option_content('Response', '25', None, 'sub-option', '26')
srv_msg.response_check_include_option('Response', None, '3')
srv_msg.response_check_option_content('Response', '3', None, 'sub-option', '5')
srv_msg.response_check_suboption_content('Response', '5', '3', None, 'addr', '3000::1')
misc.test_procedure()
srv_msg.client_copy_option('IA_NA')
srv_msg.client_copy_option('server-id')
srv_msg.client_copy_option('IA_PD')
srv_msg.client_does_include('Client', None, 'client-id')
srv_msg.client_send_msg('REQUEST')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, 'REPLY')
srv_msg.response_check_include_option('Response', None, '25')
srv_msg.response_check_option_content('Response', '25', None, 'sub-option', '26')
srv_msg.response_check_include_option('Response', None, '3')
srv_msg.response_check_option_content('Response', '3', None, 'sub-option', '5')
srv_msg.response_check_suboption_content('Response', '5', '3', None, 'addr', '3000::1')
references.references_check('RFC')
@pytest.mark.v6
@pytest.mark.dhcp6
@pytest.mark.PD
@pytest.mark.rfc3633
def test_prefix_delegation_onlyPD_request_release():
misc.test_setup()
srv_control.config_srv_subnet('3000::/64', '3000::1-3000::3')
srv_control.config_srv_prefix('2001:db8:1::', '0', '90', '91')
# pool of two prefixes
srv_control.build_and_send_config_files('SSH', 'config-file')
srv_control.start_srv('DHCP', 'started')
misc.test_procedure()
srv_msg.client_does_include('Client', None, 'IA-PD')
srv_msg.client_does_include('Client', None, 'client-id')
srv_msg.client_send_msg('SOLICIT')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, 'ADVERTISE')
srv_msg.response_check_include_option('Response', None, '25')
misc.test_procedure()
srv_msg.client_copy_option('server-id')
srv_msg.client_copy_option('IA_PD')
srv_msg.client_does_include('Client', None, 'client-id')
srv_msg.client_send_msg('REQUEST')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, 'REPLY')
srv_msg.response_check_include_option('Response', None, '25')
srv_msg.response_check_option_content('Response', '25', None, 'sub-option', '26')
misc.test_procedure()
srv_msg.client_copy_option('IA_PD')
srv_msg.client_copy_option('server-id')
srv_msg.client_does_include('Client', None, 'client-id')
srv_msg.client_send_msg('RELEASE')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, 'REPLY')
srv_msg.response_check_include_option('Response', None, '25')
srv_msg.response_check_option_content('Response', '25', None, 'sub-option', '13')
srv_msg.response_check_suboption_content('Response', '13', '25', None, 'statuscode', '0')
# tests MUST NOT include 'NoBinding'...
references.references_check('RFC')
@pytest.mark.v6
@pytest.mark.dhcp6
@pytest.mark.PD
@pytest.mark.rfc3633
def test_prefix_delegation_onlyPD_multiple_request_release():
misc.test_setup()
srv_control.config_srv_subnet('3000::/64', '3000::1-3000::3')
srv_control.config_srv_prefix('2001:db8:1::', '0', '90', '91')
# pool of two prefixes
srv_control.build_and_send_config_files('SSH', 'config-file')
srv_control.start_srv('DHCP', 'started')
misc.test_procedure()
srv_msg.client_does_include('Client', None, 'IA-PD')
srv_msg.client_does_include('Client', None, 'client-id')
srv_msg.client_send_msg('SOLICIT')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, 'ADVERTISE')
srv_msg.response_check_include_option('Response', None, '25')
misc.test_procedure()
srv_msg.client_copy_option('server-id')
srv_msg.client_copy_option('IA_PD')
srv_msg.client_does_include('Client', None, 'client-id')
srv_msg.client_send_msg('REQUEST')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, 'REPLY')
srv_msg.response_check_include_option('Response', None, '25')
srv_msg.response_check_option_content('Response', '25', None, 'sub-option', '26')
misc.test_procedure()
srv_msg.client_copy_option('IA_PD')
srv_msg.client_copy_option('server-id')
srv_msg.client_does_include('Client', None, 'client-id')
srv_msg.client_send_msg('RELEASE')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, 'REPLY')
srv_msg.response_check_include_option('Response', None, '25')
srv_msg.response_check_option_content('Response', '25', None, 'sub-option', '13')
srv_msg.response_check_suboption_content('Response', '13', '25', None, 'statuscode', '0')
misc.test_procedure()
srv_msg.generate_new('IA_PD')
srv_msg.client_does_include('Client', None, 'IA-PD')
srv_msg.client_does_include('Client', None, 'client-id')
srv_msg.client_send_msg('SOLICIT')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, 'ADVERTISE')
srv_msg.response_check_include_option('Response', None, '25')
misc.test_procedure()
srv_msg.client_copy_option('server-id')
srv_msg.client_copy_option('IA_PD')
srv_msg.client_does_include('Client', None, 'client-id')
srv_msg.client_send_msg('REQUEST')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, 'REPLY')
srv_msg.response_check_include_option('Response', None, '25')
srv_msg.response_check_option_content('Response', '25', None, 'sub-option', '26')
misc.test_procedure()
srv_msg.generate_new('IA_PD')
srv_msg.client_does_include('Client', None, 'IA-PD')
srv_msg.client_does_include('Client', None, 'client-id')
srv_msg.client_send_msg('SOLICIT')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, 'ADVERTISE')
srv_msg.response_check_include_option('Response', None, '25')
srv_msg.response_check_option_content('Response', '25', None, 'sub-option', '26')
# if it fails, it means that release process fails.
references.references_check('RFC')
@pytest.mark.v6
@pytest.mark.dhcp6
@pytest.mark.PD
@pytest.mark.rfc3633
def test_prefix_delegation_IA_and_PD_request_release():
misc.test_setup()
srv_control.config_srv_subnet('3000::/64', '3000::1-3000::3')
srv_control.config_srv_prefix('2001:db8:1::', '0', '90', '91')
# pool of two prefixes
srv_control.build_and_send_config_files('SSH', 'config-file')
srv_control.start_srv('DHCP', 'started')
misc.test_procedure()
srv_msg.client_does_include('Client', None, 'IA-PD')
srv_msg.client_does_include('Client', None, 'client-id')
srv_msg.client_does_include('Client', None, 'IA-NA')
srv_msg.client_send_msg('SOLICIT')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, 'ADVERTISE')
srv_msg.response_check_include_option('Response', None, '25')
misc.test_procedure()
srv_msg.client_copy_option('server-id')
srv_msg.client_copy_option('IA_PD')
srv_msg.client_copy_option('IA_NA')
srv_msg.client_does_include('Client', None, 'client-id')
srv_msg.client_send_msg('REQUEST')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, 'REPLY')
srv_msg.response_check_include_option('Response', None, '25')
srv_msg.response_check_option_content('Response', '25', None, 'sub-option', '26')
srv_msg.response_check_include_option('Response', None, '3')
srv_msg.response_check_option_content('Response', '3', None, 'sub-option', '5')
misc.test_procedure()
srv_msg.client_copy_option('IA_PD')
srv_msg.client_copy_option('IA_NA')
srv_msg.client_copy_option('server-id')
srv_msg.client_does_include('Client', None, 'client-id')
srv_msg.client_send_msg('RELEASE')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, 'REPLY')
srv_msg.response_check_include_option('Response', None, '25')
srv_msg.response_check_include_option('Response', None, '3')
# tests MUST NOT include 'NoBinding'...
references.references_check('RFC')
@pytest.mark.v6
@pytest.mark.dhcp6
@pytest.mark.PD
@pytest.mark.rfc3633
def test_prefix_delegation_IA_and_PD_multiple_request_release():
misc.test_setup()
srv_control.config_srv_subnet('3000::/64', '3000::1-3000::3')
srv_control.config_srv_prefix('2001:db8:1::', '0', '90', '91')
# pool of two prefixes
srv_control.build_and_send_config_files('SSH', 'config-file')
srv_control.start_srv('DHCP', 'started')
misc.test_procedure()
srv_msg.client_does_include('Client', None, 'IA-PD')
srv_msg.client_does_include('Client', None, 'client-id')
srv_msg.client_does_include('Client', None, 'IA-NA')
srv_msg.client_send_msg('SOLICIT')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, 'ADVERTISE')
srv_msg.response_check_include_option('Response', None, '25')
misc.test_procedure()
srv_msg.client_copy_option('server-id')
srv_msg.client_copy_option('IA_PD')
srv_msg.client_copy_option('IA_NA')
srv_msg.client_does_include('Client', None, 'client-id')
srv_msg.client_send_msg('REQUEST')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, 'REPLY')
srv_msg.response_check_include_option('Response', None, '25')
srv_msg.response_check_option_content('Response', '25', None, 'sub-option', '26')
srv_msg.response_check_include_option('Response', None, '3')
srv_msg.response_check_option_content('Response', '3', None, 'sub-option', '5')
misc.test_procedure()
srv_msg.client_copy_option('IA_PD')
srv_msg.client_copy_option('IA_NA')
srv_msg.client_copy_option('server-id')
srv_msg.client_does_include('Client', None, 'client-id')
srv_msg.client_send_msg('RELEASE')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, 'REPLY')
srv_msg.response_check_include_option('Response', None, '13')
# tests MUST NOT include 'NoBinding'...
misc.test_procedure()
srv_msg.generate_new('IA_PD')
srv_msg.generate_new('IA')
srv_msg.client_does_include('Client', None, 'IA-PD')
srv_msg.client_does_include('Client', None, 'client-id')
srv_msg.client_does_include('Client', None, 'IA-NA')
srv_msg.client_send_msg('SOLICIT')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, 'ADVERTISE')
srv_msg.response_check_include_option('Response', None, '25')
misc.test_procedure()
srv_msg.client_copy_option('server-id')
srv_msg.client_copy_option('IA_PD')
srv_msg.client_copy_option('IA_NA')
srv_msg.client_does_include('Client', None, 'client-id')
srv_msg.client_send_msg('REQUEST')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, 'REPLY')
srv_msg.response_check_include_option('Response', None, '25')
srv_msg.response_check_option_content('Response', '25', None, 'sub-option', '26')
srv_msg.response_check_include_option('Response', None, '3')
srv_msg.response_check_option_content('Response', '3', None, 'sub-option', '5')
misc.test_procedure()
srv_msg.generate_new('IA_PD')
srv_msg.generate_new('IA')
srv_msg.client_does_include('Client', None, 'IA-PD')
srv_msg.client_does_include('Client', None, 'client-id')
srv_msg.client_does_include('Client', None, 'IA-NA')
srv_msg.client_send_msg('SOLICIT')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, 'ADVERTISE')
srv_msg.response_check_include_option('Response', None, '25')
srv_msg.response_check_include_option('Response', None, '25')
srv_msg.response_check_option_content('Response', '25', None, 'sub-option', '26')
srv_msg.response_check_include_option('Response', None, '3')
srv_msg.response_check_option_content('Response', '3', None, 'sub-option', '5')
references.references_check('RFC')
@pytest.mark.v6
@pytest.mark.dhcp6
@pytest.mark.PD
@pytest.mark.rfc3633
def test_prefix_delegation_noprefixavail_release():
# assign 2 prefixes, try third, fail, release one, assign one more time with success.
misc.test_setup()
srv_control.config_srv_subnet('3000::/64', '3000::1-3000::3')
srv_control.config_srv_prefix('2001:db8:1::', '0', '90', '91')
# pool of two prefixes
srv_control.build_and_send_config_files('SSH', 'config-file')
srv_control.start_srv('DHCP', 'started')
misc.test_procedure()
srv_msg.client_does_include('Client', None, 'IA-PD')
srv_msg.client_does_include('Client', None, 'client-id')
srv_msg.client_send_msg('SOLICIT')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, 'ADVERTISE')
srv_msg.response_check_include_option('Response', None, '25')
misc.test_procedure()
srv_msg.client_copy_option('server-id')
srv_msg.client_copy_option('IA_PD')
srv_msg.client_does_include('Client', None, 'client-id')
srv_msg.client_send_msg('REQUEST')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, 'REPLY')
srv_msg.response_check_include_option('Response', None, '25')
srv_msg.response_check_option_content('Response', '25', None, 'sub-option', '26')
# success
misc.test_procedure()
srv_msg.generate_new('IA_PD')
srv_msg.client_does_include('Client', None, 'IA-PD')
srv_msg.client_does_include('Client', None, 'client-id')
srv_msg.client_send_msg('SOLICIT')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, 'ADVERTISE')
srv_msg.response_check_include_option('Response', None, '25')
misc.test_procedure()
srv_msg.client_copy_option('server-id')
srv_msg.client_copy_option('IA_PD')
srv_msg.client_does_include('Client', None, 'client-id')
srv_msg.client_send_msg('REQUEST')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, 'REPLY')
srv_msg.response_check_include_option('Response', None, '25')
srv_msg.response_check_option_content('Response', '25', None, 'sub-option', '26')
# both prefixes assigned.
misc.test_procedure()
srv_msg.client_save_option('IA_PD')
srv_msg.generate_new('IA_PD')
srv_msg.client_does_include('Client', None, 'IA-PD')
srv_msg.client_does_include('Client', None, 'client-id')
srv_msg.client_send_msg('SOLICIT')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, 'ADVERTISE')
srv_msg.response_check_include_option('Response', None, '25')
srv_msg.response_check_option_content('Response', '25', None, 'sub-option', '13')
srv_msg.response_check_suboption_content('Response', '13', '25', None, 'statuscode', '6')
misc.test_procedure()
srv_msg.client_add_saved_option('DONT ')
srv_msg.client_copy_option('server-id')
srv_msg.client_does_include('Client', None, 'client-id')
srv_msg.client_send_msg('RELEASE')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, 'REPLY')
srv_msg.response_check_include_option('Response', None, '25')
srv_msg.response_check_option_content('Response', '25', None, 'sub-option', '13')
srv_msg.response_check_suboption_content('Response', '13', '25', None, 'statuscode', '0')
misc.test_procedure()
srv_msg.generate_new('IA_PD')
srv_msg.client_does_include('Client', None, 'IA-PD')
srv_msg.client_does_include('Client', None, 'client-id')
srv_msg.client_send_msg('SOLICIT')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, 'ADVERTISE')
srv_msg.response_check_include_option('Response', None, '25')
srv_msg.response_check_option_content('Response', '25', None, 'sub-option', '26')
references.references_check('RFC')
@pytest.mark.v6
@pytest.mark.dhcp6
@pytest.mark.PD
@pytest.mark.rfc3633
def test_prefix_delegation_noprefixavail():
misc.test_setup()
srv_control.config_srv_subnet('3000::/64', '3000::1-3000::3')
srv_control.config_srv_prefix('2001:db8:1::', '0', '90', '91')
# pool of two prefixes
srv_control.build_and_send_config_files('SSH', 'config-file')
srv_control.start_srv('DHCP', 'started')
misc.test_procedure()
srv_msg.client_does_include('Client', None, 'IA-PD')
srv_msg.client_does_include('Client', None, 'client-id')
srv_msg.client_send_msg('SOLICIT')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, 'ADVERTISE')
srv_msg.response_check_include_option('Response', None, '25')
misc.test_procedure()
srv_msg.client_copy_option('server-id')
srv_msg.client_copy_option('IA_PD')
srv_msg.client_does_include('Client', None, 'client-id')
srv_msg.client_send_msg('REQUEST')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, 'REPLY')
srv_msg.response_check_include_option('Response', None, '25')
srv_msg.response_check_option_content('Response', '25', None, 'sub-option', '26')
misc.test_procedure()
srv_msg.generate_new('IA_PD')
srv_msg.client_does_include('Client', None, 'IA-PD')
srv_msg.client_does_include('Client', None, 'client-id')
srv_msg.client_send_msg('SOLICIT')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, 'ADVERTISE')
srv_msg.response_check_include_option('Response', None, '25')
misc.test_procedure()
srv_msg.client_copy_option('server-id')
srv_msg.client_copy_option('IA_PD')
srv_msg.client_does_include('Client', None, 'client-id')
srv_msg.client_send_msg('REQUEST')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, 'REPLY')
srv_msg.response_check_include_option('Response', None, '25')
srv_msg.response_check_option_content('Response', '25', None, 'sub-option', '26')
# both prefixes assigned.
misc.test_procedure()
srv_msg.generate_new('IA_PD')
srv_msg.client_does_include('Client', None, 'IA-PD')
srv_msg.client_does_include('Client', None, 'client-id')
srv_msg.client_send_msg('SOLICIT')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, 'ADVERTISE')
srv_msg.response_check_include_option('Response', None, '25')
srv_msg.response_check_option_content('Response', '25', None, 'sub-option', '13')
srv_msg.response_check_suboption_content('Response', '13', '25', None, 'statuscode', '6')
references.references_check('RFC')
@pytest.mark.v6
@pytest.mark.dhcp6
@pytest.mark.PD
@pytest.mark.rfc3633
def test_prefix_delegation_release_nobinding():
misc.test_setup()
srv_control.config_srv_subnet('3000::/32', '3000::1-3000::2')
srv_control.config_srv_prefix('2001:db8:1::', '0', '32', '33')
srv_control.build_and_send_config_files('SSH', 'config-file')
srv_control.start_srv('DHCP', 'started')
misc.test_procedure()
srv_msg.client_does_include('Client', None, 'IA-PD')
srv_msg.client_does_include('Client', None, 'client-id')
srv_msg.client_send_msg('SOLICIT')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, 'ADVERTISE')
srv_msg.response_check_include_option('Response', None, '25')
misc.test_procedure()
srv_msg.client_copy_option('server-id')
srv_msg.client_copy_option('IA_PD')
srv_msg.client_does_include('Client', None, 'client-id')
srv_msg.client_send_msg('RELEASE')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, 'REPLY')
srv_msg.response_check_include_option('Response', None, '25')
srv_msg.response_check_option_content('Response', '25', None, 'sub-option', '13')
srv_msg.response_check_suboption_content('Response', '13', '25', None, 'statuscode', '3')
references.references_check('RFC')
@pytest.mark.v6
@pytest.mark.dhcp6
@pytest.mark.PD
@pytest.mark.rfc3633
def test_prefix_delegation_release_dual_nobinding():
misc.test_setup()
srv_control.config_srv_subnet('3000::/32', '3000::1-3000::2')
srv_control.config_srv_prefix('2001:db8:1::', '0', '32', '33')
srv_control.build_and_send_config_files('SSH', 'config-file')
srv_control.start_srv('DHCP', 'started')
misc.test_procedure()
srv_msg.client_does_include('Client', None, 'IA-PD')
srv_msg.client_does_include('Client', None, 'client-id')
srv_msg.client_does_include('Client', None, 'IA-NA')
srv_msg.client_send_msg('SOLICIT')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, 'ADVERTISE')
srv_msg.response_check_include_option('Response', None, '25')
misc.test_procedure()
srv_msg.client_copy_option('server-id')
srv_msg.client_copy_option('IA_PD')
srv_msg.client_copy_option('IA_NA')
srv_msg.client_does_include('Client', None, 'client-id')
srv_msg.client_send_msg('RELEASE')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, 'REPLY')
srv_msg.response_check_include_option('Response', None, '25')
srv_msg.response_check_option_content('Response', '25', None, 'sub-option', '13')
srv_msg.response_check_suboption_content('Response', '13', '25', None, 'statuscode', '3')
srv_msg.response_check_include_option('Response', None, '3')
srv_msg.response_check_option_content('Response', '3', None, 'sub-option', '13')
srv_msg.response_check_suboption_content('Response', '13', '3', None, 'statuscode', '3')
references.references_check('RFC')
@pytest.mark.v6
@pytest.mark.dhcp6
@pytest.mark.PD
@pytest.mark.rfc3633
def test_prefix_delegation_release_nobinding2():
misc.test_setup()
srv_control.config_srv_subnet('3000::/32', '3000::1-3000::2')
srv_control.config_srv_prefix('2001:db8:1::', '0', '32', '33')
srv_control.build_and_send_config_files('SSH', 'config-file')
srv_control.start_srv('DHCP', 'started')
misc.test_procedure()
srv_msg.client_does_include('Client', None, 'IA-PD')
srv_msg.client_does_include('Client', None, 'client-id')
srv_msg.client_send_msg('SOLICIT')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, 'ADVERTISE')
srv_msg.response_check_include_option('Response', None, '25')
misc.test_procedure()
srv_msg.client_copy_option('server-id')
srv_msg.client_copy_option('IA_PD')
srv_msg.client_does_include('Client', None, 'client-id')
srv_msg.client_send_msg('REQUEST')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, 'REPLY')
srv_msg.response_check_include_option('Response', None, '25')
srv_msg.response_check_option_content('Response', '25', None, 'sub-option', '26')
misc.test_procedure()
srv_msg.client_copy_option('server-id')
srv_msg.client_save_option('IA_PD')
srv_msg.client_add_saved_option('DONT ')
srv_msg.client_does_include('Client', None, 'client-id')
srv_msg.client_send_msg('RELEASE')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, 'REPLY')
srv_msg.response_check_include_option('Response', None, '25')
# must not contain status code == 3.
misc.test_procedure()
srv_msg.client_copy_option('server-id')
srv_msg.client_add_saved_option('DONT ')
srv_msg.client_does_include('Client', None, 'client-id')
srv_msg.client_send_msg('RELEASE')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, 'REPLY')
srv_msg.response_check_include_option('Response', None, '25')
srv_msg.response_check_option_content('Response', '25', None, 'sub-option', '13')
srv_msg.response_check_suboption_content('Response', '13', '25', None, 'statuscode', '3')
references.references_check('RFC')
@pytest.mark.v6
@pytest.mark.dhcp6
@pytest.mark.PD
@pytest.mark.rfc3633
def test_prefix_delegation_onlyPD_relay():
misc.test_setup()
srv_control.config_srv_subnet('3000::/64', '3000::1-3000::3')
srv_control.config_srv_prefix('2001:db8:1::', '0', '90', '92')
srv_control.build_and_send_config_files('SSH', 'config-file')
srv_control.start_srv('DHCP', 'started')
misc.test_procedure()
srv_msg.client_does_include('Client', None, 'IA-PD')
srv_msg.client_does_include('Client', None, 'client-id')
srv_msg.client_send_msg('SOLICIT')
srv_msg.client_does_include('RelayAgent', None, 'interface-id')
srv_msg.create_relay_forward()
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, 'RELAYREPLY')
srv_msg.response_check_include_option('Response', None, '18')
srv_msg.response_check_include_option('Response', None, '9')
srv_msg.response_check_include_option('Response', None, '9')
# add test after Scapy fix
references.references_check('RFC')
@pytest.mark.v6
@pytest.mark.dhcp6
@pytest.mark.PD
@pytest.mark.rfc3633
def test_prefix_delegation_assign_saved_iapd():
misc.test_setup()
srv_control.config_srv_subnet('3000::/64', '3000::1-3000::3')
# two prefixes - 3000::/91; 3000::20:0:0/91;
srv_control.config_srv_prefix('2001:db8:1::', '0', '90', '91')
srv_control.build_and_send_config_files('SSH', 'config-file')
srv_control.start_srv('DHCP', 'started')
misc.test_procedure()
srv_msg.client_does_include('Client', None, 'IA-PD')
srv_msg.client_does_include('Client', None, 'client-id')
srv_msg.client_does_include('Client', None, 'IA-NA')
srv_msg.client_send_msg('SOLICIT')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, 'ADVERTISE')
srv_msg.response_check_include_option('Response', None, '25')
misc.test_procedure()
srv_msg.client_copy_option('server-id')
srv_msg.client_copy_option('IA_PD')
# 1st prefix
srv_msg.client_does_include('Client', None, 'client-id')
srv_msg.client_send_msg('REQUEST')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, 'REPLY')
srv_msg.response_check_include_option('Response', None, '25')
srv_msg.response_check_option_content('Response', '25', None, 'sub-option', '26')
misc.test_procedure()
srv_msg.generate_new('IA_PD')
srv_msg.client_does_include('Client', None, 'IA-PD')
srv_msg.client_does_include('Client', None, 'client-id')
srv_msg.client_send_msg('SOLICIT')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, 'ADVERTISE')
srv_msg.response_check_include_option('Response', None, '25')
misc.test_procedure()
srv_msg.client_copy_option('server-id')
srv_msg.client_save_option('IA_PD')
srv_msg.client_add_saved_option('DONT ')
# 2nd prefix
srv_msg.client_does_include('Client', None, 'client-id')
srv_msg.client_send_msg('REQUEST')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, 'REPLY')
srv_msg.response_check_include_option('Response', None, '25')
srv_msg.response_check_option_content('Response', '25', None, 'sub-option', '26')
# both prefixes assigned.
misc.test_setup()
srv_control.config_srv_subnet('3000::/64', '3000::1-3000::3')
srv_control.config_srv_prefix('2001:db8:1::', '0', '80', '95')
srv_control.build_and_send_config_files('SSH', 'config-file')
srv_control.start_srv('DHCP', 'started')
misc.test_procedure()
srv_msg.client_add_saved_option(None)
srv_msg.client_does_include('Client', None, 'client-id')
srv_msg.client_send_msg('SOLICIT')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, 'ADVERTISE')
srv_msg.response_check_include_option('Response', None, '25')
misc.test_procedure()
srv_msg.client_copy_option('server-id')
srv_msg.client_copy_option('IA_PD')
srv_msg.client_does_include('Client', None, 'client-id')
srv_msg.client_send_msg('REQUEST')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, 'REPLY')
srv_msg.response_check_include_option('Response', None, '25')
srv_msg.response_check_option_content('Response', '25', None, 'sub-option', '26')
srv_msg.response_check_suboption_content('Response',
'26',
'25',
None,
'prefix',
'2001:db8:1::20:0:0')
references.references_check('RFC')
@pytest.mark.v6
@pytest.mark.dhcp6
@pytest.mark.PD
@pytest.mark.rfc3633
@pytest.mark.disabled
def test_prefix_delegation_compare_prefixes_after_client_reboot():
misc.test_setup()
srv_control.config_srv_subnet('3000::/64', '3000::1-3000::300')
srv_control.config_srv_prefix('2001:db8:1::', '0', '90', '96')
srv_control.build_and_send_config_files('SSH', 'config-file')
srv_control.start_srv('DHCP', 'started')
misc.test_procedure()
srv_msg.client_does_include('Client', None, 'IA-PD')
srv_msg.client_does_include('Client', None, 'client-id')
srv_msg.client_send_msg('SOLICIT')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, 'ADVERTISE')
srv_msg.response_check_include_option('Response', None, '25')
misc.test_procedure()
srv_msg.client_copy_option('server-id')
srv_msg.client_copy_option('IA_PD')
srv_msg.client_does_include('Client', None, 'client-id')
srv_msg.client_send_msg('REQUEST')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, 'REPLY')
srv_msg.response_check_include_option('Response', None, '25')
srv_msg.response_check_option_content('Response', '25', None, 'sub-option', '26')
# save prefix value
prefix1 = srv_msg.get_suboption('IA_PD', 'IA-Prefix')[0]
misc.test_procedure()
srv_msg.client_does_include('Client', None, 'IA-PD')
# client reboot
srv_msg.client_does_include('Client', None, 'client-id')
srv_msg.client_send_msg('SOLICIT')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, 'ADVERTISE')
srv_msg.response_check_include_option('Response', None, '25')
misc.test_procedure()
srv_msg.client_copy_option('server-id')
srv_msg.client_copy_option('IA_PD')
srv_msg.client_does_include('Client', None, 'client-id')
srv_msg.client_send_msg('REQUEST')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, 'REPLY')
srv_msg.response_check_include_option('Response', None, '25')
srv_msg.response_check_option_content('Response', '25', None, 'sub-option', '26')
# compare assigned prefix with the saved one
prefix2 = srv_msg.get_suboption('IA_PD', 'IA-Prefix')[0]
assert prefix1.prefix == prefix2.prefix
references.references_check('RFC')
@pytest.mark.v6
@pytest.mark.dhcp6
@pytest.mark.PD
def test_prefix_delegation_just_PD_configured_PD_requested():
misc.test_setup()
srv_control.config_srv_subnet('3000::/64', '$(EMPTY)')
srv_control.config_srv_prefix('2001:db8:1::', '0', '90', '96')
srv_control.build_and_send_config_files('SSH', 'config-file')
srv_control.start_srv('DHCP', 'started')
misc.test_procedure()
srv_msg.client_does_include('Client', None, 'IA-PD')
srv_msg.client_does_include('Client', None, 'client-id')
srv_msg.client_send_msg('SOLICIT')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, 'ADVERTISE')
srv_msg.response_check_include_option('Response', None, '25')
srv_msg.response_check_option_content('Response', '25', None, 'sub-option', '26')
srv_msg.response_check_include_option('Response', 'NOT ', '3')
misc.test_procedure()
srv_msg.client_copy_option('server-id')
srv_msg.client_copy_option('IA_PD')
srv_msg.client_does_include('Client', None, 'client-id')
srv_msg.client_send_msg('REQUEST')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, 'REPLY')
srv_msg.response_check_include_option('Response', None, '25')
srv_msg.response_check_option_content('Response', '25', None, 'sub-option', '26')
srv_msg.response_check_include_option('Response', 'NOT ', '3')
@pytest.mark.v6
@pytest.mark.dhcp6
@pytest.mark.PD
def test_prefix_delegation_just_PD_configured_PD_and_IA_requested():
misc.test_setup()
srv_control.config_srv_subnet('3000::/64', '$(EMPTY)')
srv_control.config_srv_prefix('2001:db8:1::', '0', '90', '96')
srv_control.build_and_send_config_files('SSH', 'config-file')
srv_control.start_srv('DHCP', 'started')
misc.test_procedure()
srv_msg.client_does_include('Client', None, 'IA-PD')
srv_msg.client_does_include('Client', None, 'client-id')
srv_msg.client_does_include('Client', None, 'IA-NA')
srv_msg.client_send_msg('SOLICIT')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, 'ADVERTISE')
srv_msg.response_check_include_option('Response', None, '25')
srv_msg.response_check_option_content('Response', '25', None, 'sub-option', '26')
srv_msg.response_check_include_option('Response', None, '3')
srv_msg.response_check_option_content('Response', '3', None, 'sub-option', '13')
srv_msg.response_check_suboption_content('Response', '13', '3', None, 'statuscode', '2')
misc.test_procedure()
srv_msg.client_does_include('Client', None, 'IA-NA')
srv_msg.client_copy_option('server-id')
srv_msg.client_copy_option('IA_PD')
srv_msg.client_does_include('Client', None, 'client-id')
srv_msg.client_send_msg('REQUEST')
misc.pass_criteria()
srv_msg.send_wait_for_message('MUST', None, 'REPLY')
srv_msg.response_check_include_option('Response', None, '25')
srv_msg.response_check_option_content('Response', '25', None, 'sub-option', '26')
srv_msg.response_check_include_option('Response', None, '3')
srv_msg.response_check_option_content('Response', '3', None, 'sub-option', '13')
srv_msg.response_check_suboption_content('Response', '13', '3', None, 'statuscode', '2')
| 38.521505 | 93 | 0.715813 | 5,163 | 35,825 | 4.589773 | 0.029828 | 0.107102 | 0.111407 | 0.104233 | 0.972992 | 0.97084 | 0.97084 | 0.969827 | 0.967971 | 0.958645 | 0 | 0.029838 | 0.131863 | 35,825 | 929 | 94 | 38.562971 | 0.732099 | 0.020154 | 0 | 0.943448 | 0 | 0 | 0.170582 | 0 | 0 | 0 | 0 | 0 | 0.001379 | 1 | 0.022069 | false | 0.078621 | 0.006897 | 0 | 0.028966 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 8 |
c32e3a904ed33d8486773fc9aa670d076acbc954 | 45,215 | py | Python | Detectron/detectron/roi_data/minibatch_ok.py | yiningzeng/docker-detectron | 100072486b770b19918a344ec5b4e9a529232697 | [
"Apache-2.0"
] | null | null | null | Detectron/detectron/roi_data/minibatch_ok.py | yiningzeng/docker-detectron | 100072486b770b19918a344ec5b4e9a529232697 | [
"Apache-2.0"
] | null | null | null | Detectron/detectron/roi_data/minibatch_ok.py | yiningzeng/docker-detectron | 100072486b770b19918a344ec5b4e9a529232697 | [
"Apache-2.0"
] | null | null | null | # Copyright (c) 2017-present, Facebook, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
##############################################################################
#
# Based on:
# --------------------------------------------------------
# Fast R-CNN
# Copyright (c) 2015 Microsoft
# Licensed under The MIT License [see LICENSE for details]
# Written by Ross Girshick
# --------------------------------------------------------
"""Construct minibatches for Detectron networks."""
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
from __future__ import unicode_literals
import cv2
import logging
import numpy as np
import scipy.sparse
from detectron.core.config import cfg
import detectron.roi_data.fast_rcnn as fast_rcnn_roi_data
import detectron.roi_data.retinanet as retinanet_roi_data
import detectron.utils.boxes as box_utils
import detectron.roi_data.rpn as rpn_roi_data
import detectron.utils.blob as blob_utils
import random
WIDTH = 1280
HEIGHT = 960
REAL_CLASS = 4
logger = logging.getLogger(__name__)
BRIGHTNESS_CONTRAST = 0
def get_minibatch_blob_names(is_training=True):
"""Return blob names in the order in which they are read by the data loader.
"""
# data blob: holds a batch of N images, each with 3 channels
blob_names = ['data']
if cfg.RPN.RPN_ON:
# RPN-only or end-to-end Faster R-CNN
blob_names += rpn_roi_data.get_rpn_blob_names(is_training=is_training)
elif cfg.RETINANET.RETINANET_ON:
blob_names += retinanet_roi_data.get_retinanet_blob_names(
is_training=is_training
)
else:
# Fast R-CNN like models trained on precomputed proposals
blob_names += fast_rcnn_roi_data.get_fast_rcnn_blob_names(
is_training=is_training
)
return blob_names
def get_minibatch(roidb):
"""Given a roidb, construct a minibatch sampled from it."""
# We collect blobs from each image onto a list and then concat them into a
# single tensor, hence we initialize each blob to an empty list
blobs = {k: [] for k in get_minibatch_blob_names()}
# Get the input image blob, formatted for caffe2
im_blob, im_scales = _get_image_blob(roidb)
blobs['data'] = im_blob
if cfg.RPN.RPN_ON:
# RPN-only or end-to-end Faster/Mask R-CNN
valid = rpn_roi_data.add_rpn_blobs(blobs, im_scales, roidb)
elif cfg.RETINANET.RETINANET_ON:
im_width, im_height = im_blob.shape[3], im_blob.shape[2]
# im_width, im_height corresponds to the network input: padded image
# (if needed) width and height. We pass it as input and slice the data
# accordingly so that we don't need to use SampleAsOp
valid = retinanet_roi_data.add_retinanet_blobs(
blobs, im_scales, roidb, im_width, im_height
)
else:
# Fast R-CNN like models trained on precomputed proposals
valid = fast_rcnn_roi_data.add_fast_rcnn_blobs(blobs, im_scales, roidb)
return blobs, valid
def get_minibatch_s6(roidb,roidb_noclass):
"""Given a roidb, construct a minibatch sampled from it."""
# We collect blobs from each image onto a list and then concat them into a
# single tensor, hence we initialize each blob to an empty list
if 0:
random_bbox = dict()
random_bbox['kernel_size'] = 224
random_bbox['tl_x'] = random.randint(0, 800)
random_bbox['tl_y'] = random.randint(0, 800)
blobs = {k: [] for k in get_minibatch_blob_names()}
# Get the input image blob, formatted for caffe2
im_blob, im_scales,error_flag = _get_image_blob_s6(roidb,roidb_noclass)
blobs['data'] = im_blob
if cfg.RPN.RPN_ON:
# RPN-only or end-to-end Faster/Mask R-CNN
valid = rpn_roi_data.add_rpn_blobs(blobs, im_scales, roidb)
elif cfg.RETINANET.RETINANET_ON:
im_width, im_height = im_blob.shape[3], im_blob.shape[2]
# im_width, im_height corresponds to the network input: padded image
# (if needed) width and height. We pass it as input and slice the data
# accordingly so that we don't need to use SampleAsOp
valid = retinanet_roi_data.add_retinanet_blobs(
blobs, im_scales, roidb, im_width, im_height
)
else:
# Fast R-CNN like models trained on precomputed proposals
valid = fast_rcnn_roi_data.add_fast_rcnn_blobs(blobs, im_scales, roidb)
return blobs, valid
def contrast_brightness_image(src1, a=1.2, g=10):
h, w, ch = src1.shape
src2 = np.zeros([h, w, ch], src1.dtype)
dst = cv2.addWeighted(src1, a, src2, 1 - a, g)
cv2.imshow("con-bri-demo", dst)
cv2.waitKey(0)
cv2.destroyAllWindows()
return dst
def _get_image_blob(roidb):
"""Builds an input blob from the images in the roidb at the specified
scales.
"""
num_images = len(roidb)
# Sample random scales to use for each image in this batch
scale_inds = np.random.randint(
0, high=len(cfg.TRAIN.SCALES), size=num_images
)
processed_ims = []
im_scales = []
for i in range(num_images):
im = cv2.imread(roidb[i]['image'])
if 0:
im_tmp = cv2.imread(roidb[i]['image'])
random_flag = random.randint(0, 1)
if BRIGHTNESS_CONTRAST and random_flag :
im = contrast_brightness_image(im_tmp)
else:
im = im_tmp.copy()
assert im is not None, \
'Failed to read image \'{}\''.format(roidb[i]['image'])
if roidb[i]['flipped']:
im = im[:, ::-1, :]
target_size = cfg.TRAIN.SCALES[scale_inds[i]]
im, im_scale = blob_utils.prep_im_for_blob(
im, cfg.PIXEL_MEANS, target_size, cfg.TRAIN.MAX_SIZE
)
im_scales.append(im_scale)
processed_ims.append(im)
# Create a blob to hold the input images
blob = blob_utils.im_list_to_blob(processed_ims)
return blob, im_scales
def mat_inter(box1,box2):
# box=(xA,yA,xB,yB)
x01, y01, x02, y02 = box1
x11, y11, x12, y12 = box2
lx = abs((x01 + x02) / 2 - (x11 + x12) / 2)
ly = abs((y01 + y02) / 2 - (y11 + y12) / 2)
sax = abs(x01 - x02)
sbx = abs(x11 - x12)
say = abs(y01 - y02)
sby = abs(y11 - y12)
if lx <= (sax + sbx) / 2 and ly <= (say + sby) / 2:
return True
else:
return False
def solve_coincide(box1,box2):
# box=(xA,yA,xB,yB)
if mat_inter(box1,box2)==True:
x01, y01, x02, y02 = box1
x11, y11, x12, y12 = box2
col=min(x02,x12)-max(x01,x11)
row=min(y02,y12)-max(y01,y11)
intersection=col*row
area1=(x02-x01)*(y02-y01)
area2=(x12-x11)*(y12-y11)
coincide=intersection/area2#(area1+area2-intersection)
return coincide
else:
return False
def compute_bbox_regression_targets(entry):
"""Compute bounding-box regression targets for an image."""
# Indices of ground-truth ROIs
rois = entry['boxes']
overlaps = entry['max_overlaps']
labels = entry['max_classes']
gt_inds = np.where((entry['gt_classes'] > 0) & (entry['is_crowd'] == 0))[0]
# Targets has format (class, tx, ty, tw, th)
targets = np.zeros((rois.shape[0], 5), dtype=np.float32)
if len(gt_inds) == 0:
# Bail if the image has no ground-truth ROIs
return targets
# Indices of examples for which we try to make predictions
ex_inds = np.where(overlaps >= cfg.TRAIN.BBOX_THRESH)[0]
# Get IoU overlap between each ex ROI and gt ROI
ex_gt_overlaps = box_utils.bbox_overlaps(
rois[ex_inds, :].astype(dtype=np.float32, copy=False),
rois[gt_inds, :].astype(dtype=np.float32, copy=False))
# Find which gt ROI each ex ROI has max overlap with:
# this will be the ex ROI's gt target
gt_assignment = ex_gt_overlaps.argmax(axis=1)
gt_rois = rois[gt_inds[gt_assignment], :]
ex_rois = rois[ex_inds, :]
# Use class "1" for all boxes if using class_agnostic_bbox_reg
targets[ex_inds, 0] = (
1 if cfg.MODEL.CLS_AGNOSTIC_BBOX_REG else labels[ex_inds])
targets[ex_inds, 1:] = box_utils.bbox_transform_inv(
ex_rois, gt_rois, cfg.MODEL.BBOX_REG_WEIGHTS)
return targets
def _get_image_blob_s6_0(roidb,roidb_noclass1):
"""Builds an input blob from the images in the roidb at the specified
scales.
"""
num_images = len(roidb)
# Sample random scales to use for each image in this batch
scale_inds = np.random.randint(
0, high=len(cfg.TRAIN.SCALES), size=num_images
)
processed_ims = []
im_scales = []
error_flag = [0,0]
for i in range(num_images):
roidb_noclass = roidb_noclass1.copy()
if roidb[i][u'image'].split('/')[-1]==u'test.jpg':
random_bbox = dict()
random_bbox['kernel_size'] = 224
random_bbox['tl_x'] = 0
random_bbox['tl_y'] = 0
x0 = random_bbox['tl_x']
x1 = random_bbox['tl_x'] + random_bbox['kernel_size']
y0 = random_bbox['tl_y']
y1 = random_bbox['tl_y'] + random_bbox['kernel_size']
im = cv2.imread(roidb[i]['image'])[y0:y1, x0:x1]
im = cv2.resize(im,(WIDTH,HEIGHT))
#cv2.imwrite('/home/icubic/aa.png',im)
error_flag[i] = 0
roidb[i] = roidb_noclass.copy()
roidb[i][u'height'] = HEIGHT
roidb[i][u'width'] = WIDTH
else:
if 1:
real_class = []#roidb[i]['gt_classes'][0]
num_real_class = len(roidb[i]['gt_classes'])
random_bbox = dict()
random_bbox['kernel_size'] = 224
random_bbox['tl_x'] = random.randint(0, 800)
random_bbox['tl_y'] = random.randint(0, 800)
x0 = random_bbox['tl_x']
x1 = random_bbox['tl_x'] + random_bbox['kernel_size']
y0 = random_bbox['tl_y']
y1 = random_bbox['tl_y'] + random_bbox['kernel_size']
im = cv2.imread(roidb[i]['image'])[y0:y1, x0:x1]
im = cv2.resize(im, (WIDTH, HEIGHT))
sum_inside_overlaps = 0
boxes_inside_overlaps = []
for i_roidb,sub_boxes in enumerate(roidb[i][u'boxes']):
crop_x0 = int(sub_boxes[0])
crop_y0 = int(sub_boxes[1])
crop_x1 = int(sub_boxes[2])
crop_y1 = int(sub_boxes[3])
#real_x0 = float(crop_x0 - x0)*1024/224 # float(crop_x0) / 1024 * 224
#real_y0 = float(crop_y0 - y0)*1024/224 # float(crop_y0) / 1024 * 224
#real_x1 = float(crop_x1 - x0)*1024/224 # float(crop_x1) / 1024 * 224
#real_y1 = float(crop_y1 - y0)*1024/224
overlaps_rate = solve_coincide((x0, y0, x1, y1), (crop_x0, crop_y0, crop_x1, crop_y1))
if overlaps_rate>0.9:
sum_inside_overlaps = sum_inside_overlaps + 1
#real_x0 = crop_x0 - x0 # float(crop_x0) / 1024 * 224
#real_y0 = crop_y0 - y0 # float(crop_y0) / 1024 * 224
#real_x1 = crop_x1 - x0 # float(crop_x1) / 1024 * 224
#real_y1 = crop_y1 - y0
real_x0 = float(crop_x0 - x0)*WIDTH/224 # float(crop_x0) / 1024 * 224
real_y0 = float(crop_y0 - y0)*HEIGHT/224 # float(crop_y0) / 1024 * 224
real_x1 = float(crop_x1 - x0)*WIDTH/224 # float(crop_x1) / 1024 * 224
real_y1 = float(crop_y1 - y0)*HEIGHT/224
if real_x0<0:
real_x0 = 0
if real_x0>WIDTH:
real_x0 = WIDTH
if real_x1<0:
real_x1 = 0
if real_x1>WIDTH:
real_x1 = WIDTH
if real_y0<0:
real_y0 = 0
if real_y0>HEIGHT:
real_y0 = HEIGHT
if real_y1<0:
real_y1 = 0
if real_y1>HEIGHT:
real_y1 = HEIGHT
boxes_inside_overlaps.append([real_x0, real_y0, real_x1, real_y1])
real_class.append(roidb[i]['gt_classes'][i_roidb])
#cv2.rectangle(im, (int(real_x0), int(real_y0)),
#(int(real_x1), int(real_y1)), (255, 0, 255))
#cv2.imwrite('/home/icubic/daily_work/code/circruit/new/result/uu.png', im)
#a = roidb[i]['gt_overlaps'].toarray()
if sum_inside_overlaps>0:
num_valid_objs = sum_inside_overlaps*1
boxes = np.zeros((num_valid_objs, 4), dtype=np.float32)
gt_classes = np.zeros((num_valid_objs), dtype=np.int32)
gt_overlaps = np.zeros((num_valid_objs, 3), dtype=np.float32)
box_to_gt_ind_map = np.zeros((num_valid_objs), dtype=np.int32)
is_crowd = np.zeros((num_valid_objs), dtype=np.bool)
for ix in range(num_valid_objs):
gt_classes[ix] = real_class[ix]#real_class*1
try:
gt_overlaps[ix, real_class] = 1.0
except:
print('error')
is_crowd[ix] = False
box_to_gt_ind_map[ix] = ix
for i_index in range(4):
boxes[ix,i_index] = boxes_inside_overlaps[ix][i_index]
#for ix in range(num_valid_objs):
#box_to_gt_ind_map[ix] = ix
#cls = real_class*1
roidb_noclass['boxes'] = np.append(roidb_noclass['boxes'], boxes, axis=0)
roidb_noclass['gt_classes'] = np.append(roidb_noclass['gt_classes'], gt_classes)
#mm = np.append(
# roidb_noclass['gt_overlaps'].toarray(), gt_overlaps,axis=0)
roidb_noclass['gt_overlaps'] = np.append(
roidb_noclass['gt_overlaps'].toarray(), gt_overlaps)
roidb_noclass['gt_overlaps'] = scipy.sparse.csr_matrix(roidb_noclass['gt_overlaps'])
#mm = np.append(mm, gt_overlaps, axis=0)
#roidb_noclass['gt_overlaps'] = scipy.sparse.csr_matrix(mm)
roidb_noclass['is_crowd'] = np.append(roidb_noclass['is_crowd'], is_crowd)
roidb_noclass['box_to_gt_ind_map'] = np.append(roidb_noclass['box_to_gt_ind_map'], box_to_gt_ind_map)
gt_overlaps = roidb_noclass['gt_overlaps'].toarray()
# max overlap with gt over classes (columns)
max_overlaps = gt_overlaps.max(axis=1)
# gt class that had the max overlap
max_classes = gt_overlaps.argmax(axis=1)
roidb_noclass['max_classes'] = max_classes
roidb_noclass['max_overlaps'] = max_overlaps
# sanity checks
# if max overlap is 0, the class must be background (class 0)
zero_inds = np.where(max_overlaps == 0)[0]
assert all(max_classes[zero_inds] == 0)
# if max overlap > 0, the class must be a fg class (not class 0)
nonzero_inds = np.where(max_overlaps > 0)[0]
assert all(max_classes[nonzero_inds] != 0)
roidb_noclass['bbox_targets'] = compute_bbox_regression_targets(roidb_noclass)
roidb[i] = roidb_noclass.copy()
roidb[i][u'height'] = HEIGHT
roidb[i][u'width'] = WIDTH
else:
roidb[i] = roidb_noclass.copy()
roidb[i][u'height'] = HEIGHT
roidb[i][u'width'] = WIDTH
if 0:
if sum_inside_overlaps==0:
roidb[i] = roidb_noclass['0'].copy()
roidb[i][u'height'] = 1024
roidb[i][u'width'] = 1024
if sum_inside_overlaps==1:
num_valid_objs = 1
roidb[i] = roidb_noclass['1'].copy()
a = roidb[i]['gt_overlaps'].toarray()
#for i_inside in enumerate(sum_inside_overlaps)
if sum_inside_overlaps==2:
num_valid_objs = 2
roidb[i] = roidb_noclass['2'].copy()
a = roidb[i]['gt_overlaps'].toarray()
if sum_inside_overlaps==3:
num_valid_objs = 3
roidb[i] = roidb_noclass['3'].copy()
a = roidb[i]['gt_overlaps'].toarray()
if 0:
crop_x0 = int(roidb[i][u'boxes'][0][0])
crop_y0 = int(roidb[i][u'boxes'][0][1])
crop_x1 = int(roidb[i][u'boxes'][0][2])
crop_y1 = int(roidb[i][u'boxes'][0][3])
crop_w = crop_x1 - crop_x0
crop_h = crop_y1 - crop_y0
random_bbox = dict()
random_bbox['kernel_size'] = 224
random_bbox['tl_x'] = random.randint(0, 800)
random_bbox['tl_y'] = random.randint(0, 800)
x0 = random_bbox['tl_x']
x1 = random_bbox['tl_x'] + random_bbox['kernel_size']
y0 = random_bbox['tl_y']
y1 = random_bbox['tl_y'] + random_bbox['kernel_size']
#real_x0 = crop_x0-x0#float(crop_x0) / 1024 * 224
#real_y0 = crop_y0-y0#float(crop_y0) / 1024 * 224
#real_x1 = 1024#float(crop_x1) / 1024 * 224
#real_y1 = 1024#float(crop_y1) / 1024 * 224
overlaps_rate = solve_coincide((x0,y0,x1,y1),(crop_x0,crop_y0,crop_x1,crop_y1))
im = cv2.imread(roidb[i]['image'])[y0:y1, x0:x1]
#im = cv2.resize(im, (1024, 1024))
if overlaps_rate>0.9:
real_x0 = crop_x0 - x0 # float(crop_x0) / 1024 * 224
real_y0 = crop_y0 - y0 # float(crop_y0) / 1024 * 224
real_x1 = crop_x1 - x0 # float(crop_x1) / 1024 * 224
real_y1 = crop_y1 - y0
roidb[i][u'boxes'][0][0] = real_x0
roidb[i][u'boxes'][0][1] = real_y0
roidb[i][u'boxes'][0][2] = real_x1
roidb[i][u'boxes'][0][3] = real_y1
roidb[i][u'height'] = 224
roidb[i][u'width'] = 224
error_flag[i] = 1
#cv2.imwrite('/home/icubic/daily_work/code/Detectron/detectron/datasets/data/s6_test/aa.png',im)
else:
roidb[i] = roidb_noclass.copy()
roidb[i][u'height'] = 224
roidb[i][u'width'] = 224
error_flag[i] = 0
#print('aa')
assert im is not None, \
'Failed to read image \'{}\''.format(roidb[i]['image'])
if roidb[i]['flipped']:
im = im[:, ::-1, :]
target_size = cfg.TRAIN.SCALES[scale_inds[i]]
im, im_scale = blob_utils.prep_im_for_blob(
im, cfg.PIXEL_MEANS, target_size, cfg.TRAIN.MAX_SIZE
)
im_scales.append(im_scale)
processed_ims.append(im)
# Create a blob to hold the input images
blob = blob_utils.im_list_to_blob(processed_ims)
return blob, im_scales,error_flag
def _get_image_blob_s6(roidb,roidb_noclass1):
"""Builds an input blob from the images in the roidb at the specified
scales.
"""
num_images = len(roidb)
# Sample random scales to use for each image in this batch
scale_inds = np.random.randint(
0, high=len(cfg.TRAIN.SCALES), size=num_images
)
processed_ims = []
im_scales = []
error_flag = [0,0]
for i in range(num_images):
roidb_noclass = roidb_noclass1.copy()
if roidb[i][u'image'].split('/')[-1]==u'test.png': #test.jpg
random_bbox = dict()
random_bbox['kernel_size_x'] = int(WIDTH / 5)
random_bbox['kernel_size_y'] = int(HEIGHT / 5)
random_bbox['tl_x'] = 0
random_bbox['tl_y'] = 0
x0 = random_bbox['tl_x']
x1 = random_bbox['tl_x'] + random_bbox['kernel_size_x']
y0 = random_bbox['tl_y']
y1 = random_bbox['tl_y'] + random_bbox['kernel_size_y']
im = cv2.imread(roidb[i]['image'])[y0:y1, x0:x1]
im = cv2.resize(im, (WIDTH, HEIGHT))
# cv2.imwrite('/home/icubic/aa.png',im)
error_flag[i] = 0
roidb[i] = roidb_noclass.copy()
roidb[i][u'height'] = HEIGHT
roidb[i][u'width'] = WIDTH
else:
if 1:
if len(roidb[i][u'boxes']) == 0:
random_bbox = dict()
random_flag = random.randint(0, 1)
real_yuanlai_width = roidb[i][u'width'] * 1
real_yuanlai_height = roidb[i][u'height'] * 1
width_ratio = float(real_yuanlai_width) / 1024
height_after_ratio = int(float(real_yuanlai_height) / width_ratio)
width_after_ratio = 1024
if 1:
if random_flag == 0:
#print(random_flag)
random_bbox['kernel_size_x'] = int(WIDTH / 5)
random_bbox['kernel_size_y'] = int(HEIGHT / 5)
random_X = width_after_ratio - random_bbox['kernel_size_x']
random_Y = height_after_ratio - random_bbox['kernel_size_y']
try:
random_bbox['tl_x'] = random.randint(0, random_X)
random_bbox['tl_y'] = random.randint(0, random_Y)
except:
print('aa')
x0 = random_bbox['tl_x']
x1 = random_bbox['tl_x'] + random_bbox['kernel_size_x']
y0 = random_bbox['tl_y']
y1 = random_bbox['tl_y'] + random_bbox['kernel_size_y']
im = cv2.imread(roidb[i][u'image'])
im = cv2.resize(im, (width_after_ratio, height_after_ratio))[y0:y1, x0:x1]
im = cv2.resize(im, (WIDTH, HEIGHT))
roidb[i] = roidb_noclass.copy()
roidb[i][u'height'] = HEIGHT
roidb[i][u'width'] = WIDTH
else:
#print(random_flag)
random_bbox['kernel_size_x'] = int(float(width_after_ratio) / 1.2)
random_bbox['kernel_size_y'] = int(float(height_after_ratio) / 1.2)
random_X = width_after_ratio - random_bbox['kernel_size_x']
random_Y = height_after_ratio - random_bbox['kernel_size_y']
random_bbox['tl_x'] = random.randint(0, random_X)
random_bbox['tl_y'] = random.randint(0, random_Y)
x0 = random_bbox['tl_x']
x1 = random_bbox['tl_x'] + random_bbox['kernel_size_x']
y0 = random_bbox['tl_y']
y1 = random_bbox['tl_y'] + random_bbox['kernel_size_y']
im = cv2.imread(roidb[i][u'image'])
im = cv2.resize(im, (width_after_ratio, height_after_ratio))[y0:y1, x0:x1]
im = cv2.resize(im, (WIDTH, HEIGHT))
roidb[i] = roidb_noclass.copy()
roidb[i][u'height'] = HEIGHT
roidb[i][u'width'] = WIDTH
else:
im = cv2.imread(roidb[i][u'image'])
im = cv2.resize(im, (WIDTH, HEIGHT))
roidb[i] = roidb_noclass.copy()
roidb[i][u'height'] = HEIGHT
roidb[i][u'width'] = WIDTH
# cv2.imwrite('/home/icubic/daily_work/circruit_model/tmp_images/aa.png',im)
assert im is not None, \
'Failed to read image \'{}\''.format(roidb[i]['image'])
if roidb[i]['flipped']:#for image flip background training
im = im[:, ::-1, :]
target_size = cfg.TRAIN.SCALES[scale_inds[i]]
im, im_scale = blob_utils.prep_im_for_blob(
im, cfg.PIXEL_MEANS, target_size, cfg.TRAIN.MAX_SIZE
)
im_scales.append(im_scale)
processed_ims.append(im)
continue
real_yuanlai_width = roidb[i][u'width'] * 1
real_yuanlai_height = roidb[i][u'height'] * 1
width_ratio = float(real_yuanlai_width) / 1024
height_after_ratio = int(float(real_yuanlai_height) / width_ratio)
width_after_ratio = 1024
real_class = []#roidb[i]['gt_classes'][0]
num_real_class = len(roidb[i]['gt_classes'])
random_bbox = dict()
random_bbox['kernel_size_x'] = int(WIDTH / 5)
random_bbox['kernel_size_y'] = int(HEIGHT / 5)
if 1:
w_tongji = 0
h_tongji = 0
for i_tongji, sub_boxes_tongji in enumerate(roidb[i][u'boxes']):
crop_x0_tongji = int(sub_boxes_tongji[0] / real_yuanlai_width * width_after_ratio)
crop_y0_tongji = int(sub_boxes_tongji[1] / real_yuanlai_height * height_after_ratio)
crop_x1_tongji = int(sub_boxes_tongji[2] / real_yuanlai_width * width_after_ratio)
crop_y1_tongji = int(sub_boxes_tongji[3] / real_yuanlai_height * height_after_ratio)
w_tongji = crop_x1_tongji - crop_x0_tongji
h_tongji = crop_y1_tongji - crop_y0_tongji
if w_tongji>int(WIDTH / 5) or h_tongji>int(HEIGHT / 5):
random_bbox['kernel_size_x'] = int(float(width_after_ratio) / 1.2)
random_bbox['kernel_size_y'] = int(float(height_after_ratio) / 1.2)
random_X = width_after_ratio - random_bbox['kernel_size_x']
random_Y = height_after_ratio - random_bbox['kernel_size_y']
random_bbox['tl_x'] = random.randint(0, random_X)
random_bbox['tl_y'] = random.randint(0, random_Y)
x0 = random_bbox['tl_x']
x1 = random_bbox['tl_x'] + random_bbox['kernel_size_x']
y0 = random_bbox['tl_y']
y1 = random_bbox['tl_y'] + random_bbox['kernel_size_y']
try:
im = cv2.imread(roidb[i][u'image'])
except:
im = cv2.imread(roidb[i][u'image'])
im = cv2.resize(im, (width_after_ratio, height_after_ratio))[y0:y1, x0:x1]
im = cv2.resize(im, (WIDTH, HEIGHT))
sum_inside_overlaps = 0
boxes_inside_overlaps = []
for i_roidb,sub_boxes in enumerate(roidb[i][u'boxes']):
crop_x0 = int(sub_boxes[0]/real_yuanlai_width*width_after_ratio)
crop_y0 = int(sub_boxes[1]/real_yuanlai_height*height_after_ratio)
crop_x1 = int(sub_boxes[2]/real_yuanlai_width*width_after_ratio)
crop_y1 = int(sub_boxes[3]/real_yuanlai_height*height_after_ratio)
#real_x0 = float(crop_x0 - x0)*1024/224 # float(crop_x0) / 1024 * 224
#real_y0 = float(crop_y0 - y0)*1024/224 # float(crop_y0) / 1024 * 224
#real_x1 = float(crop_x1 - x0)*1024/224 # float(crop_x1) / 1024 * 224
#real_y1 = float(crop_y1 - y0)*1024/224
overlaps_rate = solve_coincide((x0, y0, x1, y1), (crop_x0, crop_y0, crop_x1, crop_y1))
if overlaps_rate>0.9:
sum_inside_overlaps = sum_inside_overlaps + 1
#real_x0 = crop_x0 - x0 # float(crop_x0) / 1024 * 224
#real_y0 = crop_y0 - y0 # float(crop_y0) / 1024 * 224
#real_x1 = crop_x1 - x0 # float(crop_x1) / 1024 * 224
#real_y1 = crop_y1 - y0
real_x0 = float(crop_x0 - x0)*WIDTH/(random_bbox['kernel_size_x']) # float(crop_x0) / 1024 * 224
real_y0 = float(crop_y0 - y0)*HEIGHT/(random_bbox['kernel_size_y']) # float(crop_y0) / 1024 * 224
real_x1 = float(crop_x1 - x0)*WIDTH/(random_bbox['kernel_size_x']) # float(crop_x1) / 1024 * 224
real_y1 = float(crop_y1 - y0)*HEIGHT/(random_bbox['kernel_size_y'])
if real_x0<0:
real_x0 = 0
if real_x0>WIDTH:
real_x0 = WIDTH
if real_x1<0:
real_x1 = 0
if real_x1>WIDTH:
real_x1 = WIDTH
if real_y0<0:
real_y0 = 0
if real_y0>HEIGHT:
real_y0 = HEIGHT
if real_y1<0:
real_y1 = 0
if real_y1>HEIGHT:
real_y1 = HEIGHT
#cv2.rectangle(im, (int(real_x0), int(real_y0)), (int(real_x1), int(real_y1)), (0, 255, 255), 3)
#cv2.imwrite('/home/icubic/daily_work/code/Detectron/detectron/datasets/data/shanghai/aa.png',im)
boxes_inside_overlaps.append([real_x0, real_y0, real_x1, real_y1])
real_class.append(roidb[i]['gt_classes'][i_roidb])
#cv2.rectangle(im, (int(real_x0), int(real_y0)),
#(int(real_x1), int(real_y1)), (255, 0, 255))
#cv2.imwrite('/home/icubic/daily_work/code/circruit/new/result/uu.png', im)
#a = roidb[i]['gt_overlaps'].toarray()
if sum_inside_overlaps>0 :
num_valid_objs = sum_inside_overlaps*1
boxes = np.zeros((num_valid_objs, 4), dtype=np.float32)
gt_classes = np.zeros((num_valid_objs), dtype=np.int32)
gt_overlaps = np.zeros((num_valid_objs, REAL_CLASS), dtype=np.float32)
box_to_gt_ind_map = np.zeros((num_valid_objs), dtype=np.int32)
is_crowd = np.zeros((num_valid_objs), dtype=np.bool)
for ix in range(num_valid_objs):
gt_classes[ix] = real_class[ix]#real_class*1
try:
gt_overlaps[ix, real_class] = 1.0
except:
print('error')
is_crowd[ix] = False
box_to_gt_ind_map[ix] = ix
for i_index in range(4):
boxes[ix,i_index] = boxes_inside_overlaps[ix][i_index]
#for ix in range(num_valid_objs):
#box_to_gt_ind_map[ix] = ix
#cls = real_class*1
roidb_noclass['boxes'] = np.append(roidb_noclass['boxes'], boxes, axis=0)
roidb_noclass['gt_classes'] = np.append(roidb_noclass['gt_classes'], gt_classes)
#mm = np.append(
# roidb_noclass['gt_overlaps'].toarray(), gt_overlaps,axis=0)
roidb_noclass['gt_overlaps'] = np.append(
roidb_noclass['gt_overlaps'].toarray(), gt_overlaps)
roidb_noclass['gt_overlaps'] = scipy.sparse.csr_matrix(roidb_noclass['gt_overlaps'])
#mm = np.append(mm, gt_overlaps, axis=0)
#roidb_noclass['gt_overlaps'] = scipy.sparse.csr_matrix(mm)
roidb_noclass['is_crowd'] = np.append(roidb_noclass['is_crowd'], is_crowd)
roidb_noclass['box_to_gt_ind_map'] = np.append(roidb_noclass['box_to_gt_ind_map'], box_to_gt_ind_map)
gt_overlaps = roidb_noclass['gt_overlaps'].toarray()
# max overlap with gt over classes (columns)
max_overlaps = gt_overlaps.max(axis=1)
# gt class that had the max overlap
max_classes = gt_overlaps.argmax(axis=1)
roidb_noclass['max_classes'] = max_classes
roidb_noclass['max_overlaps'] = max_overlaps
# sanity checks
# if max overlap is 0, the class must be background (class 0)
zero_inds = np.where(max_overlaps == 0)[0]
assert all(max_classes[zero_inds] == 0)
# if max overlap > 0, the class must be a fg class (not class 0)
nonzero_inds = np.where(max_overlaps > 0)[0]
assert all(max_classes[nonzero_inds] != 0)
roidb_noclass['bbox_targets'] = compute_bbox_regression_targets(roidb_noclass)
roidb[i] = roidb_noclass.copy()
roidb[i][u'height'] = HEIGHT
roidb[i][u'width'] = WIDTH
else:
roidb[i] = roidb_noclass.copy()
roidb[i][u'height'] = HEIGHT
roidb[i][u'width'] = WIDTH
#print('aa')
assert im is not None, \
'Failed to read image \'{}\''.format(roidb[i]['image'])
if roidb[i]['flipped']:
im = im[:, ::-1, :]
target_size = cfg.TRAIN.SCALES[scale_inds[i]]
im, im_scale = blob_utils.prep_im_for_blob(
im, cfg.PIXEL_MEANS, target_size, cfg.TRAIN.MAX_SIZE
)
im_scales.append(im_scale)
processed_ims.append(im)
# Create a blob to hold the input images
blob = blob_utils.im_list_to_blob(processed_ims)
return blob, im_scales,error_flag
def _get_image_blob_s6_ok(roidb,roidb_noclass1):
"""Builds an input blob from the images in the roidb at the specified
scales.
"""
num_images = len(roidb)
# Sample random scales to use for each image in this batch
scale_inds = np.random.randint(
0, high=len(cfg.TRAIN.SCALES), size=num_images
)
processed_ims = []
im_scales = []
error_flag = [0,0]
for i in range(num_images):
roidb_noclass = roidb_noclass1.copy()
if roidb[i][u'image'].split('/')[-1]==u'test.jpg':
random_bbox = dict()
random_bbox['kernel_size_x'] = int(WIDTH / 5)
random_bbox['kernel_size_y'] = int(HEIGHT / 5)
random_bbox['tl_x'] = 0
random_bbox['tl_y'] = 0
x0 = random_bbox['tl_x']
x1 = random_bbox['tl_x'] + random_bbox['kernel_size_x']
y0 = random_bbox['tl_y']
y1 = random_bbox['tl_y'] + random_bbox['kernel_size_y']
im = cv2.imread(roidb[i]['image'])[y0:y1, x0:x1]
im = cv2.resize(im, (WIDTH, HEIGHT))
# cv2.imwrite('/home/icubic/aa.png',im)
error_flag[i] = 0
roidb[i] = roidb_noclass.copy()
roidb[i][u'height'] = HEIGHT
roidb[i][u'width'] = WIDTH
else:
if 1:
real_yuanlai_width = roidb[i][u'width'] * 1
real_yuanlai_height = roidb[i][u'height'] * 1
width_ratio = float(real_yuanlai_width) / 1024
height_after_ratio = int(float(real_yuanlai_height) / width_ratio)
width_after_ratio = 1024
real_class = []#roidb[i]['gt_classes'][0]
num_real_class = len(roidb[i]['gt_classes'])
random_bbox = dict()
random_bbox['kernel_size_x'] = int(WIDTH / 5)
random_bbox['kernel_size_y'] = int(HEIGHT / 5)
random_X = width_after_ratio - random_bbox['kernel_size_x']
random_Y = height_after_ratio - random_bbox['kernel_size_y']
random_bbox['tl_x'] = random.randint(0, random_X)
random_bbox['tl_y'] = random.randint(0, random_Y)
x0 = random_bbox['tl_x']
x1 = random_bbox['tl_x'] + random_bbox['kernel_size_x']
y0 = random_bbox['tl_y']
y1 = random_bbox['tl_y'] + random_bbox['kernel_size_y']
im = cv2.imread(roidb[i]['image'])
im = cv2.resize(im, (width_after_ratio, height_after_ratio))[y0:y1, x0:x1]
im = cv2.resize(im, (WIDTH, HEIGHT))
sum_inside_overlaps = 0
boxes_inside_overlaps = []
for i_roidb,sub_boxes in enumerate(roidb[i][u'boxes']):
crop_x0 = int(sub_boxes[0]/real_yuanlai_width*width_after_ratio)
crop_y0 = int(sub_boxes[1]/real_yuanlai_height*height_after_ratio)
crop_x1 = int(sub_boxes[2]/real_yuanlai_width*width_after_ratio)
crop_y1 = int(sub_boxes[3]/real_yuanlai_height*height_after_ratio)
#real_x0 = float(crop_x0 - x0)*1024/224 # float(crop_x0) / 1024 * 224
#real_y0 = float(crop_y0 - y0)*1024/224 # float(crop_y0) / 1024 * 224
#real_x1 = float(crop_x1 - x0)*1024/224 # float(crop_x1) / 1024 * 224
#real_y1 = float(crop_y1 - y0)*1024/224
overlaps_rate = solve_coincide((x0, y0, x1, y1), (crop_x0, crop_y0, crop_x1, crop_y1))
if overlaps_rate>0.9:
sum_inside_overlaps = sum_inside_overlaps + 1
#real_x0 = crop_x0 - x0 # float(crop_x0) / 1024 * 224
#real_y0 = crop_y0 - y0 # float(crop_y0) / 1024 * 224
#real_x1 = crop_x1 - x0 # float(crop_x1) / 1024 * 224
#real_y1 = crop_y1 - y0
real_x0 = float(crop_x0 - x0)*WIDTH/(random_bbox['kernel_size_x']) # float(crop_x0) / 1024 * 224
real_y0 = float(crop_y0 - y0)*HEIGHT/(random_bbox['kernel_size_y']) # float(crop_y0) / 1024 * 224
real_x1 = float(crop_x1 - x0)*WIDTH/(random_bbox['kernel_size_x']) # float(crop_x1) / 1024 * 224
real_y1 = float(crop_y1 - y0)*HEIGHT/(random_bbox['kernel_size_y'])
if real_x0<0:
real_x0 = 0
if real_x0>WIDTH:
real_x0 = WIDTH
if real_x1<0:
real_x1 = 0
if real_x1>WIDTH:
real_x1 = WIDTH
if real_y0<0:
real_y0 = 0
if real_y0>HEIGHT:
real_y0 = HEIGHT
if real_y1<0:
real_y1 = 0
if real_y1>HEIGHT:
real_y1 = HEIGHT
#cv2.rectangle(im, (int(real_x0), int(real_y0)), (int(real_x1), int(real_y1)), (0, 255, 255), 3)
#cv2.imwrite('/home/icubic/daily_work/code/Detectron/detectron/datasets/data/shanghai/aa.png',im)
boxes_inside_overlaps.append([real_x0, real_y0, real_x1, real_y1])
real_class.append(roidb[i]['gt_classes'][i_roidb])
#cv2.rectangle(im, (int(real_x0), int(real_y0)),
#(int(real_x1), int(real_y1)), (255, 0, 255))
#cv2.imwrite('/home/icubic/daily_work/code/circruit/new/result/uu.png', im)
#a = roidb[i]['gt_overlaps'].toarray()
if sum_inside_overlaps>0:
num_valid_objs = sum_inside_overlaps*1
boxes = np.zeros((num_valid_objs, 4), dtype=np.float32)
gt_classes = np.zeros((num_valid_objs), dtype=np.int32)
gt_overlaps = np.zeros((num_valid_objs, REAL_CLASS), dtype=np.float32)
box_to_gt_ind_map = np.zeros((num_valid_objs), dtype=np.int32)
is_crowd = np.zeros((num_valid_objs), dtype=np.bool)
for ix in range(num_valid_objs):
gt_classes[ix] = real_class[ix]#real_class*1
try:
gt_overlaps[ix, real_class] = 1.0
except:
print('error')
is_crowd[ix] = False
box_to_gt_ind_map[ix] = ix
for i_index in range(4):
boxes[ix,i_index] = boxes_inside_overlaps[ix][i_index]
#for ix in range(num_valid_objs):
#box_to_gt_ind_map[ix] = ix
#cls = real_class*1
roidb_noclass['boxes'] = np.append(roidb_noclass['boxes'], boxes, axis=0)
roidb_noclass['gt_classes'] = np.append(roidb_noclass['gt_classes'], gt_classes)
#mm = np.append(
# roidb_noclass['gt_overlaps'].toarray(), gt_overlaps,axis=0)
roidb_noclass['gt_overlaps'] = np.append(
roidb_noclass['gt_overlaps'].toarray(), gt_overlaps)
roidb_noclass['gt_overlaps'] = scipy.sparse.csr_matrix(roidb_noclass['gt_overlaps'])
#mm = np.append(mm, gt_overlaps, axis=0)
#roidb_noclass['gt_overlaps'] = scipy.sparse.csr_matrix(mm)
roidb_noclass['is_crowd'] = np.append(roidb_noclass['is_crowd'], is_crowd)
roidb_noclass['box_to_gt_ind_map'] = np.append(roidb_noclass['box_to_gt_ind_map'], box_to_gt_ind_map)
gt_overlaps = roidb_noclass['gt_overlaps'].toarray()
# max overlap with gt over classes (columns)
max_overlaps = gt_overlaps.max(axis=1)
# gt class that had the max overlap
max_classes = gt_overlaps.argmax(axis=1)
roidb_noclass['max_classes'] = max_classes
roidb_noclass['max_overlaps'] = max_overlaps
# sanity checks
# if max overlap is 0, the class must be background (class 0)
zero_inds = np.where(max_overlaps == 0)[0]
assert all(max_classes[zero_inds] == 0)
# if max overlap > 0, the class must be a fg class (not class 0)
nonzero_inds = np.where(max_overlaps > 0)[0]
assert all(max_classes[nonzero_inds] != 0)
roidb_noclass['bbox_targets'] = compute_bbox_regression_targets(roidb_noclass)
roidb[i] = roidb_noclass.copy()
roidb[i][u'height'] = HEIGHT
roidb[i][u'width'] = WIDTH
else:
roidb[i] = roidb_noclass.copy()
roidb[i][u'height'] = HEIGHT
roidb[i][u'width'] = WIDTH
#print('aa')
assert im is not None, \
'Failed to read image \'{}\''.format(roidb[i]['image'])
if roidb[i]['flipped']:
im = im[:, ::-1, :]
target_size = cfg.TRAIN.SCALES[scale_inds[i]]
im, im_scale = blob_utils.prep_im_for_blob(
im, cfg.PIXEL_MEANS, target_size, cfg.TRAIN.MAX_SIZE
)
im_scales.append(im_scale)
processed_ims.append(im)
# Create a blob to hold the input images
blob = blob_utils.im_list_to_blob(processed_ims)
return blob, im_scales,error_flag
| 47.395178 | 121 | 0.528519 | 5,730 | 45,215 | 3.902792 | 0.069634 | 0.052319 | 0.017842 | 0.046505 | 0.848768 | 0.833162 | 0.81532 | 0.80499 | 0.802173 | 0.795645 | 0 | 0.046214 | 0.359682 | 45,215 | 953 | 122 | 47.444911 | 0.726202 | 0.177729 | 0 | 0.779104 | 0 | 0 | 0.057404 | 0 | 0 | 0 | 0 | 0 | 0.016418 | 1 | 0.016418 | false | 0 | 0.022388 | 0 | 0.059701 | 0.007463 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
c3384f46ef6c29cb9a1666b45c279ae93eb16505 | 50,711 | py | Python | morepath/tests/test_path_directive.py | timgates42/morepath | 09972904229f807da75c75d8825af1495057acdc | [
"BSD-3-Clause"
] | 314 | 2015-01-01T01:42:52.000Z | 2022-01-07T21:46:15.000Z | morepath/tests/test_path_directive.py | timgates42/morepath | 09972904229f807da75c75d8825af1495057acdc | [
"BSD-3-Clause"
] | 369 | 2015-01-02T19:10:40.000Z | 2021-07-03T04:37:27.000Z | morepath/tests/test_path_directive.py | timgates42/morepath | 09972904229f807da75c75d8825af1495057acdc | [
"BSD-3-Clause"
] | 37 | 2015-01-11T09:22:02.000Z | 2021-07-02T20:48:20.000Z | import dectate
import morepath
from morepath.converter import Converter
from morepath.error import (
DirectiveReportError,
ConfigError,
LinkError,
TrajectError,
)
from webtest import TestApp as Client
import pytest
def test_simple_path_one_step():
class app(morepath.App):
pass
class Model:
def __init__(self):
pass
@app.path(model=Model, path="simple")
def get_model():
return Model()
@app.view(model=Model)
def default(self, request):
return "View"
@app.view(model=Model, name="link")
def link(self, request):
return request.link(self)
c = Client(app())
response = c.get("/simple")
assert response.body == b"View"
response = c.get("/simple/link")
assert response.body == b"http://localhost/simple"
def test_simple_path_two_steps():
class app(morepath.App):
pass
class Model:
def __init__(self):
pass
@app.path(model=Model, path="one/two")
def get_model():
return Model()
@app.view(model=Model)
def default(self, request):
return "View"
@app.view(model=Model, name="link")
def link(self, request):
return request.link(self)
c = Client(app())
response = c.get("/one/two")
assert response.body == b"View"
response = c.get("/one/two/link")
assert response.body == b"http://localhost/one/two"
def test_variable_path_one_step():
class app(morepath.App):
pass
class Model:
def __init__(self, name):
self.name = name
@app.path(model=Model, path="{name}")
def get_model(name):
return Model(name)
@app.view(model=Model)
def default(self, request):
return "View: %s" % self.name
@app.view(model=Model, name="link")
def link(self, request):
return request.link(self)
c = Client(app())
response = c.get("/foo")
assert response.body == b"View: foo"
response = c.get("/foo/link")
assert response.body == b"http://localhost/foo"
def test_variable_path_two_steps():
class app(morepath.App):
pass
class Model:
def __init__(self, name):
self.name = name
@app.path(model=Model, path="document/{name}")
def get_model(name):
return Model(name)
@app.view(model=Model)
def default(self, request):
return "View: %s" % self.name
@app.view(model=Model, name="link")
def link(self, request):
return request.link(self)
c = Client(app())
response = c.get("/document/foo")
assert response.body == b"View: foo"
response = c.get("/document/foo/link")
assert response.body == b"http://localhost/document/foo"
def test_variable_path_two_variables():
class app(morepath.App):
pass
class Model:
def __init__(self, name, version):
self.name = name
self.version = version
@app.path(model=Model, path="{name}-{version}")
def get_model(name, version):
return Model(name, version)
@app.view(model=Model)
def default(self, request):
return f"View: {self.name} {self.version}"
@app.view(model=Model, name="link")
def link(self, request):
return request.link(self)
c = Client(app())
response = c.get("/foo-one")
assert response.body == b"View: foo one"
response = c.get("/foo-one/link")
assert response.body == b"http://localhost/foo-one"
def test_variable_path_explicit_converter():
class app(morepath.App):
pass
class Model:
def __init__(self, id):
self.id = id
@app.path(model=Model, path="{id}", converters=dict(id=Converter(int)))
def get_model(id):
return Model(id)
@app.view(model=Model)
def default(self, request):
return "View: {} ({})".format(self.id, type(self.id))
@app.view(model=Model, name="link")
def link(self, request):
return request.link(self)
c = Client(app())
response = c.get("/1")
assert response.body in (
b"View: 1 (<type 'int'>)",
b"View: 1 (<class 'int'>)",
)
response = c.get("/1/link")
assert response.body == b"http://localhost/1"
response = c.get("/broken", status=404)
def test_variable_path_implicit_converter():
class app(morepath.App):
pass
class Model:
def __init__(self, id):
self.id = id
@app.path(model=Model, path="{id}")
def get_model(id=0):
return Model(id)
@app.view(model=Model)
def default(self, request):
return "View: {} ({})".format(self.id, type(self.id))
@app.view(model=Model, name="link")
def link(self, request):
return request.link(self)
c = Client(app())
response = c.get("/1")
assert response.body in (
b"View: 1 (<type 'int'>)",
b"View: 1 (<class 'int'>)",
)
response = c.get("/1/link")
assert response.body == b"http://localhost/1"
response = c.get("/broken", status=404)
def test_variable_path_explicit_trumps_implicit():
class app(morepath.App):
pass
class Model:
def __init__(self, id):
self.id = id
@app.path(model=Model, path="{id}", converters=dict(id=Converter(int)))
def get_model(id="foo"):
return Model(id)
@app.view(model=Model)
def default(self, request):
return "View: {} ({})".format(self.id, type(self.id))
@app.view(model=Model, name="link")
def link(self, request):
return request.link(self)
c = Client(app())
response = c.get("/1")
assert response.body in (
b"View: 1 (<type 'int'>)",
b"View: 1 (<class 'int'>)",
)
response = c.get("/1/link")
assert response.body == b"http://localhost/1"
response = c.get("/broken", status=404)
def test_url_parameter_explicit_converter():
class app(morepath.App):
pass
class Model:
def __init__(self, id):
self.id = id
@app.path(model=Model, path="/", converters=dict(id=Converter(int)))
def get_model(id):
return Model(id)
@app.view(model=Model)
def default(self, request):
return "View: {} ({})".format(self.id, type(self.id))
@app.view(model=Model, name="link")
def link(self, request):
return request.link(self)
c = Client(app())
response = c.get("/?id=1")
assert response.body in (
b"View: 1 (<type 'int'>)",
b"View: 1 (<class 'int'>)",
)
response = c.get("/link?id=1")
assert response.body == b"http://localhost/?id=1"
response = c.get("/?id=broken", status=400)
response = c.get("/")
assert response.body in (
b"View: None (<type 'NoneType'>)",
b"View: None (<class 'NoneType'>)",
)
def test_url_parameter_explicit_converter_get_converters():
class app(morepath.App):
pass
class Model:
def __init__(self, id):
self.id = id
def get_converters():
return dict(id=Converter(int))
@app.path(model=Model, path="/", get_converters=get_converters)
def get_model(id):
return Model(id)
@app.view(model=Model)
def default(self, request):
return "View: {} ({})".format(self.id, type(self.id))
@app.view(model=Model, name="link")
def link(self, request):
return request.link(self)
c = Client(app())
response = c.get("/?id=1")
assert response.body in (
b"View: 1 (<type 'int'>)",
b"View: 1 (<class 'int'>)",
)
response = c.get("/link?id=1")
assert response.body == b"http://localhost/?id=1"
response = c.get("/?id=broken", status=400)
response = c.get("/")
assert response.body in (
b"View: None (<type 'NoneType'>)",
b"View: None (<class 'NoneType'>)",
)
def test_url_parameter_get_converters_overrides_converters():
class app(morepath.App):
pass
class Model:
def __init__(self, id):
self.id = id
def get_converters():
return dict(id=Converter(int))
@app.path(
model=Model,
path="/",
converters={id: type("")},
get_converters=get_converters,
)
def get_model(id):
return Model(id)
@app.view(model=Model)
def default(self, request):
return "View: {} ({})".format(self.id, type(self.id))
@app.view(model=Model, name="link")
def link(self, request):
return request.link(self)
c = Client(app())
response = c.get("/?id=1")
assert response.body in (
b"View: 1 (<type 'int'>)",
b"View: 1 (<class 'int'>)",
)
response = c.get("/link?id=1")
assert response.body == b"http://localhost/?id=1"
response = c.get("/?id=broken", status=400)
response = c.get("/")
assert response.body in (
b"View: None (<type 'NoneType'>)",
b"View: None (<class 'NoneType'>)",
)
def test_url_parameter_implicit_converter():
class app(morepath.App):
pass
class Model:
def __init__(self, id):
self.id = id
@app.path(model=Model, path="/")
def get_model(id=0):
return Model(id)
@app.view(model=Model)
def default(self, request):
return "View: {} ({})".format(self.id, type(self.id))
@app.view(model=Model, name="link")
def link(self, request):
return request.link(self)
c = Client(app())
response = c.get("/?id=1")
assert response.body in (
b"View: 1 (<type 'int'>)",
b"View: 1 (<class 'int'>)",
)
response = c.get("/link?id=1")
assert response.body == b"http://localhost/?id=1"
response = c.get("/?id=broken", status=400)
response = c.get("/")
assert response.body in (
b"View: 0 (<type 'int'>)",
b"View: 0 (<class 'int'>)",
)
def test_multiple_url_parameters_stable_order():
class App(morepath.App):
pass
class Model:
def __init__(self, a, b):
self.a = a
self.b = b
@App.path(model=Model, path="/")
def get_model(a, b):
return Model(a, b)
@App.view(model=Model, name="link")
def link(self, request):
return request.link(self)
c = Client(App())
response = c.get("/link?a=A&b=B")
assert response.body == b"http://localhost/?a=A&b=B"
def test_url_parameter_explicit_trumps_implicit():
class app(morepath.App):
pass
class Model:
def __init__(self, id):
self.id = id
@app.path(model=Model, path="/", converters=dict(id=Converter(int)))
def get_model(id="foo"):
return Model(id)
@app.view(model=Model)
def default(self, request):
return "View: {} ({})".format(self.id, type(self.id))
@app.view(model=Model, name="link")
def link(self, request):
return request.link(self)
c = Client(app())
response = c.get("/?id=1")
assert response.body in (
b"View: 1 (<type 'int'>)",
b"View: 1 (<class 'int'>)",
)
response = c.get("/link?id=1")
assert response.body == b"http://localhost/?id=1"
response = c.get("/?id=broken", status=400)
response = c.get("/")
assert response.body in (
b"View: foo (<type 'str'>)",
b"View: foo (<class 'str'>)",
)
def test_decode_encode():
class app(morepath.App):
pass
class Model:
def __init__(self, id):
self.id = id
def my_decode(s):
return s + "ADD"
def my_encode(s):
return s[: -len("ADD")]
@app.path(
model=Model,
path="/",
converters=dict(id=Converter(my_decode, my_encode)),
)
def get_model(id):
return Model(id)
@app.view(model=Model)
def default(self, request):
return "View: %s" % self.id
@app.view(model=Model, name="link")
def link(self, request):
return request.link(self)
c = Client(app())
response = c.get("/?id=foo")
assert response.body == b"View: fooADD"
response = c.get("/link?id=foo")
assert response.body == b"http://localhost/?id=foo"
def test_unknown_converter():
class app(morepath.App):
pass
class Model:
def __init__(self, d):
self.d = d
class Unknown:
pass
@app.path(model=Model, path="/")
def get_model(d=Unknown()):
return Model(d)
@app.view(model=Model)
def default(self, request):
return "View: %s" % self.d
@app.view(model=Model, name="link")
def link(self, request):
return request.link(self)
with pytest.raises(DirectiveReportError):
app.commit()
def test_not_all_path_variables_arguments_of_model_factory():
class App(morepath.App):
pass
class Model:
def __init__(self, foo):
self.foo = foo
class Unknown:
pass
@App.path(model=Model, path="/{foo}/{bar}")
def get_model(foo):
return Model(foo)
with pytest.raises(DirectiveReportError) as e:
App.commit()
assert str(e.value).startswith(
"Variable in path not found in function " "signature: bar"
)
def test_unknown_explicit_converter():
class app(morepath.App):
pass
class Model:
def __init__(self, d):
self.d = d
class Unknown:
pass
@app.path(model=Model, path="/", converters={"d": Unknown})
def get_model(d):
return Model(d)
@app.view(model=Model)
def default(self, request):
return "View: %s" % self.d
@app.view(model=Model, name="link")
def link(self, request):
return request.link(self)
with pytest.raises(DirectiveReportError):
app.commit()
def test_default_date_converter():
class app(morepath.App):
pass
class Model:
def __init__(self, d):
self.d = d
from datetime import date
@app.path(model=Model, path="/")
def get_model(d=date(2011, 1, 1)):
return Model(d)
@app.view(model=Model)
def default(self, request):
return "View: %s" % self.d
@app.view(model=Model, name="link")
def link(self, request):
return request.link(self)
c = Client(app())
response = c.get("/?d=20121110")
assert response.body == b"View: 2012-11-10"
response = c.get("/")
assert response.body == b"View: 2011-01-01"
response = c.get("/link?d=20121110")
assert response.body == b"http://localhost/?d=20121110"
response = c.get("/link")
assert response.body == b"http://localhost/?d=20110101"
response = c.get("/?d=broken", status=400)
def test_default_datetime_converter():
class app(morepath.App):
pass
class Model:
def __init__(self, d):
self.d = d
from datetime import datetime
@app.path(model=Model, path="/")
def get_model(d=datetime(2011, 1, 1, 10, 30)):
return Model(d)
@app.view(model=Model)
def default(self, request):
return "View: %s" % self.d
@app.view(model=Model, name="link")
def link(self, request):
return request.link(self)
c = Client(app())
response = c.get("/?d=20121110T144530")
assert response.body == b"View: 2012-11-10 14:45:30"
response = c.get("/")
assert response.body == b"View: 2011-01-01 10:30:00"
response = c.get("/link?d=20121110T144500")
assert response.body == b"http://localhost/?d=20121110T144500"
response = c.get("/link")
assert response.body == b"http://localhost/?d=20110101T103000"
c.get("/?d=broken", status=400)
def test_custom_date_converter():
class app(morepath.App):
pass
class Model:
def __init__(self, d):
self.d = d
from datetime import date
from time import strptime, mktime
def date_decode(s):
return date.fromtimestamp(mktime(strptime(s, "%d-%m-%Y")))
def date_encode(d):
return d.strftime("%d-%m-%Y")
@app.converter(type=date)
def date_converter():
return Converter(date_decode, date_encode)
@app.path(model=Model, path="/")
def get_model(d=date(2011, 1, 1)):
return Model(d)
@app.view(model=Model)
def default(self, request):
return "View: %s" % self.d
@app.view(model=Model, name="link")
def link(self, request):
return request.link(self)
c = Client(app())
response = c.get("/?d=10-11-2012")
assert response.body == b"View: 2012-11-10"
response = c.get("/")
assert response.body == b"View: 2011-01-01"
response = c.get("/link?d=10-11-2012")
assert response.body == b"http://localhost/?d=10-11-2012"
response = c.get("/link")
assert response.body == b"http://localhost/?d=01-01-2011"
response = c.get("/?d=broken", status=400)
def test_variable_path_parameter_required_no_default():
class app(morepath.App):
pass
class Model:
def __init__(self, id):
self.id = id
@app.path(model=Model, path="", required=["id"])
def get_model(id):
return Model(id)
@app.view(model=Model)
def default(self, request):
return "View: %s" % self.id
@app.view(model=Model, name="link")
def link(self, request):
return request.link(self)
c = Client(app())
response = c.get("/?id=a")
assert response.body == b"View: a"
response = c.get("/", status=400)
def test_variable_path_parameter_required_with_default():
class app(morepath.App):
pass
class Model:
def __init__(self, id):
self.id = id
@app.path(model=Model, path="", required=["id"])
def get_model(id="b"):
return Model(id)
@app.view(model=Model)
def default(self, request):
return "View: %s" % self.id
@app.view(model=Model, name="link")
def link(self, request):
return request.link(self)
c = Client(app())
response = c.get("/?id=a")
assert response.body == b"View: a"
response = c.get("/", status=400)
def test_type_hints_and_converters():
class app(morepath.App):
pass
class Model:
def __init__(self, d):
self.d = d
from datetime import date
@app.path(model=Model, path="", converters=dict(d=date))
def get_model(d):
return Model(d)
@app.view(model=Model)
def default(self, request):
return "View: %s" % self.d
@app.view(model=Model, name="link")
def link(self, request):
return request.link(self)
c = Client(app())
response = c.get("/?d=20140120")
assert response.body == b"View: 2014-01-20"
response = c.get("/link?d=20140120")
assert response.body == b"http://localhost/?d=20140120"
def test_link_for_none_means_no_parameter():
class app(morepath.App):
pass
class Model:
def __init__(self, id):
self.id = id
@app.path(model=Model, path="")
def get_model(id):
return Model(id)
@app.view(model=Model)
def default(self, request):
return "View: %s" % self.id
@app.view(model=Model, name="link")
def link(self, request):
return request.link(self)
c = Client(app())
response = c.get("/")
assert response.body == b"View: None"
response = c.get("/link")
assert response.body == b"http://localhost/"
def test_path_and_url_parameter_converter():
class app(morepath.App):
pass
class Model:
def __init__(self, id, param):
self.id = id
self.param = param
from datetime import date
@app.path(model=Model, path="/{id}", converters=dict(param=date))
def get_model(id=0, param=None):
return Model(id, param)
@app.view(model=Model)
def default(self, request):
return f"View: {self.id} {self.param}"
@app.view(model=Model, name="link")
def link(self, request):
return request.link(self)
c = Client(app())
response = c.get("/1/link")
assert response.body == b"http://localhost/1"
def test_path_converter_fallback_on_view():
class app(morepath.App):
pass
class Root:
pass
class Model:
def __init__(self, id):
self.id = id
@app.path(model=Root, path="")
def get_root():
return Root()
@app.path(model=Model, path="/{id}")
def get_model(id=0):
return Model(id)
@app.view(model=Model)
def default(self, request):
return "Default view for %s" % self.id
@app.view(model=Root, name="named")
def named(self, request):
return "Named view on root"
c = Client(app())
response = c.get("/1")
assert response.body == b"Default view for 1"
response = c.get("/named")
assert response.body == b"Named view on root"
def test_root_named_link():
class app(morepath.App):
pass
@app.path(path="")
class Root:
pass
@app.view(model=Root)
def default(self, request):
return request.link(self, "foo")
c = Client(app())
response = c.get("/")
assert response.body == b"http://localhost/foo"
def test_path_class_and_model_argument():
class app(morepath.App):
pass
class Foo:
pass
@app.path(path="", model=Foo)
class Root:
pass
with pytest.raises(ConfigError):
app.commit()
def test_path_no_class_and_no_model_argument():
class app(morepath.App):
pass
@app.path(path="")
def get_foo():
return None
with pytest.raises(ConfigError):
app.commit()
def test_url_parameter_list():
class app(morepath.App):
pass
class Model:
def __init__(self, item):
self.item = item
@app.path(model=Model, path="/", converters={"item": [int]})
def get_model(item):
return Model(item)
@app.view(model=Model)
def default(self, request):
return repr(self.item)
@app.view(model=Model, name="link")
def link(self, request):
return request.link(self)
c = Client(app())
response = c.get("/?item=1&item=2")
assert response.body == b"[1, 2]"
response = c.get("/link?item=1&item=2")
assert response.body == b"http://localhost/?item=1&item=2"
response = c.get("/link")
assert response.body == b"http://localhost/"
response = c.get("/?item=broken&item=1", status=400)
response = c.get("/")
assert response.body == b"[]"
def test_url_parameter_list_empty():
class app(morepath.App):
pass
class Model:
def __init__(self, item):
self.item = item
@app.path(model=Model, path="/", converters={"item": []})
def get_model(item):
return Model(item)
@app.view(model=Model)
def default(self, request):
return repr(self.item)
@app.view(model=Model, name="link")
def link(self, request):
return request.link(self)
c = Client(app())
response = c.get("/?item=a&item=b")
assert response.body in (b"[u'a', u'b']", b"['a', 'b']")
response = c.get("/link?item=a&item=b")
assert response.body == b"http://localhost/?item=a&item=b"
response = c.get("/link")
assert response.body == b"http://localhost/"
response = c.get("/")
assert response.body == b"[]"
def test_url_parameter_list_explicit_converter():
class app(morepath.App):
pass
class Model:
def __init__(self, item):
self.item = item
@app.path(model=Model, path="/", converters={"item": [Converter(int)]})
def get_model(item):
return Model(item)
@app.view(model=Model)
def default(self, request):
return repr(self.item)
@app.view(model=Model, name="link")
def link(self, request):
return request.link(self)
c = Client(app())
response = c.get("/?item=1&item=2")
assert response.body == b"[1, 2]"
response = c.get("/link?item=1&item=2")
assert response.body == b"http://localhost/?item=1&item=2"
response = c.get("/link")
assert response.body == b"http://localhost/"
response = c.get("/?item=broken&item=1", status=400)
response = c.get("/")
assert response.body == b"[]"
def test_url_parameter_list_unknown_explicit_converter():
class app(morepath.App):
pass
class Model:
def __init__(self, item):
self.item = item
class Unknown:
pass
@app.path(model=Model, path="/", converters={"item": [Unknown]})
def get_model(item):
return Model(item)
with pytest.raises(DirectiveReportError):
app.commit()
def test_url_parameter_list_but_only_one_allowed():
class app(morepath.App):
pass
class Model:
def __init__(self, item):
self.item = item
@app.path(model=Model, path="/", converters={"item": int})
def get_model(item):
return Model(item)
@app.view(model=Model)
def default(self, request):
return repr(self.item)
@app.view(model=Model, name="link")
def link(self, request):
return request.link(self)
c = Client(app())
c.get("/?item=1&item=2", status=400)
c.get("/link?item=1&item=2", status=400)
def test_extra_parameters():
class app(morepath.App):
pass
class Model:
def __init__(self, extra_parameters):
self.extra_parameters = extra_parameters
@app.path(model=Model, path="/")
def get_model(extra_parameters):
return Model(extra_parameters)
@app.view(model=Model)
def default(self, request):
return repr(sorted(self.extra_parameters.items()))
@app.view(model=Model, name="link")
def link(self, request):
return request.link(self)
c = Client(app())
response = c.get("/?a=A&b=B")
assert response.body in (
b"[(u'a', u'A'), (u'b', u'B')]",
b"[('a', 'A'), ('b', 'B')]",
)
response = c.get("/link?a=A&b=B")
assert sorted(response.body[len("http://localhost/?") :].split(b"&")) == [
b"a=A",
b"b=B",
]
def test_extra_parameters_with_get_converters():
class app(morepath.App):
pass
class Model:
def __init__(self, extra_parameters):
self.extra_parameters = extra_parameters
def get_converters():
return {
"a": int,
"b": type(""),
}
@app.path(model=Model, path="/", get_converters=get_converters)
def get_model(extra_parameters):
return Model(extra_parameters)
@app.view(model=Model)
def default(self, request):
return repr(sorted(self.extra_parameters.items()))
@app.view(model=Model, name="link")
def link(self, request):
return request.link(self)
c = Client(app())
response = c.get("/?a=1&b=B")
assert response.body in (
b"[(u'a', 1), (u'b', u'B')]",
b"[('a', 1), ('b', 'B')]",
)
response = c.get("/link?a=1&b=B")
assert sorted(response.body[len("http://localhost/?") :].split(b"&")) == [
b"a=1",
b"b=B",
]
c.get("/?a=broken&b=B", status=400)
def test_script_name():
class app(morepath.App):
pass
class Model:
def __init__(self):
pass
@app.path(model=Model, path="simple")
def get_model():
return Model()
@app.view(model=Model)
def default(self, request):
return "View"
@app.view(model=Model, name="link")
def link(self, request):
return request.link(self)
c = Client(app())
response = c.get(
"/prefix/simple", extra_environ=dict(SCRIPT_NAME="/prefix")
)
assert response.body == b"View"
response = c.get(
"/prefix/simple/link", extra_environ=dict(SCRIPT_NAME="/prefix")
)
assert response.body == b"http://localhost/prefix/simple"
def test_sub_path_different_variable():
# See discussion in https://github.com/morepath/morepath/issues/155
class App(morepath.App):
pass
class Foo:
def __init__(self, id):
self.id = id
class Bar:
def __init__(self, id, foo):
self.id = id
self.foo = foo
@App.path(model=Foo, path="{id}")
def get_foo(id):
return Foo(id)
@App.path(model=Bar, path="{foo_id}/{bar_id}")
def get_client(foo_id, bar_id):
return Bar(bar_id, Foo(foo_id))
@App.view(model=Foo)
def default_sbar(self, request):
return "M: %s" % self.id
@App.view(model=Bar)
def default_bar(self, request):
return f"S: {self.id} {self.foo.id}"
c = Client(App())
with pytest.raises(TrajectError) as ex:
response = c.get("/a")
assert response.body == b"M: a"
response = c.get("/a/b")
assert response.body == b"S: b a"
assert str(ex.value) == "step {id} and {foo_id} are in conflict"
def test_absorb_path():
class app(morepath.App):
pass
class Root:
pass
class Model:
def __init__(self, absorb):
self.absorb = absorb
@app.path(model=Root, path="")
def get_root():
return Root()
@app.path(model=Model, path="foo", absorb=True)
def get_model(absorb):
return Model(absorb)
@app.view(model=Model)
def default(self, request):
return "%s" % self.absorb
@app.view(model=Root)
def default_root(self, request):
return request.link(Model("a/b"))
c = Client(app())
response = c.get("/foo/a")
assert response.body == b"a"
response = c.get("/foo")
assert response.body == b""
response = c.get("/foo/a/b")
assert response.body == b"a/b"
# link to a/b absorb
response = c.get("/")
assert response.body == b"http://localhost/foo/a/b"
def test_absorb_path_with_variables():
class app(morepath.App):
pass
class Root:
pass
class Model:
def __init__(self, id, absorb):
self.id = id
self.absorb = absorb
@app.path(model=Root, path="")
def get_root():
return Root()
@app.path(model=Model, path="{id}", absorb=True)
def get_model(id, absorb):
return Model(id, absorb)
@app.view(model=Model)
def default(self, request):
return f"I:{self.id} A:{self.absorb}"
@app.view(model=Root)
def default_root(self, request):
return request.link(Model("foo", "a/b"))
c = Client(app())
response = c.get("/foo/a")
assert response.body == b"I:foo A:a"
response = c.get("/foo")
assert response.body == b"I:foo A:"
response = c.get("/foo/a/b")
assert response.body == b"I:foo A:a/b"
# link to a/b absorb
response = c.get("/")
assert response.body == b"http://localhost/foo/a/b"
def test_absorb_path_explicit_subpath_ignored():
class app(morepath.App):
pass
class Root:
pass
class Model:
def __init__(self, absorb):
self.absorb = absorb
class Another:
pass
@app.path(model=Root, path="")
def get_root():
return Root()
@app.path(model=Model, path="foo", absorb=True)
def get_model(absorb):
return Model(absorb)
@app.path(model=Another, path="foo/another")
def get_another():
return Another()
@app.view(model=Model)
def default(self, request):
return "%s" % self.absorb
@app.view(model=Another)
def default_another(self, request):
return "Another"
@app.view(model=Root)
def default_root(self, request):
return request.link(Another())
c = Client(app())
response = c.get("/foo/a")
assert response.body == b"a"
response = c.get("/foo/another")
assert response.body == b"another"
# link to another still works XXX is this wrong?
response = c.get("/")
assert response.body == b"http://localhost/foo/another"
def test_absorb_path_root():
class app(morepath.App):
pass
class Model:
def __init__(self, absorb):
self.absorb = absorb
@app.path(model=Model, path="", absorb=True)
def get_model(absorb):
return Model(absorb)
@app.view(model=Model)
def default(self, request):
return "A:{} L:{}".format(self.absorb, request.link(self))
c = Client(app())
response = c.get("/a")
assert response.body == b"A:a L:http://localhost/a"
response = c.get("/")
assert response.body == b"A: L:http://localhost/"
response = c.get("/a/b")
assert response.body == b"A:a/b L:http://localhost/a/b"
def test_path_explicit_variables():
class App(morepath.App):
pass
class Model:
def __init__(self, id):
self.store_id = id
@App.path(
model=Model, path="models/{id}", variables=lambda m: {"id": m.store_id}
)
def get_model(id):
return Model(id)
@App.view(model=Model)
def default(self, request):
return request.link(self)
c = Client(App())
response = c.get("/models/1")
assert response.body == b"http://localhost/models/1"
def test_path_explicit_variables_app_arg():
class App(morepath.App):
pass
class Model:
def __init__(self, id):
self.store_id = id
def my_variables(app, m):
assert isinstance(app, App)
return {"id": m.store_id}
@App.path(model=Model, path="models/{id}", variables=my_variables)
def get_model(id):
return Model(id)
@App.view(model=Model)
def default(self, request):
return request.link(self)
c = Client(App())
response = c.get("/models/1")
assert response.body == b"http://localhost/models/1"
def test_error_when_path_variable_is_none():
class App(morepath.App):
pass
class Model:
def __init__(self, id):
self.store_id = id
@App.path(model=Model, path="models/{id}", variables=lambda m: {"id": None})
def get_model(id):
return Model(id)
@App.view(model=Model)
def default(self, request):
return request.link(self)
c = Client(App())
with pytest.raises(LinkError):
c.get("/models/1")
def test_error_when_path_variable_is_missing():
class App(morepath.App):
pass
class Model:
def __init__(self, id):
self.store_id = id
@App.path(model=Model, path="models/{id}", variables=lambda m: {})
def get_model(id):
return Model(id)
@App.view(model=Model)
def default(self, request):
return request.link(self)
c = Client(App())
with pytest.raises(KeyError):
c.get("/models/1")
def test_error_when_path_variables_isnt_dict():
class App(morepath.App):
pass
class Model:
def __init__(self, id):
self.store_id = id
@App.path(model=Model, path="models/{id}", variables=lambda m: "nondict")
def get_model(id):
return Model(id)
@App.view(model=Model)
def default(self, request):
return request.link(self)
c = Client(App())
with pytest.raises(LinkError):
c.get("/models/1")
def test_resolve_path_method_on_request_same_app():
class App(morepath.App):
pass
class Model:
def __init__(self):
pass
@App.path(model=Model, path="simple")
def get_model():
return Model()
@App.view(model=Model)
def default(self, request):
return str(isinstance(request.resolve_path("simple"), Model))
@App.view(model=Model, name="extra")
def extra(self, request):
return str(request.resolve_path("nonexistent") is None)
@App.view(model=Model, name="appnone")
def appnone(self, request):
return request.resolve_path("simple", app=None)
c = Client(App())
response = c.get("/simple")
assert response.body == b"True"
response = c.get("/simple/extra")
assert response.body == b"True"
with pytest.raises(LinkError):
c.get("/simple/appnone")
def test_resolve_path_method_on_request_different_app():
class App(morepath.App):
pass
class Model:
def __init__(self):
pass
@App.path(model=Model, path="simple")
def get_model():
return Model()
@App.view(model=Model)
def default(self, request):
obj = request.resolve_path("p", app=request.app.child("sub"))
return str(isinstance(obj, SubModel))
class Sub(morepath.App):
pass
class SubModel:
pass
@Sub.path(model=SubModel, path="p")
def get_sub_model():
return SubModel()
@App.mount(path="sub", app=Sub)
def mount_sub():
return Sub()
c = Client(App())
response = c.get("/simple")
assert response.body == b"True"
def test_resolve_path_with_dots_in_url():
class app(morepath.App):
pass
class Root:
def __init__(self, absorb):
self.absorb = absorb
@app.path(model=Root, path="root", absorb=True)
def get_root(absorb):
return Root(absorb)
@app.view(model=Root)
def default(self, request):
return "%s" % self.absorb
c = Client(app())
response = c.get("/root/x/../child")
assert response.body == b"child"
response = c.get("/root/x/%2E%2E/child")
assert response.body == b"child"
response = c.get("/root/%2E%2E/%2E%2E/root")
assert response.body == b""
response = c.get("/root/%2E%2E/%2E%2E/root")
assert response.body == b""
response = c.get("/root/%2E%2E/%2E%2E/test", expect_errors=True)
assert response.status_code == 404
def test_quoting_link_generation():
class App(morepath.App):
pass
class Model:
def __init__(self):
pass
@App.path(model=Model, path="sim?ple")
def get_model():
return Model()
@App.view(model=Model)
def default(self, request):
return "View"
@App.view(model=Model, name="link")
def link(self, request):
return request.link(self)
c = Client(App())
response = c.get("/sim%3Fple")
assert response.body == b"View"
response = c.get("/sim%3Fple/link")
assert response.body == b"http://localhost/sim%3Fple"
def test_quoting_link_generation_umlaut():
class App(morepath.App):
pass
class Model:
def __init__(self):
pass
@App.path(model=Model, path="simëple")
def get_model():
return Model()
@App.view(model=Model)
def default(self, request):
return "View"
@App.view(model=Model, name="link")
def link(self, request):
return request.link(self)
c = Client(App())
response = c.get("/sim%C3%ABple")
assert response.body == b"View"
response = c.get("/sim%C3%ABple/link")
assert response.body == b"http://localhost/sim%C3%ABple"
def test_quoting_link_generation_tilde():
# tilde is an unreserved character according to
# https://www.ietf.org/rfc/rfc3986.txt but urllib.quote
# quotes it anyway. We test whether our workaround using
# the safe parameter works
class App(morepath.App):
pass
class Model:
def __init__(self):
pass
@App.path(model=Model, path="sim~ple")
def get_model():
return Model()
@App.view(model=Model)
def default(self, request):
return "View"
@App.view(model=Model, name="link")
def link(self, request):
return request.link(self)
c = Client(App())
response = c.get("/sim~ple")
assert response.body == b"View"
response = c.get("/sim~ple/link")
assert response.body == b"http://localhost/sim~ple"
def test_parameter_quoting():
class App(morepath.App):
pass
class Model:
def __init__(self, s):
self.s = s
@App.path(model=Model, path="")
def get_model(s):
return Model(s)
@App.view(model=Model)
def default(self, request):
return "View: %s" % self.s
@App.view(model=Model, name="link")
def link(self, request):
return request.link(self)
c = Client(App())
response = c.get("/?s=sim%C3%ABple")
assert response.body == "View: simëple".encode()
response = c.get("/link?s=sim%C3%ABple")
assert response.body == b"http://localhost/?s=sim%C3%ABple"
def test_parameter_quoting_tilde():
class App(morepath.App):
pass
class Model:
def __init__(self, s):
self.s = s
@App.path(model=Model, path="")
def get_model(s):
return Model(s)
@App.view(model=Model)
def default(self, request):
return "View: %s" % self.s
@App.view(model=Model, name="link")
def link(self, request):
return request.link(self)
c = Client(App())
response = c.get("/?s=sim~ple")
assert response.body == b"View: sim~ple"
response = c.get("/link?s=sim~ple")
assert response.body == b"http://localhost/?s=sim~ple"
def test_class_link_without_variables():
class App(morepath.App):
pass
class Model:
pass
@App.path(model=Model, path="/foo")
def get_model():
return Model()
@App.view(model=Model)
def link(self, request):
return request.class_link(Model)
c = Client(App())
response = c.get("/foo")
assert response.body == b"http://localhost/foo"
def test_class_link_no_app():
class App(morepath.App):
pass
class Model:
pass
@App.path(model=Model, path="/foo")
def get_model():
return Model()
@App.view(model=Model)
def link(self, request):
return request.class_link(Model, app=None)
c = Client(App())
with pytest.raises(LinkError):
c.get("/foo")
def test_class_link_with_variables():
class App(morepath.App):
pass
class Model:
pass
@App.path(model=Model, path="/foo/{x}")
def get_model(x):
return Model()
@App.view(model=Model)
def link(self, request):
return request.class_link(Model, variables={"x": "X"})
c = Client(App())
response = c.get("/foo/3")
assert response.body == b"http://localhost/foo/X"
def test_class_link_with_missing_variables():
class App(morepath.App):
pass
class Model:
pass
@App.path(model=Model, path="/foo/{x}")
def get_model(x):
return Model()
@App.view(model=Model)
def link(self, request):
return request.class_link(Model, variables={})
c = Client(App())
with pytest.raises(KeyError):
c.get("/foo/3")
def test_class_link_with_extra_variable():
class App(morepath.App):
pass
class Model:
pass
@App.path(model=Model, path="/foo/{x}")
def get_model(x):
return Model()
@App.view(model=Model)
def link(self, request):
return request.class_link(Model, variables={"x": "X", "y": "Y"})
c = Client(App())
response = c.get("/foo/3")
assert response.body == b"http://localhost/foo/X"
def test_class_link_with_url_parameter_variable():
class App(morepath.App):
pass
class Model:
pass
@App.path(model=Model, path="/foo/{x}")
def get_model(x, y):
return Model()
@App.view(model=Model)
def link(self, request):
return request.class_link(Model, variables={"x": "X", "y": "Y"})
c = Client(App())
response = c.get("/foo/3")
assert response.body == b"http://localhost/foo/X?y=Y"
def test_class_link_with_subclass():
class App(morepath.App):
pass
class Model:
pass
class Sub(Model):
pass
@App.path(model=Model, path="/foo/{x}")
def get_model(x):
return Model()
@App.view(model=Model)
def link(self, request):
return request.class_link(Sub, variables={"x": "X"})
c = Client(App())
response = c.get("/foo/3")
assert response.body == b"http://localhost/foo/X"
def test_absorb_class_path():
class App(morepath.App):
pass
class Root:
pass
class Model:
def __init__(self, absorb):
self.absorb = absorb
@App.path(model=Root, path="")
def get_root():
return Root()
@App.path(model=Model, path="foo", absorb=True)
def get_model(absorb):
return Model(absorb)
@App.view(model=Model)
def default(self, request):
return "%s" % self.absorb
@App.view(model=Root)
def default_root(self, request):
return request.class_link(Model, variables={"absorb": "a/b"})
c = Client(App())
# link to a/b absorb
response = c.get("/")
assert response.body == b"http://localhost/foo/a/b"
def test_absorb_class_path_with_variables():
class App(morepath.App):
pass
class Root:
pass
class Model:
def __init__(self, id, absorb):
self.id = id
self.absorb = absorb
@App.path(model=Root, path="")
def get_root():
return Root()
@App.path(model=Model, path="{id}", absorb=True)
def get_model(id, absorb):
return Model(id, absorb)
@App.view(model=Model)
def default(self, request):
return f"I:{self.id} A:{self.absorb}"
@App.view(model=Root)
def default_root(self, request):
return request.class_link(Model, variables=dict(id="foo", absorb="a/b"))
c = Client(App())
# link to a/b absorb
response = c.get("/")
assert response.body == b"http://localhost/foo/a/b"
def test_class_link_extra_parameters():
class App(morepath.App):
pass
class Model:
def __init__(self, extra_parameters):
self.extra_parameters = extra_parameters
@App.path(model=Model, path="/")
def get_model(extra_parameters):
return Model(extra_parameters)
@App.view(model=Model)
def default(self, request):
return repr(sorted(self.extra_parameters.items()))
@App.view(model=Model, name="link")
def link(self, request):
return request.class_link(
Model, variables={"extra_parameters": {"a": "A", "b": "B"}}
)
c = Client(App())
response = c.get("/link?a=A&b=B")
assert sorted(response.body[len("http://localhost/?") :].split(b"&")) == [
b"a=A",
b"b=B",
]
def test_path_on_model_class():
class App(morepath.App):
pass
@App.path("/")
class Model:
def __init__(self):
pass
@App.path("/login")
class Login:
pass
@App.view(model=Model)
def model_view(self, request):
return "Model"
@App.view(model=Login)
def login_view(self, request):
return "Login"
c = Client(App())
response = c.get("/")
assert response.body == b"Model"
response = c.get("/login")
assert response.body == b"Login"
def test_path_without_model():
class App(morepath.App):
pass
@App.path("/")
def get_path():
pass
with pytest.raises(dectate.DirectiveReportError):
App.commit()
def test_two_path_on_same_model_should_conflict():
class App(morepath.App):
pass
@App.path("/login")
@App.path("/")
class Login:
pass
with pytest.raises(dectate.ConflictError):
App.commit()
def test_path_on_same_model_explicit_and_class_should_conflict():
class App(morepath.App):
pass
@App.path("/")
class Login:
pass
@App.path("/login", model=Login)
def get_path():
return Login()
with pytest.raises(dectate.ConflictError):
App.commit()
def test_nonexisting_path_too_long_unconsumed():
class App(morepath.App):
pass
class Model:
def __init__(self):
pass
@App.path(model=Model, path="simple")
def get_model():
return Model()
@App.view(model=Model)
def default(self, request):
return "View"
c = Client(App())
c.get("/foo/bar/baz", status=404)
def test_collection_and_item():
class App(morepath.App):
pass
class Collection:
def __init__(self):
self.items = {}
class Item:
def __init__(self, id):
self.id = id
collection = Collection()
collection.items["a"] = Item("a")
collection.items["b"] = Item("b")
@App.path(model=Collection, path="/")
def get_collection():
return collection
@App.path(model=Item, path="/{id}")
def get_item(id):
return collection.items.get(id)
@App.view(model=Collection)
def default_collection(self, request):
return "Collection"
@App.view(model=Item)
def default(self, request):
return "View: %s" % self.id
c = Client(App())
r = c.get("/c", status=404)
assert r.body != "Collection"
r = c.get("/a")
assert r.body == b"View: a"
def test_view_for_missing():
class App(morepath.App):
pass
class Item:
def __init__(self, id):
self.id = id
@App.path(model=Item, path="/{id}")
def get_item(id):
if id == "found":
return Item(id)
return None
@App.view(model=Item, name="edit")
def default(self, request):
return "View: %s" % self.id
c = Client(App())
c.get("/notfound/+edit", status=404)
c.get("/notfound/edit", status=404)
def test_absorb_error():
class App(morepath.App):
pass
@App.path("/")
class Root:
pass
@App.view(model=Root)
def view_root(self, request):
return "root"
class File:
def __init__(self, absorb):
self.absorb = absorb
@App.path("/files", model=File, absorb=True)
def get_file(absorb):
if absorb == "foo":
return File("foo")
return None
@App.view(model=File)
def view_file(self, request):
return request.path
App.commit()
client = Client(App())
assert client.get("/").text == "root"
assert client.get("/files/foo").text == "/files/foo"
client.get("/files/bar", status=404)
def test_named_view_on_root():
class App(morepath.App):
pass
@App.path(path="/")
class Root:
pass
@App.view(model=Root, name="named")
def named(self, request):
return "Named view on root"
@App.view(model=Root)
def default(self, request):
return "Default view on root"
c = Client(App())
response = c.get("/named")
assert response.body == b"Named view on root"
response = c.get("/+named")
assert response.body == b"Named view on root"
response = c.get("/")
assert response.body == b"Default view on root"
| 22.280756 | 80 | 0.585475 | 6,766 | 50,711 | 4.276973 | 0.03828 | 0.055982 | 0.056811 | 0.066971 | 0.850266 | 0.822102 | 0.804168 | 0.774829 | 0.741067 | 0.696662 | 0 | 0.011316 | 0.259372 | 50,711 | 2,275 | 81 | 22.290549 | 0.759172 | 0.007257 | 0 | 0.745891 | 0 | 0 | 0.104844 | 0.001887 | 0 | 0 | 0 | 0 | 0.082174 | 1 | 0.219343 | false | 0.072693 | 0.007585 | 0.128951 | 0.465234 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 10 |
5ed94d394ca28614c2131c6c94838c1d9d1edccf | 79 | py | Python | py-bindings/ompl/morse/__init__.py | ericpairet/ompl | 25c76431cef25f0100ed74d09dd88944ecca5ee1 | [
"BSD-3-Clause"
] | 837 | 2015-01-07T12:01:20.000Z | 2022-03-31T08:42:42.000Z | py-bindings/ompl/morse/__init__.py | ericpairet/ompl | 25c76431cef25f0100ed74d09dd88944ecca5ee1 | [
"BSD-3-Clause"
] | 271 | 2015-01-12T22:05:06.000Z | 2022-03-30T22:16:01.000Z | py-bindings/ompl/morse/__init__.py | ericpairet/ompl | 25c76431cef25f0100ed74d09dd88944ecca5ee1 | [
"BSD-3-Clause"
] | 452 | 2015-02-10T08:48:21.000Z | 2022-03-23T06:53:33.000Z | from ompl import control
from ompl import util
from ompl.morse._morse import *
| 19.75 | 31 | 0.810127 | 13 | 79 | 4.846154 | 0.461538 | 0.380952 | 0.444444 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.151899 | 79 | 3 | 32 | 26.333333 | 0.940299 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
5efc9596aa75a5600fc62a1e2dd1eed0cd965fb7 | 2,258 | py | Python | pandora/evaluation.py | mikekestemont/pandora | ecae769c8dac5cce563da114be923d22eec6656d | [
"MIT"
] | 2 | 2016-02-19T10:23:17.000Z | 2016-09-28T16:14:41.000Z | pandora/evaluation.py | mikekestemont/pandora | ecae769c8dac5cce563da114be923d22eec6656d | [
"MIT"
] | 6 | 2016-06-22T12:40:57.000Z | 2018-04-16T08:39:52.000Z | pandora/evaluation.py | mikekestemont/pandora | ecae769c8dac5cce563da114be923d22eec6656d | [
"MIT"
] | 3 | 2016-01-10T10:24:53.000Z | 2017-02-06T13:47:20.000Z | #!/usr/bin/env python
# -*- coding: utf-8 -*-
from __future__ import print_function
from collections import Counter
from operator import itemgetter
import numpy as np
def single_label_accuracies(gold, silver, test_tokens, known_tokens,
print_scores=True):
"""
Calculate accuracies for all, known and unknown tokens.
Uses index of items seen during training.
"""
kno_corr, unk_corr = 0.0, 0.0
nb_kno, nb_unk = 0.0, 0.0
for gold_pred, silver_pred, tok in zip(gold, silver, test_tokens):
if tok in known_tokens:
nb_kno += 1
if gold_pred == silver_pred:
kno_corr += 1
else:
nb_unk += 1
if gold_pred == silver_pred:
unk_corr += 1
all_acc = (kno_corr + unk_corr) / (nb_kno + nb_unk)
kno_acc = kno_corr / nb_kno
# account for situation with no unknowns:
unk_acc = 0.0
if nb_unk > 0:
unk_acc = unk_corr / nb_unk
if print_scores:
print('+\tall acc:', all_acc)
print('+\tkno acc:', kno_acc)
print('+\tunk acc:', unk_acc)
return all_acc, kno_acc, unk_acc
def multilabel_accuracies(gold, silver, test_tokens, known_tokens,
print_scores=True):
"""
Calculate accuracies for all, known and unknown tokens.
Uses index of items seen during training.
"""
kno_corr, unk_corr = 0.0, 0.0
nb_kno, nb_unk = 0.0, 0.0
for gold_pred, silver_pred, tok in zip(gold, silver, test_tokens):
gold_pred = set(gold_pred.split('|'))
silver_pred = set(silver_pred.split('|'))
if tok in known_tokens:
nb_kno += 1
if gold_pred == silver_pred:
kno_corr += 1
else:
nb_unk += 1
if gold_pred == silver_pred:
unk_corr += 1
all_acc = (kno_corr + unk_corr) / (nb_kno + nb_unk)
kno_acc = kno_corr / nb_kno
# account for situation with no unknowns:
unk_acc = 0.0
if nb_unk > 0:
unk_acc = unk_corr / nb_unk
if print_scores:
print('+\tall acc:', all_acc)
print('+\tkno acc:', kno_acc)
print('+\tunk acc:', unk_acc)
return all_acc, kno_acc, unk_acc
| 27.536585 | 70 | 0.584588 | 325 | 2,258 | 3.784615 | 0.206154 | 0.022764 | 0.019512 | 0.087805 | 0.834146 | 0.834146 | 0.834146 | 0.834146 | 0.834146 | 0.834146 | 0 | 0.020104 | 0.317095 | 2,258 | 82 | 71 | 27.536585 | 0.777562 | 0.140833 | 0 | 0.846154 | 0 | 0 | 0.035808 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.038462 | false | 0 | 0.076923 | 0 | 0.153846 | 0.211538 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
5eff49ab774b5be893f7416e32ca525bbae6a16e | 487 | py | Python | PythonAPI/carissma_project/lib/python3.5/site-packages/pandas/tests/util/conftest.py | AbdulHoffmann/carla_carissma | 8d382769ffa02a6c61a22c57160285505f5ff0a4 | [
"MIT"
] | 6,989 | 2017-07-18T06:23:18.000Z | 2022-03-31T15:58:36.000Z | pandas/tests/util/conftest.py | ivan-vasilev/pandas | 4071dde86e33434e1bee8304fa62074949f813cc | [
"BSD-3-Clause"
] | 1,978 | 2017-07-18T09:17:58.000Z | 2022-03-31T14:28:43.000Z | pandas/tests/util/conftest.py | ivan-vasilev/pandas | 4071dde86e33434e1bee8304fa62074949f813cc | [
"BSD-3-Clause"
] | 1,228 | 2017-07-18T09:03:13.000Z | 2022-03-29T05:57:40.000Z | import pytest
@pytest.fixture(params=[True, False])
def check_dtype(request):
return request.param
@pytest.fixture(params=[True, False])
def check_exact(request):
return request.param
@pytest.fixture(params=[True, False])
def check_index_type(request):
return request.param
@pytest.fixture(params=[True, False])
def check_less_precise(request):
return request.param
@pytest.fixture(params=[True, False])
def check_categorical(request):
return request.param
| 18.037037 | 37 | 0.749487 | 64 | 487 | 5.59375 | 0.28125 | 0.181564 | 0.265363 | 0.321229 | 0.782123 | 0.782123 | 0.782123 | 0.681564 | 0.681564 | 0.681564 | 0 | 0 | 0.12731 | 487 | 26 | 38 | 18.730769 | 0.842353 | 0 | 0 | 0.625 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.3125 | false | 0 | 0.0625 | 0.3125 | 0.6875 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 7 |
6f00cb70eb55f632b4ba112286e58fc480249411 | 48 | py | Python | files/python_by_examples/loops/comprehensions/p01_list_comprehensions.py | Sher-Chowdhury/CentOS7-Python | 2282aa2b8396891a060a132a3b340cc810bbf746 | [
"Apache-2.0"
] | null | null | null | files/python_by_examples/loops/comprehensions/p01_list_comprehensions.py | Sher-Chowdhury/CentOS7-Python | 2282aa2b8396891a060a132a3b340cc810bbf746 | [
"Apache-2.0"
] | null | null | null | files/python_by_examples/loops/comprehensions/p01_list_comprehensions.py | Sher-Chowdhury/CentOS7-Python | 2282aa2b8396891a060a132a3b340cc810bbf746 | [
"Apache-2.0"
] | null | null | null | a = [1, 3, 5, 7, 9, 11]
[print(x) for x in a]
| 9.6 | 23 | 0.4375 | 13 | 48 | 1.615385 | 0.846154 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.212121 | 0.3125 | 48 | 4 | 24 | 12 | 0.424242 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.5 | 1 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 7 |
48ee6d07e2c2e071bff4f33f98040c4fcb7f2fd8 | 28,852 | py | Python | scripts/slave/recipe_modules/skia/fake_specs.py | bopopescu/build | 4e95fd33456e552bfaf7d94f7d04b19273d1c534 | [
"BSD-3-Clause"
] | null | null | null | scripts/slave/recipe_modules/skia/fake_specs.py | bopopescu/build | 4e95fd33456e552bfaf7d94f7d04b19273d1c534 | [
"BSD-3-Clause"
] | null | null | null | scripts/slave/recipe_modules/skia/fake_specs.py | bopopescu/build | 4e95fd33456e552bfaf7d94f7d04b19273d1c534 | [
"BSD-3-Clause"
] | 1 | 2020-07-23T10:57:32.000Z | 2020-07-23T10:57:32.000Z | # This file is generated by the scripts/slave/skia/gen_buildbot_specs.py script.
FAKE_SPECS = {
'Build-Mac-Clang-Arm7-Release-iOS': {
'build_targets': [
'most',
],
'builder_cfg': {
'compiler': 'Clang',
'configuration': 'Release',
'extra_config': 'iOS',
'is_trybot': False,
'os': 'Mac',
'role': 'Build',
'target_arch': 'Arm7',
},
'configuration': 'Release',
'dm_flags': [
'--dummy-flags',
],
'do_perf_steps': False,
'do_test_steps': False,
'env': {
'CC': '/usr/bin/clang',
'CXX': '/usr/bin/clang++',
'GYP_DEFINES':
('skia_arch_type=arm skia_clang_build=1 skia_os=ios skia_warnings_a'
's_errors=1'),
},
'nanobench_flags': [
'--dummy-flags',
],
'upload_dm_results': True,
'upload_perf_results': False,
},
'Build-Mac-Clang-x86_64-Release-Swarming': {
'build_targets': [
'most',
],
'builder_cfg': {
'compiler': 'Clang',
'configuration': 'Release',
'extra_config': 'Swarming',
'is_trybot': False,
'os': 'Mac',
'role': 'Build',
'target_arch': 'x86_64',
},
'configuration': 'Release',
'dm_flags': [
'--dummy-flags',
],
'do_perf_steps': False,
'do_test_steps': False,
'env': {
'CC': '/usr/bin/clang',
'CXX': '/usr/bin/clang++',
'GYP_DEFINES':
'skia_arch_type=x86_64 skia_clang_build=1 skia_warnings_as_errors=1',
},
'nanobench_flags': [
'--dummy-flags',
],
'upload_dm_results': True,
'upload_perf_results': False,
},
'Build-Mac10.8-Clang-Arm7-Debug-Android': {
'build_targets': [
'most',
],
'builder_cfg': {
'compiler': 'Clang',
'configuration': 'Debug',
'extra_config': 'Android',
'is_trybot': False,
'os': 'Mac10.8',
'role': 'Build',
'target_arch': 'Arm7',
},
'configuration': 'Debug',
'device_cfg': 'arm_v7_neon',
'dm_flags': [
'--dummy-flags',
],
'do_perf_steps': False,
'do_test_steps': False,
'env': {
'CC': '/usr/bin/clang',
'CXX': '/usr/bin/clang++',
'GYP_DEFINES':
'skia_arch_type=arm skia_clang_build=1 skia_warnings_as_errors=0',
},
'nanobench_flags': [
'--dummy-flags',
],
'upload_dm_results': True,
'upload_perf_results': False,
},
'Build-Mac10.9-Clang-Arm7-Debug-iOS': {
'build_targets': [
'most',
],
'builder_cfg': {
'compiler': 'Clang',
'configuration': 'Debug',
'extra_config': 'iOS',
'is_trybot': False,
'os': 'Mac10.9',
'role': 'Build',
'target_arch': 'Arm7',
},
'configuration': 'Debug',
'dm_flags': [
'--dummy-flags',
],
'do_perf_steps': False,
'do_test_steps': False,
'env': {
'CC': '/usr/bin/clang',
'CXX': '/usr/bin/clang++',
'GYP_DEFINES':
('skia_arch_type=arm skia_clang_build=1 skia_os=ios skia_warnings_a'
's_errors=1'),
},
'nanobench_flags': [
'--dummy-flags',
],
'upload_dm_results': True,
'upload_perf_results': False,
},
'Build-Ubuntu-GCC-Arm7-Debug-Android': {
'build_targets': [
'most',
],
'builder_cfg': {
'compiler': 'GCC',
'configuration': 'Debug',
'extra_config': 'Android',
'is_trybot': False,
'os': 'Ubuntu',
'role': 'Build',
'target_arch': 'Arm7',
},
'configuration': 'Debug',
'device_cfg': 'arm_v7_neon',
'dm_flags': [
'--dummy-flags',
],
'do_perf_steps': False,
'do_test_steps': False,
'env': {
'GYP_DEFINES': 'skia_arch_type=arm skia_warnings_as_errors=1',
},
'nanobench_flags': [
'--dummy-flags',
],
'upload_dm_results': True,
'upload_perf_results': False,
},
'Build-Ubuntu-GCC-Arm7-Debug-Android-Trybot': {
'build_targets': [
'most',
],
'builder_cfg': {
'compiler': 'GCC',
'configuration': 'Debug',
'extra_config': 'Android',
'is_trybot': True,
'os': 'Ubuntu',
'role': 'Build',
'target_arch': 'Arm7',
},
'configuration': 'Debug',
'device_cfg': 'arm_v7_neon',
'dm_flags': [
'--dummy-flags',
],
'do_perf_steps': False,
'do_test_steps': False,
'env': {
'GYP_DEFINES': 'skia_arch_type=arm skia_warnings_as_errors=1',
},
'nanobench_flags': [
'--dummy-flags',
],
'upload_dm_results': True,
'upload_perf_results': False,
},
'Build-Ubuntu-GCC-Arm7-Release-Android': {
'build_targets': [
'most',
],
'builder_cfg': {
'compiler': 'GCC',
'configuration': 'Release',
'extra_config': 'Android',
'is_trybot': False,
'os': 'Ubuntu',
'role': 'Build',
'target_arch': 'Arm7',
},
'configuration': 'Release',
'device_cfg': 'arm_v7_neon',
'dm_flags': [
'--dummy-flags',
],
'do_perf_steps': False,
'do_test_steps': False,
'env': {
'GYP_DEFINES': 'skia_arch_type=arm skia_warnings_as_errors=1',
},
'nanobench_flags': [
'--dummy-flags',
],
'upload_dm_results': True,
'upload_perf_results': False,
},
'Build-Ubuntu-GCC-x86_64-Debug-Swarming': {
'build_targets': [
'most',
],
'builder_cfg': {
'compiler': 'GCC',
'configuration': 'Debug',
'extra_config': 'Swarming',
'is_trybot': False,
'os': 'Ubuntu',
'role': 'Build',
'target_arch': 'x86_64',
},
'configuration': 'Debug',
'dm_flags': [
'--dummy-flags',
],
'do_perf_steps': False,
'do_test_steps': False,
'env': {
'GYP_DEFINES': 'skia_arch_type=x86_64 skia_warnings_as_errors=1',
},
'nanobench_flags': [
'--dummy-flags',
],
'upload_dm_results': True,
'upload_perf_results': False,
},
'Build-Ubuntu-GCC-x86_64-Release-CMake': {
'build_targets': [
'most',
],
'builder_cfg': {
'compiler': 'GCC',
'configuration': 'Release',
'extra_config': 'CMake',
'is_trybot': False,
'os': 'Ubuntu',
'role': 'Build',
'target_arch': 'x86_64',
},
'configuration': 'Release',
'dm_flags': [
'--dummy-flags',
],
'do_perf_steps': False,
'do_test_steps': False,
'env': {
'GYP_DEFINES': 'skia_arch_type=x86_64 skia_warnings_as_errors=1',
},
'nanobench_flags': [
'--dummy-flags',
],
'upload_dm_results': True,
'upload_perf_results': False,
},
'Build-Ubuntu-GCC-x86_64-Release-Swarming-Trybot': {
'build_targets': [
'most',
],
'builder_cfg': {
'compiler': 'GCC',
'configuration': 'Release',
'extra_config': 'Swarming',
'is_trybot': True,
'os': 'Ubuntu',
'role': 'Build',
'target_arch': 'x86_64',
},
'configuration': 'Release',
'dm_flags': [
'--dummy-flags',
],
'do_perf_steps': False,
'do_test_steps': False,
'env': {
'GYP_DEFINES': 'skia_arch_type=x86_64 skia_warnings_as_errors=1',
},
'nanobench_flags': [
'--dummy-flags',
],
'upload_dm_results': True,
'upload_perf_results': False,
},
'Build-Ubuntu-GCC-x86_64-Release-SwarmingValgrind': {
'build_targets': [
'most',
],
'builder_cfg': {
'compiler': 'GCC',
'configuration': 'Release',
'extra_config': 'SwarmingValgrind',
'is_trybot': False,
'os': 'Ubuntu',
'role': 'Build',
'target_arch': 'x86_64',
},
'configuration': 'Release',
'dm_flags': [
'--dummy-flags',
],
'do_perf_steps': False,
'do_test_steps': False,
'env': {
'GYP_DEFINES':
('skia_arch_type=x86_64 skia_release_optimization_level=1 skia_warn'
'ings_as_errors=1'),
},
'nanobench_flags': [
'--dummy-flags',
],
'upload_dm_results': False,
'upload_perf_results': False,
},
'Build-Win-MSVC-x86-Debug-VS2015': {
'build_targets': [
'most',
],
'builder_cfg': {
'compiler': 'MSVC',
'configuration': 'Debug',
'extra_config': 'VS2015',
'is_trybot': False,
'os': 'Win',
'role': 'Build',
'target_arch': 'x86',
},
'configuration': 'Debug',
'dm_flags': [
'--dummy-flags',
],
'do_perf_steps': False,
'do_test_steps': False,
'env': {
'GYP_DEFINES':
('qt_sdk=C:/Qt/4.8.5/ skia_arch_type=x86 skia_warnings_as_errors=1 '
'skia_win_debuggers_path=c:/DbgHelp skia_win_ltcg=0'),
},
'nanobench_flags': [
'--dummy-flags',
],
'upload_dm_results': True,
'upload_perf_results': False,
},
'Build-Win8-MSVC-x86_64-Release-Swarming': {
'build_targets': [
'most',
],
'builder_cfg': {
'compiler': 'MSVC',
'configuration': 'Release',
'extra_config': 'Swarming',
'is_trybot': False,
'os': 'Win8',
'role': 'Build',
'target_arch': 'x86_64',
},
'configuration': 'Release_x64',
'dm_flags': [
'--dummy-flags',
],
'do_perf_steps': False,
'do_test_steps': False,
'env': {
'GYP_DEFINES':
('qt_sdk=C:/Qt/Qt5.1.0/5.1.0/msvc2012_64/ skia_arch_type=x86_64 ski'
'a_warnings_as_errors=1 skia_win_debuggers_path=c:/DbgHelp skia_wi'
'n_ltcg=0'),
},
'nanobench_flags': [
'--dummy-flags',
],
'upload_dm_results': True,
'upload_perf_results': False,
},
'Housekeeper-PerCommit': {
'build_targets': [
'most',
],
'builder_cfg': {
'frequency': 'PerCommit',
'is_trybot': False,
'role': 'Housekeeper',
},
'configuration': 'Release',
'dm_flags': [
'--dummy-flags',
],
'do_perf_steps': False,
'do_test_steps': False,
'env': {
'GYP_DEFINES': 'skia_shared_lib=1 skia_warnings_as_errors=0',
},
'nanobench_flags': [
'--dummy-flags',
],
'upload_dm_results': True,
'upload_perf_results': False,
},
'Housekeeper-PerCommit-Trybot': {
'build_targets': [
'most',
],
'builder_cfg': {
'frequency': 'PerCommit',
'is_trybot': True,
'role': 'Housekeeper',
},
'configuration': 'Release',
'dm_flags': [
'--dummy-flags',
],
'do_perf_steps': False,
'do_test_steps': False,
'env': {
'GYP_DEFINES': 'skia_shared_lib=1 skia_warnings_as_errors=0',
},
'nanobench_flags': [
'--dummy-flags',
],
'upload_dm_results': True,
'upload_perf_results': False,
},
'Perf-Android-GCC-Nexus5-CPU-NEON-Arm7-Release-Appurify': {
'build_targets': [
'VisualBenchTest_APK',
],
'builder_cfg': {
'arch': 'Arm7',
'compiler': 'GCC',
'configuration': 'Release',
'cpu_or_gpu': 'CPU',
'cpu_or_gpu_value': 'NEON',
'extra_config': 'Appurify',
'is_trybot': False,
'model': 'Nexus5',
'os': 'Android',
'role': 'Perf',
},
'configuration': 'Release',
'device_cfg': 'arm_v7_neon',
'dm_flags': [
'--dummy-flags',
],
'do_perf_steps': True,
'do_test_steps': False,
'env': {
'GYP_DEFINES': 'skia_arch_type=arm skia_gpu=0 skia_warnings_as_errors=0',
},
'nanobench_flags': [
'--dummy-flags',
],
'product.board': 'hammerhead',
'upload_dm_results': True,
'upload_perf_results': True,
},
'Perf-Android-GCC-Nexus5-GPU-Adreno330-Arm7-Release-Appurify': {
'build_targets': [
'VisualBenchTest_APK',
],
'builder_cfg': {
'arch': 'Arm7',
'compiler': 'GCC',
'configuration': 'Release',
'cpu_or_gpu': 'GPU',
'cpu_or_gpu_value': 'Adreno330',
'extra_config': 'Appurify',
'is_trybot': False,
'model': 'Nexus5',
'os': 'Android',
'role': 'Perf',
},
'configuration': 'Release',
'device_cfg': 'arm_v7_neon',
'dm_flags': [
'--dummy-flags',
],
'do_perf_steps': True,
'do_test_steps': False,
'env': {
'GYP_DEFINES':
'skia_arch_type=arm skia_dump_stats=1 skia_warnings_as_errors=0',
},
'nanobench_flags': [
'--dummy-flags',
],
'product.board': 'hammerhead',
'upload_dm_results': True,
'upload_perf_results': True,
},
'Perf-Android-GCC-Nexus7-GPU-Tegra3-Arm7-Release': {
'build_targets': [
'nanobench',
],
'builder_cfg': {
'arch': 'Arm7',
'compiler': 'GCC',
'configuration': 'Release',
'cpu_or_gpu': 'GPU',
'cpu_or_gpu_value': 'Tegra3',
'is_trybot': False,
'model': 'Nexus7',
'os': 'Android',
'role': 'Perf',
},
'configuration': 'Release',
'device_cfg': 'arm_v7_neon',
'dm_flags': [
'--dummy-flags',
],
'do_perf_steps': True,
'do_test_steps': False,
'env': {
'GYP_DEFINES':
'skia_arch_type=arm skia_dump_stats=1 skia_warnings_as_errors=0',
},
'nanobench_flags': [
'--dummy-flags',
],
'product.board': 'grouper',
'upload_dm_results': True,
'upload_perf_results': True,
},
'Perf-Ubuntu-GCC-GCE-CPU-AVX2-x86_64-Release-Swarming-Trybot': {
'build_targets': [
'nanobench',
],
'builder_cfg': {
'arch': 'x86_64',
'compiler': 'GCC',
'configuration': 'Release',
'cpu_or_gpu': 'CPU',
'cpu_or_gpu_value': 'AVX2',
'extra_config': 'Swarming',
'is_trybot': True,
'model': 'GCE',
'os': 'Ubuntu',
'role': 'Perf',
},
'configuration': 'Release',
'dm_flags': [
'--dummy-flags',
],
'do_perf_steps': True,
'do_test_steps': False,
'env': {
'GYP_DEFINES':
'skia_arch_type=x86_64 skia_gpu=0 skia_warnings_as_errors=0',
},
'nanobench_flags': [
'--dummy-flags',
],
'upload_dm_results': True,
'upload_perf_results': True,
},
'Perf-Ubuntu-GCC-ShuttleA-GPU-GTX550Ti-x86_64-Release-VisualBench': {
'build_targets': [
'visualbench',
],
'builder_cfg': {
'arch': 'x86_64',
'compiler': 'GCC',
'configuration': 'Release',
'cpu_or_gpu': 'GPU',
'cpu_or_gpu_value': 'GTX550Ti',
'extra_config': 'VisualBench',
'is_trybot': False,
'model': 'ShuttleA',
'os': 'Ubuntu',
'role': 'Perf',
},
'configuration': 'Release',
'dm_flags': [
'--dummy-flags',
],
'do_perf_steps': True,
'do_test_steps': False,
'env': {
'GYP_DEFINES':
('skia_arch_type=x86_64 skia_dump_stats=1 skia_use_sdl=1 skia_warni'
'ngs_as_errors=0'),
},
'nanobench_flags': [
'--dummy-flags',
],
'upload_dm_results': True,
'upload_perf_results': True,
},
'Perf-Win8-MSVC-ShuttleB-GPU-HD4600-x86_64-Release-Trybot': {
'build_targets': [
'nanobench',
],
'builder_cfg': {
'arch': 'x86_64',
'compiler': 'MSVC',
'configuration': 'Release',
'cpu_or_gpu': 'GPU',
'cpu_or_gpu_value': 'HD4600',
'is_trybot': True,
'model': 'ShuttleB',
'os': 'Win8',
'role': 'Perf',
},
'configuration': 'Release_x64',
'dm_flags': [
'--dummy-flags',
],
'do_perf_steps': True,
'do_test_steps': False,
'env': {
'GYP_DEFINES':
('qt_sdk=C:/Qt/Qt5.1.0/5.1.0/msvc2012_64/ skia_arch_type=x86_64 ski'
'a_dump_stats=1 skia_warnings_as_errors=0 skia_win_debuggers_path='
'c:/DbgHelp'),
},
'nanobench_flags': [
'--dummy-flags',
],
'upload_dm_results': True,
'upload_perf_results': True,
},
'Test-Android-GCC-Nexus6-GPU-Adreno420-Arm7-Release': {
'build_targets': [
'dm',
],
'builder_cfg': {
'arch': 'Arm7',
'compiler': 'GCC',
'configuration': 'Release',
'cpu_or_gpu': 'GPU',
'cpu_or_gpu_value': 'Adreno420',
'is_trybot': False,
'model': 'Nexus6',
'os': 'Android',
'role': 'Test',
},
'configuration': 'Release',
'device_cfg': 'arm_v7_neon',
'dm_flags': [
'--dummy-flags',
],
'do_perf_steps': False,
'do_test_steps': True,
'env': {
'GYP_DEFINES': 'skia_arch_type=arm skia_warnings_as_errors=0',
},
'nanobench_flags': [
'--dummy-flags',
],
'product.board': 'shamu',
'upload_dm_results': True,
'upload_perf_results': False,
},
'Test-Android-GCC-Nexus7-GPU-Tegra3-Arm7-Debug': {
'build_targets': [
'dm',
'nanobench',
],
'builder_cfg': {
'arch': 'Arm7',
'compiler': 'GCC',
'configuration': 'Debug',
'cpu_or_gpu': 'GPU',
'cpu_or_gpu_value': 'Tegra3',
'is_trybot': False,
'model': 'Nexus7',
'os': 'Android',
'role': 'Test',
},
'configuration': 'Debug',
'device_cfg': 'arm_v7_neon',
'dm_flags': [
'--dummy-flags',
],
'do_perf_steps': True,
'do_test_steps': True,
'env': {
'GYP_DEFINES': 'skia_arch_type=arm skia_warnings_as_errors=0',
},
'nanobench_flags': [
'--dummy-flags',
],
'product.board': 'grouper',
'upload_dm_results': True,
'upload_perf_results': False,
},
'Test-Android-GCC-Nexus7v2-GPU-Tegra3-Arm7-Release-Swarming': {
'build_targets': [
'dm',
],
'builder_cfg': {
'arch': 'Arm7',
'compiler': 'GCC',
'configuration': 'Release',
'cpu_or_gpu': 'GPU',
'cpu_or_gpu_value': 'Tegra3',
'extra_config': 'Swarming',
'is_trybot': False,
'model': 'Nexus7v2',
'os': 'Android',
'role': 'Test',
},
'configuration': 'Release',
'device_cfg': 'arm_v7_neon',
'dm_flags': [
'--dummy-flags',
],
'do_perf_steps': False,
'do_test_steps': True,
'env': {
'GYP_DEFINES': 'skia_arch_type=arm skia_warnings_as_errors=0',
},
'nanobench_flags': [
'--dummy-flags',
],
'product.board': 'flo',
'upload_dm_results': True,
'upload_perf_results': False,
},
'Test-ChromeOS-GCC-Link-CPU-AVX-x86_64-Debug': {
'build_targets': [
'dm',
'nanobench',
],
'builder_cfg': {
'arch': 'x86_64',
'compiler': 'GCC',
'configuration': 'Debug',
'cpu_or_gpu': 'CPU',
'cpu_or_gpu_value': 'AVX',
'is_trybot': False,
'model': 'Link',
'os': 'ChromeOS',
'role': 'Test',
},
'configuration': 'Debug',
'device_cfg': 'link',
'dm_flags': [
'--dummy-flags',
],
'do_perf_steps': True,
'do_test_steps': True,
'env': {
'GYP_DEFINES':
'skia_arch_type=x86_64 skia_gpu=0 skia_warnings_as_errors=0',
},
'nanobench_flags': [
'--dummy-flags',
],
'upload_dm_results': True,
'upload_perf_results': False,
},
'Test-Mac-Clang-MacMini6.2-CPU-AVX-x86_64-Release-Swarming': {
'build_targets': [
'dm',
],
'builder_cfg': {
'arch': 'x86_64',
'compiler': 'Clang',
'configuration': 'Release',
'cpu_or_gpu': 'CPU',
'cpu_or_gpu_value': 'AVX',
'extra_config': 'Swarming',
'is_trybot': False,
'model': 'MacMini6.2',
'os': 'Mac',
'role': 'Test',
},
'configuration': 'Release',
'dm_flags': [
'--dummy-flags',
],
'do_perf_steps': False,
'do_test_steps': True,
'env': {
'CC': '/usr/bin/clang',
'CXX': '/usr/bin/clang++',
'GYP_DEFINES':
('skia_arch_type=x86_64 skia_clang_build=1 skia_gpu=0 skia_warnings'
'_as_errors=0'),
},
'nanobench_flags': [
'--dummy-flags',
],
'upload_dm_results': True,
'upload_perf_results': False,
},
'Test-Ubuntu-Clang-GCE-CPU-AVX2-x86_64-Coverage-Trybot': {
'build_targets': [
'dm',
],
'builder_cfg': {
'arch': 'x86_64',
'compiler': 'Clang',
'configuration': 'Coverage',
'cpu_or_gpu': 'CPU',
'cpu_or_gpu_value': 'AVX2',
'is_trybot': True,
'model': 'GCE',
'os': 'Ubuntu',
'role': 'Test',
},
'configuration': 'Coverage',
'dm_flags': [
'--dummy-flags',
],
'do_perf_steps': False,
'do_test_steps': True,
'env': {
'CC': '/usr/bin/clang-3.6',
'CXX': '/usr/bin/clang++-3.6',
'GYP_DEFINES':
('skia_arch_type=x86_64 skia_clang_build=1 skia_gpu=0 skia_warnings'
'_as_errors=0'),
},
'nanobench_flags': [
'--dummy-flags',
],
'upload_dm_results': False,
'upload_perf_results': False,
},
'Test-Ubuntu-GCC-GCE-CPU-AVX2-x86_64-Debug': {
'build_targets': [
'dm',
'nanobench',
],
'builder_cfg': {
'arch': 'x86_64',
'compiler': 'GCC',
'configuration': 'Debug',
'cpu_or_gpu': 'CPU',
'cpu_or_gpu_value': 'AVX2',
'is_trybot': False,
'model': 'GCE',
'os': 'Ubuntu',
'role': 'Test',
},
'configuration': 'Debug',
'dm_flags': [
'--dummy-flags',
],
'do_perf_steps': True,
'do_test_steps': True,
'env': {
'GYP_DEFINES':
'skia_arch_type=x86_64 skia_gpu=0 skia_warnings_as_errors=0',
},
'nanobench_flags': [
'--dummy-flags',
],
'upload_dm_results': True,
'upload_perf_results': False,
},
'Test-Ubuntu-GCC-GCE-CPU-AVX2-x86_64-Debug-Swarming': {
'build_targets': [
'dm',
'nanobench',
],
'builder_cfg': {
'arch': 'x86_64',
'compiler': 'GCC',
'configuration': 'Debug',
'cpu_or_gpu': 'CPU',
'cpu_or_gpu_value': 'AVX2',
'extra_config': 'Swarming',
'is_trybot': False,
'model': 'GCE',
'os': 'Ubuntu',
'role': 'Test',
},
'configuration': 'Debug',
'dm_flags': [
'--dummy-flags',
],
'do_perf_steps': True,
'do_test_steps': True,
'env': {
'GYP_DEFINES':
'skia_arch_type=x86_64 skia_gpu=0 skia_warnings_as_errors=0',
},
'nanobench_flags': [
'--dummy-flags',
],
'upload_dm_results': True,
'upload_perf_results': False,
},
'Test-Ubuntu-GCC-GCE-CPU-AVX2-x86_64-Release-TSAN': {
'build_targets': [
'dm',
],
'builder_cfg': {
'arch': 'x86_64',
'compiler': 'GCC',
'configuration': 'Release',
'cpu_or_gpu': 'CPU',
'cpu_or_gpu_value': 'AVX2',
'extra_config': 'TSAN',
'is_trybot': False,
'model': 'GCE',
'os': 'Ubuntu',
'role': 'Test',
},
'configuration': 'Release',
'dm_flags': [
'--dummy-flags',
],
'do_perf_steps': False,
'do_test_steps': True,
'env': {
'GYP_DEFINES':
'skia_arch_type=x86_64 skia_gpu=0 skia_warnings_as_errors=0',
},
'nanobench_flags': [
'--dummy-flags',
],
'upload_dm_results': False,
'upload_perf_results': False,
},
'Test-Ubuntu-GCC-ShuttleA-GPU-GTX550Ti-x86_64-Debug-ZeroGPUCache': {
'build_targets': [
'dm',
'nanobench',
],
'builder_cfg': {
'arch': 'x86_64',
'compiler': 'GCC',
'configuration': 'Debug',
'cpu_or_gpu': 'GPU',
'cpu_or_gpu_value': 'GTX550Ti',
'extra_config': 'ZeroGPUCache',
'is_trybot': False,
'model': 'ShuttleA',
'os': 'Ubuntu',
'role': 'Test',
},
'configuration': 'Debug',
'dm_flags': [
'--dummy-flags',
],
'do_perf_steps': True,
'do_test_steps': True,
'env': {
'GYP_DEFINES': 'skia_arch_type=x86_64 skia_warnings_as_errors=0',
},
'nanobench_flags': [
'--dummy-flags',
],
'upload_dm_results': True,
'upload_perf_results': False,
},
'Test-Ubuntu-GCC-ShuttleA-GPU-GTX550Ti-x86_64-Release-SwarmingValgrind': {
'build_targets': [
'dm',
'nanobench',
],
'builder_cfg': {
'arch': 'x86_64',
'compiler': 'GCC',
'configuration': 'Release',
'cpu_or_gpu': 'GPU',
'cpu_or_gpu_value': 'GTX550Ti',
'extra_config': 'SwarmingValgrind',
'is_trybot': False,
'model': 'ShuttleA',
'os': 'Ubuntu',
'role': 'Test',
},
'configuration': 'Release',
'dm_flags': [
'--dummy-flags',
],
'do_perf_steps': True,
'do_test_steps': True,
'env': {
'GYP_DEFINES':
('skia_arch_type=x86_64 skia_release_optimization_level=1 skia_warn'
'ings_as_errors=0'),
},
'nanobench_flags': [
'--dummy-flags',
],
'upload_dm_results': False,
'upload_perf_results': False,
},
'Test-Ubuntu-GCC-ShuttleA-GPU-GTX550Ti-x86_64-Release-Valgrind': {
'build_targets': [
'dm',
'nanobench',
],
'builder_cfg': {
'arch': 'x86_64',
'compiler': 'GCC',
'configuration': 'Release',
'cpu_or_gpu': 'GPU',
'cpu_or_gpu_value': 'GTX550Ti',
'extra_config': 'Valgrind',
'is_trybot': False,
'model': 'ShuttleA',
'os': 'Ubuntu',
'role': 'Test',
},
'configuration': 'Release',
'dm_flags': [
'--dummy-flags',
],
'do_perf_steps': True,
'do_test_steps': True,
'env': {
'GYP_DEFINES':
('skia_arch_type=x86_64 skia_release_optimization_level=1 skia_warn'
'ings_as_errors=0'),
},
'nanobench_flags': [
'--dummy-flags',
],
'upload_dm_results': False,
'upload_perf_results': False,
},
'Test-Win8-MSVC-ShuttleB-CPU-AVX2-x86_64-Release-Swarming': {
'build_targets': [
'dm',
],
'builder_cfg': {
'arch': 'x86_64',
'compiler': 'MSVC',
'configuration': 'Release',
'cpu_or_gpu': 'CPU',
'cpu_or_gpu_value': 'AVX2',
'extra_config': 'Swarming',
'is_trybot': False,
'model': 'ShuttleB',
'os': 'Win8',
'role': 'Test',
},
'configuration': 'Release_x64',
'dm_flags': [
'--dummy-flags',
],
'do_perf_steps': False,
'do_test_steps': True,
'env': {
'GYP_DEFINES':
('qt_sdk=C:/Qt/Qt5.1.0/5.1.0/msvc2012_64/ skia_arch_type=x86_64 ski'
'a_gpu=0 skia_warnings_as_errors=0 skia_win_debuggers_path=c:/DbgH'
'elp'),
},
'nanobench_flags': [
'--dummy-flags',
],
'upload_dm_results': True,
'upload_perf_results': False,
},
'Test-Win8-MSVC-ShuttleB-CPU-AVX2-x86_64-Release-Trybot': {
'build_targets': [
'dm',
],
'builder_cfg': {
'arch': 'x86_64',
'compiler': 'MSVC',
'configuration': 'Release',
'cpu_or_gpu': 'CPU',
'cpu_or_gpu_value': 'AVX2',
'is_trybot': True,
'model': 'ShuttleB',
'os': 'Win8',
'role': 'Test',
},
'configuration': 'Release_x64',
'dm_flags': [
'--dummy-flags',
],
'do_perf_steps': False,
'do_test_steps': True,
'env': {
'GYP_DEFINES':
('qt_sdk=C:/Qt/Qt5.1.0/5.1.0/msvc2012_64/ skia_arch_type=x86_64 ski'
'a_gpu=0 skia_warnings_as_errors=0 skia_win_debuggers_path=c:/DbgH'
'elp'),
},
'nanobench_flags': [
'--dummy-flags',
],
'upload_dm_results': True,
'upload_perf_results': False,
},
'Test-iOS-Clang-iPad4-GPU-SGX554-Arm7-Debug': {
'build_targets': [
'iOSShell',
],
'builder_cfg': {
'arch': 'Arm7',
'compiler': 'Clang',
'configuration': 'Debug',
'cpu_or_gpu': 'GPU',
'cpu_or_gpu_value': 'SGX554',
'is_trybot': False,
'model': 'iPad4',
'os': 'iOS',
'role': 'Test',
},
'configuration': 'Debug',
'device_cfg': 'iPad4,1',
'dm_flags': [
'--dummy-flags',
],
'do_perf_steps': True,
'do_test_steps': True,
'env': {
'CC': '/usr/bin/clang',
'CXX': '/usr/bin/clang++',
'GYP_DEFINES':
('skia_arch_type=arm skia_clang_build=1 skia_os=ios skia_warnings_a'
's_errors=0'),
},
'nanobench_flags': [
'--dummy-flags',
],
'upload_dm_results': True,
'upload_perf_results': False,
},
'Test-iOS-Clang-iPad4-GPU-SGX554-Arm7-Release-Swarming': {
'build_targets': [
'iOSShell',
],
'builder_cfg': {
'arch': 'Arm7',
'compiler': 'Clang',
'configuration': 'Release',
'cpu_or_gpu': 'GPU',
'cpu_or_gpu_value': 'SGX554',
'extra_config': 'Swarming',
'is_trybot': False,
'model': 'iPad4',
'os': 'iOS',
'role': 'Test',
},
'configuration': 'Release',
'device_cfg': 'iPad4,1',
'dm_flags': [
'--dummy-flags',
],
'do_perf_steps': False,
'do_test_steps': True,
'env': {
'CC': '/usr/bin/clang',
'CXX': '/usr/bin/clang++',
'GYP_DEFINES':
('skia_arch_type=arm skia_clang_build=1 skia_os=ios skia_warnings_a'
's_errors=0'),
},
'nanobench_flags': [
'--dummy-flags',
],
'upload_dm_results': True,
'upload_perf_results': False,
},
}
| 24.554894 | 80 | 0.539269 | 3,130 | 28,852 | 4.641534 | 0.049521 | 0.050936 | 0.076404 | 0.043296 | 0.951267 | 0.944108 | 0.930135 | 0.916506 | 0.895374 | 0.869425 | 0 | 0.025832 | 0.270103 | 28,852 | 1,174 | 81 | 24.575809 | 0.664039 | 0.002703 | 0 | 0.83959 | 1 | 0.003413 | 0.536251 | 0.112227 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
48f0ba48093ca0527ec84cd6d32caa69694007f8 | 70 | py | Python | eod/plugins/yolov5/utils/__init__.py | Helicopt/EOD | b5db36f4ce267bf64d093b8174bde2c4097b4718 | [
"Apache-2.0"
] | 196 | 2021-10-30T05:15:36.000Z | 2022-03-30T18:43:40.000Z | eod/tasks/det/plugins/yolov5/utils/__init__.py | YZW-explorer/EOD | f10e64de86c0f356ebf5c7e923f4042eec4207b1 | [
"Apache-2.0"
] | 12 | 2021-10-30T11:33:28.000Z | 2022-03-31T14:22:58.000Z | eod/tasks/det/plugins/yolov5/utils/__init__.py | YZW-explorer/EOD | f10e64de86c0f356ebf5c7e923f4042eec4207b1 | [
"Apache-2.0"
] | 23 | 2021-11-01T07:26:17.000Z | 2022-03-27T05:55:37.000Z | from .optimizer_helper import * # noqa
from .lr_helper import * # noqa | 35 | 38 | 0.757143 | 10 | 70 | 5.1 | 0.6 | 0.470588 | 0.627451 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.157143 | 70 | 2 | 39 | 35 | 0.864407 | 0.128571 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
5b143fbf1efd8fde877cc1a9fdbd3b29ed280795 | 2,155 | py | Python | pymotor/plots.py | yoonghm/pymotor | 8d9c36b4b1409103634d5a106fd44284b0a70be8 | [
"MIT"
] | 5 | 2019-11-17T20:05:58.000Z | 2021-07-23T12:40:09.000Z | pymotor/plots.py | yoonghm/pymotor | 8d9c36b4b1409103634d5a106fd44284b0a70be8 | [
"MIT"
] | 1 | 2020-02-10T17:32:42.000Z | 2020-03-24T02:29:17.000Z | pymotor/plots.py | yoonghm/pymotor | 8d9c36b4b1409103634d5a106fd44284b0a70be8 | [
"MIT"
] | 2 | 2020-03-24T02:12:14.000Z | 2020-10-27T09:07:37.000Z |
import matplotlib.pyplot as plt
DEFAULT_PLOT_WIDTH_INCHES = 6.5
DEFAULT_PLOT_HEIGHT_INCHES = 9.0
def _plot_df(df,
plot_title='pandas.DataFrame',
filename=None,
width=DEFAULT_PLOT_WIDTH_INCHES,
height=DEFAULT_PLOT_HEIGHT_INCHES,
):
if filename is None:
plt.switch_backend('TKAgg')
else:
plt.switch_backend('Agg')
labels = list(df.columns.values)
num_plots = df.shape[1] - 1
plt.figure(figsize=(width, height), clear=True)
for i in range(num_plots):
plt.subplot(num_plots, 1, i + 1)
plt.plot(df.iloc[:, 0].get_values(), df.iloc[:, i + 1].get_values(),
linestyle='solid', linewidth=1, color=(0.0, 0.0, 0.0))
plt.grid(linestyle=':', linewidth=1, color=(0.75, 0.75, 0.75))
plt.ylabel(labels[i + 1])
if i == 0:
plt.title(plot_title)
if i == num_plots - 1:
plt.xlabel(labels[0])
plt.tight_layout()
if filename is None:
plt.show()
else:
plt.savefig(filename)
def _plot_df_dual(df, series,
plot_title='pandas.DataFrame',
filename=None,
width=DEFAULT_PLOT_WIDTH_INCHES,
height=DEFAULT_PLOT_HEIGHT_INCHES,
):
if filename is None:
plt.switch_backend('TKAgg')
else:
plt.switch_backend('Agg')
labels = list(df.columns.values)
num_plots = df.shape[1] - 1
plt.figure(figsize=(width, height), clear=True)
for i in range(num_plots):
plt.subplot(num_plots, 1, i + 1)
plt.plot(df.iloc[:, 0].get_values(), df.iloc[:, i + 1].get_values(),
linestyle='solid', linewidth=1, color=(0.0, 0.0, 0.0))
if i == 0:
plt.plot(df.iloc[:, 0].get_values(), series.get_values(),
linestyle='--', linewidth=1, color=(1.0, 0.0, 0.0))
plt.grid(linestyle=':', linewidth=1, color=(0.75, 0.75, 0.75))
plt.ylabel(labels[i + 1])
if i == 0:
plt.title(plot_title)
if i == num_plots - 1:
plt.xlabel(labels[0])
plt.tight_layout()
if filename is None:
plt.show()
else:
plt.savefig(filename)
| 25.963855 | 76 | 0.578654 | 310 | 2,155 | 3.870968 | 0.196774 | 0.023333 | 0.0275 | 0.026667 | 0.873333 | 0.873333 | 0.873333 | 0.854167 | 0.854167 | 0.854167 | 0 | 0.042675 | 0.271462 | 2,155 | 82 | 77 | 26.280488 | 0.721656 | 0 | 0 | 0.887097 | 0 | 0 | 0.028784 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.032258 | false | 0 | 0.016129 | 0 | 0.048387 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
d2b0e87939c991a1790ca9a408b0de4c11660f5a | 18,534 | py | Python | experiment/random/tanhTest.py | predoodl/predoo | 3a0ba0515373744364a0dd9daf4251867b39650c | [
"MIT"
] | 14 | 2021-03-27T06:19:39.000Z | 2022-03-07T01:29:42.000Z | experiment/random/tanhTest.py | predoodl/predoo | 3a0ba0515373744364a0dd9daf4251867b39650c | [
"MIT"
] | null | null | null | experiment/random/tanhTest.py | predoodl/predoo | 3a0ba0515373744364a0dd9daf4251867b39650c | [
"MIT"
] | null | null | null | import numpy as np
import torch
import torch.nn.functional as F
import MNN
import tensorflow as tf
from tensorflow import keras
from tensorflow.keras import layers
import csv
import time
import math
F_mnn = MNN.expr
np.random.seed(0)
def input_withDiffDype(x, dtype):
return tf.convert_to_tensor(x, dtype=dtype)
def tf_TanhWithDiffDype(dtype):
return tf.keras.layers.Activation(
'tanh', dtype=dtype
)
def torch_input_withDiffDype(x, dtype):
return torch.tensor(x,dtype=dtype)
def torch_tanhWithDiffType(dtype):
torch_tanh = torch.nn.Tanh()
# torch_tanh.half()
torch_tanh.type(dtype)
return torch_tanh
def getdataforTorch(f):
out = open(file=f, mode="a", newline='')
csv_writer = csv.writer(out)
for j in range(1000):
print('j= ', j)
print('------------------------------------------')
x = np.random.randn(2, 2)
csv_writer.writerow([x])
csv_writer.writerow(["No.", "32_16", "32_64", "64_16", "64_32"])
for i in range(100):
print(i)
res = []
res.append(i)
# x_16 = torch_input_withDiffDype(x, torch.float16)
x_32 = torch_input_withDiffDype(x, torch.float32)
x_64 = torch_input_withDiffDype(x, torch.float64)
torch_tanh_16 = torch_tanhWithDiffType(torch.float16)
torch_tanh_32 = torch_tanhWithDiffType(torch.float32)
torch_tanh_64 = torch_tanhWithDiffType(torch.float64)
out_32_32_1 = torch_tanh_32(x_32).detach().numpy()
out_32_16 = torch_tanh_16(x_32).detach().numpy()
out_32_64 = torch_tanh_64(x_32).detach().numpy()
diff3 = np.mean(out_32_16 - out_32_32_1) # 高精度到低精度
diff4 = np.mean(out_32_64 - out_32_32_1) # 低精度到高精度
res.append(diff3)
res.append(diff4)
out_64_16 = torch_tanh_16(x_64).detach().numpy()
out_64_32 = torch_tanh_32(x_64).detach().numpy()
out_64_64 = torch_tanh_64(x_64).detach().numpy()
diff5 = np.mean(out_64_16 - out_64_64) # 高精度到低精度
diff6 = np.mean(out_64_32 - out_64_64) # 低精度到高精度
res.append(diff5)
res.append(diff6)
csv_writer.writerow(res)
out.close()
def torch_disturb(f):
out = open(file=f, mode="a", newline='')
csv_writer = csv.writer(out)
for j in range(1000):
print('j= ', j)
print('------------------------------------------')
x = np.random.randn(2, 2)
a1 = 0.000001 * np.ones((2, 2), np.float64)
a2 = 0.00000001 * np.ones((2, 2), np.float64)
a3 = 0.0000000001 * np.ones((2, 2), np.float64)
csv_writer.writerow([x])
csv_writer.writerow(["No.", "32_16", "32_64", "64_16", "64_32"])
getTorchData(x, csv_writer)
csv_writer.writerow(["+10^-6", "32_16", "32_64", "64_16", "64_32"])
getTorchData(x + a1, csv_writer)
csv_writer.writerow(["+10^-8", "32_16", "32_64", "64_16", "64_32"])
getTorchData(x + a2, csv_writer)
csv_writer.writerow(["+10^-10", "32_16", "32_64", "64_16", "64_32"])
getTorchData(x + a3, csv_writer)
out.close()
def getTorchData(x,csv_writer):
for i in range(20):
res = []
res.append(i)
# x_16 = torch_input_withDiffDype(x, torch.float16)
x_32 = torch_input_withDiffDype(x, torch.float32)
x_64 = torch_input_withDiffDype(x, torch.float64)
torch_tanh_16 = torch_tanhWithDiffType(torch.float16)
torch_tanh_32 = torch_tanhWithDiffType(torch.float32)
torch_tanh_64 = torch_tanhWithDiffType(torch.float64)
out_32_32_1 = torch_tanh_32(x_32).detach().numpy()
out_32_16 = torch_tanh_16(x_32).detach().numpy()
out_32_64 = torch_tanh_64(x_32).detach().numpy()
diff3 = np.mean(out_32_16 - out_32_32_1) # 高精度到低精度
diff4 = np.mean(out_32_64 - out_32_32_1) # 低精度到高精度
res.append(diff3)
res.append(diff4)
out_64_16 = torch_tanh_16(x_64).detach().numpy()
out_64_32 = torch_tanh_32(x_64).detach().numpy()
out_64_64 = torch_tanh_64(x_64).detach().numpy()
diff5 = np.mean(out_64_16 - out_64_64) # 高精度到低精度
diff6 = np.mean(out_64_32 - out_64_64) # 低精度到高精度
res.append(diff5)
res.append(diff6)
csv_writer.writerow(res)
def getdataforTensorflow(f):
out = open(file=f, mode="a", newline='')
csv_writer = csv.writer(out)
for j in range(1000):
print('j= ', j)
print('------------------------------------------')
x = np.random.randn(2, 2)
csv_writer.writerow([x])
csv_writer.writerow(["No.", "16_32", "16_64", "32_16", "32_64", "64_16", "64_32"])
for i in range(100):
print(i)
res = []
res.append(i)
# TF Pooling
x_32 = input_withDiffDype(x, tf.float32)
x_16 = input_withDiffDype(x, tf.float16)
x_64 = input_withDiffDype(x, tf.float64)
tf_Tanh_16 = tf_TanhWithDiffDype('float16')
tf_Tanh_32 = tf_TanhWithDiffDype('float32')
tf_Tanh_64 = tf_TanhWithDiffDype('float64')
out_16_16_1 = tf_Tanh_16(x_16).numpy().astype(np.float32)
out_16_16_2 = tf_Tanh_16(x_16).numpy().astype(np.float64)
out_16_32 = tf_Tanh_32(x_16)
out_16_64 = tf_Tanh_64(x_16)
diff1 = np.mean(out_16_32 - out_16_16_1) # 低精度到高精度
diff2 = np.mean(out_16_64 - out_16_16_2) # 低精度到高精度
res.append(diff1)
res.append(diff2)
out_32_32_1 = tf_Tanh_32(x_32)
out_32_32_2 = tf_Tanh_32(x_32).numpy().astype(np.float64)
out_32_16 = tf_Tanh_16(x_32).numpy().astype(np.float32)
out_32_64 = tf_Tanh_64(x_32)
diff3 = np.mean(out_32_16 - out_32_32_1) # 高精度到低精度
diff4 = np.mean(out_32_64 - out_32_32_2) # 低精度到高精度
res.append(diff3)
res.append(diff4)
out_64_16 = tf_Tanh_16(x_64).numpy().astype(np.float64)
out_64_32 = tf_Tanh_32(x_64).numpy().astype(np.float64)
out_64_64 = tf_Tanh_64(x_64)
diff5 = np.mean(out_64_16 - out_64_64) # 高精度到低精度
diff6 = np.mean(out_64_32 - out_64_64) # 低精度到高精度
res.append(diff5)
res.append(diff6)
csv_writer.writerow(res)
out.close()
def getDataforTfWihthG(f,g):
out = open(file=f, mode="a", newline='')
out1 = open(file=g, mode="a", newline='')
csv_writer = csv.writer(out)
csv_writer1 = csv.writer(out1)
csv_writer.writerow(["No.", "16_32(16)", "16_64(16)", "32_16(32)", "32_64(32)", "64_16(64)", "64_32(64)",
"time1", "32_16(16)", "64_16(16)", "16_32(32)", "64_32(32)", "16_64(64)", "32_64(64)", "time2",
"isNaN"])
csv_writer1.writerow(
["No.", "当前最大误差(同输入)", "全局最大误差(同输入)", "引起最大误差的输入编号1", "当前最大误差(同算子)", "全局最大误差(同算子)", "引起最大误差的输入编号2"])
h_error1 = 0
h_error2 = 0
for i in range(20):
tmp1 = 0
tmp2 = 0
index1 = 0
index2 = 0
info = []
info.append(i)
for j in range(1000):
print('j= ', j)
x = np.random.randn(2, 2)
res = []
res.append(j)
# TF Pooling
x_32 = input_withDiffDype(x, tf.float32)
x_16 = input_withDiffDype(x, tf.float16)
x_64 = input_withDiffDype(x, tf.float64)
s=time.time()
tf_Tanh_16 = tf_TanhWithDiffDype('float16')
tf_Tanh_32 = tf_TanhWithDiffDype('float32')
tf_Tanh_64 = tf_TanhWithDiffDype('float64')
out_16_16_1 = tf_Tanh_16(x_16).numpy().astype(np.float32)
out_16_16_2 = tf_Tanh_16(x_16).numpy().astype(np.float64)
out_16_32 = tf_Tanh_32(x_16)
out_16_64 = tf_Tanh_64(x_16)
diff1 = np.mean(np.abs(out_16_32 - out_16_16_1)) # 低精度到高精度
diff2 = np.mean(np.abs(out_16_64 - out_16_16_2)) # 低精度到高精度
out_32_32_1 = tf_Tanh_32(x_32)
out_32_32_2 = tf_Tanh_32(x_32).numpy().astype(np.float64)
out_32_16 = tf_Tanh_16(x_32).numpy().astype(np.float32)
out_32_64 = tf_Tanh_64(x_32)
diff3 = np.mean(np.abs(out_32_16 - out_32_32_1) ) # 高精度到低精度
diff4 = np.mean(np.abs(out_32_64 - out_32_32_2)) # 低精度到高精度
out_64_16 = tf_Tanh_16(x_64).numpy().astype(np.float64)
out_64_32 = tf_Tanh_32(x_64).numpy().astype(np.float64)
out_64_64 = tf_Tanh_64(x_64)
diff5 = np.mean(np.abs(out_64_16 - out_64_64)) # 高精度到低精度
diff6 = np.mean(np.abs(out_64_32 - out_64_64)) # 低精度到高精度
e=time.time()
res.append(diff1)
res.append(diff2)
res.append(diff3)
res.append(diff4)
res.append(diff5)
res.append(diff6)
res.append(e-s)
s = time.time()
out_16_16 = tf_Tanh_16(x_16)
diff7 = np.mean(np.abs(tf_Tanh_16(x_32) - out_16_16))
diff8 = np.mean(np.abs(tf_Tanh_16(x_64) - out_16_16))
diff9 = np.mean(np.abs(tf_Tanh_32(x_16) - out_32_32_1))
diff10 = np.mean(np.abs(tf_Tanh_32(x_64) - out_32_32_1))
diff11 = np.mean(np.abs(tf_Tanh_64(x_16) - out_64_64))
diff12 = np.mean(np.abs(tf_Tanh_64(x_32) - out_64_64))
e = time.time()
res.append(diff7)
res.append(diff8)
res.append(diff9)
res.append(diff10)
res.append(diff11)
res.append(diff12)
res.append(e - s)
for n in out_32_32_1.numpy().ravel():
if math.isnan(n):
res.append("NAN")
break
csv_writer.writerow(res)
if max(res[1:7]) > tmp1:
index1 = j
tmp1 = max(max(res[1:7]), tmp1)
if max(res[8:14]) > tmp2:
index2 = j
tmp2 = max(max(res[8:14]), tmp2)
h_error1 = max(h_error1, tmp1)
h_error2 = max(h_error2, tmp2)
info.append(tmp1)
info.append(h_error1)
info.append(index1)
info.append(tmp2)
info.append(h_error2)
info.append(index2)
csv_writer1.writerow(info)
out.close()
out1.close()
def tf_disturb(f):
out = open(file=f, mode="a", newline='')
csv_writer = csv.writer(out)
for j in range(1000):
print('j= ', j)
print('------------------------------------------')
x = np.random.randn(2, 2)
a1 = 0.000001 * np.ones((2, 2), np.float64)
a2 = 0.00000001 * np.ones((2, 2), np.float64)
a3 = 0.0000000001 * np.ones((2, 2), np.float64)
csv_writer.writerow([x])
csv_writer.writerow(["No.", "16_32", "16_64", "32_16", "32_64", "64_16", "64_32"])
getdata(x, csv_writer)
csv_writer.writerow(["+10^-6", "16_32", "16_64", "32_16", "32_64", "64_16", "64_32"])
getdata(x + a1, csv_writer)
csv_writer.writerow(["+10^-8", "16_32", "16_64", "32_16", "32_64", "64_16", "64_32"])
getdata(x + a2, csv_writer)
csv_writer.writerow(["+10^-10", "16_32", "16_64", "32_16", "32_64", "64_16", "64_32"])
getdata(x + a3, csv_writer)
out.close()
def getdata(x,csv_writer):
for i in range(20):
res = []
res.append(i)
# TF Pooling
x_32 = input_withDiffDype(x, tf.float32)
x_16 = input_withDiffDype(x, tf.float16)
x_64 = input_withDiffDype(x, tf.float64)
tf_Tanh_16 = tf_TanhWithDiffDype('float16')
tf_Tanh_32 = tf_TanhWithDiffDype('float32')
tf_Tanh_64 = tf_TanhWithDiffDype('float64')
out_16_16_1 = tf_Tanh_16(x_16).numpy().astype(np.float32)
out_16_16_2 = tf_Tanh_16(x_16).numpy().astype(np.float64)
out_16_32 = tf_Tanh_32(x_16)
out_16_64 = tf_Tanh_64(x_16)
diff1 = np.mean(out_16_32 - out_16_16_1) # 低精度到高精度
diff2 = np.mean(out_16_64 - out_16_16_2) # 低精度到高精度
res.append(diff1)
res.append(diff2)
out_32_32_1 = tf_Tanh_32(x_32)
out_32_32_2 = tf_Tanh_32(x_32).numpy().astype(np.float64)
out_32_16 = tf_Tanh_16(x_32).numpy().astype(np.float32)
out_32_64 = tf_Tanh_64(x_32)
diff3 = np.mean(out_32_16 - out_32_32_1) # 高精度到低精度
diff4 = np.mean(out_32_64 - out_32_32_2) # 低精度到高精度
res.append(diff3)
res.append(diff4)
out_64_16 = tf_Tanh_16(x_64).numpy().astype(np.float64)
out_64_32 = tf_Tanh_32(x_64).numpy().astype(np.float64)
out_64_64 = tf_Tanh_64(x_64)
diff5 = np.mean(out_64_16 - out_64_64) # 高精度到低精度
diff6 = np.mean(out_64_32 - out_64_64) # 低精度到高精度
res.append(diff5)
res.append(diff6)
csv_writer.writerow(res)
def tf_disturb_timeflow(f,g):
out = open(file=f, mode="a", newline='')
out1 = open(file=g, mode="a", newline='')
csv_writer = csv.writer(out)
csv_writer1 = csv.writer(out1)
csv_writer.writerow(["No.", "16_32", "16_64", "32_16", "32_64", "64_16", "64_32","time","disturb"])
csv_writer1.writerow(["No.", "当前最大误差", "全局最大误差", "引起最大误差的输入编号"])
h_error = 0
a1 = 0.000001 * np.ones((2, 2), np.float64)
a2 = 0.00000001 * np.ones((2, 2), np.float64)
a3 = 0.0000000001 * np.ones((2, 2), np.float64)
for i in range(20):
tmp=0
index=0
info=[]
info.append(i)
for j in range(1000):
err=[]
print('j= ', j)
print('------------------------------------------')
x = np.random.randn(2, 2)
getdatafortimeflow(x,csv_writer,j,"0",err)
getdatafortimeflow(x+a1,csv_writer,j,"e-6",err)
getdatafortimeflow(x+a2,csv_writer,j,"e-8",err)
getdatafortimeflow(x+a3,csv_writer,j,"e-10",err)
if max(err)>tmp:
tmp=max(err)
index=j
h_error=max(h_error,tmp)
info.append(tmp)
info.append(h_error)
info.append(index)
csv_writer1.writerow(info)
out.close()
out1.close()
def getdatafortimeflow(x,csv_writer,j,disturb,err):
res = []
res.append(j)
x_32 = input_withDiffDype(x, tf.float32)
x_16 = input_withDiffDype(x, tf.float16)
x_64 = input_withDiffDype(x, tf.float64)
s = time.time()
tf_Tanh_16 = tf_TanhWithDiffDype('float16')
tf_Tanh_32 = tf_TanhWithDiffDype('float32')
tf_Tanh_64 = tf_TanhWithDiffDype('float64')
out_16_16_1 = tf_Tanh_16(x_16).numpy().astype(np.float32)
out_16_16_2 = tf_Tanh_16(x_16).numpy().astype(np.float64)
out_16_32 = tf_Tanh_32(x_16)
out_16_64 = tf_Tanh_64(x_16)
diff1 = np.mean(out_16_32 - out_16_16_1) # 低精度到高精度
diff2 = np.mean(out_16_64 - out_16_16_2) # 低精度到高精度
out_32_32_1 = tf_Tanh_32(x_32)
out_32_32_2 = tf_Tanh_32(x_32).numpy().astype(np.float64)
out_32_16 = tf_Tanh_16(x_32).numpy().astype(np.float32)
out_32_64 = tf_Tanh_64(x_32)
diff3 = np.mean(out_32_16 - out_32_32_1) # 高精度到低精度
diff4 = np.mean(out_32_64 - out_32_32_2) # 低精度到高精度
out_64_16 = tf_Tanh_16(x_64).numpy().astype(np.float64)
out_64_32 = tf_Tanh_32(x_64).numpy().astype(np.float64)
out_64_64 = tf_Tanh_64(x_64)
diff5 = np.mean(out_64_16 - out_64_64) # 高精度到低精度
diff6 = np.mean(out_64_32 - out_64_64) # 低精度到高精度
e = time.time()
res.append(diff1)
res.append(diff2)
res.append(diff3)
res.append(diff4)
res.append(diff5)
res.append(diff6)
res.append(e - s)
res.append(disturb)
err.append(max(res[1:7]))
csv_writer.writerow(res)
def tf_random(f, g):
out = open(file=f, mode="a", newline='')
out1 = open(file=g, mode="a", newline='')
csv_writer = csv.writer(out)
csv_writer1 = csv.writer(out1)
csv_writer.writerow(["No.", "16_64", "32_64", "time",
"isNaN"])
csv_writer1.writerow(
["No.", "当前最大误差", "全局最大误差", "引起最大误差的输入编号"])
h_error1 = 0
for i in range(20):
tmp1 = 0
index1 = 0
info = []
info.append(i)
for j in range(1000):
print('j= ', j)
x = np.random.randn(2, 2)
res = []
res.append(j)
x_32 = input_withDiffDype(x, tf.float32)
x_16 = input_withDiffDype(x, tf.float16)
x_64 = input_withDiffDype(x, tf.float64)
# TF Conv2D
s = time.time()
tf_Tanh_16 = tf_TanhWithDiffDype('float16')
tf_Tanh_32 = tf_TanhWithDiffDype('float32')
tf_Tanh_64 = tf_TanhWithDiffDype('float64')
out_16_16_2 = tf_Tanh_16(x_16).numpy().astype(np.float64)
out_32_32_2 = tf_Tanh_32(x_32).numpy().astype(np.float64)
out_64_64 = tf_Tanh_64(x_64)
diff1 = np.mean(np.abs(out_16_16_2 - out_64_64)) # 低精度到高精度
diff2 = np.mean(np.abs(out_32_32_2 - out_64_64)) # 低精度到高精度
e = time.time()
res.append(diff1)
res.append(diff2)
res.append(e - s)
for n in out_64_64.numpy().ravel():
if math.isnan(n):
res.append("NAN")
break
csv_writer.writerow(res)
if max(res[1:3]) > tmp1:
index1 = j
tmp1 = max(res[1:3])
h_error1 = max(h_error1, tmp1)
info.append(tmp1)
info.append(h_error1)
info.append(index1)
csv_writer1.writerow(info)
out.close()
out1.close()
if __name__ == '__main__':
getDataforTfWihthG("/home/ise/opTest/data/timeflow2/tf_gpu_2.3.1/tanh.csv",
"/home/ise/opTest/data/timeflow2/tf_gpu_2.3.1/tanh_count.csv")
| 33.759563 | 121 | 0.550286 | 2,675 | 18,534 | 3.514766 | 0.055701 | 0.044033 | 0.020102 | 0.019145 | 0.874388 | 0.83578 | 0.82174 | 0.810998 | 0.771219 | 0.748883 | 0 | 0.138379 | 0.306356 | 18,534 | 548 | 122 | 33.821168 | 0.592953 | 0.023309 | 0 | 0.733967 | 0 | 0 | 0.062939 | 0.018391 | 0 | 0 | 0 | 0 | 0 | 1 | 0.033254 | false | 0 | 0.023753 | 0.007126 | 0.066508 | 0.033254 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
d2d20a5ebf5d79b90cb07c8004ce455285d5202c | 305 | py | Python | src/swimport/pools/derived_types/__init__.py | talos-gis/swimport | e8f0fcf02b0c9751b199f750f1f8bc57c8ff54b3 | [
"MIT"
] | 1 | 2019-03-07T20:43:42.000Z | 2019-03-07T20:43:42.000Z | src/swimport/pools/derived_types/__init__.py | talos-gis/swimport | e8f0fcf02b0c9751b199f750f1f8bc57c8ff54b3 | [
"MIT"
] | null | null | null | src/swimport/pools/derived_types/__init__.py | talos-gis/swimport | e8f0fcf02b0c9751b199f750f1f8bc57c8ff54b3 | [
"MIT"
] | null | null | null | import swimport.pools.derived_types.callable_
import swimport.pools.derived_types.iter_
import swimport.pools.derived_types.map_
import swimport.pools.derived_types.py_iterable
import swimport.pools.derived_types.slice_
import swimport.pools.derived_types.tuple_
import swimport.pools.derived_types.array
| 38.125 | 47 | 0.885246 | 43 | 305 | 5.976744 | 0.302326 | 0.381323 | 0.51751 | 0.708171 | 0.844358 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.045902 | 305 | 7 | 48 | 43.571429 | 0.883162 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
d2f3f092c1990bfc33a258da750bad2d2c7f320e | 18,306 | py | Python | nautobot/dcim/migrations/0002_initial_part_2.py | steffann/nautobot | f5cf4a294861e69fa10ac445f7fc89f432d5b3df | [
"Apache-2.0"
] | 1 | 2021-03-16T15:14:55.000Z | 2021-03-16T15:14:55.000Z | nautobot/dcim/migrations/0002_initial_part_2.py | steffann/nautobot | f5cf4a294861e69fa10ac445f7fc89f432d5b3df | [
"Apache-2.0"
] | null | null | null | nautobot/dcim/migrations/0002_initial_part_2.py | steffann/nautobot | f5cf4a294861e69fa10ac445f7fc89f432d5b3df | [
"Apache-2.0"
] | 1 | 2021-10-14T01:54:24.000Z | 2021-10-14T01:54:24.000Z | # Generated by Django 3.1.3 on 2021-02-20 08:07
from django.conf import settings
from django.db import migrations, models
import django.db.models.deletion
import mptt.fields
import nautobot.extras.models.statuses
import taggit.managers
class Migration(migrations.Migration):
initial = True
dependencies = [
migrations.swappable_dependency(settings.AUTH_USER_MODEL),
("tenancy", "0001_initial"),
("dcim", "0001_initial_part_1"),
("extras", "0001_initial_part_1"),
("contenttypes", "0002_remove_content_type_name"),
]
operations = [
migrations.AddField(
model_name="virtualchassis",
name="tags",
field=taggit.managers.TaggableManager(through="extras.TaggedItem", to="extras.Tag"),
),
migrations.AddField(
model_name="site",
name="region",
field=models.ForeignKey(
blank=True,
null=True,
on_delete=django.db.models.deletion.SET_NULL,
related_name="sites",
to="dcim.region",
),
),
migrations.AddField(
model_name="site",
name="status",
field=nautobot.extras.models.statuses.StatusField(
null=True,
on_delete=django.db.models.deletion.PROTECT,
related_name="dcim_site_related",
to="extras.status",
),
),
migrations.AddField(
model_name="site",
name="tags",
field=taggit.managers.TaggableManager(through="extras.TaggedItem", to="extras.Tag"),
),
migrations.AddField(
model_name="site",
name="tenant",
field=models.ForeignKey(
blank=True,
null=True,
on_delete=django.db.models.deletion.PROTECT,
related_name="sites",
to="tenancy.tenant",
),
),
migrations.AddField(
model_name="region",
name="parent",
field=mptt.fields.TreeForeignKey(
blank=True,
null=True,
on_delete=django.db.models.deletion.CASCADE,
related_name="children",
to="dcim.region",
),
),
migrations.AddField(
model_name="rearporttemplate",
name="device_type",
field=models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE,
related_name="rearporttemplates",
to="dcim.devicetype",
),
),
migrations.AddField(
model_name="rearport",
name="_cable_peer_type",
field=models.ForeignKey(
blank=True,
null=True,
on_delete=django.db.models.deletion.SET_NULL,
related_name="+",
to="contenttypes.contenttype",
),
),
migrations.AddField(
model_name="rearport",
name="cable",
field=models.ForeignKey(
blank=True,
null=True,
on_delete=django.db.models.deletion.SET_NULL,
related_name="+",
to="dcim.cable",
),
),
migrations.AddField(
model_name="rearport",
name="device",
field=models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE,
related_name="rearports",
to="dcim.device",
),
),
migrations.AddField(
model_name="rearport",
name="tags",
field=taggit.managers.TaggableManager(through="extras.TaggedItem", to="extras.Tag"),
),
migrations.AddField(
model_name="rackreservation",
name="rack",
field=models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE,
related_name="reservations",
to="dcim.rack",
),
),
migrations.AddField(
model_name="rackreservation",
name="tags",
field=taggit.managers.TaggableManager(through="extras.TaggedItem", to="extras.Tag"),
),
migrations.AddField(
model_name="rackreservation",
name="tenant",
field=models.ForeignKey(
blank=True,
null=True,
on_delete=django.db.models.deletion.PROTECT,
related_name="rackreservations",
to="tenancy.tenant",
),
),
migrations.AddField(
model_name="rackreservation",
name="user",
field=models.ForeignKey(on_delete=django.db.models.deletion.PROTECT, to=settings.AUTH_USER_MODEL),
),
migrations.AddField(
model_name="rackgroup",
name="parent",
field=mptt.fields.TreeForeignKey(
blank=True,
null=True,
on_delete=django.db.models.deletion.CASCADE,
related_name="children",
to="dcim.rackgroup",
),
),
migrations.AddField(
model_name="rackgroup",
name="site",
field=models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE,
related_name="rack_groups",
to="dcim.site",
),
),
migrations.AddField(
model_name="rack",
name="group",
field=models.ForeignKey(
blank=True,
null=True,
on_delete=django.db.models.deletion.SET_NULL,
related_name="racks",
to="dcim.rackgroup",
),
),
migrations.AddField(
model_name="rack",
name="role",
field=models.ForeignKey(
blank=True,
null=True,
on_delete=django.db.models.deletion.PROTECT,
related_name="racks",
to="dcim.rackrole",
),
),
migrations.AddField(
model_name="rack",
name="site",
field=models.ForeignKey(
on_delete=django.db.models.deletion.PROTECT,
related_name="racks",
to="dcim.site",
),
),
migrations.AddField(
model_name="rack",
name="status",
field=nautobot.extras.models.statuses.StatusField(
null=True,
on_delete=django.db.models.deletion.PROTECT,
related_name="dcim_rack_related",
to="extras.status",
),
),
migrations.AddField(
model_name="rack",
name="tags",
field=taggit.managers.TaggableManager(through="extras.TaggedItem", to="extras.Tag"),
),
migrations.AddField(
model_name="rack",
name="tenant",
field=models.ForeignKey(
blank=True,
null=True,
on_delete=django.db.models.deletion.PROTECT,
related_name="racks",
to="tenancy.tenant",
),
),
migrations.AddField(
model_name="powerporttemplate",
name="device_type",
field=models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE,
related_name="powerporttemplates",
to="dcim.devicetype",
),
),
migrations.AddField(
model_name="powerport",
name="_cable_peer_type",
field=models.ForeignKey(
blank=True,
null=True,
on_delete=django.db.models.deletion.SET_NULL,
related_name="+",
to="contenttypes.contenttype",
),
),
migrations.AddField(
model_name="powerport",
name="_path",
field=models.ForeignKey(
blank=True,
null=True,
on_delete=django.db.models.deletion.SET_NULL,
to="dcim.cablepath",
),
),
migrations.AddField(
model_name="powerport",
name="cable",
field=models.ForeignKey(
blank=True,
null=True,
on_delete=django.db.models.deletion.SET_NULL,
related_name="+",
to="dcim.cable",
),
),
migrations.AddField(
model_name="powerport",
name="device",
field=models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE,
related_name="powerports",
to="dcim.device",
),
),
migrations.AddField(
model_name="powerport",
name="tags",
field=taggit.managers.TaggableManager(through="extras.TaggedItem", to="extras.Tag"),
),
migrations.AddField(
model_name="powerpanel",
name="rack_group",
field=models.ForeignKey(
blank=True,
null=True,
on_delete=django.db.models.deletion.PROTECT,
to="dcim.rackgroup",
),
),
migrations.AddField(
model_name="powerpanel",
name="site",
field=models.ForeignKey(on_delete=django.db.models.deletion.PROTECT, to="dcim.site"),
),
migrations.AddField(
model_name="powerpanel",
name="tags",
field=taggit.managers.TaggableManager(through="extras.TaggedItem", to="extras.Tag"),
),
migrations.AddField(
model_name="poweroutlettemplate",
name="device_type",
field=models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE,
related_name="poweroutlettemplates",
to="dcim.devicetype",
),
),
migrations.AddField(
model_name="poweroutlettemplate",
name="power_port",
field=models.ForeignKey(
blank=True,
null=True,
on_delete=django.db.models.deletion.SET_NULL,
related_name="poweroutlet_templates",
to="dcim.powerporttemplate",
),
),
migrations.AddField(
model_name="poweroutlet",
name="_cable_peer_type",
field=models.ForeignKey(
blank=True,
null=True,
on_delete=django.db.models.deletion.SET_NULL,
related_name="+",
to="contenttypes.contenttype",
),
),
migrations.AddField(
model_name="poweroutlet",
name="_path",
field=models.ForeignKey(
blank=True,
null=True,
on_delete=django.db.models.deletion.SET_NULL,
to="dcim.cablepath",
),
),
migrations.AddField(
model_name="poweroutlet",
name="cable",
field=models.ForeignKey(
blank=True,
null=True,
on_delete=django.db.models.deletion.SET_NULL,
related_name="+",
to="dcim.cable",
),
),
migrations.AddField(
model_name="poweroutlet",
name="device",
field=models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE,
related_name="poweroutlets",
to="dcim.device",
),
),
migrations.AddField(
model_name="poweroutlet",
name="power_port",
field=models.ForeignKey(
blank=True,
null=True,
on_delete=django.db.models.deletion.SET_NULL,
related_name="poweroutlets",
to="dcim.powerport",
),
),
migrations.AddField(
model_name="poweroutlet",
name="tags",
field=taggit.managers.TaggableManager(through="extras.TaggedItem", to="extras.Tag"),
),
migrations.AddField(
model_name="powerfeed",
name="_cable_peer_type",
field=models.ForeignKey(
blank=True,
null=True,
on_delete=django.db.models.deletion.SET_NULL,
related_name="+",
to="contenttypes.contenttype",
),
),
migrations.AddField(
model_name="powerfeed",
name="_path",
field=models.ForeignKey(
blank=True,
null=True,
on_delete=django.db.models.deletion.SET_NULL,
to="dcim.cablepath",
),
),
migrations.AddField(
model_name="powerfeed",
name="cable",
field=models.ForeignKey(
blank=True,
null=True,
on_delete=django.db.models.deletion.SET_NULL,
related_name="+",
to="dcim.cable",
),
),
migrations.AddField(
model_name="powerfeed",
name="power_panel",
field=models.ForeignKey(
on_delete=django.db.models.deletion.PROTECT,
related_name="powerfeeds",
to="dcim.powerpanel",
),
),
migrations.AddField(
model_name="powerfeed",
name="rack",
field=models.ForeignKey(
blank=True,
null=True,
on_delete=django.db.models.deletion.PROTECT,
to="dcim.rack",
),
),
migrations.AddField(
model_name="powerfeed",
name="status",
field=nautobot.extras.models.statuses.StatusField(
null=True,
on_delete=django.db.models.deletion.PROTECT,
related_name="dcim_powerfeed_related",
to="extras.status",
),
),
migrations.AddField(
model_name="powerfeed",
name="tags",
field=taggit.managers.TaggableManager(through="extras.TaggedItem", to="extras.Tag"),
),
migrations.AddField(
model_name="platform",
name="manufacturer",
field=models.ForeignKey(
blank=True,
null=True,
on_delete=django.db.models.deletion.PROTECT,
related_name="platforms",
to="dcim.manufacturer",
),
),
migrations.AddField(
model_name="inventoryitem",
name="device",
field=models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE,
related_name="inventoryitems",
to="dcim.device",
),
),
migrations.AddField(
model_name="inventoryitem",
name="manufacturer",
field=models.ForeignKey(
blank=True,
null=True,
on_delete=django.db.models.deletion.PROTECT,
related_name="inventory_items",
to="dcim.manufacturer",
),
),
migrations.AddField(
model_name="inventoryitem",
name="parent",
field=mptt.fields.TreeForeignKey(
blank=True,
null=True,
on_delete=django.db.models.deletion.CASCADE,
related_name="child_items",
to="dcim.inventoryitem",
),
),
migrations.AddField(
model_name="inventoryitem",
name="tags",
field=taggit.managers.TaggableManager(through="extras.TaggedItem", to="extras.Tag"),
),
migrations.AddField(
model_name="interfacetemplate",
name="device_type",
field=models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE,
related_name="interfacetemplates",
to="dcim.devicetype",
),
),
migrations.AddField(
model_name="interface",
name="_cable_peer_type",
field=models.ForeignKey(
blank=True,
null=True,
on_delete=django.db.models.deletion.SET_NULL,
related_name="+",
to="contenttypes.contenttype",
),
),
migrations.AddField(
model_name="interface",
name="_path",
field=models.ForeignKey(
blank=True,
null=True,
on_delete=django.db.models.deletion.SET_NULL,
to="dcim.cablepath",
),
),
migrations.AddField(
model_name="interface",
name="cable",
field=models.ForeignKey(
blank=True,
null=True,
on_delete=django.db.models.deletion.SET_NULL,
related_name="+",
to="dcim.cable",
),
),
migrations.AddField(
model_name="interface",
name="device",
field=models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE,
related_name="interfaces",
to="dcim.device",
),
),
migrations.AddField(
model_name="interface",
name="lag",
field=models.ForeignKey(
blank=True,
null=True,
on_delete=django.db.models.deletion.SET_NULL,
related_name="member_interfaces",
to="dcim.interface",
),
),
]
| 32.92446 | 110 | 0.490495 | 1,501 | 18,306 | 5.836109 | 0.081945 | 0.119178 | 0.152283 | 0.178767 | 0.895662 | 0.890753 | 0.835502 | 0.733333 | 0.716895 | 0.694064 | 0 | 0.00301 | 0.40118 | 18,306 | 555 | 111 | 32.983784 | 0.796114 | 0.002458 | 0 | 0.894161 | 1 | 0 | 0.130347 | 0.01172 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.010949 | 0 | 0.018248 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
96089c8cedc2b6028ae8096529c080fa4f9ba5ba | 13,841 | py | Python | FCNmotif.py | suifengwangshi/MotifC | 34117a6bfb7dacd5a84da3abd5b8a339ae73cc76 | [
"Apache-2.0"
] | null | null | null | FCNmotif.py | suifengwangshi/MotifC | 34117a6bfb7dacd5a84da3abd5b8a339ae73cc76 | [
"Apache-2.0"
] | null | null | null | FCNmotif.py | suifengwangshi/MotifC | 34117a6bfb7dacd5a84da3abd5b8a339ae73cc76 | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf8 -*-
import torch
import torch.nn as nn
import torch.nn.functional as F
#from torchsummary import summary
import numpy as np
import sys
# threelayers c_in=256
"""
class FCN(nn.Module):
# FPN for semantic segmentation
def __init__(self, motiflen=15):
super(FCN, self).__init__() # 初始化
# encode process
self.conv1 = nn.Conv1d(in_channels=4, out_channels=64, kernel_size=motiflen) # 注意这里是一维卷积层,图像处理任务时才是二维卷积层
self.pool1 = nn.MaxPool1d(kernel_size=4, stride=4)
self.conv2 = nn.Conv1d(in_channels=64, out_channels=64, kernel_size=5)
self.pool2 = nn.MaxPool1d(kernel_size=4, stride=4)
self.conv3 = nn.Conv1d(in_channels=64, out_channels=64, kernel_size=3)
self.pool3 = nn.MaxPool1d(kernel_size=2, stride=2)
self.conv4 = nn.Conv1d(in_channels=1, out_channels=64, kernel_size=motiflen) # 4、5、6用于处理进化信息
self.pool4 = nn.MaxPool1d(kernel_size=4, stride=4)
self.conv5 = nn.Conv1d(in_channels=64, out_channels=64, kernel_size=5)
self.pool5 = nn.MaxPool1d(kernel_size=4, stride=4)
self.conv6 = nn.Conv1d(in_channels=64, out_channels=64, kernel_size=3)
self.pool6 = nn.MaxPool1d(kernel_size=2, stride=2)
# classifier head
c_in = 256
self.linear1 = nn.Linear(c_in, 64)
self.drop = nn.Dropout(p=0.5)
self.linear2 = nn.Linear(64, 1)
# general functions
self.sigmoid = nn.Sigmoid()
self.relu = nn.ReLU(inplace=True)
self.dropout = nn.Dropout(p=0.2)
self._init_weights()
def _init_weights(self):
# Initialize the new built layers
for layer in self.modules():
if isinstance(layer, (nn.Conv1d, nn.Linear)):
# nn.init.kaiming_uniform_(layer.weight, mode='fan_in', nonlinearity='relu')
nn.init.xavier_uniform_(layer.weight)
if layer.bias is not None:
nn.init.constant_(layer.bias, 0)
elif isinstance(layer, nn.BatchNorm1d):
nn.init.constant_(layer.weight, 1)
nn.init.constant_(layer.bias, 0)
def forward(self, data1, data2):
# Construct a new computation graph at each froward
b, _, _ = data1.size()
# encode process
out1 = self.conv1(data1)
out1 = self.relu(out1)
out1 = self.pool1(out1)
out1 = self.dropout(out1)
out1 = self.conv2(out1)
out1 = self.relu(out1)
out1 = self.pool2(out1)
out1 = self.dropout(out1)
out1 = self.conv3(out1)
out1 = self.relu(out1)
out1 = self.pool3(out1)
out1 = self.dropout(out1)
skip1 = out1
out2 = self.conv4(data2)
out2 = self.relu(out2)
out2 = self.pool4(out2)
out2 = self.dropout(out2)
out2 = self.conv5(out2)
out2 = self.relu(out2)
out2 = self.pool5(out2)
out2 = self.dropout(out2)
out2 = self.conv6(out2)
out2 = self.relu(out2)
out2 = self.pool6(out2)
out2 = self.dropout(out2)
skip2 = out2
# classifier
skip4 = skip1 + skip2
# classifier
out3 = skip4.view(b, -1)
out3 = self.linear1(out3)
out3 = self.relu(out3)
out3 = self.drop(out3)
out3 = self.linear2(out3)
out_class = self.sigmoid(out3)
return out_class
"""
# twolayers c_in=640 101
class FCN(nn.Module):
# FPN for semantic segmentation
def __init__(self, motiflen=15):
super(FCN, self).__init__() # 初始化
# encode process
self.conv1 = nn.Conv1d(in_channels=4, out_channels=64, kernel_size=motiflen) # 注意这里是一维卷积层,图像处理任务时才是二维卷积层
self.pool1 = nn.MaxPool1d(kernel_size=4, stride=4)
self.conv2 = nn.Conv1d(in_channels=64, out_channels=64, kernel_size=5)
self.pool2 = nn.MaxPool1d(kernel_size=4, stride=4)
self.conv4 = nn.Conv1d(in_channels=1, out_channels=64, kernel_size=motiflen) # 4、5、6用于处理进化信息
self.pool4 = nn.MaxPool1d(kernel_size=4, stride=4)
self.conv5 = nn.Conv1d(in_channels=64, out_channels=64, kernel_size=5)
self.pool5 = nn.MaxPool1d(kernel_size=4, stride=4)
# classifier head
c_in = 256
self.linear1 = nn.Linear(c_in, 64)
self.drop = nn.Dropout(p=0.5)
self.linear2 = nn.Linear(64, 1)
# general functions
self.sigmoid = nn.Sigmoid()
self.relu = nn.ReLU(inplace=True)
self.dropout = nn.Dropout(p=0.2)
self._init_weights()
def _init_weights(self):
# Initialize the new built layers
for layer in self.modules():
if isinstance(layer, (nn.Conv1d, nn.Linear)):
# nn.init.kaiming_uniform_(layer.weight, mode='fan_in', nonlinearity='relu')
nn.init.xavier_uniform_(layer.weight)
if layer.bias is not None:
nn.init.constant_(layer.bias, 0)
elif isinstance(layer, nn.BatchNorm1d):
nn.init.constant_(layer.weight, 1)
nn.init.constant_(layer.bias, 0)
def forward(self, data1, data2):
# Construct a new computation graph at each froward
b, _, _ = data1.size()
# encode process
out1 = self.conv1(data1)
out1 = self.relu(out1)
out1 = self.pool1(out1)
out1 = self.dropout(out1)
out1 = self.conv2(out1)
out1 = self.relu(out1)
out1 = self.pool2(out1)
out1 = self.dropout(out1)
skip1 = out1
out2 = self.conv4(data2)
out2 = self.relu(out2)
out2 = self.pool4(out2)
out2 = self.dropout(out2)
out2 = self.conv5(out2)
out2 = self.relu(out2)
out2 = self.pool5(out2)
out2 = self.dropout(out2)
skip2 = out2
# classifier
skip4 = skip1 + skip2 # add
# skip4 = torch.cat((skip1, skip2), 1)
# classifier
out3 = skip4.view(b, -1)
out3 = self.linear1(out3)
out3 = self.relu(out3)
out3 = self.drop(out3)
out3 = self.linear2(out3)
out_class = self.sigmoid(out3)
return out_class
# twolayers
class FCN1(nn.Module):
# FPN for semantic segmentation
def __init__(self, motiflen=15):
super(FCN1, self).__init__() # 初始化
# encode process
self.conv1 = nn.Conv1d(in_channels=4, out_channels=64, kernel_size=motiflen) # 注意这里是一维卷积层,图像处理任务时才是二维卷积层
self.pool1 = nn.MaxPool1d(kernel_size=4, stride=4)
self.conv2 = nn.Conv1d(in_channels=64, out_channels=64, kernel_size=5)
self.pool2 = nn.MaxPool1d(kernel_size=4, stride=4)
self.conv4 = nn.Conv1d(in_channels=1, out_channels=64, kernel_size=motiflen) # 4、5、6用于处理进化信息
self.pool4 = nn.MaxPool1d(kernel_size=4, stride=4)
self.conv5 = nn.Conv1d(in_channels=64, out_channels=64, kernel_size=5)
self.pool5 = nn.MaxPool1d(kernel_size=4, stride=4)
self.conv7 = nn.Conv1d(in_channels=1, out_channels=64, kernel_size=motiflen) # 4、5、6用于处理进化信息
self.pool7 = nn.MaxPool1d(kernel_size=4, stride=4)
self.conv8 = nn.Conv1d(in_channels=64, out_channels=64, kernel_size=5)
self.pool8 = nn.MaxPool1d(kernel_size=4, stride=4)
self.conv10 = nn.Conv1d(in_channels=1, out_channels=64, kernel_size=motiflen) # 4、5、6用于处理进化信息
self.pool10 = nn.MaxPool1d(kernel_size=4, stride=4)
self.conv11 = nn.Conv1d(in_channels=64, out_channels=64, kernel_size=5)
self.pool11 = nn.MaxPool1d(kernel_size=4, stride=4)
# classifier head
c_in = 256
self.linear1 = nn.Linear(c_in, 64)
self.drop = nn.Dropout(p=0.5)
self.linear2 = nn.Linear(64, 1)
# general functions
self.sigmoid = nn.Sigmoid()
self.relu = nn.ReLU(inplace=True)
self.dropout = nn.Dropout(p=0.2)
self._init_weights()
def _init_weights(self):
# Initialize the new built layers
for layer in self.modules():
if isinstance(layer, (nn.Conv1d, nn.Linear)):
# nn.init.kaiming_uniform_(layer.weight, mode='fan_in', nonlinearity='relu')
nn.init.xavier_uniform_(layer.weight)
if layer.bias is not None:
nn.init.constant_(layer.bias, 0)
elif isinstance(layer, nn.BatchNorm1d):
nn.init.constant_(layer.weight, 1)
nn.init.constant_(layer.bias, 0)
def forward(self, data1, data2, data3, data4):
# Construct a new computation graph at each froward
b, _, _ = data1.size()
# encode process
out1 = self.conv1(data1)
out1 = self.relu(out1)
out1 = self.pool1(out1)
out1 = self.dropout(out1)
out1 = self.conv2(out1)
out1 = self.relu(out1)
out1 = self.pool2(out1)
out1 = self.dropout(out1)
skip1 = out1
out2 = self.conv4(data2)
out2 = self.relu(out2)
out2 = self.pool4(out2)
out2 = self.dropout(out2)
out2 = self.conv5(out2)
out2 = self.relu(out2)
out2 = self.pool5(out2)
out2 = self.dropout(out2)
skip2 = out2
out3 = self.conv7(data3)
out3 = self.relu(out3)
out3 = self.pool7(out3)
out3 = self.dropout(out3)
out3 = self.conv8(out3)
out3 = self.relu(out3)
out3 = self.pool8(out3)
out3 = self.dropout(out3)
skip3 = out3
out4 = self.conv10(data4)
out4 = self.relu(out4)
out4 = self.pool10(out4)
out4 = self.dropout(out4)
out4 = self.conv11(out4)
out4 = self.relu(out4)
out4 = self.pool11(out4)
out4 = self.dropout(out4)
skip4 = out4
# classifier
skip = skip1 + skip2 + skip3 + skip4 # add
# skip = torch.cat((skip1, skip2, skip3, skip4), 1)
# classifier
out = skip.view(b, -1)
out = self.linear1(out)
out = self.relu(out)
out = self.drop(out)
out = self.linear2(out)
out_class = self.sigmoid(out)
return out_class
class FCN2(nn.Module):
# FPN for semantic segmentation
def __init__(self, motiflen=15):
super(FCN2, self).__init__() # 初始化
# encode process
self.conv1 = nn.Conv1d(in_channels=4, out_channels=64, kernel_size=motiflen) # 注意这里是一维卷积层,图像处理任务时才是二维卷积层
self.pool1 = nn.MaxPool1d(kernel_size=4, stride=4)
self.conv2 = nn.Conv1d(in_channels=64, out_channels=64, kernel_size=5)
self.pool2 = nn.MaxPool1d(kernel_size=4, stride=4)
self.conv4 = nn.Conv1d(in_channels=1, out_channels=64, kernel_size=motiflen) # 4、5、6用于处理进化信息
self.pool4 = nn.MaxPool1d(kernel_size=4, stride=4)
self.conv5 = nn.Conv1d(in_channels=64, out_channels=64, kernel_size=5)
self.pool5 = nn.MaxPool1d(kernel_size=4, stride=4)
self.conv7 = nn.Conv1d(in_channels=1, out_channels=64, kernel_size=motiflen) # 4、5、6用于处理进化信息
self.pool7 = nn.MaxPool1d(kernel_size=4, stride=4)
self.conv8 = nn.Conv1d(in_channels=64, out_channels=64, kernel_size=5)
self.pool8 = nn.MaxPool1d(kernel_size=4, stride=4)
# classifier head
c_in = 256
self.linear1 = nn.Linear(c_in, 64)
self.drop = nn.Dropout(p=0.5)
self.linear2 = nn.Linear(64, 1)
# general functions
self.sigmoid = nn.Sigmoid()
self.relu = nn.ReLU(inplace=True)
self.dropout = nn.Dropout(p=0.2)
self._init_weights()
def _init_weights(self):
# Initialize the new built layers
for layer in self.modules():
if isinstance(layer, (nn.Conv1d, nn.Linear)):
# nn.init.kaiming_uniform_(layer.weight, mode='fan_in', nonlinearity='relu')
nn.init.xavier_uniform_(layer.weight)
if layer.bias is not None:
nn.init.constant_(layer.bias, 0)
elif isinstance(layer, nn.BatchNorm1d):
nn.init.constant_(layer.weight, 1)
nn.init.constant_(layer.bias, 0)
def forward(self, data1, data2, data3):
# Construct a new computation graph at each froward
b, _, _ = data1.size()
# encode process
out1 = self.conv1(data1)
out1 = self.relu(out1)
out1 = self.pool1(out1)
out1 = self.dropout(out1)
out1 = self.conv2(out1)
out1 = self.relu(out1)
out1 = self.pool2(out1)
out1 = self.dropout(out1)
skip1 = out1
out2 = self.conv4(data2)
out2 = self.relu(out2)
out2 = self.pool4(out2)
out2 = self.dropout(out2)
out2 = self.conv5(out2)
out2 = self.relu(out2)
out2 = self.pool5(out2)
out2 = self.dropout(out2)
skip2 = out2
out3 = self.conv7(data3)
out3 = self.relu(out3)
out3 = self.pool7(out3)
out3 = self.dropout(out3)
out3 = self.conv8(out3)
out3 = self.relu(out3)
out3 = self.pool8(out3)
out3 = self.dropout(out3)
skip3 = out3
# classifier
skip = skip1 + skip2 + skip3 # add
# skip = torch.cat((skip1, skip2, skip3), 1)
# classifier
out = skip.view(b, -1)
out = self.linear1(out)
out = self.relu(out)
out = self.drop(out)
out = self.linear2(out)
out_class = self.sigmoid(out)
return out_class
| 38.770308 | 114 | 0.57886 | 1,772 | 13,841 | 4.407449 | 0.079007 | 0.06146 | 0.043022 | 0.055314 | 0.957362 | 0.944046 | 0.9379 | 0.913572 | 0.908835 | 0.908835 | 0 | 0.073618 | 0.307131 | 13,841 | 356 | 115 | 38.879213 | 0.740772 | 0.089661 | 0 | 0.853774 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.042453 | false | 0 | 0.023585 | 0 | 0.09434 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
825da7e2b2cdd060ea44daf563c6d2172104419f | 110 | py | Python | reaver/models/sc2/__init__.py | HatsuneMiku4/reaver | 059320ce109498ec4100fcc2cee32177c427f1ea | [
"MIT"
] | 239 | 2019-01-18T08:47:24.000Z | 2022-03-21T08:29:50.000Z | reaver/models/sc2/__init__.py | HatsuneMiku4/reaver | 059320ce109498ec4100fcc2cee32177c427f1ea | [
"MIT"
] | 19 | 2019-01-27T10:10:12.000Z | 2021-12-29T20:02:05.000Z | reaver/models/sc2/__init__.py | HatsuneMiku4/reaver | 059320ce109498ec4100fcc2cee32177c427f1ea | [
"MIT"
] | 44 | 2019-01-18T02:12:46.000Z | 2021-07-28T14:54:10.000Z | from reaver.models.sc2.policy import SC2MultiPolicy
from reaver.models.sc2.fully_conv import build_fully_conv
| 36.666667 | 57 | 0.872727 | 17 | 110 | 5.470588 | 0.588235 | 0.215054 | 0.344086 | 0.408602 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.029412 | 0.072727 | 110 | 2 | 58 | 55 | 0.882353 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
8264ef4ad371deb30edcb8f8c92351fa55fe839a | 33 | py | Python | tests/test_stub.py | chalupaul/twitch_dungeon | 97af4a73c8e99d2e5f1f1880e67e3253e0e06582 | [
"MIT"
] | null | null | null | tests/test_stub.py | chalupaul/twitch_dungeon | 97af4a73c8e99d2e5f1f1880e67e3253e0e06582 | [
"MIT"
] | null | null | null | tests/test_stub.py | chalupaul/twitch_dungeon | 97af4a73c8e99d2e5f1f1880e67e3253e0e06582 | [
"MIT"
] | null | null | null | def test_stub():
return True
| 11 | 16 | 0.666667 | 5 | 33 | 4.2 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.242424 | 33 | 2 | 17 | 16.5 | 0.84 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | true | 0 | 0 | 0.5 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 7 |
82d5f3b8843787b9a66927cab9f00c3a176b4546 | 3,617 | py | Python | lnbits/extensions/lnticket/migrations.py | supertestnet/lnbits | 0ea661babbf1b4251599380321b23fd7c8920fa6 | [
"MIT"
] | null | null | null | lnbits/extensions/lnticket/migrations.py | supertestnet/lnbits | 0ea661babbf1b4251599380321b23fd7c8920fa6 | [
"MIT"
] | null | null | null | lnbits/extensions/lnticket/migrations.py | supertestnet/lnbits | 0ea661babbf1b4251599380321b23fd7c8920fa6 | [
"MIT"
] | 1 | 2021-07-19T07:01:36.000Z | 2021-07-19T07:01:36.000Z | async def m001_initial(db):
await db.execute(
"""
CREATE TABLE lnticket.forms (
id TEXT PRIMARY KEY,
wallet TEXT NOT NULL,
name TEXT NOT NULL,
description TEXT NOT NULL,
costpword INTEGER NOT NULL,
amountmade INTEGER NOT NULL,
time TIMESTAMP NOT NULL DEFAULT """
+ db.timestamp_now
+ """
);
"""
)
await db.execute(
"""
CREATE TABLE lnticket.tickets (
id TEXT PRIMARY KEY,
form TEXT NOT NULL,
email TEXT NOT NULL,
ltext TEXT NOT NULL,
name TEXT NOT NULL,
wallet TEXT NOT NULL,
sats INTEGER NOT NULL,
time TIMESTAMP NOT NULL DEFAULT """
+ db.timestamp_now
+ """
);
"""
)
async def m002_changed(db):
await db.execute(
"""
CREATE TABLE lnticket.ticket (
id TEXT PRIMARY KEY,
form TEXT NOT NULL,
email TEXT NOT NULL,
ltext TEXT NOT NULL,
name TEXT NOT NULL,
wallet TEXT NOT NULL,
sats INTEGER NOT NULL,
paid BOOLEAN NOT NULL,
time TIMESTAMP NOT NULL DEFAULT """
+ db.timestamp_now
+ """
);
"""
)
for row in [
list(row) for row in await db.fetchall("SELECT * FROM lnticket.tickets")
]:
usescsv = ""
for i in range(row[5]):
if row[7]:
usescsv += "," + str(i + 1)
else:
usescsv += "," + str(1)
usescsv = usescsv[1:]
await db.execute(
"""
INSERT INTO lnticket.ticket (
id,
form,
email,
ltext,
name,
wallet,
sats,
paid
)
VALUES (?, ?, ?, ?, ?, ?, ?, ?)
""",
(
row[0],
row[1],
row[2],
row[3],
row[4],
row[5],
row[6],
True,
),
)
await db.execute("DROP TABLE lnticket.tickets")
async def m003_changed(db):
await db.execute(
"""
CREATE TABLE lnticket.form (
id TEXT PRIMARY KEY,
wallet TEXT NOT NULL,
name TEXT NOT NULL,
webhook TEXT,
description TEXT NOT NULL,
costpword INTEGER NOT NULL,
amountmade INTEGER NOT NULL,
time TIMESTAMP NOT NULL DEFAULT """
+ db.timestamp_now
+ """
);
"""
)
for row in [list(row) for row in await db.fetchall("SELECT * FROM lnticket.forms")]:
usescsv = ""
for i in range(row[5]):
if row[7]:
usescsv += "," + str(i + 1)
else:
usescsv += "," + str(1)
usescsv = usescsv[1:]
await db.execute(
"""
INSERT INTO lnticket.form (
id,
wallet,
name,
webhook,
description,
costpword,
amountmade
)
VALUES (?, ?, ?, ?, ?, ?, ?)
""",
(
row[0],
row[1],
row[2],
row[3],
row[4],
row[5],
row[6],
),
)
await db.execute("DROP TABLE lnticket.forms")
| 24.439189 | 88 | 0.399226 | 331 | 3,617 | 4.34139 | 0.196375 | 0.131524 | 0.122477 | 0.055672 | 0.855254 | 0.855254 | 0.789144 | 0.764788 | 0.706333 | 0.706333 | 0 | 0.018425 | 0.504838 | 3,617 | 147 | 89 | 24.605442 | 0.78392 | 0 | 0 | 0.722222 | 0 | 0 | 0.103972 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
7d5e6f7224a8399574abd4213d59cc916117a0c6 | 12,285 | py | Python | tests/test_order.py | masakichi/otcbtc-client | 28f8ee57b4321e901463c913c5fd892ecb0cee7b | [
"MIT"
] | 6 | 2018-05-20T13:51:41.000Z | 2019-08-01T10:44:22.000Z | tests/test_order.py | masakichi/otcbtc-client | 28f8ee57b4321e901463c913c5fd892ecb0cee7b | [
"MIT"
] | null | null | null | tests/test_order.py | masakichi/otcbtc-client | 28f8ee57b4321e901463c913c5fd892ecb0cee7b | [
"MIT"
] | 2 | 2018-12-03T01:12:17.000Z | 2019-02-20T04:52:20.000Z | # coding: utf-8
import responses
from otcbtc_client.client import OTCBTCClient
from tests.helper import concat_url_and_params, body_str_to_dict
class TestOrder(object):
@property
def order(self):
return OTCBTCClient(api_key='xxx', api_secret='yyy').order
@responses.activate
def test_list_order(self):
order = self.order
params = {
'id':
1,
'access_key':
'xxx',
'signature':
'c242b60f1830337f7618afab08d378b7cb2e9501fe226e76e0cab0ee93ac1933'
}
responses.add(
responses.GET,
concat_url_and_params(order.build_url(order.ORDER_URI), params),
json={
'id': 1, # Unique order id.
'side': 'buy', # Either 'sell' or 'buy'.
'ord_type': 'limit', # Type of order, now only 'limit'.
'price':
'0.002', # Price for each unit. e.g. If you sell/buy 1 OTB at 0.002 ETH, the price is '0.002'
'avg_price':
'0.0', # Average execution price, average of price in trades.
'state':
'wait', # One of 'wait', 'done', or 'cancel'. An order in 'wait' is an active order, waiting fullfillment; a 'done' order is an order fullfilled; 'cancel' means the order has been cancelled.
'market':
'otbeth', # The market in which the order is placed, e.g. 'otbeth'. All available markets can be found at /api/v2/markets.
'created_at':
'2017-02-01T00:00:00+08:00', # Order create time in iso8601 format.
'volume':
'100.0', # The amount user want to sell/buy. An order could be partially executed, e.g. an order sell 100 otb can be matched with a buy 60 otb order, left 40 otb to be sold; in this case the order's volume would be '100.0', its remaining_volume would be '40.0', its executed volume is '60.0'.
'remaining_volume': '100.0', # The remaining volume
'executed_volume': '0.0', # The executed volume
'trades_count': 1 # Number of trades under this order
})
resp = order.list_order(id=1)
assert resp['id'] == 1
@responses.activate
def test_list_orders(self):
order = self.order
market = 'otbeth'
params = {
'market':
market,
'access_key':
'xxx',
'signature':
'be0694b7c33e92da3ec6ee534f7391fb7d0332fc1d867681c5085c5194ed69c8'
}
responses.add(
responses.GET,
concat_url_and_params(order.build_url(order.ORDERS_URI), params),
json=[
{
'id': 1, # Unique order id.
'side': 'buy', # Either 'sell' or 'buy'.
'ord_type': 'limit', # Type of order, now only 'limit'.
'price':
'0.002', # Price for each unit. e.g. If you sell/buy 1 OTB at 0.002 ETH, the price is '0.002'
'avg_price':
'0.0', # Average execution price, average of price in trades.
'state':
'wait', # One of 'wait', 'done', or 'cancel'. An order in 'wait' is an active order, waiting fullfillment; a 'done' order is an order fullfilled; 'cancel' means the order has been cancelled.
'market':
'otbeth', # The market in which the order is placed, e.g. 'otbeth'. All available markets can be found at /api/v2/markets.
'created_at':
'2017-02-01T00:00:00+08:00', # Order create time in iso8601 format.
'volume':
'100.0', # The amount user want to sell/buy. An order could be partially executed, e.g. an order sell 100 otb can be matched with a buy 60 otb order, left 40 otb to be sold; in this case the order's volume would be '100.0', its remaining_volume would be '40.0', its executed volume is '60.0'.
'remaining_volume': '100.0', # The remaining volume
'executed_volume': '0.0', # The executed volume
'trades_count': 1 # Counts of trades under this order
},
{
'id': 3,
'side': 'sell',
'ord_type': 'limit',
'price': '0.003',
'avg_price': '0.0',
'state': 'wait',
'market': 'otbeth',
'created_at': '2017-02-01T00:00:00+08:00',
'volume': '100.0',
'remaining_volume': '100.0',
'executed_volume': '0.0',
'trades_count': 0
}
],
match_querystring=True)
resp = order.list_orders(market=market)
assert isinstance(resp, list)
@responses.activate
def test_create_order(self):
order = self.order
market = 'otbeth'
data = {
'market':
market,
'price':
'0.002',
'side':
'sell',
'volume':
'100',
'access_key':
'xxx',
'signature':
'efcc83119fe25b18f0a02302aaee7765b62d7bf64dc6c5d4f2266f5a5fda4327'
}
responses.add(
responses.POST,
order.build_url(order.ORDERS_URI),
json={
'id': 1, # Unique order id.
'side': 'sell', # Either 'sell' or 'buy'.
'ord_type': 'limit', # Type of order, now only 'limit'.
'price':
'0.002', # Price for each unit. e.g. If you sell/buy 100 OTB at 0.002 ETH, the price is '0.002'.
'avg_price':
'0.0', # Average execution price, average of price in trades.
'state':
'wait', # One of 'wait', 'done', or 'cancel'. An order in 'wait' is an active order, waiting fullfillment; a 'done' order is an order fullfilled; 'cancel' means the order has been cancelled.
'market':
'otbeth', # The market in which the order is placed, e.g. 'otbeth'. All available markets can be found at /api/v2/markets.
'created_at':
'2017-02-01T00:00:00+08:00', # Trade create time in iso8601 format.
'volume':
'100.0', # The amount user want to sell/buy. An order could be partially executed, e.g. an order sell 100 otb can be matched with a buy 60 otb order, left 40 otb to be sold; in this case the order's volume would be '100.0', its remaining_volume would be '40.0', its executed volume is '60.0'.
'remaining_volume': '100.0', # The remaining volume
'executed_volume': '0.0', # The executed volume
'trades_count': 0 # Number of trades under this order
},
match_querystring=True)
order.create_order(
market=market, side='sell', price='0.002', volume='100')
# XXX(Gimo): due to responses library don't have a parameter like match_request_body.
assert body_str_to_dict(responses.calls[0].request.body) == data
@responses.activate
def test_cancel_order(self):
order = self.order
data = {
'id':
'1',
'access_key':
'xxx',
'signature':
'47ba4a04e8f5471a05078f8dd13976b7caa80665c5f8152d654486de327c395c'
}
responses.add(
responses.POST,
order.build_url(order.DELETE_ORDER_URI),
json={
'id': 1, # Unique order id.
'side': 'buy', # Either 'sell' or 'buy'.
'ord_type': 'limit', # Type of order, now only 'limit'.
'price':
'0.002', # Price for each unit. e.g. If you sell/buy 100 OTB at 0.002 ETH, the price is '0.002'.
'avg_price':
'0.0', # Average execution price, average of price in trades.
'state':
'wait', # One of 'wait', 'done', or 'cancel'. An order in 'wait' is an active order, waiting fullfillment; a 'done' order is an order fullfilled; 'cancel' means the order has been cancelled.
'market':
'otbeth', # The market in which the order is placed, e.g. 'otbeth'. All available markets can be found at /api/v2/markets.
'created_at':
'2017-02-01T00:00:00+08:00', # Trade create time in iso8601 format.
'volume':
'100.0', # The amount user want to sell/buy. An order could be partially executed, e.g. an order sell 100 otb can be matched with a buy 60 otb order, left 40 otb to be sold; in this case the order's volume would be '100.0', its remaining_volume would be '40.0', its executed volume is '60.0'.
'remaining_volume': '100.0', # The remaining volume
'executed_volume': '0.0', # The executed volume
'trades_count': 0 # Number of trades under this order
},
match_querystring=True)
order.cancel_order(id='1')
# XXX(Gimo): due to responses library don't have a parameter like match_request_body.
assert body_str_to_dict(responses.calls[0].request.body) == data
@responses.activate
def test_cancel_orders(self):
order = self.order
data = {
'access_key':
'xxx',
'signature':
'f2ab1d061ad07a2de9fe7658b7203ce28ed6b6511287502b2a7a869172039bcf'
}
responses.add(
responses.POST,
order.build_url(order.CLEAR_ORDERS_URI),
json=[
{
'id': 2, # Unique order id.
'side': 'buy', # Either 'sell' or 'buy'.
'ord_type': 'limit', # Type of order, now only 'limit'.
'price':
'0.0015', # Price for each unit. e.g. If you sell/buy 100 OTB at 0.0015 ETH, the price is '0.0015'.
'avg_price':
'0.0', # Average execution price, average of price in trades.
'state':
'wait', # One of 'wait', 'done', or 'cancel'. An order in 'wait' is an active order, waiting fullfillment; a 'done' order is an order fullfilled; 'cancel' means the order has been cancelled.
'market':
'otbeth', # The market in which the order is placed, e.g. 'otbeth'. All available markets can be found at /api/v2/markets.
'created_at':
'2017-02-01T00:00:00+08:00', # Trade create time in iso8601 format.
'volume':
'100.0', # The amount user want to sell/buy. An order could be partially executed, e.g. an order sell 100 otb can be matched with a buy 60 otb order, left 40 otb to be sold; in this case the order's volume would be '100.0', its remaining_volume would be '40.0', its executed volume is '60.0'.
'remaining_volume': '60.0', # The remaining volume
'executed_volume': '40.0', # The executed volume
'trades_count': 1 # Number of trades under this order
},
{
'id': 1,
'side': 'sell',
'ord_type': 'limit',
'price': '0.0012',
'avg_price': '0.0',
'state': 'wait',
'market': 'otbeth',
'created_at': '2017-02-01T00:00:00+08:00',
'volume': '100.0',
'remaining_volume': '100.0',
'executed_volume': '0.0',
'trades_count': 0
}
],
match_querystring=True)
order.cancel_orders()
# XXX(Gimo): due to responses library don't have a parameter like match_request_body.
assert body_str_to_dict(responses.calls[0].request.body) == data
| 50.348361 | 313 | 0.523158 | 1,472 | 12,285 | 4.289402 | 0.11413 | 0.022173 | 0.020589 | 0.01853 | 0.856826 | 0.828952 | 0.800602 | 0.792366 | 0.767184 | 0.767184 | 0 | 0.083064 | 0.369882 | 12,285 | 243 | 314 | 50.555556 | 0.732593 | 0.388116 | 0 | 0.747826 | 0 | 0 | 0.22088 | 0.066425 | 0 | 0 | 0 | 0 | 0.021739 | 1 | 0.026087 | false | 0 | 0.013043 | 0.004348 | 0.047826 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
7d630493628484a2618ca923afa00275e5c3c18f | 213 | py | Python | termination_handler/handlers/__init__.py | dgzlopes/termination-handler | 526977887cfd9835075de71069fadb095933a654 | [
"MIT"
] | 7 | 2019-08-17T13:58:07.000Z | 2021-12-15T20:14:58.000Z | termination_handler/handlers/__init__.py | dgzlopes/termination-handler | 526977887cfd9835075de71069fadb095933a654 | [
"MIT"
] | 2 | 2020-07-16T07:55:49.000Z | 2020-07-20T20:03:27.000Z | termination_handler/handlers/__init__.py | dgzlopes/termination-handler | 526977887cfd9835075de71069fadb095933a654 | [
"MIT"
] | 3 | 2020-07-16T07:07:49.000Z | 2021-02-12T06:10:23.000Z | from .handler import AbstractHandler # noqa: F401
from .k8s_handler import K8sHandler # noqa: F401
from .nomad_handler import NomadHandler # noqa: F401
from .slack_handler import SlackHandler # noqa: F401
| 42.6 | 54 | 0.774648 | 27 | 213 | 6 | 0.444444 | 0.320988 | 0.222222 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.079096 | 0.169014 | 213 | 4 | 55 | 53.25 | 0.836158 | 0.201878 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
81639ca88f234a4f52162465b40b32d3eddde71e | 20,720 | py | Python | python/sdk/client/api/alert_api.py | ashwinath/merlin | 087a7fa6fb21e4c771d64418bd58873175226ca1 | [
"Apache-2.0"
] | null | null | null | python/sdk/client/api/alert_api.py | ashwinath/merlin | 087a7fa6fb21e4c771d64418bd58873175226ca1 | [
"Apache-2.0"
] | null | null | null | python/sdk/client/api/alert_api.py | ashwinath/merlin | 087a7fa6fb21e4c771d64418bd58873175226ca1 | [
"Apache-2.0"
] | null | null | null | # coding: utf-8
"""
Merlin
API Guide for accessing Merlin's model management, deployment, and serving functionalities # noqa: E501
OpenAPI spec version: 0.7.0
Generated by: https://github.com/swagger-api/swagger-codegen.git
"""
from __future__ import absolute_import
import re # noqa: F401
# python 2 and python 3 compatibility library
import six
from client.api_client import ApiClient
class AlertApi(object):
"""NOTE: This class is auto generated by the swagger code generator program.
Do not edit the class manually.
Ref: https://github.com/swagger-api/swagger-codegen
"""
def __init__(self, api_client=None):
if api_client is None:
api_client = ApiClient()
self.api_client = api_client
def alerts_teams_get(self, **kwargs): # noqa: E501
"""Lists teams for alert notification channel. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.alerts_teams_get(async_req=True)
>>> result = thread.get()
:param async_req bool
:return: list[str]
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.alerts_teams_get_with_http_info(**kwargs) # noqa: E501
else:
(data) = self.alerts_teams_get_with_http_info(**kwargs) # noqa: E501
return data
def alerts_teams_get_with_http_info(self, **kwargs): # noqa: E501
"""Lists teams for alert notification channel. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.alerts_teams_get_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool
:return: list[str]
If the method is called asynchronously,
returns the request thread.
"""
all_params = [] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method alerts_teams_get" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# Authentication setting
auth_settings = ['Bearer'] # noqa: E501
return self.api_client.call_api(
'/alerts/teams', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='list[str]', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def models_model_id_alerts_get(self, model_id, **kwargs): # noqa: E501
"""Lists alerts for given model. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.models_model_id_alerts_get(model_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int model_id: (required)
:return: list[ModelEndpointAlert]
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.models_model_id_alerts_get_with_http_info(model_id, **kwargs) # noqa: E501
else:
(data) = self.models_model_id_alerts_get_with_http_info(model_id, **kwargs) # noqa: E501
return data
def models_model_id_alerts_get_with_http_info(self, model_id, **kwargs): # noqa: E501
"""Lists alerts for given model. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.models_model_id_alerts_get_with_http_info(model_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int model_id: (required)
:return: list[ModelEndpointAlert]
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['model_id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method models_model_id_alerts_get" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'model_id' is set
if ('model_id' not in params or
params['model_id'] is None):
raise ValueError("Missing the required parameter `model_id` when calling `models_model_id_alerts_get`") # noqa: E501
collection_formats = {}
path_params = {}
if 'model_id' in params:
path_params['model_id'] = params['model_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# Authentication setting
auth_settings = ['Bearer'] # noqa: E501
return self.api_client.call_api(
'/models/{model_id}/alerts', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='list[ModelEndpointAlert]', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def models_model_id_endpoints_model_endpoint_id_alert_get(self, model_id, model_endpoint_id, **kwargs): # noqa: E501
"""Gets alert for given model endpoint. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.models_model_id_endpoints_model_endpoint_id_alert_get(model_id, model_endpoint_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int model_id: (required)
:param str model_endpoint_id: (required)
:return: ModelEndpointAlert
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.models_model_id_endpoints_model_endpoint_id_alert_get_with_http_info(model_id, model_endpoint_id, **kwargs) # noqa: E501
else:
(data) = self.models_model_id_endpoints_model_endpoint_id_alert_get_with_http_info(model_id, model_endpoint_id, **kwargs) # noqa: E501
return data
def models_model_id_endpoints_model_endpoint_id_alert_get_with_http_info(self, model_id, model_endpoint_id, **kwargs): # noqa: E501
"""Gets alert for given model endpoint. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.models_model_id_endpoints_model_endpoint_id_alert_get_with_http_info(model_id, model_endpoint_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int model_id: (required)
:param str model_endpoint_id: (required)
:return: ModelEndpointAlert
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['model_id', 'model_endpoint_id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method models_model_id_endpoints_model_endpoint_id_alert_get" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'model_id' is set
if ('model_id' not in params or
params['model_id'] is None):
raise ValueError("Missing the required parameter `model_id` when calling `models_model_id_endpoints_model_endpoint_id_alert_get`") # noqa: E501
# verify the required parameter 'model_endpoint_id' is set
if ('model_endpoint_id' not in params or
params['model_endpoint_id'] is None):
raise ValueError("Missing the required parameter `model_endpoint_id` when calling `models_model_id_endpoints_model_endpoint_id_alert_get`") # noqa: E501
collection_formats = {}
path_params = {}
if 'model_id' in params:
path_params['model_id'] = params['model_id'] # noqa: E501
if 'model_endpoint_id' in params:
path_params['model_endpoint_id'] = params['model_endpoint_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# Authentication setting
auth_settings = ['Bearer'] # noqa: E501
return self.api_client.call_api(
'/models/{model_id}/endpoints/{model_endpoint_id}/alert', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='ModelEndpointAlert', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def models_model_id_endpoints_model_endpoint_id_alert_post(self, model_id, model_endpoint_id, **kwargs): # noqa: E501
"""Creates alert for given model endpoint. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.models_model_id_endpoints_model_endpoint_id_alert_post(model_id, model_endpoint_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int model_id: (required)
:param str model_endpoint_id: (required)
:param ModelEndpointAlert body:
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.models_model_id_endpoints_model_endpoint_id_alert_post_with_http_info(model_id, model_endpoint_id, **kwargs) # noqa: E501
else:
(data) = self.models_model_id_endpoints_model_endpoint_id_alert_post_with_http_info(model_id, model_endpoint_id, **kwargs) # noqa: E501
return data
def models_model_id_endpoints_model_endpoint_id_alert_post_with_http_info(self, model_id, model_endpoint_id, **kwargs): # noqa: E501
"""Creates alert for given model endpoint. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.models_model_id_endpoints_model_endpoint_id_alert_post_with_http_info(model_id, model_endpoint_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int model_id: (required)
:param str model_endpoint_id: (required)
:param ModelEndpointAlert body:
:return: None
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['model_id', 'model_endpoint_id', 'body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method models_model_id_endpoints_model_endpoint_id_alert_post" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'model_id' is set
if ('model_id' not in params or
params['model_id'] is None):
raise ValueError("Missing the required parameter `model_id` when calling `models_model_id_endpoints_model_endpoint_id_alert_post`") # noqa: E501
# verify the required parameter 'model_endpoint_id' is set
if ('model_endpoint_id' not in params or
params['model_endpoint_id'] is None):
raise ValueError("Missing the required parameter `model_endpoint_id` when calling `models_model_id_endpoints_model_endpoint_id_alert_post`") # noqa: E501
collection_formats = {}
path_params = {}
if 'model_id' in params:
path_params['model_id'] = params['model_id'] # noqa: E501
if 'model_endpoint_id' in params:
path_params['model_endpoint_id'] = params['model_endpoint_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# Authentication setting
auth_settings = ['Bearer'] # noqa: E501
return self.api_client.call_api(
'/models/{model_id}/endpoints/{model_endpoint_id}/alert', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None, # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def models_model_id_endpoints_model_endpoint_id_alert_put(self, model_id, model_endpoint_id, **kwargs): # noqa: E501
"""Creates alert for given model endpoint. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.models_model_id_endpoints_model_endpoint_id_alert_put(model_id, model_endpoint_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int model_id: (required)
:param str model_endpoint_id: (required)
:param ModelEndpointAlert body:
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.models_model_id_endpoints_model_endpoint_id_alert_put_with_http_info(model_id, model_endpoint_id, **kwargs) # noqa: E501
else:
(data) = self.models_model_id_endpoints_model_endpoint_id_alert_put_with_http_info(model_id, model_endpoint_id, **kwargs) # noqa: E501
return data
def models_model_id_endpoints_model_endpoint_id_alert_put_with_http_info(self, model_id, model_endpoint_id, **kwargs): # noqa: E501
"""Creates alert for given model endpoint. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.models_model_id_endpoints_model_endpoint_id_alert_put_with_http_info(model_id, model_endpoint_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param int model_id: (required)
:param str model_endpoint_id: (required)
:param ModelEndpointAlert body:
:return: None
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['model_id', 'model_endpoint_id', 'body'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method models_model_id_endpoints_model_endpoint_id_alert_put" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'model_id' is set
if ('model_id' not in params or
params['model_id'] is None):
raise ValueError("Missing the required parameter `model_id` when calling `models_model_id_endpoints_model_endpoint_id_alert_put`") # noqa: E501
# verify the required parameter 'model_endpoint_id' is set
if ('model_endpoint_id' not in params or
params['model_endpoint_id'] is None):
raise ValueError("Missing the required parameter `model_endpoint_id` when calling `models_model_id_endpoints_model_endpoint_id_alert_put`") # noqa: E501
collection_formats = {}
path_params = {}
if 'model_id' in params:
path_params['model_id'] = params['model_id'] # noqa: E501
if 'model_endpoint_id' in params:
path_params['model_endpoint_id'] = params['model_endpoint_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'body' in params:
body_params = params['body']
# Authentication setting
auth_settings = ['Bearer'] # noqa: E501
return self.api_client.call_api(
'/models/{model_id}/endpoints/{model_endpoint_id}/alert', 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None, # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
| 41.111111 | 166 | 0.634556 | 2,502 | 20,720 | 4.91287 | 0.063949 | 0.058656 | 0.095184 | 0.053693 | 0.958591 | 0.957615 | 0.950781 | 0.944598 | 0.940205 | 0.939636 | 0 | 0.012693 | 0.281371 | 20,720 | 503 | 167 | 41.192843 | 0.812827 | 0.309653 | 0 | 0.791209 | 1 | 0 | 0.217617 | 0.081908 | 0 | 0 | 0 | 0 | 0 | 1 | 0.040293 | false | 0 | 0.014652 | 0 | 0.113553 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
81869f4f250c69f3e7530e77316afe0db0f88a12 | 1,362 | py | Python | example/ex_tprint.py | zoumingzhe/PyTools | 7202268fca71db5e5b35fccc4031002e6ebbc023 | [
"MIT"
] | 3 | 2019-05-02T07:08:15.000Z | 2021-03-10T04:55:03.000Z | example/ex_tprint.py | zoumingzhe/PyTools | 7202268fca71db5e5b35fccc4031002e6ebbc023 | [
"MIT"
] | null | null | null | example/ex_tprint.py | zoumingzhe/PyTools | 7202268fca71db5e5b35fccc4031002e6ebbc023 | [
"MIT"
] | 1 | 2019-05-13T07:26:33.000Z | 2019-05-13T07:26:33.000Z | from ztools import tprint
from ztools import AnsiStyle as style
from ztools import AnsiFore as fore
from ztools import AnsiBack as back
tp = tprint()
print("-----")
tp.color("123", style.underline, fore.red, back.white)
tp.flush()
tp.color("123", style.underline, fore.red)
tp.flush()
tp.color("123", style.underline, back.white)
tp.flush()
tp.color("123", style.underline)
tp.flush()
tp.color("123")
tp.flush()
print("-----")
tp.color("123", fore.red, back.white)
tp.flush()
tp.color("123", fore.red)
tp.flush()
tp.color("123", back.white)
tp.flush()
tp.color("123")
tp.flush()
print("-----")
tp.color(123, fore.red, back.white)
tp.flush()
tp.color(123, fore.red)
tp.flush()
tp.color(123, back.white)
tp.flush()
tp.color(123)
tp.flush()
print("-----")
tp.color((1,2,3), fore.red, back.white)
tp.flush()
tp.color((1,2,3), fore.red)
tp.flush()
tp.color((1,2,3), back.white)
tp.flush()
tp.color((1,2,3))
tp.flush()
print("-----")
tp.color([1,2,3], fore.red, back.white)
tp.flush()
tp.color([1,2,3], fore.red)
tp.flush()
tp.color([1,2,3], back.white)
tp.flush()
tp.color([1,2,3])
tp.flush()
print("-----")
tp.color({'k1':1, 'k2':2, 'k3':3}, fore.red, back.white)
tp.flush()
tp.color({'k1':1, 'k2':2, 'k3':3}, fore.red)
tp.flush()
tp.color({'k1':1, 'k2':2, 'k3':3}, back.white)
tp.flush()
tp.color({'k1':1, 'k2':2, 'k3':3})
tp.flush()
input("按回车(Enter)继续")
| 18.916667 | 56 | 0.634361 | 251 | 1,362 | 3.442231 | 0.115538 | 0.202546 | 0.197917 | 0.30787 | 0.827546 | 0.827546 | 0.827546 | 0.755787 | 0.755787 | 0.655093 | 0 | 0.070445 | 0.093245 | 1,362 | 71 | 57 | 19.183099 | 0.62915 | 0 | 0 | 0.532258 | 0 | 0 | 0.068332 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.064516 | 0 | 0.064516 | 0.129032 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
81b23b4f8a9c6a227c716b85514902591e4e2e6c | 231 | py | Python | tests/test_module.py | ABitMoreDepth/persistent_structures | 98a61bfd9bb560ae952ee04d8ebda7297ca74b51 | [
"MIT"
] | null | null | null | tests/test_module.py | ABitMoreDepth/persistent_structures | 98a61bfd9bb560ae952ee04d8ebda7297ca74b51 | [
"MIT"
] | 4 | 2019-10-13T20:37:21.000Z | 2019-10-13T20:38:42.000Z | tests/test_module.py | ABitMoreDepth/persistent_structures | 98a61bfd9bb560ae952ee04d8ebda7297ca74b51 | [
"MIT"
] | null | null | null | """Test that the persistent_structures imports as expected."""
import persistent_structures
def test_module() -> None:
"""Test that the module behaves as expected."""
assert persistent_structures.__version__ is not None
| 25.666667 | 62 | 0.757576 | 29 | 231 | 5.758621 | 0.586207 | 0.359281 | 0.131737 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.155844 | 231 | 8 | 63 | 28.875 | 0.85641 | 0.424242 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.333333 | 1 | 0.333333 | true | 0 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
81d36a0f1e180fccb998d541812439c33b87629e | 5,651 | py | Python | old/old_bin/cru_ts323_update_launcher.py | ua-snap/downscale | 3fe8ea1774cf82149d19561ce5f19b25e6cba6fb | [
"MIT"
] | 5 | 2020-06-24T21:55:12.000Z | 2022-03-23T16:32:54.000Z | old/old_bin/cru_ts323_update_launcher.py | ua-snap/downscale | 3fe8ea1774cf82149d19561ce5f19b25e6cba6fb | [
"MIT"
] | 17 | 2016-01-04T23:37:47.000Z | 2017-04-17T20:57:02.000Z | snap_scripts/old_scripts/tem_iem_older_scripts_april2018/tem_inputs_iem/old_code/cru_ts323_update_launcher.py | ua-snap/downscale | 3fe8ea1774cf82149d19561ce5f19b25e6cba6fb | [
"MIT"
] | 3 | 2020-09-16T04:48:57.000Z | 2021-05-25T03:46:00.000Z |
# SCRIPT TO RUN THE CRU TS3.1 BUILT SCRIPT OVER THE CRU TS3.2.3 UPDATE (SEPT.2015)
# WHICH EXTENDS THE SERIES TO 12/2014.
# THIS IS CURRENTLY WORKING FOR CLD, TMP, VAP, AND MORE TO COME!
# # # # #
# Author: Michael Lindgren (malindgren@alaska.edu)
# # # # #
# CURRENTLY SETUP TO RUN ON EOS.
# CLD
import os
os.chdir( '/workspace/Shared/Tech_Projects/ALFRESCO_Inputs/project_data/CODE/tem_ar5_inputs/downscale_cmip5/bin' )
ncores = '14'
base_path = '/workspace/Shared/Tech_Projects/ALFRESCO_Inputs/project_data/TEM_Data/cru_ts323'
cru_ts31 = '/workspace/Shared/Tech_Projects/ALFRESCO_Inputs/project_data/TEM_Data/cru_ts323/cru_ts3.23.1901.2014.cld.dat.nc'
cl20_path = '/workspace/Shared/Tech_Projects/ALFRESCO_Inputs/project_data/TEM_Data/cru_october_final/cru_cl20/cld/akcan'
template_raster_fn = '/workspace/Shared/Tech_Projects/ALFRESCO_Inputs/project_data/TEM_Data/templates/tas_mean_C_AR5_GFDL-CM3_historical_01_1860.tif'
anomalies_calc_type = 'relative'
downscaling_operation = 'mult'
climatology_begin = '1961'
climatology_end = '1990'
year_begin = '1901'
year_end = '2014'
variable = 'cld'
metric = 'pct'
args_tuples = [ ('hi', cru_ts31), ('ci', cl20_path), ('tr', template_raster_fn),
('base', base_path), ('bt', year_begin), ('et', year_end),
('cbt', climatology_begin), ('cet', climatology_end),
('nc', ncores), ('at', anomalies_calc_type), ('m', metric),
('dso', downscaling_operation), ('v', variable) ]
args = ''.join([ ' -'+flag+' '+value for flag, value in args_tuples ])
os.system( 'ipython2.7 -- tas_cld_cru_ts31_to_cl20_downscaling.py ' + args )
# TAS
import os
os.chdir( '/workspace/Shared/Tech_Projects/ALFRESCO_Inputs/project_data/CODE/tem_ar5_inputs/downscale_cmip5/bin' )
ncores = '14'
base_path = '/workspace/Shared/Tech_Projects/ALFRESCO_Inputs/project_data/TEM_Data/cru_ts323'
cru_ts31 = '/workspace/Shared/Tech_Projects/ALFRESCO_Inputs/project_data/TEM_Data/cru_ts323/cru_ts3.23.1901.2014.tmp.dat.nc'
cl20_path = '/workspace/Shared/Tech_Projects/ALFRESCO_Inputs/project_data/TEM_Data/cru_october_final/cru_cl20/cld/akcan'
template_raster_fn = '/workspace/Shared/Tech_Projects/ALFRESCO_Inputs/project_data/TEM_Data/templates/tas_mean_C_AR5_GFDL-CM3_historical_01_1860.tif'
anomalies_calc_type = 'absolute'
downscaling_operation = 'add'
climatology_begin = '1961'
climatology_end = '1990'
year_begin = '1901'
year_end = '2014'
variable = 'tas'
metric = 'C'
args_tuples = [ ('hi', cru_ts31), ('ci', cl20_path), ('tr', template_raster_fn),
('base', base_path), ('bt', year_begin), ('et', year_end),
('cbt', climatology_begin), ('cet', climatology_end),
('nc', ncores), ('at', anomalies_calc_type), ('m', metric),
('dso', downscaling_operation), ('v', variable) ]
args = ''.join([ ' -'+flag+' '+value for flag, value in args_tuples ])
os.system( 'ipython2.7 -- tas_cld_cru_ts31_to_cl20_downscaling.py ' + args )
# VAP (HUR)
import os
os.chdir( '/workspace/Shared/Tech_Projects/ALFRESCO_Inputs/project_data/CODE/tem_ar5_inputs/downscale_cmip5/bin' )
ncores = '14'
base_path = '/workspace/Shared/Tech_Projects/ALFRESCO_Inputs/project_data/TEM_Data/cru_ts323'
cru_ts31_vap = '/workspace/Shared/Tech_Projects/ALFRESCO_Inputs/project_data/TEM_Data/cru_ts323/cru_ts3.23.1901.2014.vap.dat.nc'
cru_ts31_tas = '/workspace/Shared/Tech_Projects/ALFRESCO_Inputs/project_data/TEM_Data/cru_ts323/cru_ts3.23.1901.2014.tmp.dat.nc'
cl20_path = '/workspace/Shared/Tech_Projects/ALFRESCO_Inputs/project_data/TEM_Data/cru_october_final/cru_cl20/hur/akcan' # hur
template_raster_fn = '/workspace/Shared/Tech_Projects/ALFRESCO_Inputs/project_data/TEM_Data/templates/tas_mean_C_AR5_GFDL-CM3_historical_01_1860.tif'
anomalies_calc_type = 'relative'
downscaling_operation = 'mult'
climatology_begin = '1961'
climatology_end = '1990'
year_begin = '1901'
year_end = '2014'
variable = 'hur'
metric = 'pct'
args_tuples = [ ('hhi', cru_ts31_vap), ('thi', cru_ts31_tas), ('ci', cl20_path), ('tr', template_raster_fn),
('base', base_path), ('bt', year_begin), ('et', year_end),
('cbt', climatology_begin), ('cet', climatology_end),
('nc', ncores), ('at', anomalies_calc_type), ('m', metric),
('dso', downscaling_operation), ('v', variable) ]
args = ''.join([ ' -'+flag+' '+value for flag, value in args_tuples ])
os.system( 'ipython2.7 -i -- hur_cru_ts31_to_cl20_downscaling.py ' + args )
# PRECIP
import os
os.chdir( '/workspace/Shared/Tech_Projects/ALFRESCO_Inputs/project_data/CODE/tem_ar5_inputs/downscale_cmip5/bin' )
ncores = '14'
base_path = '/workspace/Shared/Tech_Projects/ALFRESCO_Inputs/project_data/TEM_Data/cru_ts323'
cru_ts31 = '/workspace/Shared/Tech_Projects/ALFRESCO_Inputs/project_data/TEM_Data/cru_ts323/cru_ts3.23.1901.2014.pre.dat.nc'
cl20_path = '/workspace/Shared/Tech_Projects/ALFRESCO_Inputs/project_data/TEM_Data/cru_october_final/cru_cl20/pre/akcan'
template_raster_fn = '/workspace/Shared/Tech_Projects/ALFRESCO_Inputs/project_data/TEM_Data/templates/tas_mean_C_AR5_GFDL-CM3_historical_01_1860.tif'
anomalies_calc_type = 'relative'
downscaling_operation = 'mult'
climatology_begin = '1961'
climatology_end = '1990'
year_begin = '1901'
year_end = '2014'
variable = 'pre'
metric = 'mm'
args_tuples = [ ('hi', cru_ts31), ('ci', cl20_path), ('tr', template_raster_fn),
('base', base_path), ('bt', year_begin), ('et', year_end),
('cbt', climatology_begin), ('cet', climatology_end),
('nc', ncores), ('at', anomalies_calc_type), ('m', metric),
('dso', downscaling_operation), ('v', variable) ]
args = ''.join([ ' -'+flag+' '+value for flag, value in args_tuples ])
os.system( 'ipython2.7 -- tas_cld_cru_ts31_to_cl20_downscaling.py ' + args )
| 46.702479 | 149 | 0.749602 | 828 | 5,651 | 4.778986 | 0.155797 | 0.079606 | 0.100834 | 0.14329 | 0.904979 | 0.904979 | 0.904979 | 0.897397 | 0.897397 | 0.897397 | 0 | 0.054271 | 0.096797 | 5,651 | 120 | 150 | 47.091667 | 0.721003 | 0.053088 | 0 | 0.808989 | 0 | 0.05618 | 0.504129 | 0.441254 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.044944 | 0 | 0.044944 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
c49ad91e2ed5fe2988a43b2bdcde263133ddd00d | 12,426 | py | Python | tests/test_siemens.py | clintonjwang/dicom2nifti | 6f7533cccb587d63423c6f77824a60776c8d5b5d | [
"MIT"
] | null | null | null | tests/test_siemens.py | clintonjwang/dicom2nifti | 6f7533cccb587d63423c6f77824a60776c8d5b5d | [
"MIT"
] | null | null | null | tests/test_siemens.py | clintonjwang/dicom2nifti | 6f7533cccb587d63423c6f77824a60776c8d5b5d | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
"""
dicom2nifti
@author: abrys
"""
import os
import shutil
import tempfile
import unittest
import nibabel
import numpy
import dicom2nifti.compressed_dicom as compressed_dicom
import pydicom
import tests.test_data as test_data
import dicom2nifti.convert_siemens as convert_siemens
from dicom2nifti.common import read_dicom_directory
from tests.test_tools import assert_compare_nifti, assert_compare_bval, assert_compare_bvec, ground_thruth_filenames
class TestConversionSiemens(unittest.TestCase):
def test_diffusion_imaging(self):
tmp_output_dir = tempfile.mkdtemp()
try:
results = convert_siemens.dicom_to_nifti(read_dicom_directory(test_data.SIEMENS_DTI),
None)
self.assertTrue(results.get('NII_FILE') is None)
self.assertTrue(isinstance(results['NII'], nibabel.nifti1.Nifti1Image))
self.assertTrue(results.get('BVAL_FILE') is None)
self.assertTrue(isinstance(results['BVAL'], numpy.ndarray))
self.assertTrue(results.get('BVEC_FILE') is None)
self.assertTrue(isinstance(results['BVEC'], numpy.ndarray))
results = convert_siemens.dicom_to_nifti(read_dicom_directory(test_data.SIEMENS_DTI),
os.path.join(tmp_output_dir, 'test.nii.gz'))
assert_compare_nifti(results['NII_FILE'],
ground_thruth_filenames(test_data.SIEMENS_DTI)[0])
self.assertTrue(isinstance(results['NII'], nibabel.nifti1.Nifti1Image))
assert_compare_bval(results['BVAL_FILE'],
ground_thruth_filenames(test_data.SIEMENS_DTI)[2])
self.assertTrue(isinstance(results['BVAL'], numpy.ndarray))
assert_compare_bval(results['BVEC_FILE'],
ground_thruth_filenames(test_data.SIEMENS_DTI)[3])
self.assertTrue(isinstance(results['BVEC'], numpy.ndarray))
results = convert_siemens.dicom_to_nifti(read_dicom_directory(test_data.SIEMENS_DTI_IMPLICIT),
os.path.join(tmp_output_dir, 'test.nii.gz'))
assert_compare_nifti(results['NII_FILE'],
ground_thruth_filenames(test_data.SIEMENS_DTI_IMPLICIT)[0])
self.assertTrue(isinstance(results['NII'], nibabel.nifti1.Nifti1Image))
assert_compare_bval(results['BVAL_FILE'],
ground_thruth_filenames(test_data.SIEMENS_DTI_IMPLICIT)[2])
self.assertTrue(isinstance(results['BVAL'], numpy.ndarray))
assert_compare_bval(results['BVEC_FILE'],
ground_thruth_filenames(test_data.SIEMENS_DTI_IMPLICIT)[3])
self.assertTrue(isinstance(results['BVEC'], numpy.ndarray))
results = convert_siemens.dicom_to_nifti(read_dicom_directory(test_data.SIEMENS_CLASSIC_DTI),
os.path.join(tmp_output_dir, 'test.nii.gz'))
assert_compare_nifti(results['NII_FILE'],
ground_thruth_filenames(test_data.SIEMENS_CLASSIC_DTI)[0])
self.assertTrue(isinstance(results['NII'], nibabel.nifti1.Nifti1Image))
assert_compare_bval(results['BVAL_FILE'],
ground_thruth_filenames(test_data.SIEMENS_CLASSIC_DTI)[2])
self.assertTrue(isinstance(results['BVAL'], numpy.ndarray))
assert_compare_bval(results['BVEC_FILE'],
ground_thruth_filenames(test_data.SIEMENS_CLASSIC_DTI)[3])
self.assertTrue(isinstance(results['BVEC'], numpy.ndarray))
results = convert_siemens.dicom_to_nifti(read_dicom_directory(test_data.SIEMENS_CLASSIC_DTI_IMPLICIT),
os.path.join(tmp_output_dir, 'test.nii.gz'))
assert_compare_nifti(results['NII_FILE'],
ground_thruth_filenames(test_data.SIEMENS_CLASSIC_DTI_IMPLICIT)[0])
self.assertTrue(isinstance(results['NII'], nibabel.nifti1.Nifti1Image))
assert_compare_bval(results['BVAL_FILE'],
ground_thruth_filenames(test_data.SIEMENS_CLASSIC_DTI_IMPLICIT)[2])
self.assertTrue(isinstance(results['BVAL'], numpy.ndarray))
assert_compare_bval(results['BVEC_FILE'],
ground_thruth_filenames(test_data.SIEMENS_CLASSIC_DTI_IMPLICIT)[3])
self.assertTrue(isinstance(results['BVEC'], numpy.ndarray))
finally:
shutil.rmtree(tmp_output_dir)
def test_4d(self):
tmp_output_dir = tempfile.mkdtemp()
try:
results = convert_siemens.dicom_to_nifti(read_dicom_directory(test_data.SIEMENS_FMRI),
None)
self.assertTrue(results.get('NII_FILE') is None)
self.assertTrue(isinstance(results['NII'], nibabel.nifti1.Nifti1Image))
results = convert_siemens.dicom_to_nifti(read_dicom_directory(test_data.SIEMENS_FMRI),
os.path.join(tmp_output_dir, 'test.nii.gz'))
assert_compare_nifti(results['NII_FILE'],
ground_thruth_filenames(test_data.SIEMENS_FMRI)[0])
self.assertTrue(isinstance(results['NII'], nibabel.nifti1.Nifti1Image))
results = convert_siemens.dicom_to_nifti(read_dicom_directory(test_data.SIEMENS_FMRI_IMPLICIT),
os.path.join(tmp_output_dir, 'test.nii.gz'))
assert_compare_nifti(results['NII_FILE'],
ground_thruth_filenames(test_data.SIEMENS_FMRI_IMPLICIT)[0])
self.assertTrue(isinstance(results['NII'], nibabel.nifti1.Nifti1Image))
results = convert_siemens.dicom_to_nifti(read_dicom_directory(test_data.SIEMENS_CLASSIC_FMRI),
os.path.join(tmp_output_dir, 'test.nii.gz'))
assert_compare_nifti(results['NII_FILE'],
ground_thruth_filenames(test_data.SIEMENS_CLASSIC_FMRI)[0])
self.assertTrue(isinstance(results['NII'], nibabel.nifti1.Nifti1Image))
results = convert_siemens.dicom_to_nifti(read_dicom_directory(test_data.SIEMENS_CLASSIC_FMRI_IMPLICIT),
os.path.join(tmp_output_dir, 'test.nii.gz'))
assert_compare_nifti(results['NII_FILE'],
ground_thruth_filenames(test_data.SIEMENS_CLASSIC_FMRI_IMPLICIT)[0])
self.assertTrue(isinstance(results['NII'], nibabel.nifti1.Nifti1Image))
finally:
shutil.rmtree(tmp_output_dir)
def test_anatomical(self):
tmp_output_dir = tempfile.mkdtemp()
try:
results = convert_siemens.dicom_to_nifti(read_dicom_directory(test_data.SIEMENS_ANATOMICAL),
None)
self.assertTrue(results.get('NII_FILE') is None)
self.assertTrue(isinstance(results['NII'], nibabel.nifti1.Nifti1Image))
results = convert_siemens.dicom_to_nifti(read_dicom_directory(test_data.SIEMENS_ANATOMICAL),
os.path.join(tmp_output_dir, 'test.nii.gz'))
assert_compare_nifti(results['NII_FILE'],
ground_thruth_filenames(test_data.SIEMENS_ANATOMICAL)[0])
self.assertTrue(isinstance(results['NII'], nibabel.nifti1.Nifti1Image))
results = convert_siemens.dicom_to_nifti(read_dicom_directory(test_data.SIEMENS_ANATOMICAL_IMPLICIT),
os.path.join(tmp_output_dir, 'test.nii.gz'))
assert_compare_nifti(results['NII_FILE'],
ground_thruth_filenames(test_data.SIEMENS_ANATOMICAL_IMPLICIT)[0])
self.assertTrue(isinstance(results['NII'], nibabel.nifti1.Nifti1Image))
finally:
shutil.rmtree(tmp_output_dir)
def test_is_mosaic(self):
# test wit directory
assert convert_siemens._is_mosaic(read_dicom_directory(test_data.SIEMENS_DTI))
assert convert_siemens._is_mosaic(read_dicom_directory(test_data.SIEMENS_FMRI))
assert not convert_siemens._is_mosaic(read_dicom_directory(test_data.SIEMENS_CLASSIC_DTI))
assert not convert_siemens._is_mosaic(read_dicom_directory(test_data.SIEMENS_CLASSIC_FMRI))
assert not convert_siemens._is_mosaic(read_dicom_directory(test_data.SIEMENS_ANATOMICAL))
# test with grouped dicoms
assert convert_siemens._is_mosaic(
convert_siemens._classic_get_grouped_dicoms(read_dicom_directory(test_data.SIEMENS_DTI)))
assert convert_siemens._is_mosaic(
convert_siemens._classic_get_grouped_dicoms(read_dicom_directory(test_data.SIEMENS_FMRI)))
assert not convert_siemens._is_mosaic(
convert_siemens._classic_get_grouped_dicoms(read_dicom_directory(test_data.SIEMENS_CLASSIC_DTI)))
assert not convert_siemens._is_mosaic(
convert_siemens._classic_get_grouped_dicoms(read_dicom_directory(test_data.SIEMENS_CLASSIC_FMRI)))
assert not convert_siemens._is_mosaic(
convert_siemens._classic_get_grouped_dicoms(read_dicom_directory(test_data.SIEMENS_ANATOMICAL)))
def test_is_4d(self):
assert convert_siemens._is_4d(read_dicom_directory(test_data.SIEMENS_DTI))
assert convert_siemens._is_4d(read_dicom_directory(test_data.SIEMENS_FMRI))
assert not convert_siemens._is_4d(read_dicom_directory(test_data.SIEMENS_CLASSIC_DTI))
assert not convert_siemens._is_4d(read_dicom_directory(test_data.SIEMENS_CLASSIC_FMRI))
assert not convert_siemens._is_4d(read_dicom_directory(test_data.SIEMENS_ANATOMICAL))
def test_is_diffusion_imaging(self):
assert convert_siemens._is_diffusion_imaging(read_dicom_directory(test_data.SIEMENS_DTI)[0])
assert not convert_siemens._is_diffusion_imaging(read_dicom_directory(test_data.SIEMENS_FMRI)[0])
assert convert_siemens._is_diffusion_imaging(read_dicom_directory(test_data.SIEMENS_CLASSIC_DTI)[0])
assert not convert_siemens._is_diffusion_imaging(read_dicom_directory(test_data.SIEMENS_CLASSIC_FMRI)[0])
assert not convert_siemens._is_diffusion_imaging(read_dicom_directory(test_data.SIEMENS_ANATOMICAL)[0])
def test_is_classic_4d(self):
assert not convert_siemens._is_classic_4d(
convert_siemens._classic_get_grouped_dicoms(read_dicom_directory(test_data.SIEMENS_DTI)))
assert not convert_siemens._is_classic_4d(
convert_siemens._classic_get_grouped_dicoms(read_dicom_directory(test_data.SIEMENS_FMRI)))
assert convert_siemens._is_classic_4d(
convert_siemens._classic_get_grouped_dicoms(read_dicom_directory(test_data.SIEMENS_CLASSIC_DTI)))
assert convert_siemens._is_classic_4d(
convert_siemens._classic_get_grouped_dicoms(read_dicom_directory(test_data.SIEMENS_CLASSIC_FMRI)))
assert not convert_siemens._is_classic_4d(
convert_siemens._classic_get_grouped_dicoms(read_dicom_directory(test_data.SIEMENS_ANATOMICAL)))
def test_is_siemens(self):
assert not convert_siemens.is_siemens(read_dicom_directory(test_data.PHILIPS_ANATOMICAL))
assert convert_siemens.is_siemens(read_dicom_directory(test_data.SIEMENS_ANATOMICAL))
assert not convert_siemens.is_siemens(read_dicom_directory(test_data.GE_ANATOMICAL))
assert not convert_siemens.is_siemens(read_dicom_directory(test_data.GENERIC_ANATOMICAL))
assert not convert_siemens.is_siemens(read_dicom_directory(test_data.HITACHI_ANATOMICAL))
def test_get_asconv_headers(self):
mosaic = compressed_dicom.read_file(os.path.join(test_data.SIEMENS_FMRI, 'IM-0001-0001.dcm'))
asconv_headers = convert_siemens._get_asconv_headers(mosaic)
assert len(asconv_headers) == 64022
if __name__ == '__main__':
unittest.main()
| 60.320388 | 116 | 0.670369 | 1,410 | 12,426 | 5.475887 | 0.065957 | 0.066313 | 0.11268 | 0.122523 | 0.889393 | 0.888097 | 0.887061 | 0.874369 | 0.868929 | 0.862583 | 0 | 0.008527 | 0.24497 | 12,426 | 205 | 117 | 60.614634 | 0.814432 | 0.007565 | 0 | 0.56213 | 0 | 0 | 0.03303 | 0 | 0 | 0 | 0 | 0 | 0.461538 | 1 | 0.053254 | false | 0 | 0.071006 | 0 | 0.130178 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
c4c4523ad6f7ab81e2098496ddaef99ba8c7f217 | 106,985 | py | Python | uptrends/api/alert_definition_api.py | hpcc-systems/uptrends-python | 2e05ba851a4e65bde3c40514f499c475465bef90 | [
"BSD-3-Clause"
] | null | null | null | uptrends/api/alert_definition_api.py | hpcc-systems/uptrends-python | 2e05ba851a4e65bde3c40514f499c475465bef90 | [
"BSD-3-Clause"
] | null | null | null | uptrends/api/alert_definition_api.py | hpcc-systems/uptrends-python | 2e05ba851a4e65bde3c40514f499c475465bef90 | [
"BSD-3-Clause"
] | null | null | null | # coding: utf-8
"""
Uptrends API v4
This document describes Uptrends API version 4. This Swagger environment also lets you execute API methods directly. Please note that this is not a sandbox environment: these API methods operate directly on your actual Uptrends account. For more information, please visit https://www.uptrends.com/api. # noqa: E501
OpenAPI spec version: 1.0.0
Generated by: https://github.com/swagger-api/swagger-codegen.git
"""
from __future__ import absolute_import
import re # noqa: F401
# python 2 and python 3 compatibility library
import six
from uptrends.api_client import ApiClient
class AlertDefinitionApi(object):
"""NOTE: This class is auto generated by the swagger code generator program.
Do not edit the class manually.
Ref: https://github.com/swagger-api/swagger-codegen
"""
def __init__(self, api_client=None):
if api_client is None:
api_client = ApiClient()
self.api_client = api_client
def alert_definition_add_monitor_group_to_alert_definition(self, alert_definition_guid, monitor_group_guid, **kwargs): # noqa: E501
"""Adds a monitor group to the specified alert definition. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.alert_definition_add_monitor_group_to_alert_definition(alert_definition_guid, monitor_group_guid, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str alert_definition_guid: The Guid of the alert definition to modify. (required)
:param str monitor_group_guid: The Guid of the monitor group to add. (required)
:return: AlertDefinitionMonitorGroup
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.alert_definition_add_monitor_group_to_alert_definition_with_http_info(alert_definition_guid, monitor_group_guid, **kwargs) # noqa: E501
else:
(data) = self.alert_definition_add_monitor_group_to_alert_definition_with_http_info(alert_definition_guid, monitor_group_guid, **kwargs) # noqa: E501
return data
def alert_definition_add_monitor_group_to_alert_definition_with_http_info(self, alert_definition_guid, monitor_group_guid, **kwargs): # noqa: E501
"""Adds a monitor group to the specified alert definition. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.alert_definition_add_monitor_group_to_alert_definition_with_http_info(alert_definition_guid, monitor_group_guid, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str alert_definition_guid: The Guid of the alert definition to modify. (required)
:param str monitor_group_guid: The Guid of the monitor group to add. (required)
:return: AlertDefinitionMonitorGroup
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['alert_definition_guid', 'monitor_group_guid'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method alert_definition_add_monitor_group_to_alert_definition" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'alert_definition_guid' is set
if ('alert_definition_guid' not in params or
params['alert_definition_guid'] is None):
raise ValueError("Missing the required parameter `alert_definition_guid` when calling `alert_definition_add_monitor_group_to_alert_definition`") # noqa: E501
# verify the required parameter 'monitor_group_guid' is set
if ('monitor_group_guid' not in params or
params['monitor_group_guid'] is None):
raise ValueError("Missing the required parameter `monitor_group_guid` when calling `alert_definition_add_monitor_group_to_alert_definition`") # noqa: E501
collection_formats = {}
path_params = {}
if 'alert_definition_guid' in params:
path_params['alertDefinitionGuid'] = params['alert_definition_guid'] # noqa: E501
if 'monitor_group_guid' in params:
path_params['monitorGroupGuid'] = params['monitor_group_guid'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json', 'application/xml']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json', 'application/xml']) # noqa: E501
# Authentication setting
auth_settings = ['basicauth'] # noqa: E501
return self.api_client.call_api(
'/AlertDefinition/{alertDefinitionGuid}/Members/MonitorGroup/{monitorGroupGuid}', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='AlertDefinitionMonitorGroup', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def alert_definition_add_monitor_to_alert_definition(self, alert_definition_guid, monitor_guid, **kwargs): # noqa: E501
"""Adds a monitor to the specified alert definition. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.alert_definition_add_monitor_to_alert_definition(alert_definition_guid, monitor_guid, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str alert_definition_guid: The Guid of the alert definition to modify. (required)
:param str monitor_guid: The Guid of the monitor to add. (required)
:return: AlertDefinitionMonitor
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.alert_definition_add_monitor_to_alert_definition_with_http_info(alert_definition_guid, monitor_guid, **kwargs) # noqa: E501
else:
(data) = self.alert_definition_add_monitor_to_alert_definition_with_http_info(alert_definition_guid, monitor_guid, **kwargs) # noqa: E501
return data
def alert_definition_add_monitor_to_alert_definition_with_http_info(self, alert_definition_guid, monitor_guid, **kwargs): # noqa: E501
"""Adds a monitor to the specified alert definition. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.alert_definition_add_monitor_to_alert_definition_with_http_info(alert_definition_guid, monitor_guid, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str alert_definition_guid: The Guid of the alert definition to modify. (required)
:param str monitor_guid: The Guid of the monitor to add. (required)
:return: AlertDefinitionMonitor
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['alert_definition_guid', 'monitor_guid'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method alert_definition_add_monitor_to_alert_definition" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'alert_definition_guid' is set
if ('alert_definition_guid' not in params or
params['alert_definition_guid'] is None):
raise ValueError("Missing the required parameter `alert_definition_guid` when calling `alert_definition_add_monitor_to_alert_definition`") # noqa: E501
# verify the required parameter 'monitor_guid' is set
if ('monitor_guid' not in params or
params['monitor_guid'] is None):
raise ValueError("Missing the required parameter `monitor_guid` when calling `alert_definition_add_monitor_to_alert_definition`") # noqa: E501
collection_formats = {}
path_params = {}
if 'alert_definition_guid' in params:
path_params['alertDefinitionGuid'] = params['alert_definition_guid'] # noqa: E501
if 'monitor_guid' in params:
path_params['monitorGuid'] = params['monitor_guid'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json', 'application/xml']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json', 'application/xml']) # noqa: E501
# Authentication setting
auth_settings = ['basicauth'] # noqa: E501
return self.api_client.call_api(
'/AlertDefinition/{alertDefinitionGuid}/Members/Monitor/{monitorGuid}', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='AlertDefinitionMonitor', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def alert_definition_add_operator_group_to_escalation_level(self, alert_definition_guid, escalation_level_id, operator_group_guid, **kwargs): # noqa: E501
"""Adds an operator group to the specified escalation level. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.alert_definition_add_operator_group_to_escalation_level(alert_definition_guid, escalation_level_id, operator_group_guid, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str alert_definition_guid: The Guid of the alert definition. (required)
:param int escalation_level_id: The escalation level id. (required)
:param str operator_group_guid: The Guid of the operator group to add. (required)
:return: AlertDefinitionOperatorGroup
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.alert_definition_add_operator_group_to_escalation_level_with_http_info(alert_definition_guid, escalation_level_id, operator_group_guid, **kwargs) # noqa: E501
else:
(data) = self.alert_definition_add_operator_group_to_escalation_level_with_http_info(alert_definition_guid, escalation_level_id, operator_group_guid, **kwargs) # noqa: E501
return data
def alert_definition_add_operator_group_to_escalation_level_with_http_info(self, alert_definition_guid, escalation_level_id, operator_group_guid, **kwargs): # noqa: E501
"""Adds an operator group to the specified escalation level. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.alert_definition_add_operator_group_to_escalation_level_with_http_info(alert_definition_guid, escalation_level_id, operator_group_guid, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str alert_definition_guid: The Guid of the alert definition. (required)
:param int escalation_level_id: The escalation level id. (required)
:param str operator_group_guid: The Guid of the operator group to add. (required)
:return: AlertDefinitionOperatorGroup
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['alert_definition_guid', 'escalation_level_id', 'operator_group_guid'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method alert_definition_add_operator_group_to_escalation_level" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'alert_definition_guid' is set
if ('alert_definition_guid' not in params or
params['alert_definition_guid'] is None):
raise ValueError("Missing the required parameter `alert_definition_guid` when calling `alert_definition_add_operator_group_to_escalation_level`") # noqa: E501
# verify the required parameter 'escalation_level_id' is set
if ('escalation_level_id' not in params or
params['escalation_level_id'] is None):
raise ValueError("Missing the required parameter `escalation_level_id` when calling `alert_definition_add_operator_group_to_escalation_level`") # noqa: E501
# verify the required parameter 'operator_group_guid' is set
if ('operator_group_guid' not in params or
params['operator_group_guid'] is None):
raise ValueError("Missing the required parameter `operator_group_guid` when calling `alert_definition_add_operator_group_to_escalation_level`") # noqa: E501
collection_formats = {}
path_params = {}
if 'alert_definition_guid' in params:
path_params['alertDefinitionGuid'] = params['alert_definition_guid'] # noqa: E501
if 'escalation_level_id' in params:
path_params['escalationLevelId'] = params['escalation_level_id'] # noqa: E501
if 'operator_group_guid' in params:
path_params['operatorGroupGuid'] = params['operator_group_guid'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json', 'application/xml']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json', 'application/xml']) # noqa: E501
# Authentication setting
auth_settings = ['basicauth'] # noqa: E501
return self.api_client.call_api(
'/AlertDefinition/{alertDefinitionGuid}/EscalationLevel/{escalationLevelId}/Members/OperatorGroup/{operatorGroupGuid}', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='AlertDefinitionOperatorGroup', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def alert_definition_add_operator_to_escalation_level(self, alert_definition_guid, escalation_level_id, operator_guid, **kwargs): # noqa: E501
"""Adds an operator to the specified escalation level. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.alert_definition_add_operator_to_escalation_level(alert_definition_guid, escalation_level_id, operator_guid, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str alert_definition_guid: The Guid of the alert definition. (required)
:param int escalation_level_id: The escalation level id. (required)
:param str operator_guid: The Guid of the operator to add. (required)
:return: AlertDefinitionOperator
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.alert_definition_add_operator_to_escalation_level_with_http_info(alert_definition_guid, escalation_level_id, operator_guid, **kwargs) # noqa: E501
else:
(data) = self.alert_definition_add_operator_to_escalation_level_with_http_info(alert_definition_guid, escalation_level_id, operator_guid, **kwargs) # noqa: E501
return data
def alert_definition_add_operator_to_escalation_level_with_http_info(self, alert_definition_guid, escalation_level_id, operator_guid, **kwargs): # noqa: E501
"""Adds an operator to the specified escalation level. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.alert_definition_add_operator_to_escalation_level_with_http_info(alert_definition_guid, escalation_level_id, operator_guid, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str alert_definition_guid: The Guid of the alert definition. (required)
:param int escalation_level_id: The escalation level id. (required)
:param str operator_guid: The Guid of the operator to add. (required)
:return: AlertDefinitionOperator
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['alert_definition_guid', 'escalation_level_id', 'operator_guid'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method alert_definition_add_operator_to_escalation_level" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'alert_definition_guid' is set
if ('alert_definition_guid' not in params or
params['alert_definition_guid'] is None):
raise ValueError("Missing the required parameter `alert_definition_guid` when calling `alert_definition_add_operator_to_escalation_level`") # noqa: E501
# verify the required parameter 'escalation_level_id' is set
if ('escalation_level_id' not in params or
params['escalation_level_id'] is None):
raise ValueError("Missing the required parameter `escalation_level_id` when calling `alert_definition_add_operator_to_escalation_level`") # noqa: E501
# verify the required parameter 'operator_guid' is set
if ('operator_guid' not in params or
params['operator_guid'] is None):
raise ValueError("Missing the required parameter `operator_guid` when calling `alert_definition_add_operator_to_escalation_level`") # noqa: E501
collection_formats = {}
path_params = {}
if 'alert_definition_guid' in params:
path_params['alertDefinitionGuid'] = params['alert_definition_guid'] # noqa: E501
if 'escalation_level_id' in params:
path_params['escalationLevelId'] = params['escalation_level_id'] # noqa: E501
if 'operator_guid' in params:
path_params['operatorGuid'] = params['operator_guid'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json', 'application/xml']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json', 'application/xml']) # noqa: E501
# Authentication setting
auth_settings = ['basicauth'] # noqa: E501
return self.api_client.call_api(
'/AlertDefinition/{alertDefinitionGuid}/EscalationLevel/{escalationLevelId}/Members/Operator/{operatorGuid}', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='AlertDefinitionOperator', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def alert_definition_create_alert_definition(self, alert_definition, **kwargs): # noqa: E501
"""Creates a new alert definition. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.alert_definition_create_alert_definition(alert_definition, async_req=True)
>>> result = thread.get()
:param async_req bool
:param AlertDefinition alert_definition: The details of the alert definition to create. (required)
:return: AlertDefinition
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.alert_definition_create_alert_definition_with_http_info(alert_definition, **kwargs) # noqa: E501
else:
(data) = self.alert_definition_create_alert_definition_with_http_info(alert_definition, **kwargs) # noqa: E501
return data
def alert_definition_create_alert_definition_with_http_info(self, alert_definition, **kwargs): # noqa: E501
"""Creates a new alert definition. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.alert_definition_create_alert_definition_with_http_info(alert_definition, async_req=True)
>>> result = thread.get()
:param async_req bool
:param AlertDefinition alert_definition: The details of the alert definition to create. (required)
:return: AlertDefinition
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['alert_definition'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method alert_definition_create_alert_definition" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'alert_definition' is set
if ('alert_definition' not in params or
params['alert_definition'] is None):
raise ValueError("Missing the required parameter `alert_definition` when calling `alert_definition_create_alert_definition`") # noqa: E501
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'alert_definition' in params:
body_params = params['alert_definition']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json', 'application/xml']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json', 'application/xml']) # noqa: E501
# Authentication setting
auth_settings = ['basicauth'] # noqa: E501
return self.api_client.call_api(
'/AlertDefinition', 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='AlertDefinition', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def alert_definition_delete_alert_definition(self, alert_definition_guid, **kwargs): # noqa: E501
"""Deletes an existing alert definition. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.alert_definition_delete_alert_definition(alert_definition_guid, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str alert_definition_guid: The Guid of the alert definition to remove. (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.alert_definition_delete_alert_definition_with_http_info(alert_definition_guid, **kwargs) # noqa: E501
else:
(data) = self.alert_definition_delete_alert_definition_with_http_info(alert_definition_guid, **kwargs) # noqa: E501
return data
def alert_definition_delete_alert_definition_with_http_info(self, alert_definition_guid, **kwargs): # noqa: E501
"""Deletes an existing alert definition. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.alert_definition_delete_alert_definition_with_http_info(alert_definition_guid, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str alert_definition_guid: The Guid of the alert definition to remove. (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['alert_definition_guid'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method alert_definition_delete_alert_definition" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'alert_definition_guid' is set
if ('alert_definition_guid' not in params or
params['alert_definition_guid'] is None):
raise ValueError("Missing the required parameter `alert_definition_guid` when calling `alert_definition_delete_alert_definition`") # noqa: E501
collection_formats = {}
path_params = {}
if 'alert_definition_guid' in params:
path_params['alertDefinitionGuid'] = params['alert_definition_guid'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json', 'application/xml']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json', 'application/xml']) # noqa: E501
# Authentication setting
auth_settings = ['basicauth'] # noqa: E501
return self.api_client.call_api(
'/AlertDefinition/{alertDefinitionGuid}', 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None, # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def alert_definition_get_all_alert_definitions(self, **kwargs): # noqa: E501
"""Gets a list of all alert definitions. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.alert_definition_get_all_alert_definitions(async_req=True)
>>> result = thread.get()
:param async_req bool
:return: list[AlertDefinition]
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.alert_definition_get_all_alert_definitions_with_http_info(**kwargs) # noqa: E501
else:
(data) = self.alert_definition_get_all_alert_definitions_with_http_info(**kwargs) # noqa: E501
return data
def alert_definition_get_all_alert_definitions_with_http_info(self, **kwargs): # noqa: E501
"""Gets a list of all alert definitions. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.alert_definition_get_all_alert_definitions_with_http_info(async_req=True)
>>> result = thread.get()
:param async_req bool
:return: list[AlertDefinition]
If the method is called asynchronously,
returns the request thread.
"""
all_params = [] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method alert_definition_get_all_alert_definitions" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
path_params = {}
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json', 'application/xml']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json', 'application/xml']) # noqa: E501
# Authentication setting
auth_settings = ['basicauth'] # noqa: E501
return self.api_client.call_api(
'/AlertDefinition', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='list[AlertDefinition]', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def alert_definition_get_all_members(self, alert_definition_guid, **kwargs): # noqa: E501
"""Gets a list of all monitor and monitor group guids of the specified alert definition. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.alert_definition_get_all_members(alert_definition_guid, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str alert_definition_guid: The Guid of the alert definition for which to return the members. (required)
:return: list[AlertDefinitionMember]
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.alert_definition_get_all_members_with_http_info(alert_definition_guid, **kwargs) # noqa: E501
else:
(data) = self.alert_definition_get_all_members_with_http_info(alert_definition_guid, **kwargs) # noqa: E501
return data
def alert_definition_get_all_members_with_http_info(self, alert_definition_guid, **kwargs): # noqa: E501
"""Gets a list of all monitor and monitor group guids of the specified alert definition. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.alert_definition_get_all_members_with_http_info(alert_definition_guid, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str alert_definition_guid: The Guid of the alert definition for which to return the members. (required)
:return: list[AlertDefinitionMember]
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['alert_definition_guid'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method alert_definition_get_all_members" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'alert_definition_guid' is set
if ('alert_definition_guid' not in params or
params['alert_definition_guid'] is None):
raise ValueError("Missing the required parameter `alert_definition_guid` when calling `alert_definition_get_all_members`") # noqa: E501
collection_formats = {}
path_params = {}
if 'alert_definition_guid' in params:
path_params['alertDefinitionGuid'] = params['alert_definition_guid'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json', 'application/xml']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json', 'application/xml']) # noqa: E501
# Authentication setting
auth_settings = ['basicauth'] # noqa: E501
return self.api_client.call_api(
'/AlertDefinition/{alertDefinitionGuid}/Members', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='list[AlertDefinitionMember]', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def alert_definition_get_escalation_level(self, alert_definition_guid, escalation_level_id, **kwargs): # noqa: E501
"""Gets the escalation level information of the specified alert definition. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.alert_definition_get_escalation_level(alert_definition_guid, escalation_level_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str alert_definition_guid: The Guid of the alert definition. (required)
:param int escalation_level_id: The escalation level id. (required)
:return: list[EscalationLevel]
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.alert_definition_get_escalation_level_with_http_info(alert_definition_guid, escalation_level_id, **kwargs) # noqa: E501
else:
(data) = self.alert_definition_get_escalation_level_with_http_info(alert_definition_guid, escalation_level_id, **kwargs) # noqa: E501
return data
def alert_definition_get_escalation_level_with_http_info(self, alert_definition_guid, escalation_level_id, **kwargs): # noqa: E501
"""Gets the escalation level information of the specified alert definition. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.alert_definition_get_escalation_level_with_http_info(alert_definition_guid, escalation_level_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str alert_definition_guid: The Guid of the alert definition. (required)
:param int escalation_level_id: The escalation level id. (required)
:return: list[EscalationLevel]
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['alert_definition_guid', 'escalation_level_id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method alert_definition_get_escalation_level" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'alert_definition_guid' is set
if ('alert_definition_guid' not in params or
params['alert_definition_guid'] is None):
raise ValueError("Missing the required parameter `alert_definition_guid` when calling `alert_definition_get_escalation_level`") # noqa: E501
# verify the required parameter 'escalation_level_id' is set
if ('escalation_level_id' not in params or
params['escalation_level_id'] is None):
raise ValueError("Missing the required parameter `escalation_level_id` when calling `alert_definition_get_escalation_level`") # noqa: E501
collection_formats = {}
path_params = {}
if 'alert_definition_guid' in params:
path_params['alertDefinitionGuid'] = params['alert_definition_guid'] # noqa: E501
if 'escalation_level_id' in params:
path_params['escalationLevelId'] = params['escalation_level_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json', 'application/xml']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json', 'application/xml']) # noqa: E501
# Authentication setting
auth_settings = ['basicauth'] # noqa: E501
return self.api_client.call_api(
'/AlertDefinition/{alertDefinitionGuid}/EscalationLevel/{escalationLevelId}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='EscalationLevel', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def alert_definition_get_escalation_level_integration(self, alert_definition_guid, escalation_level_id, **kwargs): # noqa: E501
"""Gets the integrations for the specified escalation level. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.alert_definition_get_escalation_level_integration(alert_definition_guid, escalation_level_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str alert_definition_guid: The Guid of the alert definition. (required)
:param int escalation_level_id: The escalation level id. (required)
:return: list[Integration]
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.alert_definition_get_escalation_level_integration_with_http_info(alert_definition_guid, escalation_level_id, **kwargs) # noqa: E501
else:
(data) = self.alert_definition_get_escalation_level_integration_with_http_info(alert_definition_guid, escalation_level_id, **kwargs) # noqa: E501
return data
def alert_definition_get_escalation_level_integration_with_http_info(self, alert_definition_guid, escalation_level_id, **kwargs): # noqa: E501
"""Gets the integrations for the specified escalation level. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.alert_definition_get_escalation_level_integration_with_http_info(alert_definition_guid, escalation_level_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str alert_definition_guid: The Guid of the alert definition. (required)
:param int escalation_level_id: The escalation level id. (required)
:return: list[Integration]
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['alert_definition_guid', 'escalation_level_id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method alert_definition_get_escalation_level_integration" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'alert_definition_guid' is set
if ('alert_definition_guid' not in params or
params['alert_definition_guid'] is None):
raise ValueError("Missing the required parameter `alert_definition_guid` when calling `alert_definition_get_escalation_level_integration`") # noqa: E501
# verify the required parameter 'escalation_level_id' is set
if ('escalation_level_id' not in params or
params['escalation_level_id'] is None):
raise ValueError("Missing the required parameter `escalation_level_id` when calling `alert_definition_get_escalation_level_integration`") # noqa: E501
collection_formats = {}
path_params = {}
if 'alert_definition_guid' in params:
path_params['alertDefinitionGuid'] = params['alert_definition_guid'] # noqa: E501
if 'escalation_level_id' in params:
path_params['escalationLevelId'] = params['escalation_level_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json', 'application/xml']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json', 'application/xml']) # noqa: E501
# Authentication setting
auth_settings = ['basicauth'] # noqa: E501
return self.api_client.call_api(
'/AlertDefinition/{alertDefinitionGuid}/EscalationLevel/{escalationLevelId}/Integration', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='list[Integration]', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def alert_definition_get_escalation_level_operator(self, alert_definition_guid, escalation_level_id, **kwargs): # noqa: E501
"""Gets the operator and operator group guids for the specified escalation level. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.alert_definition_get_escalation_level_operator(alert_definition_guid, escalation_level_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str alert_definition_guid: The Guid of the alert definition. (required)
:param int escalation_level_id: The escalation level id. (required)
:return: list[AlertEscalationLevelMember]
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.alert_definition_get_escalation_level_operator_with_http_info(alert_definition_guid, escalation_level_id, **kwargs) # noqa: E501
else:
(data) = self.alert_definition_get_escalation_level_operator_with_http_info(alert_definition_guid, escalation_level_id, **kwargs) # noqa: E501
return data
def alert_definition_get_escalation_level_operator_with_http_info(self, alert_definition_guid, escalation_level_id, **kwargs): # noqa: E501
"""Gets the operator and operator group guids for the specified escalation level. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.alert_definition_get_escalation_level_operator_with_http_info(alert_definition_guid, escalation_level_id, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str alert_definition_guid: The Guid of the alert definition. (required)
:param int escalation_level_id: The escalation level id. (required)
:return: list[AlertEscalationLevelMember]
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['alert_definition_guid', 'escalation_level_id'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method alert_definition_get_escalation_level_operator" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'alert_definition_guid' is set
if ('alert_definition_guid' not in params or
params['alert_definition_guid'] is None):
raise ValueError("Missing the required parameter `alert_definition_guid` when calling `alert_definition_get_escalation_level_operator`") # noqa: E501
# verify the required parameter 'escalation_level_id' is set
if ('escalation_level_id' not in params or
params['escalation_level_id'] is None):
raise ValueError("Missing the required parameter `escalation_level_id` when calling `alert_definition_get_escalation_level_operator`") # noqa: E501
collection_formats = {}
path_params = {}
if 'alert_definition_guid' in params:
path_params['alertDefinitionGuid'] = params['alert_definition_guid'] # noqa: E501
if 'escalation_level_id' in params:
path_params['escalationLevelId'] = params['escalation_level_id'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json', 'application/xml']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json', 'application/xml']) # noqa: E501
# Authentication setting
auth_settings = ['basicauth'] # noqa: E501
return self.api_client.call_api(
'/AlertDefinition/{alertDefinitionGuid}/EscalationLevel/{escalationLevelId}/Members', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='list[AlertEscalationLevelMember]', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def alert_definition_get_specified_alert_definitions(self, alert_definition_guid, **kwargs): # noqa: E501
"""Gets the specified alert definition. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.alert_definition_get_specified_alert_definitions(alert_definition_guid, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str alert_definition_guid: The Guid of the alert definition. (required)
:return: AlertDefinition
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.alert_definition_get_specified_alert_definitions_with_http_info(alert_definition_guid, **kwargs) # noqa: E501
else:
(data) = self.alert_definition_get_specified_alert_definitions_with_http_info(alert_definition_guid, **kwargs) # noqa: E501
return data
def alert_definition_get_specified_alert_definitions_with_http_info(self, alert_definition_guid, **kwargs): # noqa: E501
"""Gets the specified alert definition. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.alert_definition_get_specified_alert_definitions_with_http_info(alert_definition_guid, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str alert_definition_guid: The Guid of the alert definition. (required)
:return: AlertDefinition
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['alert_definition_guid'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method alert_definition_get_specified_alert_definitions" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'alert_definition_guid' is set
if ('alert_definition_guid' not in params or
params['alert_definition_guid'] is None):
raise ValueError("Missing the required parameter `alert_definition_guid` when calling `alert_definition_get_specified_alert_definitions`") # noqa: E501
collection_formats = {}
path_params = {}
if 'alert_definition_guid' in params:
path_params['alertDefinitionGuid'] = params['alert_definition_guid'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json', 'application/xml']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json', 'application/xml']) # noqa: E501
# Authentication setting
auth_settings = ['basicauth'] # noqa: E501
return self.api_client.call_api(
'/AlertDefinition/{alertDefinitionGuid}', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='AlertDefinition', # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def alert_definition_patch_alert_definition(self, alert_definition, alert_definition_guid, **kwargs): # noqa: E501
"""Partially updates the definition of the specified alert definition. # noqa: E501
This methods accepts parts of an alert definition. Fields that do not require changes can be omitted. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.alert_definition_patch_alert_definition(alert_definition, alert_definition_guid, async_req=True)
>>> result = thread.get()
:param async_req bool
:param AlertDefinition alert_definition: The partial definition for the alert definition that should be updated. (required)
:param str alert_definition_guid: The Guid of the alert definition that should be updated. (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.alert_definition_patch_alert_definition_with_http_info(alert_definition, alert_definition_guid, **kwargs) # noqa: E501
else:
(data) = self.alert_definition_patch_alert_definition_with_http_info(alert_definition, alert_definition_guid, **kwargs) # noqa: E501
return data
def alert_definition_patch_alert_definition_with_http_info(self, alert_definition, alert_definition_guid, **kwargs): # noqa: E501
"""Partially updates the definition of the specified alert definition. # noqa: E501
This methods accepts parts of an alert definition. Fields that do not require changes can be omitted. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.alert_definition_patch_alert_definition_with_http_info(alert_definition, alert_definition_guid, async_req=True)
>>> result = thread.get()
:param async_req bool
:param AlertDefinition alert_definition: The partial definition for the alert definition that should be updated. (required)
:param str alert_definition_guid: The Guid of the alert definition that should be updated. (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['alert_definition', 'alert_definition_guid'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method alert_definition_patch_alert_definition" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'alert_definition' is set
if ('alert_definition' not in params or
params['alert_definition'] is None):
raise ValueError("Missing the required parameter `alert_definition` when calling `alert_definition_patch_alert_definition`") # noqa: E501
# verify the required parameter 'alert_definition_guid' is set
if ('alert_definition_guid' not in params or
params['alert_definition_guid'] is None):
raise ValueError("Missing the required parameter `alert_definition_guid` when calling `alert_definition_patch_alert_definition`") # noqa: E501
collection_formats = {}
path_params = {}
if 'alert_definition_guid' in params:
path_params['alertDefinitionGuid'] = params['alert_definition_guid'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'alert_definition' in params:
body_params = params['alert_definition']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json', 'application/xml']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json', 'application/xml']) # noqa: E501
# Authentication setting
auth_settings = ['basicauth'] # noqa: E501
return self.api_client.call_api(
'/AlertDefinition/{alertDefinitionGuid}', 'PATCH',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None, # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def alert_definition_put_alert_definition(self, alert_definition, alert_definition_guid, **kwargs): # noqa: E501
"""Updates the definition of the specified alert definition. # noqa: E501
This methods only accepts a complete alert definition where all fields are specified. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.alert_definition_put_alert_definition(alert_definition, alert_definition_guid, async_req=True)
>>> result = thread.get()
:param async_req bool
:param AlertDefinition alert_definition: The partial definition for the alert definition that should be updated. (required)
:param str alert_definition_guid: The Guid of the alert definition that should be updated. (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.alert_definition_put_alert_definition_with_http_info(alert_definition, alert_definition_guid, **kwargs) # noqa: E501
else:
(data) = self.alert_definition_put_alert_definition_with_http_info(alert_definition, alert_definition_guid, **kwargs) # noqa: E501
return data
def alert_definition_put_alert_definition_with_http_info(self, alert_definition, alert_definition_guid, **kwargs): # noqa: E501
"""Updates the definition of the specified alert definition. # noqa: E501
This methods only accepts a complete alert definition where all fields are specified. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.alert_definition_put_alert_definition_with_http_info(alert_definition, alert_definition_guid, async_req=True)
>>> result = thread.get()
:param async_req bool
:param AlertDefinition alert_definition: The partial definition for the alert definition that should be updated. (required)
:param str alert_definition_guid: The Guid of the alert definition that should be updated. (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['alert_definition', 'alert_definition_guid'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method alert_definition_put_alert_definition" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'alert_definition' is set
if ('alert_definition' not in params or
params['alert_definition'] is None):
raise ValueError("Missing the required parameter `alert_definition` when calling `alert_definition_put_alert_definition`") # noqa: E501
# verify the required parameter 'alert_definition_guid' is set
if ('alert_definition_guid' not in params or
params['alert_definition_guid'] is None):
raise ValueError("Missing the required parameter `alert_definition_guid` when calling `alert_definition_put_alert_definition`") # noqa: E501
collection_formats = {}
path_params = {}
if 'alert_definition_guid' in params:
path_params['alertDefinitionGuid'] = params['alert_definition_guid'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'alert_definition' in params:
body_params = params['alert_definition']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json', 'application/xml']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json', 'application/xml']) # noqa: E501
# Authentication setting
auth_settings = ['basicauth'] # noqa: E501
return self.api_client.call_api(
'/AlertDefinition/{alertDefinitionGuid}', 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None, # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def alert_definition_remove_monitor_from_alert_definition(self, alert_definition_guid, monitor_guid, **kwargs): # noqa: E501
"""Removes a monitor for the specified alert definition. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.alert_definition_remove_monitor_from_alert_definition(alert_definition_guid, monitor_guid, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str alert_definition_guid: The Guid of the alert definition to modify. (required)
:param str monitor_guid: The Guid of the monitor to remove. (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.alert_definition_remove_monitor_from_alert_definition_with_http_info(alert_definition_guid, monitor_guid, **kwargs) # noqa: E501
else:
(data) = self.alert_definition_remove_monitor_from_alert_definition_with_http_info(alert_definition_guid, monitor_guid, **kwargs) # noqa: E501
return data
def alert_definition_remove_monitor_from_alert_definition_with_http_info(self, alert_definition_guid, monitor_guid, **kwargs): # noqa: E501
"""Removes a monitor for the specified alert definition. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.alert_definition_remove_monitor_from_alert_definition_with_http_info(alert_definition_guid, monitor_guid, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str alert_definition_guid: The Guid of the alert definition to modify. (required)
:param str monitor_guid: The Guid of the monitor to remove. (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['alert_definition_guid', 'monitor_guid'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method alert_definition_remove_monitor_from_alert_definition" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'alert_definition_guid' is set
if ('alert_definition_guid' not in params or
params['alert_definition_guid'] is None):
raise ValueError("Missing the required parameter `alert_definition_guid` when calling `alert_definition_remove_monitor_from_alert_definition`") # noqa: E501
# verify the required parameter 'monitor_guid' is set
if ('monitor_guid' not in params or
params['monitor_guid'] is None):
raise ValueError("Missing the required parameter `monitor_guid` when calling `alert_definition_remove_monitor_from_alert_definition`") # noqa: E501
collection_formats = {}
path_params = {}
if 'alert_definition_guid' in params:
path_params['alertDefinitionGuid'] = params['alert_definition_guid'] # noqa: E501
if 'monitor_guid' in params:
path_params['monitorGuid'] = params['monitor_guid'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json', 'application/xml']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json', 'application/xml']) # noqa: E501
# Authentication setting
auth_settings = ['basicauth'] # noqa: E501
return self.api_client.call_api(
'/AlertDefinition/{alertDefinitionGuid}/Members/Monitor/{monitorGuid}', 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None, # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def alert_definition_remove_monitor_group_from_alert_definition(self, alert_definition_guid, monitor_group_guid, **kwargs): # noqa: E501
"""Removes a monitor group for the specified alert definition. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.alert_definition_remove_monitor_group_from_alert_definition(alert_definition_guid, monitor_group_guid, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str alert_definition_guid: The Guid of the alert definition to modify. (required)
:param str monitor_group_guid: The Guid of the monitor group to remove. (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.alert_definition_remove_monitor_group_from_alert_definition_with_http_info(alert_definition_guid, monitor_group_guid, **kwargs) # noqa: E501
else:
(data) = self.alert_definition_remove_monitor_group_from_alert_definition_with_http_info(alert_definition_guid, monitor_group_guid, **kwargs) # noqa: E501
return data
def alert_definition_remove_monitor_group_from_alert_definition_with_http_info(self, alert_definition_guid, monitor_group_guid, **kwargs): # noqa: E501
"""Removes a monitor group for the specified alert definition. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.alert_definition_remove_monitor_group_from_alert_definition_with_http_info(alert_definition_guid, monitor_group_guid, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str alert_definition_guid: The Guid of the alert definition to modify. (required)
:param str monitor_group_guid: The Guid of the monitor group to remove. (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['alert_definition_guid', 'monitor_group_guid'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method alert_definition_remove_monitor_group_from_alert_definition" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'alert_definition_guid' is set
if ('alert_definition_guid' not in params or
params['alert_definition_guid'] is None):
raise ValueError("Missing the required parameter `alert_definition_guid` when calling `alert_definition_remove_monitor_group_from_alert_definition`") # noqa: E501
# verify the required parameter 'monitor_group_guid' is set
if ('monitor_group_guid' not in params or
params['monitor_group_guid'] is None):
raise ValueError("Missing the required parameter `monitor_group_guid` when calling `alert_definition_remove_monitor_group_from_alert_definition`") # noqa: E501
collection_formats = {}
path_params = {}
if 'alert_definition_guid' in params:
path_params['alertDefinitionGuid'] = params['alert_definition_guid'] # noqa: E501
if 'monitor_group_guid' in params:
path_params['monitorGroupGuid'] = params['monitor_group_guid'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json', 'application/xml']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json', 'application/xml']) # noqa: E501
# Authentication setting
auth_settings = ['basicauth'] # noqa: E501
return self.api_client.call_api(
'/AlertDefinition/{alertDefinitionGuid}/Members/MonitorGroup/{monitorGroupGuid}', 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None, # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def alert_definition_remove_operator_from_escalation_level(self, alert_definition_guid, escalation_level_id, operator_guid, **kwargs): # noqa: E501
"""Removes an operator for the specified escalation level. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.alert_definition_remove_operator_from_escalation_level(alert_definition_guid, escalation_level_id, operator_guid, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str alert_definition_guid: The Guid of the alert definition. (required)
:param int escalation_level_id: The escalation level id. (required)
:param str operator_guid: The Guid of the operator to remove. (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.alert_definition_remove_operator_from_escalation_level_with_http_info(alert_definition_guid, escalation_level_id, operator_guid, **kwargs) # noqa: E501
else:
(data) = self.alert_definition_remove_operator_from_escalation_level_with_http_info(alert_definition_guid, escalation_level_id, operator_guid, **kwargs) # noqa: E501
return data
def alert_definition_remove_operator_from_escalation_level_with_http_info(self, alert_definition_guid, escalation_level_id, operator_guid, **kwargs): # noqa: E501
"""Removes an operator for the specified escalation level. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.alert_definition_remove_operator_from_escalation_level_with_http_info(alert_definition_guid, escalation_level_id, operator_guid, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str alert_definition_guid: The Guid of the alert definition. (required)
:param int escalation_level_id: The escalation level id. (required)
:param str operator_guid: The Guid of the operator to remove. (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['alert_definition_guid', 'escalation_level_id', 'operator_guid'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method alert_definition_remove_operator_from_escalation_level" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'alert_definition_guid' is set
if ('alert_definition_guid' not in params or
params['alert_definition_guid'] is None):
raise ValueError("Missing the required parameter `alert_definition_guid` when calling `alert_definition_remove_operator_from_escalation_level`") # noqa: E501
# verify the required parameter 'escalation_level_id' is set
if ('escalation_level_id' not in params or
params['escalation_level_id'] is None):
raise ValueError("Missing the required parameter `escalation_level_id` when calling `alert_definition_remove_operator_from_escalation_level`") # noqa: E501
# verify the required parameter 'operator_guid' is set
if ('operator_guid' not in params or
params['operator_guid'] is None):
raise ValueError("Missing the required parameter `operator_guid` when calling `alert_definition_remove_operator_from_escalation_level`") # noqa: E501
collection_formats = {}
path_params = {}
if 'alert_definition_guid' in params:
path_params['alertDefinitionGuid'] = params['alert_definition_guid'] # noqa: E501
if 'escalation_level_id' in params:
path_params['escalationLevelId'] = params['escalation_level_id'] # noqa: E501
if 'operator_guid' in params:
path_params['operatorGuid'] = params['operator_guid'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json', 'application/xml']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json', 'application/xml']) # noqa: E501
# Authentication setting
auth_settings = ['basicauth'] # noqa: E501
return self.api_client.call_api(
'/AlertDefinition/{alertDefinitionGuid}/EscalationLevel/{escalationLevelId}/Members/Operator/{operatorGuid}', 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None, # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def alert_definition_remove_operator_group_from_escalation_level(self, alert_definition_guid, escalation_level_id, operator_group_guid, **kwargs): # noqa: E501
"""Removes an operator group for the specified escalation level. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.alert_definition_remove_operator_group_from_escalation_level(alert_definition_guid, escalation_level_id, operator_group_guid, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str alert_definition_guid: The Guid of the alert definition. (required)
:param int escalation_level_id: The escalation level id. (required)
:param str operator_group_guid: The Guid of the operator group to remove. (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.alert_definition_remove_operator_group_from_escalation_level_with_http_info(alert_definition_guid, escalation_level_id, operator_group_guid, **kwargs) # noqa: E501
else:
(data) = self.alert_definition_remove_operator_group_from_escalation_level_with_http_info(alert_definition_guid, escalation_level_id, operator_group_guid, **kwargs) # noqa: E501
return data
def alert_definition_remove_operator_group_from_escalation_level_with_http_info(self, alert_definition_guid, escalation_level_id, operator_group_guid, **kwargs): # noqa: E501
"""Removes an operator group for the specified escalation level. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.alert_definition_remove_operator_group_from_escalation_level_with_http_info(alert_definition_guid, escalation_level_id, operator_group_guid, async_req=True)
>>> result = thread.get()
:param async_req bool
:param str alert_definition_guid: The Guid of the alert definition. (required)
:param int escalation_level_id: The escalation level id. (required)
:param str operator_group_guid: The Guid of the operator group to remove. (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['alert_definition_guid', 'escalation_level_id', 'operator_group_guid'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method alert_definition_remove_operator_group_from_escalation_level" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'alert_definition_guid' is set
if ('alert_definition_guid' not in params or
params['alert_definition_guid'] is None):
raise ValueError("Missing the required parameter `alert_definition_guid` when calling `alert_definition_remove_operator_group_from_escalation_level`") # noqa: E501
# verify the required parameter 'escalation_level_id' is set
if ('escalation_level_id' not in params or
params['escalation_level_id'] is None):
raise ValueError("Missing the required parameter `escalation_level_id` when calling `alert_definition_remove_operator_group_from_escalation_level`") # noqa: E501
# verify the required parameter 'operator_group_guid' is set
if ('operator_group_guid' not in params or
params['operator_group_guid'] is None):
raise ValueError("Missing the required parameter `operator_group_guid` when calling `alert_definition_remove_operator_group_from_escalation_level`") # noqa: E501
collection_formats = {}
path_params = {}
if 'alert_definition_guid' in params:
path_params['alertDefinitionGuid'] = params['alert_definition_guid'] # noqa: E501
if 'escalation_level_id' in params:
path_params['escalationLevelId'] = params['escalation_level_id'] # noqa: E501
if 'operator_group_guid' in params:
path_params['operatorGroupGuid'] = params['operator_group_guid'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json', 'application/xml']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json', 'application/xml']) # noqa: E501
# Authentication setting
auth_settings = ['basicauth'] # noqa: E501
return self.api_client.call_api(
'/AlertDefinition/{alertDefinitionGuid}/EscalationLevel/{escalationLevelId}/Members/OperatorGroup/{operatorGroupGuid}', 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None, # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def alert_definition_update_integration_for_escalation_with_patch(self, escalation_level_integration, alert_definition_guid, escalation_level_id, integration_guid, **kwargs): # noqa: E501
"""Partially updates an integration to the specified escalation level. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.alert_definition_update_integration_for_escalation_with_patch(escalation_level_integration, alert_definition_guid, escalation_level_id, integration_guid, async_req=True)
>>> result = thread.get()
:param async_req bool
:param EscalationLevelIntegration escalation_level_integration: The partial definition for the integration that should be updated. (required)
:param str alert_definition_guid: The Guid of the alert definition. (required)
:param int escalation_level_id: The escalation level id. (required)
:param str integration_guid: The Guid of the integration to update. (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.alert_definition_update_integration_for_escalation_with_patch_with_http_info(escalation_level_integration, alert_definition_guid, escalation_level_id, integration_guid, **kwargs) # noqa: E501
else:
(data) = self.alert_definition_update_integration_for_escalation_with_patch_with_http_info(escalation_level_integration, alert_definition_guid, escalation_level_id, integration_guid, **kwargs) # noqa: E501
return data
def alert_definition_update_integration_for_escalation_with_patch_with_http_info(self, escalation_level_integration, alert_definition_guid, escalation_level_id, integration_guid, **kwargs): # noqa: E501
"""Partially updates an integration to the specified escalation level. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.alert_definition_update_integration_for_escalation_with_patch_with_http_info(escalation_level_integration, alert_definition_guid, escalation_level_id, integration_guid, async_req=True)
>>> result = thread.get()
:param async_req bool
:param EscalationLevelIntegration escalation_level_integration: The partial definition for the integration that should be updated. (required)
:param str alert_definition_guid: The Guid of the alert definition. (required)
:param int escalation_level_id: The escalation level id. (required)
:param str integration_guid: The Guid of the integration to update. (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['escalation_level_integration', 'alert_definition_guid', 'escalation_level_id', 'integration_guid'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method alert_definition_update_integration_for_escalation_with_patch" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'escalation_level_integration' is set
if ('escalation_level_integration' not in params or
params['escalation_level_integration'] is None):
raise ValueError("Missing the required parameter `escalation_level_integration` when calling `alert_definition_update_integration_for_escalation_with_patch`") # noqa: E501
# verify the required parameter 'alert_definition_guid' is set
if ('alert_definition_guid' not in params or
params['alert_definition_guid'] is None):
raise ValueError("Missing the required parameter `alert_definition_guid` when calling `alert_definition_update_integration_for_escalation_with_patch`") # noqa: E501
# verify the required parameter 'escalation_level_id' is set
if ('escalation_level_id' not in params or
params['escalation_level_id'] is None):
raise ValueError("Missing the required parameter `escalation_level_id` when calling `alert_definition_update_integration_for_escalation_with_patch`") # noqa: E501
# verify the required parameter 'integration_guid' is set
if ('integration_guid' not in params or
params['integration_guid'] is None):
raise ValueError("Missing the required parameter `integration_guid` when calling `alert_definition_update_integration_for_escalation_with_patch`") # noqa: E501
collection_formats = {}
path_params = {}
if 'alert_definition_guid' in params:
path_params['alertDefinitionGuid'] = params['alert_definition_guid'] # noqa: E501
if 'escalation_level_id' in params:
path_params['escalationLevelId'] = params['escalation_level_id'] # noqa: E501
if 'integration_guid' in params:
path_params['integrationGuid'] = params['integration_guid'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'escalation_level_integration' in params:
body_params = params['escalation_level_integration']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json', 'application/xml']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json', 'application/xml']) # noqa: E501
# Authentication setting
auth_settings = ['basicauth'] # noqa: E501
return self.api_client.call_api(
'/AlertDefinition/{alertDefinitionGuid}/EscalationLevel/{escalationLevelId}/Integration/{integrationGuid}', 'PATCH',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None, # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def alert_definition_update_integration_for_escalation_with_put(self, escalation_level_integration, alert_definition_guid, escalation_level_id, integration_guid, **kwargs): # noqa: E501
"""Updates an integration for the specified escalation level. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.alert_definition_update_integration_for_escalation_with_put(escalation_level_integration, alert_definition_guid, escalation_level_id, integration_guid, async_req=True)
>>> result = thread.get()
:param async_req bool
:param EscalationLevelIntegration escalation_level_integration: The definition for the integration that should be updated. (required)
:param str alert_definition_guid: The Guid of the alert definition. (required)
:param int escalation_level_id: The escalation level id. (required)
:param str integration_guid: The Guid of the integration to update. (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('async_req'):
return self.alert_definition_update_integration_for_escalation_with_put_with_http_info(escalation_level_integration, alert_definition_guid, escalation_level_id, integration_guid, **kwargs) # noqa: E501
else:
(data) = self.alert_definition_update_integration_for_escalation_with_put_with_http_info(escalation_level_integration, alert_definition_guid, escalation_level_id, integration_guid, **kwargs) # noqa: E501
return data
def alert_definition_update_integration_for_escalation_with_put_with_http_info(self, escalation_level_integration, alert_definition_guid, escalation_level_id, integration_guid, **kwargs): # noqa: E501
"""Updates an integration for the specified escalation level. # noqa: E501
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.alert_definition_update_integration_for_escalation_with_put_with_http_info(escalation_level_integration, alert_definition_guid, escalation_level_id, integration_guid, async_req=True)
>>> result = thread.get()
:param async_req bool
:param EscalationLevelIntegration escalation_level_integration: The definition for the integration that should be updated. (required)
:param str alert_definition_guid: The Guid of the alert definition. (required)
:param int escalation_level_id: The escalation level id. (required)
:param str integration_guid: The Guid of the integration to update. (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['escalation_level_integration', 'alert_definition_guid', 'escalation_level_id', 'integration_guid'] # noqa: E501
all_params.append('async_req')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in six.iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method alert_definition_update_integration_for_escalation_with_put" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'escalation_level_integration' is set
if ('escalation_level_integration' not in params or
params['escalation_level_integration'] is None):
raise ValueError("Missing the required parameter `escalation_level_integration` when calling `alert_definition_update_integration_for_escalation_with_put`") # noqa: E501
# verify the required parameter 'alert_definition_guid' is set
if ('alert_definition_guid' not in params or
params['alert_definition_guid'] is None):
raise ValueError("Missing the required parameter `alert_definition_guid` when calling `alert_definition_update_integration_for_escalation_with_put`") # noqa: E501
# verify the required parameter 'escalation_level_id' is set
if ('escalation_level_id' not in params or
params['escalation_level_id'] is None):
raise ValueError("Missing the required parameter `escalation_level_id` when calling `alert_definition_update_integration_for_escalation_with_put`") # noqa: E501
# verify the required parameter 'integration_guid' is set
if ('integration_guid' not in params or
params['integration_guid'] is None):
raise ValueError("Missing the required parameter `integration_guid` when calling `alert_definition_update_integration_for_escalation_with_put`") # noqa: E501
collection_formats = {}
path_params = {}
if 'alert_definition_guid' in params:
path_params['alertDefinitionGuid'] = params['alert_definition_guid'] # noqa: E501
if 'escalation_level_id' in params:
path_params['escalationLevelId'] = params['escalation_level_id'] # noqa: E501
if 'integration_guid' in params:
path_params['integrationGuid'] = params['integration_guid'] # noqa: E501
query_params = []
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'escalation_level_integration' in params:
body_params = params['escalation_level_integration']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json', 'application/xml']) # noqa: E501
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type( # noqa: E501
['application/json', 'application/xml']) # noqa: E501
# Authentication setting
auth_settings = ['basicauth'] # noqa: E501
return self.api_client.call_api(
'/AlertDefinition/{alertDefinitionGuid}/EscalationLevel/{escalationLevelId}/Integration/{integrationGuid}', 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None, # noqa: E501
auth_settings=auth_settings,
async_req=params.get('async_req'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
| 49.645012 | 321 | 0.669458 | 12,348 | 106,985 | 5.478053 | 0.018869 | 0.140813 | 0.075839 | 0.021288 | 0.987981 | 0.987257 | 0.98563 | 0.98362 | 0.981343 | 0.97781 | 0 | 0.012272 | 0.251297 | 106,985 | 2,154 | 322 | 49.668059 | 0.83221 | 0.324158 | 0 | 0.831526 | 0 | 0 | 0.2726 | 0.131414 | 0 | 0 | 0 | 0 | 0 | 1 | 0.034195 | false | 0 | 0.003336 | 0 | 0.088407 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
c4ff62e06552a2fb5ed82e376b924f37bb949faa | 7,187 | py | Python | TP1/src/alineaC.py | LuisPereira23/PL-2021 | 951190835d8989e3afda1fd0f8f9ef08f5d85e07 | [
"MIT"
] | null | null | null | TP1/src/alineaC.py | LuisPereira23/PL-2021 | 951190835d8989e3afda1fd0f8f9ef08f5d85e07 | [
"MIT"
] | null | null | null | TP1/src/alineaC.py | LuisPereira23/PL-2021 | 951190835d8989e3afda1fd0f8f9ef08f5d85e07 | [
"MIT"
] | null | null | null | import re
def bibtex2json(docBib):
docBib = "[" + docBib
docBib = re.sub(
r',\n',
r',',
docBib
)
docBib = re.sub(
r'["}],',
r',\n',
docBib
)
docBib = re.sub(
r'@',
r'\n@',
docBib
)
docBib = re.sub(
r'%(.)*',
r'',
docBib
)
docBib = re.sub(
r'>.*\n+',
r'',
docBib
)
docBib = re.sub(
r'@([a-zA-z]+){([À-ÿa-zA-Z0-9 :,\\{}/\.\-]+),',
r'{\n\t"categoria":"\1",\n\t"label":"\2",',
docBib
)
docBib = re.sub(
r'(?i) +author *= *{([À-ÿa-zA-Z0-9 ,\\{}:/\.\'\~]+)\n? *([À-ÿa-zA-Z0-9 ,\\{}:/\.]*)}',
r'\t"author":"\1 \2"',
docBib
)
docBib = re.sub(
r'(?i) +author *= *\"([À-ÿa-zA-Z0-9 ,\\{}:/\.]+)\n? *([À-ÿa-zA-Z0-9 ,\\{}:/\.]*)\"',
r'\t"author":"\1 \2"',
docBib
)
docBib = re.sub(
r'(?i) +title *= *{([À-ÿa-zA-Z0-9 ,\{\}\+\=\!:\/\.\?\_\$\-\&\'\(\)\{\}\#\\]+)\n?( *[À-ÿa-zA-Z0-9 ,\{\}:\/\.\-*]*)\n?( *[À-ÿa-zA-Z0-9 ,\{\}:\/\.]*)}',
r'\t"title":"\1\2\3"',
docBib
)
docBib = re.sub(
r'(?i) +title *= *{([^"]+)\n?("(.*)")( *[À-ÿa-zA-Z0-9 ,\\{}:\/\.\-*]*)\n?( *[À-ÿa-zA-Z0-9 ,\\{}:\/\.]*)}',
r'\t"title":"\1\3\4"',
docBib
)
docBib = re.sub(
r' +title *= *\"([À-ÿa-zA-Z0-9 \=\,\{\}\!:\/\.\?\\]+)\n?( *[À-ÿa-zA-Z0-9 \,\{\}:\/\.\-\*]*)\n?( *[À-ÿa-zA-Z0-9 \,\{\}\:\/\.]*)\"',
r'\t"title":"\1\2\3"',
docBib
)
docBib = re.sub(
r' +note *= *\"([À-ÿa-zA-Z0-9 ,\\{}:/\.]+)\n? *([À-ÿa-zA-Z0-9 ,\\{}:/\.]*)\"',
r'\t"note":"\1\2"',
docBib
)
docBib = re.sub(
r' +note *= *{([^}]*) *}',
r'\t"note":"\1"',
docBib
)
docBib = re.sub(
r'(?i) +booktitle *= *{([\$\º\'\ª\-À-ÿa-zA-Z0-9 ,\\{}:/\(\)\.]+)\n? *([À-ÿa-zA-Z0-9 ,\\{}:/\(\)\.]*)}',
r'\t"booktitle":"\1 \2"',
docBib
)
docBib = re.sub(
r'(?i) +booktitle *= *\"([\'\-À-ÿa-zA-Z0-9 ,\\{}:/\(\)\.]+)\n? *([À-ÿa-zA-Z0-9 ,\\{}:/\(\)\.]*)\"',
r'\t"booktitle":"\1 \2"',
docBib
)
docBib = re.sub(
r' +address *= *\"([^\"]+)\"',
r'\t"adress":"\1"',
docBib
)
docBib = re.sub(
r' +address *= *{(.*) *}',
r'\t"address":"\1"',
docBib
)
docBib = re.sub(
r'(?i) +year *= *([0-9]+)',
r'\t"year":"\1"',
docBib
)
docBib = re.sub(
r'(?i) +year *= *\"([0-9]+)\"',
r'\t"year":"\1"',
docBib
)
docBib = re.sub(
r'(?i) +year *= *{([0-9]*)}',
r'\t"year":"\1"',
docBib
)
docBib = re.sub(
r' +institution *= *\"(.*) *\"',
r'\t"institution":"\1"',
docBib
)
docBib = re.sub(
r' +type *= *\"(.*) *\"',
r'\t"type":"\1"',
docBib
)
docBib = re.sub(
r' +keyword *= *\"(.*) *\"',
r'\t"keyword":"\1"',
docBib
)
docBib = re.sub(
r' +keyword *= *{(.*) *}',
r'\t"keyword":"\1"',
docBib
)
docBib = re.sub(
r' +editor *=\t* *{(.*) *}',
r'\t"editor":"\1"',
docBib
)
docBib = re.sub(
r' +url *= *\"(.*) *\"',
r'\t"url":"\1"',
docBib
)
docBib = re.sub(
r' *url *=\n? *{(.*) *}',
r'\t"url":"\1"',
docBib
)
docBib = re.sub(
r' +month *= *\"(.*) *\"',
r'\t"month":"\1"',
docBib
)
docBib = re.sub(
r' +month *= *{(.*) *}',
r'\t"month":"\1"',
docBib
)
#
# docBib = re.sub(
# r' +abstract *= *{([0-9]*)}',
# r'\t"abstract":"\1"',
# docBib
# )
#
# docBib = re.sub(
# r' +abstract *= *\"([0-9]*)\"',
# r'\t"abstract":"\1"',
# docBib
# )
docBib = re.sub(
r' +editor *= *\"(.+) *\"',
r'\t"editor":"\1"',
docBib
)
docBib = re.sub(
r' +pages *= *\"(.*) *\"',
r'\t"pages":"\1"',
docBib
)
docBib = re.sub(
r' +pages *=\t* *{(.*) *}',
r'\t"pages":"\1"',
docBib
)
docBib = re.sub(
r' +number *= *[\"{](.*) *[\"}]',
r'\t"number":"\1"',
docBib
)
docBib = re.sub(
r' +number *= *\"?([0-9]*)\"?',
r'\t"number":"\1"',
docBib
)
docBib = re.sub(
r' +note *= *\"(.+) *\"',
r'\t"note":"\1"',
docBib
)
docBib = re.sub(
r' +publisher *= *\"(.*) *\"',
r'\t"publisher":"\1"',
docBib
)
docBib = re.sub(
r' +publisher *= *{(.*) *}',
r'\t"publisher":"\1"',
docBib
)
docBib = re.sub(
r' +docpage *= *\"(.+) *\"',
r'\t"docpage":"\1"',
docBib
)
docBib = re.sub(
r' +series *= *\"(.+) *\"',
r'\t"series":"\1"',
docBib
)
docBib = re.sub(
r' +series *=\t* *{(.+) *}',
r'\t"series":"\1"',
docBib
)
docBib = re.sub(
r'(?i) +volume *= *\"([0-9]*)\"',
r'\t"volume":"\1"',
docBib
)
docBib = re.sub(
r'(?i) +volume *=\t* *{?([0-9A-Z\(\) ]*)}?',
r'\t"volume":"\1"',
docBib
)
docBib = re.sub(
r' +journal *= *{(.+) *}',
r'\t"journal":"\1"',
docBib
)
docBib = re.sub(
r' +journal *= *\"(.+) *\"',
r'\t"journal":"\1"',
docBib
)
docBib = re.sub(
r' +isbn *= *\"(.*) *\"',
r'\t"isbn":"\1"',
docBib
)
docBib = re.sub(
r' +(isbn|ISBN)(13)? *= *{(.*) *}',
r'\t"\1\2":"\3"',
docBib
)
docBib = re.sub(
r' +lang *= *\"(.*) *\"',
r'\t"lang":"\1"',
docBib
)
docBib = re.sub(
r' +lang *= *{(.*) *}',
r'\t"lang":"\1"',
docBib
)
docBib = re.sub(
r' +school *= *{(.+) *}',
r'\t"school":"\1"',
docBib
)
docBib = re.sub(
r' +superviser *= *\"(.*) *\"',
r'\t"superviser":"\1"',
docBib
)
docBib = re.sub(
r' +location *= *\"(.*) *\"',
r'\t"location":"\1"',
docBib
)
docBib = re.sub(
r' +Note *= *\"(.*) *\"',
r'\t"note":"\1"',
docBib
)
docBib = re.sub(
r' +shortin *= *{(.*) *}',
r'\t"shortin":"\1"',
docBib
)
docBib = re.sub(
r' +edition *=\t* *{(.*) *}',
r'\t"edition":"\1"',
docBib
)
docBib = re.sub(
r' +annote *=\t* *{(.*) *}',
r'\t"annote":"\1"',
docBib
)
docBib = re.sub(
r'\\',
r'\\\\',
docBib
)
docBib = re.sub(
r'}\n',
r'},\n',
docBib
)
docBib = re.sub(
r',\n}',
r'\n}',
docBib
)
docBib = docBib + "]"
return docBib
with open("exemplo-utf8.bib", encoding="utf8") as f:
out = open ("out.json","w")
conteudo = f.read()
out.write(bibtex2json(conteudo))
| 18.716146 | 157 | 0.3139 | 772 | 7,187 | 2.920984 | 0.091969 | 0.340577 | 0.372506 | 0.452328 | 0.837251 | 0.829268 | 0.780488 | 0.725055 | 0.701109 | 0.586253 | 0 | 0.028087 | 0.375817 | 7,187 | 384 | 158 | 18.716146 | 0.474588 | 0.028941 | 0 | 0.51505 | 0 | 0.020067 | 0.349943 | 0.023262 | 0 | 0 | 0 | 0 | 0 | 1 | 0.003344 | false | 0 | 0.003344 | 0 | 0.010033 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
48009df1ff99e9a6f744b8e81b4555fa1822c2b6 | 21,675 | py | Python | smaframework/common/address_keywords_extension_map.py | diegopso/smaframework | a49ccd1f035ab257acf734e07f88b4ed17d6cbc3 | [
"MIT"
] | 1 | 2020-12-25T07:10:27.000Z | 2020-12-25T07:10:27.000Z | smaframework/common/address_keywords_extension_map.py | diegopso/smaframework | a49ccd1f035ab257acf734e07f88b4ed17d6cbc3 | [
"MIT"
] | null | null | null | smaframework/common/address_keywords_extension_map.py | diegopso/smaframework | a49ccd1f035ab257acf734e07f88b4ed17d6cbc3 | [
"MIT"
] | null | null | null | import re
'''
* Parses an address string to collect the relevant keywords.
*
* @param address - The address string.
* @param mode - `extend` (to add abbreviations) or `clean` (to remove commom words).
'''
def parse_str(address, mode='clean'):
address_str = re.sub('[^a-zA-Z0-9\-]+', ' ', address).lower()
address_keywords = address_str.split()
if mode == 'extend':
extensions = list(map(lambda k: next((tp for tp in address_keywords_extensions if k in tp), []), address_keywords))
for e in extensions:
if len(e):
address_keywords.extend(e)
elif mode == 'clean':
address_keywords = [item for item in address_keywords if item not in address_stop_words]
return address_keywords
address_stop_words = ["alley","allee","aly","ally","anex","anx","annex","annx","arcade","arc","avenue","av","ave","aven","avenu","avn","avnue","bayou","bayoo","byu","beach","bch","bend","bnd","bluff","blf","bluf","bluffs","blfs","bottom","bot","btm","bottm","boulevard","blvd","boul","boulv","branch","br","brnch","bridge","brdge","brg","brook","brk","brooks","brks","burg","bg","burgs","bgs","bypass","byp","bypa","bypas","byps","camp","cp","cmp","canyon","canyn","cyn","cnyn","cape","cpe","causeway","cswy","causwa","center","cen","ctr","cent","centr","centre","cnter","cntr","centers","ctrs","circle","cir","circ","circl","crcl","crcle","circles","cirs","cliff","clf","cliffs","clfs","club","clb","common","cmn","commons","cmns","corner","cor","corners","cors","course","crse","court","ct","courts","cts","cove","cv","coves","cvs","creek","crk","crescent","cres","crsent","crsnt","crest","crst","crossing","xing","crssng","crossroad","xrd","crossroads","xrds","curve","curv","dale","dl","dam","dm","divide","div","dv","dvd","drive","dr","driv","drv","drives","drs","estate","est","estates","ests","expressway","exp","expy","expr","express","expw","extension","ext","extn","extnsn","extensions","exts","fall","falls","fls","ferry","fry","frry","field","fld","fields","flds","flat","flt","flats","flts","ford","frd","fords","frds","forest","frst","forests","forge","forg","frg","forges","frgs","fork","frk","forks","frks","fort","ft","frt","freeway","fwy","freewy","frway","frwy","garden","gdn","gardn","grden","grdn","gardens","gdns","grdns","gateway","gtwy","gatewy","gatway","gtway","glen","gln","glens","glns","green","grn","greens","grns","grove","grov","grv","groves","grvs","harbor","harb","hbr","harbr","hrbor","harbors","hbrs","haven","hvn","heights","ht","hts","highway","hwy","highwy","hiway","hiwy","hway","hill","hl","hills","hls","hollow","hllw","holw","hollows","holws","inlet","inlt","island","is","islnd","islands","iss","islnds","isle","isles","junction","jct","jction","jctn","junctn","juncton","junctions","jctns","jcts","key","ky","keys","kys","knoll","knl","knol","knolls","knls","lake","lk","lakes","lks","land","landing","lndg","lndng","lane","ln","light","lgt","lights","lgts","loaf","lf","lock","lck","locks","lcks","lodge","ldg","ldge","lodg","loop","loops","mall","manor","mnr","manors","mnrs","meadow","mdw","meadows","mdw","mdws","medows","mews","mill","ml","mills","mls","mission","missn","msn","mssn","motorway","mtwy","mount","mnt","mt","mountain","mntain","mtn","mntn","mountin","mtin","mountains","mntns","mtns","neck","nck","orchard","orch","orchrd","oval","ovl","overpass","opas","park","prk","parks","park","parkway","pkwy","parkwy","pkway","pky","parkways","pkwy","pkwys","pass","passage","psge","path","paths","pike","pikes","pine","pne","pines","pnes","place","pl","plain","pln","plains","plns","plaza","plz","plza","point","pt","points","pts","port","prt","ports","prts","prairie","pr","prr","radial","rad","radl","radiel","ramp","ranch","rnch","ranches","rnchs","rapid","rpd","rapids","rpds","rest","rst","ridge","rdg","rdge","ridges","rdgs","river","riv","rvr","rivr","road","rd","roads","rds","route","rte","row","rue","run","shoal","shl","shoals","shls","shore","shoar","shr","shores","shoars","shrs","skyway","skwy","spring","spg","spng","sprng","springs","spgs","spngs","sprngs","spur","spurs","spur","square","sq","sqr","sqre","squ","squares","sqrs","sqs","station","sta","statn","stn","stravenue","stra","strav","straven","stravn","strvn","strvnue","stream","strm","streme","street","st","strt","str","streets","sts","summit","smt","sumit","sumitt","terrace","ter","terr","throughway","trwy","trace","trce","traces","track","trak","tracks","trk","trks","trafficway","trfy","trail","trl","trails","trls","trailer","trlr","trlrs","tunnel","tunel","tunl","tunls","tunnels","tunnl","turnpike","trnpk","tpke","turnpk","underpass","upas","union","un","unions","uns","valley","vly","vally","vlly","valleys","vlys","viaduct","vdct","via","viadct","view","vw","views","vws","village","vill","vlg","villag","villg","villiage","villages","vlgs","ville","vl","vista","vis","vist","vst","vsta","walk","walks","walk","wall","way","wy","ways","well","wl","wells","wls"]
'''
* Map to extend addresses keywords, extracted from USPS.com Postal Explorer: C1 Street Suffix Abbreviations
*
* @param key - The key to retrieve extension options
'''
address_keywords_extensions = [
[
"alley",
"allee",
"aly",
"ally"
],
[
"anex",
"anx",
"annex",
"annx"
],
[
"arcade",
"arc"
],
[
"avenue",
"av",
"ave",
"aven",
"avenu",
"avn",
"avnue"
],
[
"bayou",
"bayoo",
"byu"
],
[
"beach",
"bch"
],
[
"bend",
"bnd"
],
[
"bluff",
"blf",
"bluf"
],
[
"bluffs",
"blfs"
],
[
"bottom",
"bot",
"btm",
"bottm"
],
[
"boulevard",
"blvd",
"boul",
"boulv"
],
[
"branch",
"br",
"brnch"
],
[
"bridge",
"brdge",
"brg"
],
[
"brook",
"brk"
],
[
"brooks",
"brks"
],
[
"burg",
"bg"
],
[
"burgs",
"bgs"
],
[
"bypass",
"byp",
"bypa",
"bypas",
"byps"
],
[
"camp",
"cp",
"cmp"
],
[
"canyon",
"canyn",
"cyn",
"cnyn"
],
[
"cape",
"cpe"
],
[
"causeway",
"cswy",
"causwa"
],
[
"center",
"cen",
"ctr",
"cent",
"centr",
"centre",
"cnter",
"cntr"
],
[
"centers",
"ctrs"
],
[
"circle",
"cir",
"circ",
"circl",
"crcl",
"crcle"
],
[
"circles",
"cirs"
],
[
"cliff",
"clf"
],
[
"cliffs",
"clfs"
],
[
"club",
"clb"
],
[
"common",
"cmn"
],
[
"commons",
"cmns"
],
[
"corner",
"cor"
],
[
"corners",
"cors"
],
[
"course",
"crse"
],
[
"court",
"ct"
],
[
"courts",
"cts"
],
[
"cove",
"cv"
],
[
"coves",
"cvs"
],
[
"creek",
"crk"
],
[
"crescent",
"cres",
"crsent",
"crsnt"
],
[
"crest",
"crst"
],
[
"crossing",
"xing",
"crssng"
],
[
"crossroad",
"xrd"
],
[
"crossroads",
"xrds"
],
[
"curve",
"curv"
],
[
"dale",
"dl"
],
[
"dam",
"dm"
],
[
"divide",
"div",
"dv",
"dvd"
],
[
"drive",
"dr",
"driv",
"drv"
],
[
"drives",
"drs"
],
[
"estate",
"est"
],
[
"estates",
"ests"
],
[
"expressway",
"exp",
"expy",
"expr",
"express",
"expw"
],
[
"extension",
"ext",
"extn",
"extnsn"
],
[
"extensions",
"exts"
],
[
"fall"
],
[
"falls",
"fls"
],
[
"ferry",
"fry",
"frry"
],
[
"field",
"fld"
],
[
"fields",
"flds"
],
[
"flat",
"flt"
],
[
"flats",
"flts"
],
[
"ford",
"frd"
],
[
"fords",
"frds"
],
[
"forest",
"frst",
"forests"
],
[
"forge",
"forg",
"frg"
],
[
"forges",
"frgs"
],
[
"fork",
"frk"
],
[
"forks",
"frks"
],
[
"fort",
"ft",
"frt"
],
[
"freeway",
"fwy",
"freewy",
"frway",
"frwy"
],
[
"garden",
"gdn",
"gardn",
"grden",
"grdn"
],
[
"gardens",
"gdns",
"grdns"
],
[
"gateway",
"gtwy",
"gatewy",
"gatway",
"gtway"
],
[
"glen",
"gln"
],
[
"glens",
"glns"
],
[
"green",
"grn"
],
[
"greens",
"grns"
],
[
"grove",
"grov",
"grv"
],
[
"groves",
"grvs"
],
[
"harbor",
"harb",
"hbr",
"harbr",
"hrbor"
],
[
"harbors",
"hbrs"
],
[
"haven",
"hvn"
],
[
"heights",
"ht",
"hts"
],
[
"highway",
"hwy",
"highwy",
"hiway",
"hiwy",
"hway"
],
[
"hill",
"hl"
],
[
"hills",
"hls"
],
[
"hollow",
"hllw",
"holw",
"hollows",
"holws"
],
[
"inlet",
"inlt"
],
[
"island",
"is",
"islnd"
],
[
"islands",
"iss",
"islnds"
],
[
"isle",
"isles"
],
[
"junction",
"jct",
"jction",
"jctn",
"junctn",
"juncton"
],
[
"junctions",
"jctns",
"jcts"
],
[
"key",
"ky"
],
[
"keys",
"kys"
],
[
"knoll",
"knl",
"knol"
],
[
"knolls",
"knls"
],
[
"lake",
"lk"
],
[
"lakes",
"lks"
],
[
"land"
],
[
"landing",
"lndg",
"lndng"
],
[
"lane",
"ln"
],
[
"light",
"lgt"
],
[
"lights",
"lgts"
],
[
"loaf",
"lf"
],
[
"lock",
"lck"
],
[
"locks",
"lcks"
],
[
"lodge",
"ldg",
"ldge",
"lodg"
],
[
"loop",
"loops"
],
[
"mall"
],
[
"manor",
"mnr"
],
[
"manors",
"mnrs"
],
[
"meadow",
"mdw"
],
[
"meadows",
"mdw",
"mdws",
"medows"
],
[
"mews"
],
[
"mill",
"ml"
],
[
"mills",
"mls"
],
[
"mission",
"missn",
"msn",
"mssn"
],
[
"motorway",
"mtwy"
],
[
"mount",
"mnt",
"mt"
],
[
"mountain",
"mntain",
"mtn",
"mntn",
"mountin",
"mtin"
],
[
"mountains",
"mntns",
"mtns"
],
[
"neck",
"nck"
],
[
"orchard",
"orch",
"orchrd"
],
[
"oval",
"ovl"
],
[
"overpass",
"opas"
],
[
"park",
"prk"
],
[
"parks",
"park"
],
[
"parkway",
"pkwy",
"parkwy",
"pkway",
"pky"
],
[
"parkways",
"pkwy",
"pkwys"
],
[
"pass"
],
[
"passage",
"psge"
],
[
"path",
"paths"
],
[
"pike",
"pikes"
],
[
"pine",
"pne"
],
[
"pines",
"pnes"
],
[
"place",
"pl"
],
[
"plain",
"pln"
],
[
"plains",
"plns"
],
[
"plaza",
"plz",
"plza"
],
[
"point",
"pt"
],
[
"points",
"pts"
],
[
"port",
"prt"
],
[
"ports",
"prts"
],
[
"prairie",
"pr",
"prr"
],
[
"radial",
"rad",
"radl",
"radiel"
],
[
"ramp"
],
[
"ranch",
"rnch",
"ranches",
"rnchs"
],
[
"rapid",
"rpd"
],
[
"rapids",
"rpds"
],
[
"rest",
"rst"
],
[
"ridge",
"rdg",
"rdge"
],
[
"ridges",
"rdgs"
],
[
"river",
"riv",
"rvr",
"rivr"
],
[
"road",
"rd"
],
[
"roads",
"rds"
],
[
"route",
"rte"
],
[
"row"
],
[
"rue"
],
[
"run"
],
[
"shoal",
"shl"
],
[
"shoals",
"shls"
],
[
"shore",
"shoar",
"shr"
],
[
"shores",
"shoars",
"shrs"
],
[
"skyway",
"skwy"
],
[
"spring",
"spg",
"spng",
"sprng"
],
[
"springs",
"spgs",
"spngs",
"sprngs"
],
[
"spur"
],
[
"spurs",
"spur"
],
[
"square",
"sq",
"sqr",
"sqre",
"squ"
],
[
"squares",
"sqrs",
"sqs"
],
[
"station",
"sta",
"statn",
"stn"
],
[
"stravenue",
"stra",
"strav",
"straven",
"stravn",
"strvn",
"strvnue"
],
[
"stream",
"strm",
"streme"
],
[
"street",
"st",
"strt",
"str"
],
[
"streets",
"sts"
],
[
"summit",
"smt",
"sumit",
"sumitt"
],
[
"terrace",
"ter",
"terr"
],
[
"throughway",
"trwy"
],
[
"trace",
"trce",
"traces"
],
[
"track",
"trak",
"tracks",
"trk",
"trks"
],
[
"trafficway",
"trfy"
],
[
"trail",
"trl",
"trails",
"trls"
],
[
"trailer",
"trlr",
"trlrs"
],
[
"tunnel",
"tunel",
"tunl",
"tunls",
"tunnels",
"tunnl"
],
[
"turnpike",
"trnpk",
"tpke",
"turnpk"
],
[
"underpass",
"upas"
],
[
"union",
"un"
],
[
"unions",
"uns"
],
[
"valley",
"vly",
"vally",
"vlly"
],
[
"valleys",
"vlys"
],
[
"viaduct",
"vdct",
"via",
"viadct"
],
[
"view",
"vw"
],
[
"views",
"vws"
],
[
"village",
"vill",
"vlg",
"villag",
"villg",
"villiage"
],
[
"villages",
"vlgs"
],
[
"ville",
"vl"
],
[
"vista",
"vis",
"vist",
"vst",
"vsta"
],
[
"walk"
],
[
"walks",
"walk"
],
[
"wall"
],
[
"way",
"wy"
],
[
"ways"
],
[
"well",
"wl"
],
[
"wells",
"wls"
]
] | 21.762048 | 4,206 | 0.263299 | 1,240 | 21,675 | 4.58871 | 0.48871 | 0.02109 | 0.005624 | 0.005975 | 0.886116 | 0.886116 | 0.886116 | 0.886116 | 0.886116 | 0.886116 | 0 | 0.000314 | 0.559262 | 21,675 | 996 | 4,207 | 21.762048 | 0.59531 | 0 | 0 | 0.215087 | 0 | 0 | 0.249828 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.001019 | false | 0.006116 | 0.001019 | 0 | 0.003058 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
482b5b51d86f4d9258cbec2cc7552b05f86dab36 | 185,847 | py | Python | smartrecruiters_python_client/apis/configuration_api.py | roksela/smartrecruiters-python-client | 6d0849d173a3d6718b5f0769098f4c76857f637d | [
"MIT"
] | 5 | 2018-03-27T08:20:13.000Z | 2022-03-30T06:23:38.000Z | smartrecruiters_python_client/apis/configuration_api.py | roksela/smartrecruiters-python-client | 6d0849d173a3d6718b5f0769098f4c76857f637d | [
"MIT"
] | null | null | null | smartrecruiters_python_client/apis/configuration_api.py | roksela/smartrecruiters-python-client | 6d0849d173a3d6718b5f0769098f4c76857f637d | [
"MIT"
] | 2 | 2018-12-05T04:48:37.000Z | 2020-12-17T12:12:12.000Z | # coding: utf-8
"""
Unofficial python library for the SmartRecruiters API
The SmartRecruiters API provides a platform to integrate services or applications, build apps and create fully customizable career sites. It exposes SmartRecruiters functionality and allows to connect and build software enhancing it.
OpenAPI spec version: 1
Generated by: https://github.com/swagger-api/swagger-codegen.git
"""
from __future__ import absolute_import
import sys
import os
import re
# python 2 and python 3 compatibility library
from six import iteritems
from ..configuration import Configuration
from ..api_client import ApiClient
class ConfigurationApi(object):
"""
NOTE: This class is auto generated by the swagger code generator program.
Do not edit the class manually.
Ref: https://github.com/swagger-api/swagger-codegen
"""
def __init__(self, api_client=None):
config = Configuration()
if api_client:
self.api_client = api_client
else:
if not config.api_client:
config.api_client = ApiClient()
self.api_client = config.api_client
def configuration_candidate_properties_all(self, **kwargs):
"""
Get a list of available candidate properties
Get all candidate properties and their configuration for a company
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.configuration_candidate_properties_all(callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:return: CandidatePropertyDefinitionList
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.configuration_candidate_properties_all_with_http_info(**kwargs)
else:
(data) = self.configuration_candidate_properties_all_with_http_info(**kwargs)
return data
def configuration_candidate_properties_all_with_http_info(self, **kwargs):
"""
Get a list of available candidate properties
Get all candidate properties and their configuration for a company
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.configuration_candidate_properties_all_with_http_info(callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:return: CandidatePropertyDefinitionList
If the method is called asynchronously,
returns the request thread.
"""
all_params = []
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method configuration_candidate_properties_all" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
resource_path = '/configuration/candidate-properties'.replace('{format}', 'json')
path_params = {}
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json; charset=utf-8'])
# Authentication setting
auth_settings = ['key']
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='CandidatePropertyDefinitionList',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def configuration_candidate_properties_get(self, id, **kwargs):
"""
Get candidate property by id
Get candidate property details and its configuration by id.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.configuration_candidate_properties_get(id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str id: Identifier of candidate property (required)
:return: CandidatePropertyDefinition
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.configuration_candidate_properties_get_with_http_info(id, **kwargs)
else:
(data) = self.configuration_candidate_properties_get_with_http_info(id, **kwargs)
return data
def configuration_candidate_properties_get_with_http_info(self, id, **kwargs):
"""
Get candidate property by id
Get candidate property details and its configuration by id.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.configuration_candidate_properties_get_with_http_info(id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str id: Identifier of candidate property (required)
:return: CandidatePropertyDefinition
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id']
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method configuration_candidate_properties_get" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params) or (params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `configuration_candidate_properties_get`")
collection_formats = {}
resource_path = '/configuration/candidate-properties/{id}'.replace('{format}', 'json')
path_params = {}
if 'id' in params:
path_params['id'] = params['id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json; charset=utf-8'])
# Authentication setting
auth_settings = ['key']
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='CandidatePropertyDefinition',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def configuration_candidate_properties_values_all(self, id, **kwargs):
"""
Get Candidate Property values
Lists all available values for given candidate property id. This endpoint is available only for SINGLE_SELECT candidate property type.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.configuration_candidate_properties_values_all(id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str id: Identifier of candidate property (required)
:return: CandidatePropertyValueList
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.configuration_candidate_properties_values_all_with_http_info(id, **kwargs)
else:
(data) = self.configuration_candidate_properties_values_all_with_http_info(id, **kwargs)
return data
def configuration_candidate_properties_values_all_with_http_info(self, id, **kwargs):
"""
Get Candidate Property values
Lists all available values for given candidate property id. This endpoint is available only for SINGLE_SELECT candidate property type.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.configuration_candidate_properties_values_all_with_http_info(id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str id: Identifier of candidate property (required)
:return: CandidatePropertyValueList
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id']
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method configuration_candidate_properties_values_all" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params) or (params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `configuration_candidate_properties_values_all`")
collection_formats = {}
resource_path = '/configuration/candidate-properties/{id}/values'.replace('{format}', 'json')
path_params = {}
if 'id' in params:
path_params['id'] = params['id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json; charset=utf-8'])
# Authentication setting
auth_settings = ['key']
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='CandidatePropertyValueList',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def configuration_candidate_properties_values_create(self, id, candidate_property_value, **kwargs):
"""
Create candidate property value
Create SINGLE_SELECT candidate property value
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.configuration_candidate_properties_values_create(id, candidate_property_value, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str id: Identifier of candidate property (required)
:param CandidatePropertyValue candidate_property_value: Candidate property value. (required)
:return: CandidatePropertyValue
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.configuration_candidate_properties_values_create_with_http_info(id, candidate_property_value, **kwargs)
else:
(data) = self.configuration_candidate_properties_values_create_with_http_info(id, candidate_property_value, **kwargs)
return data
def configuration_candidate_properties_values_create_with_http_info(self, id, candidate_property_value, **kwargs):
"""
Create candidate property value
Create SINGLE_SELECT candidate property value
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.configuration_candidate_properties_values_create_with_http_info(id, candidate_property_value, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str id: Identifier of candidate property (required)
:param CandidatePropertyValue candidate_property_value: Candidate property value. (required)
:return: CandidatePropertyValue
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'candidate_property_value']
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method configuration_candidate_properties_values_create" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params) or (params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `configuration_candidate_properties_values_create`")
# verify the required parameter 'candidate_property_value' is set
if ('candidate_property_value' not in params) or (params['candidate_property_value'] is None):
raise ValueError("Missing the required parameter `candidate_property_value` when calling `configuration_candidate_properties_values_create`")
collection_formats = {}
resource_path = '/configuration/candidate-properties/{id}/values'.replace('{format}', 'json')
path_params = {}
if 'id' in params:
path_params['id'] = params['id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'candidate_property_value' in params:
body_params = params['candidate_property_value']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json; charset=utf-8'])
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/json'])
# Authentication setting
auth_settings = ['key']
return self.api_client.call_api(resource_path, 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='CandidatePropertyValue',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def configuration_candidate_properties_values_get(self, id, value_id, **kwargs):
"""
Get Candidate Property value by id
Get Candidate Property value by its id.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.configuration_candidate_properties_values_get(id, value_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str id: Identifier of candidate property (required)
:param str value_id: Identifier of candidate property value (required)
:return: CandidatePropertyValue
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.configuration_candidate_properties_values_get_with_http_info(id, value_id, **kwargs)
else:
(data) = self.configuration_candidate_properties_values_get_with_http_info(id, value_id, **kwargs)
return data
def configuration_candidate_properties_values_get_with_http_info(self, id, value_id, **kwargs):
"""
Get Candidate Property value by id
Get Candidate Property value by its id.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.configuration_candidate_properties_values_get_with_http_info(id, value_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str id: Identifier of candidate property (required)
:param str value_id: Identifier of candidate property value (required)
:return: CandidatePropertyValue
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'value_id']
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method configuration_candidate_properties_values_get" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params) or (params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `configuration_candidate_properties_values_get`")
# verify the required parameter 'value_id' is set
if ('value_id' not in params) or (params['value_id'] is None):
raise ValueError("Missing the required parameter `value_id` when calling `configuration_candidate_properties_values_get`")
collection_formats = {}
resource_path = '/configuration/candidate-properties/{id}/values/{valueId}'.replace('{format}', 'json')
path_params = {}
if 'id' in params:
path_params['id'] = params['id']
if 'value_id' in params:
path_params['valueId'] = params['value_id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json; charset=utf-8'])
# Authentication setting
auth_settings = ['key']
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='CandidatePropertyValue',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def configuration_candidate_properties_values_update(self, id, value_id, candidate_property_value_label, **kwargs):
"""
Update candidate property value label
Update candidate property value label
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.configuration_candidate_properties_values_update(id, value_id, candidate_property_value_label, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str id: Identifier of candidate property (required)
:param str value_id: Identifier of candidate property value (required)
:param CandidatePropertyValueLabel candidate_property_value_label: Candidate property value label. (required)
:return: CandidatePropertyValue
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.configuration_candidate_properties_values_update_with_http_info(id, value_id, candidate_property_value_label, **kwargs)
else:
(data) = self.configuration_candidate_properties_values_update_with_http_info(id, value_id, candidate_property_value_label, **kwargs)
return data
def configuration_candidate_properties_values_update_with_http_info(self, id, value_id, candidate_property_value_label, **kwargs):
"""
Update candidate property value label
Update candidate property value label
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.configuration_candidate_properties_values_update_with_http_info(id, value_id, candidate_property_value_label, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str id: Identifier of candidate property (required)
:param str value_id: Identifier of candidate property value (required)
:param CandidatePropertyValueLabel candidate_property_value_label: Candidate property value label. (required)
:return: CandidatePropertyValue
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'value_id', 'candidate_property_value_label']
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method configuration_candidate_properties_values_update" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params) or (params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `configuration_candidate_properties_values_update`")
# verify the required parameter 'value_id' is set
if ('value_id' not in params) or (params['value_id'] is None):
raise ValueError("Missing the required parameter `value_id` when calling `configuration_candidate_properties_values_update`")
# verify the required parameter 'candidate_property_value_label' is set
if ('candidate_property_value_label' not in params) or (params['candidate_property_value_label'] is None):
raise ValueError("Missing the required parameter `candidate_property_value_label` when calling `configuration_candidate_properties_values_update`")
collection_formats = {}
resource_path = '/configuration/candidate-properties/{id}/values/{valueId}'.replace('{format}', 'json')
path_params = {}
if 'id' in params:
path_params['id'] = params['id']
if 'value_id' in params:
path_params['valueId'] = params['value_id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'candidate_property_value_label' in params:
body_params = params['candidate_property_value_label']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json; charset=utf-8'])
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/json'])
# Authentication setting
auth_settings = ['key']
return self.api_client.call_api(resource_path, 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='CandidatePropertyValue',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def configuration_company_my(self, **kwargs):
"""
Get company information
Get all information about your company.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.configuration_company_my(callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:return: CompanyConfiguration
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.configuration_company_my_with_http_info(**kwargs)
else:
(data) = self.configuration_company_my_with_http_info(**kwargs)
return data
def configuration_company_my_with_http_info(self, **kwargs):
"""
Get company information
Get all information about your company.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.configuration_company_my_with_http_info(callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:return: CompanyConfiguration
If the method is called asynchronously,
returns the request thread.
"""
all_params = []
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method configuration_company_my" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
resource_path = '/configuration/company'.replace('{format}', 'json')
path_params = {}
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json; charset=utf-8'])
# Authentication setting
auth_settings = ['key']
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='CompanyConfiguration',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def configuration_department_all(self, **kwargs):
"""
Get departments
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.configuration_department_all(callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:return: Departments
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.configuration_department_all_with_http_info(**kwargs)
else:
(data) = self.configuration_department_all_with_http_info(**kwargs)
return data
def configuration_department_all_with_http_info(self, **kwargs):
"""
Get departments
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.configuration_department_all_with_http_info(callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:return: Departments
If the method is called asynchronously,
returns the request thread.
"""
all_params = []
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method configuration_department_all" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
resource_path = '/configuration/departments'.replace('{format}', 'json')
path_params = {}
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json; charset=utf-8'])
# Authentication setting
auth_settings = ['key']
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Departments',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def configuration_department_create(self, department, **kwargs):
"""
Creates department
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.configuration_department_create(department, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param Department department: department to be created (required)
:return: Department
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.configuration_department_create_with_http_info(department, **kwargs)
else:
(data) = self.configuration_department_create_with_http_info(department, **kwargs)
return data
def configuration_department_create_with_http_info(self, department, **kwargs):
"""
Creates department
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.configuration_department_create_with_http_info(department, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param Department department: department to be created (required)
:return: Department
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['department']
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method configuration_department_create" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'department' is set
if ('department' not in params) or (params['department'] is None):
raise ValueError("Missing the required parameter `department` when calling `configuration_department_create`")
collection_formats = {}
resource_path = '/configuration/departments'.replace('{format}', 'json')
path_params = {}
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'department' in params:
body_params = params['department']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json; charset=utf-8'])
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/json'])
# Authentication setting
auth_settings = ['key']
return self.api_client.call_api(resource_path, 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Department',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def configuration_department_get(self, id, **kwargs):
"""
Get department
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.configuration_department_get(id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str id: Identifier of a department (required)
:return: Department
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.configuration_department_get_with_http_info(id, **kwargs)
else:
(data) = self.configuration_department_get_with_http_info(id, **kwargs)
return data
def configuration_department_get_with_http_info(self, id, **kwargs):
"""
Get department
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.configuration_department_get_with_http_info(id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str id: Identifier of a department (required)
:return: Department
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id']
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method configuration_department_get" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params) or (params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `configuration_department_get`")
collection_formats = {}
resource_path = '/configuration/departments/{id}'.replace('{format}', 'json')
path_params = {}
if 'id' in params:
path_params['id'] = params['id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json; charset=utf-8'])
# Authentication setting
auth_settings = ['key']
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Department',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def configuration_hiring_process_all(self, **kwargs):
"""
Get list of hiring process
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.configuration_hiring_process_all(callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:return: HiringProcesses
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.configuration_hiring_process_all_with_http_info(**kwargs)
else:
(data) = self.configuration_hiring_process_all_with_http_info(**kwargs)
return data
def configuration_hiring_process_all_with_http_info(self, **kwargs):
"""
Get list of hiring process
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.configuration_hiring_process_all_with_http_info(callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:return: HiringProcesses
If the method is called asynchronously,
returns the request thread.
"""
all_params = []
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method configuration_hiring_process_all" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
resource_path = '/configuration/hiring-processes'.replace('{format}', 'json')
path_params = {}
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json; charset=utf-8'])
# Authentication setting
auth_settings = ['key']
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='HiringProcesses',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def configuration_hiring_process_get(self, id, **kwargs):
"""
Get hiring process
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.configuration_hiring_process_get(id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str id: Identifier of a hiring process (required)
:return: HiringProcess
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.configuration_hiring_process_get_with_http_info(id, **kwargs)
else:
(data) = self.configuration_hiring_process_get_with_http_info(id, **kwargs)
return data
def configuration_hiring_process_get_with_http_info(self, id, **kwargs):
"""
Get hiring process
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.configuration_hiring_process_get_with_http_info(id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str id: Identifier of a hiring process (required)
:return: HiringProcess
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id']
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method configuration_hiring_process_get" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params) or (params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `configuration_hiring_process_get`")
collection_formats = {}
resource_path = '/configuration/hiring-processes/{id}'.replace('{format}', 'json')
path_params = {}
if 'id' in params:
path_params['id'] = params['id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json; charset=utf-8'])
# Authentication setting
auth_settings = ['key']
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='HiringProcess',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def configuration_job_properties_activate(self, id, **kwargs):
"""
Activate a job property
Activates a job property with given id.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.configuration_job_properties_activate(id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str id: Identifier of a job property (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.configuration_job_properties_activate_with_http_info(id, **kwargs)
else:
(data) = self.configuration_job_properties_activate_with_http_info(id, **kwargs)
return data
def configuration_job_properties_activate_with_http_info(self, id, **kwargs):
"""
Activate a job property
Activates a job property with given id.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.configuration_job_properties_activate_with_http_info(id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str id: Identifier of a job property (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id']
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method configuration_job_properties_activate" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params) or (params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `configuration_job_properties_activate`")
collection_formats = {}
resource_path = '/configuration/job-properties/{id}/activation'.replace('{format}', 'json')
path_params = {}
if 'id' in params:
path_params['id'] = params['id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json; charset=utf-8'])
# Authentication setting
auth_settings = ['key']
return self.api_client.call_api(resource_path, 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None,
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def configuration_job_properties_all(self, **kwargs):
"""
Get a list of available job properties
Get a list of available job properties.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.configuration_job_properties_all(callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:return: JobPropertyDefinitionList
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.configuration_job_properties_all_with_http_info(**kwargs)
else:
(data) = self.configuration_job_properties_all_with_http_info(**kwargs)
return data
def configuration_job_properties_all_with_http_info(self, **kwargs):
"""
Get a list of available job properties
Get a list of available job properties.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.configuration_job_properties_all_with_http_info(callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:return: JobPropertyDefinitionList
If the method is called asynchronously,
returns the request thread.
"""
all_params = []
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method configuration_job_properties_all" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
resource_path = '/configuration/job-properties'.replace('{format}', 'json')
path_params = {}
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json; charset=utf-8'])
# Authentication setting
auth_settings = ['key']
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='JobPropertyDefinitionList',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def configuration_job_properties_create(self, **kwargs):
"""
Create a job property
Creates a job property
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.configuration_job_properties_create(callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param JobPropertyDefinition job_property_definition: job property to be created
:return: JobPropertyDefinition
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.configuration_job_properties_create_with_http_info(**kwargs)
else:
(data) = self.configuration_job_properties_create_with_http_info(**kwargs)
return data
def configuration_job_properties_create_with_http_info(self, **kwargs):
"""
Create a job property
Creates a job property
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.configuration_job_properties_create_with_http_info(callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param JobPropertyDefinition job_property_definition: job property to be created
:return: JobPropertyDefinition
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['job_property_definition']
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method configuration_job_properties_create" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
resource_path = '/configuration/job-properties'.replace('{format}', 'json')
path_params = {}
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'job_property_definition' in params:
body_params = params['job_property_definition']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json; charset=utf-8'])
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/json'])
# Authentication setting
auth_settings = ['key']
return self.api_client.call_api(resource_path, 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='JobPropertyDefinition',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def configuration_job_properties_deactivate(self, id, **kwargs):
"""
Deactivate a job property
Deactivates a job property.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.configuration_job_properties_deactivate(id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str id: Identifier of a job property (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.configuration_job_properties_deactivate_with_http_info(id, **kwargs)
else:
(data) = self.configuration_job_properties_deactivate_with_http_info(id, **kwargs)
return data
def configuration_job_properties_deactivate_with_http_info(self, id, **kwargs):
"""
Deactivate a job property
Deactivates a job property.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.configuration_job_properties_deactivate_with_http_info(id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str id: Identifier of a job property (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id']
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method configuration_job_properties_deactivate" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params) or (params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `configuration_job_properties_deactivate`")
collection_formats = {}
resource_path = '/configuration/job-properties/{id}/activation'.replace('{format}', 'json')
path_params = {}
if 'id' in params:
path_params['id'] = params['id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json; charset=utf-8'])
# Authentication setting
auth_settings = ['key']
return self.api_client.call_api(resource_path, 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None,
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def configuration_job_properties_dependents_all(self, id, **kwargs):
"""
Get job property's dependents
Get list of job property's dependents
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.configuration_job_properties_dependents_all(id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str id: Identifier of a job property (required)
:return: DependentJobProperties
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.configuration_job_properties_dependents_all_with_http_info(id, **kwargs)
else:
(data) = self.configuration_job_properties_dependents_all_with_http_info(id, **kwargs)
return data
def configuration_job_properties_dependents_all_with_http_info(self, id, **kwargs):
"""
Get job property's dependents
Get list of job property's dependents
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.configuration_job_properties_dependents_all_with_http_info(id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str id: Identifier of a job property (required)
:return: DependentJobProperties
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id']
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method configuration_job_properties_dependents_all" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params) or (params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `configuration_job_properties_dependents_all`")
collection_formats = {}
resource_path = '/configuration/job-properties/{id}/dependents'.replace('{format}', 'json')
path_params = {}
if 'id' in params:
path_params['id'] = params['id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json; charset=utf-8'])
# Authentication setting
auth_settings = ['key']
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='DependentJobProperties',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def configuration_job_properties_dependents_create(self, id, dependent_job_properties_ids, **kwargs):
"""
Create job property dependents
Create dependencies between job properties
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.configuration_job_properties_dependents_create(id, dependent_job_properties_ids, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str id: Identifier of a job property (required)
:param DependentJobPropertiesIds dependent_job_properties_ids: Job properties' id (required)
:return: DependentJobProperties
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.configuration_job_properties_dependents_create_with_http_info(id, dependent_job_properties_ids, **kwargs)
else:
(data) = self.configuration_job_properties_dependents_create_with_http_info(id, dependent_job_properties_ids, **kwargs)
return data
def configuration_job_properties_dependents_create_with_http_info(self, id, dependent_job_properties_ids, **kwargs):
"""
Create job property dependents
Create dependencies between job properties
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.configuration_job_properties_dependents_create_with_http_info(id, dependent_job_properties_ids, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str id: Identifier of a job property (required)
:param DependentJobPropertiesIds dependent_job_properties_ids: Job properties' id (required)
:return: DependentJobProperties
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'dependent_job_properties_ids']
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method configuration_job_properties_dependents_create" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params) or (params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `configuration_job_properties_dependents_create`")
# verify the required parameter 'dependent_job_properties_ids' is set
if ('dependent_job_properties_ids' not in params) or (params['dependent_job_properties_ids'] is None):
raise ValueError("Missing the required parameter `dependent_job_properties_ids` when calling `configuration_job_properties_dependents_create`")
collection_formats = {}
resource_path = '/configuration/job-properties/{id}/dependents'.replace('{format}', 'json')
path_params = {}
if 'id' in params:
path_params['id'] = params['id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'dependent_job_properties_ids' in params:
body_params = params['dependent_job_properties_ids']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json; charset=utf-8'])
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/json'])
# Authentication setting
auth_settings = ['key']
return self.api_client.call_api(resource_path, 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='DependentJobProperties',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def configuration_job_properties_dependents_remove(self, id, dependent_id, **kwargs):
"""
Remove job property's dependent
Remove dependency between job properties
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.configuration_job_properties_dependents_remove(id, dependent_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str id: Identifier of a job property (required)
:param str dependent_id: Identifier of a job property's dependent (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.configuration_job_properties_dependents_remove_with_http_info(id, dependent_id, **kwargs)
else:
(data) = self.configuration_job_properties_dependents_remove_with_http_info(id, dependent_id, **kwargs)
return data
def configuration_job_properties_dependents_remove_with_http_info(self, id, dependent_id, **kwargs):
"""
Remove job property's dependent
Remove dependency between job properties
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.configuration_job_properties_dependents_remove_with_http_info(id, dependent_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str id: Identifier of a job property (required)
:param str dependent_id: Identifier of a job property's dependent (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'dependent_id']
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method configuration_job_properties_dependents_remove" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params) or (params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `configuration_job_properties_dependents_remove`")
# verify the required parameter 'dependent_id' is set
if ('dependent_id' not in params) or (params['dependent_id'] is None):
raise ValueError("Missing the required parameter `dependent_id` when calling `configuration_job_properties_dependents_remove`")
collection_formats = {}
resource_path = '/configuration/job-properties/{id}/dependents/{dependentId}'.replace('{format}', 'json')
path_params = {}
if 'id' in params:
path_params['id'] = params['id']
if 'dependent_id' in params:
path_params['dependentId'] = params['dependent_id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json; charset=utf-8'])
# Authentication setting
auth_settings = ['key']
return self.api_client.call_api(resource_path, 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None,
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def configuration_job_properties_dependents_values_add(self, id, value_id, dependent_id, dependent_job_property_value_id, **kwargs):
"""
Add job property's dependent value
Add job property's dependent value for specific job property's value
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.configuration_job_properties_dependents_values_add(id, value_id, dependent_id, dependent_job_property_value_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str id: Identifier of a job property (required)
:param str value_id: Identifier of a job property value (required)
:param str dependent_id: Identifier of job property's dependent (required)
:param Identifiable dependent_job_property_value_id: Identifier of job property's dependent value (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.configuration_job_properties_dependents_values_add_with_http_info(id, value_id, dependent_id, dependent_job_property_value_id, **kwargs)
else:
(data) = self.configuration_job_properties_dependents_values_add_with_http_info(id, value_id, dependent_id, dependent_job_property_value_id, **kwargs)
return data
def configuration_job_properties_dependents_values_add_with_http_info(self, id, value_id, dependent_id, dependent_job_property_value_id, **kwargs):
"""
Add job property's dependent value
Add job property's dependent value for specific job property's value
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.configuration_job_properties_dependents_values_add_with_http_info(id, value_id, dependent_id, dependent_job_property_value_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str id: Identifier of a job property (required)
:param str value_id: Identifier of a job property value (required)
:param str dependent_id: Identifier of job property's dependent (required)
:param Identifiable dependent_job_property_value_id: Identifier of job property's dependent value (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'value_id', 'dependent_id', 'dependent_job_property_value_id']
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method configuration_job_properties_dependents_values_add" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params) or (params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `configuration_job_properties_dependents_values_add`")
# verify the required parameter 'value_id' is set
if ('value_id' not in params) or (params['value_id'] is None):
raise ValueError("Missing the required parameter `value_id` when calling `configuration_job_properties_dependents_values_add`")
# verify the required parameter 'dependent_id' is set
if ('dependent_id' not in params) or (params['dependent_id'] is None):
raise ValueError("Missing the required parameter `dependent_id` when calling `configuration_job_properties_dependents_values_add`")
# verify the required parameter 'dependent_job_property_value_id' is set
if ('dependent_job_property_value_id' not in params) or (params['dependent_job_property_value_id'] is None):
raise ValueError("Missing the required parameter `dependent_job_property_value_id` when calling `configuration_job_properties_dependents_values_add`")
collection_formats = {}
resource_path = '/configuration/job-properties/{id}/values/{valueId}/dependents/{dependentId}/values'.replace('{format}', 'json')
path_params = {}
if 'id' in params:
path_params['id'] = params['id']
if 'value_id' in params:
path_params['valueId'] = params['value_id']
if 'dependent_id' in params:
path_params['dependentId'] = params['dependent_id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'dependent_job_property_value_id' in params:
body_params = params['dependent_job_property_value_id']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json; charset=utf-8'])
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/json'])
# Authentication setting
auth_settings = ['key']
return self.api_client.call_api(resource_path, 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None,
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def configuration_job_properties_dependents_values_all(self, id, dependent_id, **kwargs):
"""
Get dependent job property's values
Get dependent job property's values with corelation to the parent field.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.configuration_job_properties_dependents_values_all(id, dependent_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str id: Identifier of a job property (required)
:param str dependent_id: Identifier of dependent job property (required)
:return: DependentJobPropertyValuesRelations
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.configuration_job_properties_dependents_values_all_with_http_info(id, dependent_id, **kwargs)
else:
(data) = self.configuration_job_properties_dependents_values_all_with_http_info(id, dependent_id, **kwargs)
return data
def configuration_job_properties_dependents_values_all_with_http_info(self, id, dependent_id, **kwargs):
"""
Get dependent job property's values
Get dependent job property's values with corelation to the parent field.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.configuration_job_properties_dependents_values_all_with_http_info(id, dependent_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str id: Identifier of a job property (required)
:param str dependent_id: Identifier of dependent job property (required)
:return: DependentJobPropertyValuesRelations
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'dependent_id']
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method configuration_job_properties_dependents_values_all" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params) or (params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `configuration_job_properties_dependents_values_all`")
# verify the required parameter 'dependent_id' is set
if ('dependent_id' not in params) or (params['dependent_id'] is None):
raise ValueError("Missing the required parameter `dependent_id` when calling `configuration_job_properties_dependents_values_all`")
collection_formats = {}
resource_path = '/configuration/job-properties/{id}/dependents/{dependentId}/values'.replace('{format}', 'json')
path_params = {}
if 'id' in params:
path_params['id'] = params['id']
if 'dependent_id' in params:
path_params['dependentId'] = params['dependent_id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json; charset=utf-8'])
# Authentication setting
auth_settings = ['key']
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='DependentJobPropertyValuesRelations',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def configuration_job_properties_dependents_values_get(self, id, value_id, dependent_id, **kwargs):
"""
Get job property's dependent values
Get list of job property's dependent values for specific job property's value
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.configuration_job_properties_dependents_values_get(id, value_id, dependent_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str id: Identifier of a job property (required)
:param str value_id: Identifier of a job property value (required)
:param str dependent_id: Identifier of job property's dependent (required)
:return: DependentJobPropertyValues
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.configuration_job_properties_dependents_values_get_with_http_info(id, value_id, dependent_id, **kwargs)
else:
(data) = self.configuration_job_properties_dependents_values_get_with_http_info(id, value_id, dependent_id, **kwargs)
return data
def configuration_job_properties_dependents_values_get_with_http_info(self, id, value_id, dependent_id, **kwargs):
"""
Get job property's dependent values
Get list of job property's dependent values for specific job property's value
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.configuration_job_properties_dependents_values_get_with_http_info(id, value_id, dependent_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str id: Identifier of a job property (required)
:param str value_id: Identifier of a job property value (required)
:param str dependent_id: Identifier of job property's dependent (required)
:return: DependentJobPropertyValues
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'value_id', 'dependent_id']
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method configuration_job_properties_dependents_values_get" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params) or (params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `configuration_job_properties_dependents_values_get`")
# verify the required parameter 'value_id' is set
if ('value_id' not in params) or (params['value_id'] is None):
raise ValueError("Missing the required parameter `value_id` when calling `configuration_job_properties_dependents_values_get`")
# verify the required parameter 'dependent_id' is set
if ('dependent_id' not in params) or (params['dependent_id'] is None):
raise ValueError("Missing the required parameter `dependent_id` when calling `configuration_job_properties_dependents_values_get`")
collection_formats = {}
resource_path = '/configuration/job-properties/{id}/values/{valueId}/dependents/{dependentId}/values'.replace('{format}', 'json')
path_params = {}
if 'id' in params:
path_params['id'] = params['id']
if 'value_id' in params:
path_params['valueId'] = params['value_id']
if 'dependent_id' in params:
path_params['dependentId'] = params['dependent_id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json; charset=utf-8'])
# Authentication setting
auth_settings = ['key']
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='DependentJobPropertyValues',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def configuration_job_properties_dependents_values_remove(self, id, value_id, dependent_id, dependent_value_id, **kwargs):
"""
Remove job property's dependent values relationship
Remove relationship between dependent job properties values
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.configuration_job_properties_dependents_values_remove(id, value_id, dependent_id, dependent_value_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str id: Identifier of a job property (required)
:param str value_id: Identifier of a job property value (required)
:param str dependent_id: Identifier of job property's dependent (required)
:param str dependent_value_id: Identifier of job property's dependent value (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.configuration_job_properties_dependents_values_remove_with_http_info(id, value_id, dependent_id, dependent_value_id, **kwargs)
else:
(data) = self.configuration_job_properties_dependents_values_remove_with_http_info(id, value_id, dependent_id, dependent_value_id, **kwargs)
return data
def configuration_job_properties_dependents_values_remove_with_http_info(self, id, value_id, dependent_id, dependent_value_id, **kwargs):
"""
Remove job property's dependent values relationship
Remove relationship between dependent job properties values
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.configuration_job_properties_dependents_values_remove_with_http_info(id, value_id, dependent_id, dependent_value_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str id: Identifier of a job property (required)
:param str value_id: Identifier of a job property value (required)
:param str dependent_id: Identifier of job property's dependent (required)
:param str dependent_value_id: Identifier of job property's dependent value (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'value_id', 'dependent_id', 'dependent_value_id']
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method configuration_job_properties_dependents_values_remove" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params) or (params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `configuration_job_properties_dependents_values_remove`")
# verify the required parameter 'value_id' is set
if ('value_id' not in params) or (params['value_id'] is None):
raise ValueError("Missing the required parameter `value_id` when calling `configuration_job_properties_dependents_values_remove`")
# verify the required parameter 'dependent_id' is set
if ('dependent_id' not in params) or (params['dependent_id'] is None):
raise ValueError("Missing the required parameter `dependent_id` when calling `configuration_job_properties_dependents_values_remove`")
# verify the required parameter 'dependent_value_id' is set
if ('dependent_value_id' not in params) or (params['dependent_value_id'] is None):
raise ValueError("Missing the required parameter `dependent_value_id` when calling `configuration_job_properties_dependents_values_remove`")
collection_formats = {}
resource_path = '/configuration/job-properties/{id}/values/{valueId}/dependents/{dependentId}/values/{dependentValueId}'.replace('{format}', 'json')
path_params = {}
if 'id' in params:
path_params['id'] = params['id']
if 'value_id' in params:
path_params['valueId'] = params['value_id']
if 'dependent_id' in params:
path_params['dependentId'] = params['dependent_id']
if 'dependent_value_id' in params:
path_params['dependentValueId'] = params['dependent_value_id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json; charset=utf-8'])
# Authentication setting
auth_settings = ['key']
return self.api_client.call_api(resource_path, 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None,
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def configuration_job_properties_get(self, id, **kwargs):
"""
Get job property by id
Get job property by id
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.configuration_job_properties_get(id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str id: Identifier of a job property (required)
:return: JobPropertyDefinition
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.configuration_job_properties_get_with_http_info(id, **kwargs)
else:
(data) = self.configuration_job_properties_get_with_http_info(id, **kwargs)
return data
def configuration_job_properties_get_with_http_info(self, id, **kwargs):
"""
Get job property by id
Get job property by id
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.configuration_job_properties_get_with_http_info(id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str id: Identifier of a job property (required)
:return: JobPropertyDefinition
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id']
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method configuration_job_properties_get" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params) or (params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `configuration_job_properties_get`")
collection_formats = {}
resource_path = '/configuration/job-properties/{id}'.replace('{format}', 'json')
path_params = {}
if 'id' in params:
path_params['id'] = params['id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json; charset=utf-8'])
# Authentication setting
auth_settings = ['key']
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='JobPropertyDefinition',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def configuration_job_properties_update(self, id, **kwargs):
"""
Update a job property
Updates a job property.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.configuration_job_properties_update(id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str id: Identifier of a job property (required)
:param JSONPatch json_patch: patch request
:return: JobPropertyDefinition
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.configuration_job_properties_update_with_http_info(id, **kwargs)
else:
(data) = self.configuration_job_properties_update_with_http_info(id, **kwargs)
return data
def configuration_job_properties_update_with_http_info(self, id, **kwargs):
"""
Update a job property
Updates a job property.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.configuration_job_properties_update_with_http_info(id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str id: Identifier of a job property (required)
:param JSONPatch json_patch: patch request
:return: JobPropertyDefinition
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'json_patch']
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method configuration_job_properties_update" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params) or (params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `configuration_job_properties_update`")
collection_formats = {}
resource_path = '/configuration/job-properties/{id}'.replace('{format}', 'json')
path_params = {}
if 'id' in params:
path_params['id'] = params['id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'json_patch' in params:
body_params = params['json_patch']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json; charset=utf-8'])
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/json-patch+json'])
# Authentication setting
auth_settings = ['key']
return self.api_client.call_api(resource_path, 'PATCH',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='JobPropertyDefinition',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def configuration_job_properties_values_archive(self, id, value_id, **kwargs):
"""
Archive a job property value
Archive a job property value
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.configuration_job_properties_values_archive(id, value_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str id: Identifier of a job property (required)
:param str value_id: Identifier of a job property value to be archived (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.configuration_job_properties_values_archive_with_http_info(id, value_id, **kwargs)
else:
(data) = self.configuration_job_properties_values_archive_with_http_info(id, value_id, **kwargs)
return data
def configuration_job_properties_values_archive_with_http_info(self, id, value_id, **kwargs):
"""
Archive a job property value
Archive a job property value
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.configuration_job_properties_values_archive_with_http_info(id, value_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str id: Identifier of a job property (required)
:param str value_id: Identifier of a job property value to be archived (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'value_id']
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method configuration_job_properties_values_archive" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params) or (params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `configuration_job_properties_values_archive`")
# verify the required parameter 'value_id' is set
if ('value_id' not in params) or (params['value_id'] is None):
raise ValueError("Missing the required parameter `value_id` when calling `configuration_job_properties_values_archive`")
collection_formats = {}
resource_path = '/configuration/job-properties/{id}/archive-values/{valueId}'.replace('{format}', 'json')
path_params = {}
if 'id' in params:
path_params['id'] = params['id']
if 'value_id' in params:
path_params['valueId'] = params['value_id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json; charset=utf-8'])
# Authentication setting
auth_settings = ['key']
return self.api_client.call_api(resource_path, 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None,
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def configuration_job_properties_values_create(self, id, **kwargs):
"""
Create a job property value
Creates a job property value.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.configuration_job_properties_values_create(id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str id: Identifier of a job property (required)
:param JobPropertyValueDefinition job_property_value_definition: job property object to be created
:return: JobPropertyValueDefinition
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.configuration_job_properties_values_create_with_http_info(id, **kwargs)
else:
(data) = self.configuration_job_properties_values_create_with_http_info(id, **kwargs)
return data
def configuration_job_properties_values_create_with_http_info(self, id, **kwargs):
"""
Create a job property value
Creates a job property value.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.configuration_job_properties_values_create_with_http_info(id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str id: Identifier of a job property (required)
:param JobPropertyValueDefinition job_property_value_definition: job property object to be created
:return: JobPropertyValueDefinition
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'job_property_value_definition']
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method configuration_job_properties_values_create" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params) or (params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `configuration_job_properties_values_create`")
collection_formats = {}
resource_path = '/configuration/job-properties/{id}/values'.replace('{format}', 'json')
path_params = {}
if 'id' in params:
path_params['id'] = params['id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'job_property_value_definition' in params:
body_params = params['job_property_value_definition']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json; charset=utf-8'])
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/json'])
# Authentication setting
auth_settings = ['key']
return self.api_client.call_api(resource_path, 'POST',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='JobPropertyValueDefinition',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def configuration_job_properties_values_deprecated_archive(self, id, value_id, **kwargs):
"""
Archive a job property value
Archive a job property value. Please use `PUT /configuration/job-properties/{id}/archive-values/{valueId}` instead.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.configuration_job_properties_values_deprecated_archive(id, value_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str id: Identifier of a job property (required)
:param str value_id: Identifier of a job property value to be archived (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.configuration_job_properties_values_deprecated_archive_with_http_info(id, value_id, **kwargs)
else:
(data) = self.configuration_job_properties_values_deprecated_archive_with_http_info(id, value_id, **kwargs)
return data
def configuration_job_properties_values_deprecated_archive_with_http_info(self, id, value_id, **kwargs):
"""
Archive a job property value
Archive a job property value. Please use `PUT /configuration/job-properties/{id}/archive-values/{valueId}` instead.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.configuration_job_properties_values_deprecated_archive_with_http_info(id, value_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str id: Identifier of a job property (required)
:param str value_id: Identifier of a job property value to be archived (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'value_id']
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method configuration_job_properties_values_deprecated_archive" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params) or (params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `configuration_job_properties_values_deprecated_archive`")
# verify the required parameter 'value_id' is set
if ('value_id' not in params) or (params['value_id'] is None):
raise ValueError("Missing the required parameter `value_id` when calling `configuration_job_properties_values_deprecated_archive`")
collection_formats = {}
resource_path = '/configuration/job-properties/{id}/values/{valueId}'.replace('{format}', 'json')
path_params = {}
if 'id' in params:
path_params['id'] = params['id']
if 'value_id' in params:
path_params['valueId'] = params['value_id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json; charset=utf-8'])
# Authentication setting
auth_settings = ['key']
return self.api_client.call_api(resource_path, 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None,
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def configuration_job_properties_values_deprecated_unarchive(self, id, value_id, **kwargs):
"""
Unarchive a job property value
Unarchive a job property value. `DELETE /configuration/job-properties/{id}/archive-values/{valueId}` instead.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.configuration_job_properties_values_deprecated_unarchive(id, value_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str id: Identifier of a job property (required)
:param str value_id: Identifier of a job property value to be unarchived (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.configuration_job_properties_values_deprecated_unarchive_with_http_info(id, value_id, **kwargs)
else:
(data) = self.configuration_job_properties_values_deprecated_unarchive_with_http_info(id, value_id, **kwargs)
return data
def configuration_job_properties_values_deprecated_unarchive_with_http_info(self, id, value_id, **kwargs):
"""
Unarchive a job property value
Unarchive a job property value. `DELETE /configuration/job-properties/{id}/archive-values/{valueId}` instead.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.configuration_job_properties_values_deprecated_unarchive_with_http_info(id, value_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str id: Identifier of a job property (required)
:param str value_id: Identifier of a job property value to be unarchived (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'value_id']
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method configuration_job_properties_values_deprecated_unarchive" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params) or (params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `configuration_job_properties_values_deprecated_unarchive`")
# verify the required parameter 'value_id' is set
if ('value_id' not in params) or (params['value_id'] is None):
raise ValueError("Missing the required parameter `value_id` when calling `configuration_job_properties_values_deprecated_unarchive`")
collection_formats = {}
resource_path = '/configuration/job-properties/{id}/values/{valueId}'.replace('{format}', 'json')
path_params = {}
if 'id' in params:
path_params['id'] = params['id']
if 'value_id' in params:
path_params['valueId'] = params['value_id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json; charset=utf-8'])
# Authentication setting
auth_settings = ['key']
return self.api_client.call_api(resource_path, 'PUT',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None,
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def configuration_job_properties_values_get(self, id, **kwargs):
"""
Get available job property values
Get available job property values.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.configuration_job_properties_values_get(id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str id: Identifier of a job property (required)
:return: JobPropertyValueDefinitionList
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.configuration_job_properties_values_get_with_http_info(id, **kwargs)
else:
(data) = self.configuration_job_properties_values_get_with_http_info(id, **kwargs)
return data
def configuration_job_properties_values_get_with_http_info(self, id, **kwargs):
"""
Get available job property values
Get available job property values.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.configuration_job_properties_values_get_with_http_info(id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str id: Identifier of a job property (required)
:return: JobPropertyValueDefinitionList
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id']
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method configuration_job_properties_values_get" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params) or (params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `configuration_job_properties_values_get`")
collection_formats = {}
resource_path = '/configuration/job-properties/{id}/values'.replace('{format}', 'json')
path_params = {}
if 'id' in params:
path_params['id'] = params['id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json; charset=utf-8'])
# Authentication setting
auth_settings = ['key']
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='JobPropertyValueDefinitionList',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def configuration_job_properties_values_unarchive(self, id, value_id, **kwargs):
"""
Unarchive a job property value
Unarchive a job property value
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.configuration_job_properties_values_unarchive(id, value_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str id: Identifier of a job property (required)
:param str value_id: Identifier of a job property value to be unarchived (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.configuration_job_properties_values_unarchive_with_http_info(id, value_id, **kwargs)
else:
(data) = self.configuration_job_properties_values_unarchive_with_http_info(id, value_id, **kwargs)
return data
def configuration_job_properties_values_unarchive_with_http_info(self, id, value_id, **kwargs):
"""
Unarchive a job property value
Unarchive a job property value
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.configuration_job_properties_values_unarchive_with_http_info(id, value_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str id: Identifier of a job property (required)
:param str value_id: Identifier of a job property value to be unarchived (required)
:return: None
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'value_id']
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method configuration_job_properties_values_unarchive" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params) or (params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `configuration_job_properties_values_unarchive`")
# verify the required parameter 'value_id' is set
if ('value_id' not in params) or (params['value_id'] is None):
raise ValueError("Missing the required parameter `value_id` when calling `configuration_job_properties_values_unarchive`")
collection_formats = {}
resource_path = '/configuration/job-properties/{id}/archive-values/{valueId}'.replace('{format}', 'json')
path_params = {}
if 'id' in params:
path_params['id'] = params['id']
if 'value_id' in params:
path_params['valueId'] = params['value_id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json; charset=utf-8'])
# Authentication setting
auth_settings = ['key']
return self.api_client.call_api(resource_path, 'DELETE',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type=None,
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def configuration_job_properties_values_update(self, id, value_id, **kwargs):
"""
Update a job property value
Update a job property value. Returns an updated job property value object.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.configuration_job_properties_values_update(id, value_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str id: Identifier of a job property (required)
:param str value_id: Identifier of a job property value to be updated (required)
:param JSONPatch json_patch: patch request
:return: JobPropertyValueDefinition
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.configuration_job_properties_values_update_with_http_info(id, value_id, **kwargs)
else:
(data) = self.configuration_job_properties_values_update_with_http_info(id, value_id, **kwargs)
return data
def configuration_job_properties_values_update_with_http_info(self, id, value_id, **kwargs):
"""
Update a job property value
Update a job property value. Returns an updated job property value object.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.configuration_job_properties_values_update_with_http_info(id, value_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str id: Identifier of a job property (required)
:param str value_id: Identifier of a job property value to be updated (required)
:param JSONPatch json_patch: patch request
:return: JobPropertyValueDefinition
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['id', 'value_id', 'json_patch']
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method configuration_job_properties_values_update" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'id' is set
if ('id' not in params) or (params['id'] is None):
raise ValueError("Missing the required parameter `id` when calling `configuration_job_properties_values_update`")
# verify the required parameter 'value_id' is set
if ('value_id' not in params) or (params['value_id'] is None):
raise ValueError("Missing the required parameter `value_id` when calling `configuration_job_properties_values_update`")
collection_formats = {}
resource_path = '/configuration/job-properties/{id}/values/{valueId}'.replace('{format}', 'json')
path_params = {}
if 'id' in params:
path_params['id'] = params['id']
if 'value_id' in params:
path_params['valueId'] = params['value_id']
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'json_patch' in params:
body_params = params['json_patch']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json; charset=utf-8'])
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.\
select_header_content_type(['application/json-patch+json'])
# Authentication setting
auth_settings = ['key']
return self.api_client.call_api(resource_path, 'PATCH',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='JobPropertyValueDefinition',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def configuration_offer_properties_all(self, **kwargs):
"""
Get a list of available offer properties
Get a list of available offer properties.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.configuration_offer_properties_all(callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:return: OfferPropertiesDefinition
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.configuration_offer_properties_all_with_http_info(**kwargs)
else:
(data) = self.configuration_offer_properties_all_with_http_info(**kwargs)
return data
def configuration_offer_properties_all_with_http_info(self, **kwargs):
"""
Get a list of available offer properties
Get a list of available offer properties.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.configuration_offer_properties_all_with_http_info(callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:return: OfferPropertiesDefinition
If the method is called asynchronously,
returns the request thread.
"""
all_params = []
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method configuration_offer_properties_all" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
resource_path = '/configuration/offer-properties'.replace('{format}', 'json')
path_params = {}
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json; charset=utf-8'])
# Authentication setting
auth_settings = ['key']
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='OfferPropertiesDefinition',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def configuration_reasons_rejection_all(self, **kwargs):
"""
Get rejection reasons
Get rejection reasons
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.configuration_reasons_rejection_all(callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:return: Properties
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.configuration_reasons_rejection_all_with_http_info(**kwargs)
else:
(data) = self.configuration_reasons_rejection_all_with_http_info(**kwargs)
return data
def configuration_reasons_rejection_all_with_http_info(self, **kwargs):
"""
Get rejection reasons
Get rejection reasons
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.configuration_reasons_rejection_all_with_http_info(callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:return: Properties
If the method is called asynchronously,
returns the request thread.
"""
all_params = []
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method configuration_reasons_rejection_all" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
resource_path = '/configuration/rejection-reasons'.replace('{format}', 'json')
path_params = {}
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json; charset=utf-8'])
# Authentication setting
auth_settings = ['key']
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Properties',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def configuration_reasons_withdrawal_all(self, **kwargs):
"""
Get withdrawal reasons
Get withdrawal reasons
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.configuration_reasons_withdrawal_all(callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:return: Properties
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.configuration_reasons_withdrawal_all_with_http_info(**kwargs)
else:
(data) = self.configuration_reasons_withdrawal_all_with_http_info(**kwargs)
return data
def configuration_reasons_withdrawal_all_with_http_info(self, **kwargs):
"""
Get withdrawal reasons
Get withdrawal reasons
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.configuration_reasons_withdrawal_all_with_http_info(callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:return: Properties
If the method is called asynchronously,
returns the request thread.
"""
all_params = []
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method configuration_reasons_withdrawal_all" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
resource_path = '/configuration/withdrawal-reasons'.replace('{format}', 'json')
path_params = {}
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json; charset=utf-8'])
# Authentication setting
auth_settings = ['key']
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Properties',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def configuration_source_types(self, **kwargs):
"""
List candidate source types with subtypes
Get a list of all available candidate source type with subtypes
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.configuration_source_types(callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:return: SourceTypes
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.configuration_source_types_with_http_info(**kwargs)
else:
(data) = self.configuration_source_types_with_http_info(**kwargs)
return data
def configuration_source_types_with_http_info(self, **kwargs):
"""
List candidate source types with subtypes
Get a list of all available candidate source type with subtypes
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.configuration_source_types_with_http_info(callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:return: SourceTypes
If the method is called asynchronously,
returns the request thread.
"""
all_params = []
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method configuration_source_types" % key
)
params[key] = val
del params['kwargs']
collection_formats = {}
resource_path = '/configuration/sources'.replace('{format}', 'json')
path_params = {}
query_params = {}
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json; charset=utf-8'])
# Authentication setting
auth_settings = ['key']
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='SourceTypes',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def configuration_source_values_all(self, source_type, **kwargs):
"""
List candidate sources
Get a list of all available candidate sources by type.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.configuration_source_values_all(source_type, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str source_type: Source type from /configuration/sources (required)
:param str source_sub_type: Source SubType
:param int limit: number of elements to return. max value is 100
:param int offset: number of elements to skip while processing result
:return: Sources
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.configuration_source_values_all_with_http_info(source_type, **kwargs)
else:
(data) = self.configuration_source_values_all_with_http_info(source_type, **kwargs)
return data
def configuration_source_values_all_with_http_info(self, source_type, **kwargs):
"""
List candidate sources
Get a list of all available candidate sources by type.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.configuration_source_values_all_with_http_info(source_type, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str source_type: Source type from /configuration/sources (required)
:param str source_sub_type: Source SubType
:param int limit: number of elements to return. max value is 100
:param int offset: number of elements to skip while processing result
:return: Sources
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['source_type', 'source_sub_type', 'limit', 'offset']
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method configuration_source_values_all" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'source_type' is set
if ('source_type' not in params) or (params['source_type'] is None):
raise ValueError("Missing the required parameter `source_type` when calling `configuration_source_values_all`")
if 'limit' in params and params['limit'] > 100:
raise ValueError("Invalid value for parameter `limit` when calling `configuration_source_values_all`, must be a value less than or equal to `100`")
if 'limit' in params and params['limit'] < 1:
raise ValueError("Invalid value for parameter `limit` when calling `configuration_source_values_all`, must be a value greater than or equal to `1`")
if 'offset' in params and params['offset'] < 0:
raise ValueError("Invalid value for parameter `offset` when calling `configuration_source_values_all`, must be a value greater than or equal to `0`")
collection_formats = {}
resource_path = '/configuration/sources/{sourceType}/values'.replace('{format}', 'json')
path_params = {}
if 'source_type' in params:
path_params['sourceType'] = params['source_type']
query_params = {}
if 'source_sub_type' in params:
query_params['sourceSubType'] = params['source_sub_type']
if 'limit' in params:
query_params['limit'] = params['limit']
if 'offset' in params:
query_params['offset'] = params['offset']
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json; charset=utf-8'])
# Authentication setting
auth_settings = ['key']
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Sources',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
def configuration_source_values_single(self, source_type, source_value_id, **kwargs):
"""
Get a candidate source
Get a single candidate sources for a given type.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.configuration_source_values_single(source_type, source_value_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str source_type: Source type from /configuration/sources (required)
:param str source_value_id: Source id (required)
:param str source_sub_type: Source SubType from /configuration/sources
:return: Source
If the method is called asynchronously,
returns the request thread.
"""
kwargs['_return_http_data_only'] = True
if kwargs.get('callback'):
return self.configuration_source_values_single_with_http_info(source_type, source_value_id, **kwargs)
else:
(data) = self.configuration_source_values_single_with_http_info(source_type, source_value_id, **kwargs)
return data
def configuration_source_values_single_with_http_info(self, source_type, source_value_id, **kwargs):
"""
Get a candidate source
Get a single candidate sources for a given type.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please define a `callback` function
to be invoked when receiving the response.
>>> def callback_function(response):
>>> pprint(response)
>>>
>>> thread = api.configuration_source_values_single_with_http_info(source_type, source_value_id, callback=callback_function)
:param callback function: The callback function
for asynchronous request. (optional)
:param str source_type: Source type from /configuration/sources (required)
:param str source_value_id: Source id (required)
:param str source_sub_type: Source SubType from /configuration/sources
:return: Source
If the method is called asynchronously,
returns the request thread.
"""
all_params = ['source_type', 'source_value_id', 'source_sub_type']
all_params.append('callback')
all_params.append('_return_http_data_only')
all_params.append('_preload_content')
all_params.append('_request_timeout')
params = locals()
for key, val in iteritems(params['kwargs']):
if key not in all_params:
raise TypeError(
"Got an unexpected keyword argument '%s'"
" to method configuration_source_values_single" % key
)
params[key] = val
del params['kwargs']
# verify the required parameter 'source_type' is set
if ('source_type' not in params) or (params['source_type'] is None):
raise ValueError("Missing the required parameter `source_type` when calling `configuration_source_values_single`")
# verify the required parameter 'source_value_id' is set
if ('source_value_id' not in params) or (params['source_value_id'] is None):
raise ValueError("Missing the required parameter `source_value_id` when calling `configuration_source_values_single`")
collection_formats = {}
resource_path = '/configuration/sources/{sourceType}/values/{sourceValueId}'.replace('{format}', 'json')
path_params = {}
if 'source_type' in params:
path_params['sourceType'] = params['source_type']
if 'source_value_id' in params:
path_params['sourceValueId'] = params['source_value_id']
query_params = {}
if 'source_sub_type' in params:
query_params['sourceSubType'] = params['source_sub_type']
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.\
select_header_accept(['application/json; charset=utf-8'])
# Authentication setting
auth_settings = ['key']
return self.api_client.call_api(resource_path, 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='Source',
auth_settings=auth_settings,
callback=params.get('callback'),
_return_http_data_only=params.get('_return_http_data_only'),
_preload_content=params.get('_preload_content', True),
_request_timeout=params.get('_request_timeout'),
collection_formats=collection_formats)
| 45.461595 | 237 | 0.590044 | 18,743 | 185,847 | 5.591634 | 0.015099 | 0.058013 | 0.04912 | 0.026106 | 0.984543 | 0.98148 | 0.975936 | 0.966385 | 0.956004 | 0.949382 | 0 | 0.000469 | 0.333931 | 185,847 | 4,087 | 238 | 45.472718 | 0.846179 | 0.320958 | 0 | 0.818449 | 0 | 0.001963 | 0.192317 | 0.082032 | 0 | 0 | 0 | 0 | 0 | 1 | 0.037782 | false | 0 | 0.003435 | 0 | 0.097645 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
482bb5fb145e90ab82f2f0d0b2c9351a436b85f9 | 509 | py | Python | holobot/extensions/moderation/managers/ipermission_manager.py | rexor12/holobot | 89b7b416403d13ccfeee117ef942426b08d3651d | [
"MIT"
] | 1 | 2021-05-24T00:17:46.000Z | 2021-05-24T00:17:46.000Z | holobot/extensions/moderation/managers/ipermission_manager.py | rexor12/holobot | 89b7b416403d13ccfeee117ef942426b08d3651d | [
"MIT"
] | 41 | 2021-03-24T22:50:09.000Z | 2021-12-17T12:15:13.000Z | holobot/extensions/moderation/managers/ipermission_manager.py | rexor12/holobot | 89b7b416403d13ccfeee117ef942426b08d3651d | [
"MIT"
] | null | null | null | from ..enums import ModeratorPermission
class IPermissionManager:
async def add_permissions(self, server_id: str, user_id: str, permissions: ModeratorPermission) -> None:
raise NotImplementedError
async def remove_permissions(self, server_id: str, user_id: str, permissions: ModeratorPermission) -> None:
raise NotImplementedError
async def has_permissions(self, server_id: str, user_id: str, permissions: ModeratorPermission) -> bool:
raise NotImplementedError
| 42.416667 | 111 | 0.748527 | 54 | 509 | 6.888889 | 0.388889 | 0.080645 | 0.169355 | 0.185484 | 0.717742 | 0.717742 | 0.717742 | 0.717742 | 0.717742 | 0.717742 | 0 | 0 | 0.180747 | 509 | 11 | 112 | 46.272727 | 0.892086 | 0 | 0 | 0.375 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.125 | 0 | 0.25 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
482d5e841578f224f7ffca17d87d07c108f64015 | 10,446 | py | Python | catboost/spark/catboost4j-spark/core/src/test/generate_canonical_results/feature_importance_test.py | jochenater/catboost | de2786fbc633b0d6ea6a23b3862496c6151b95c2 | [
"Apache-2.0"
] | 6,989 | 2017-07-18T06:23:18.000Z | 2022-03-31T15:58:36.000Z | catboost/spark/catboost4j-spark/core/src/test/generate_canonical_results/feature_importance_test.py | jochenater/catboost | de2786fbc633b0d6ea6a23b3862496c6151b95c2 | [
"Apache-2.0"
] | 1,978 | 2017-07-18T09:17:58.000Z | 2022-03-31T14:28:43.000Z | catboost/spark/catboost4j-spark/core/src/test/generate_canonical_results/feature_importance_test.py | jochenater/catboost | de2786fbc633b0d6ea6a23b3862496c6151b95c2 | [
"Apache-2.0"
] | 1,228 | 2017-07-18T09:03:13.000Z | 2022-03-29T05:57:40.000Z | import json
import os
import catboost as cb
import utils
from config import CATBOOST_TEST_DATA_DIR, OUTPUT_DIR
def prediction_values_change():
dataset_dir = os.path.join(CATBOOST_TEST_DATA_DIR, 'higgs')
learn_set_path = os.path.join(dataset_dir, "train_small")
cd_path = os.path.join(dataset_dir, "train.cd")
model = utils.run_dist_train(
['--iterations', '20',
'--loss-function', 'RMSE',
'--learn-set', learn_set_path,
'--cd', cd_path
],
model_class=cb.CatBoostRegressor
)
result = {}
for calc_type in ['Regular', 'Approximate', 'Exact']:
result['calc_type_' + calc_type] = model.get_feature_importance(
type=cb.EFstrType.PredictionValuesChange,
shap_calc_type=calc_type
).tolist()
prettified_result = model.get_feature_importance(
type=cb.EFstrType.PredictionValuesChange,
prettified=True,
shap_calc_type=calc_type
)
result['calc_type_' + calc_type + '_prettified'] = [
{
"featureName": prettified_result['Feature Id'][i],
"importance": prettified_result['Importances'][i]
}
for i in range(len(prettified_result.index))
]
json.dump(
result,
fp=open(os.path.join(OUTPUT_DIR, 'feature_importance_prediction_values_change.json'), 'w'),
allow_nan=True,
indent=2
)
def loss_function_change():
dataset_dir = os.path.join(CATBOOST_TEST_DATA_DIR, 'querywise')
learn_set_path = os.path.join(dataset_dir, "train")
cd_path = os.path.join(dataset_dir, "train.cd")
model = utils.run_dist_train(
['--iterations', '20',
'--loss-function', 'QueryRMSE',
'--learn-set', learn_set_path,
'--cd', cd_path
],
model_class=cb.CatBoostRegressor
)
train_pool = cb.Pool(
learn_set_path,
column_description=cd_path
)
result = {}
for calc_type in ['Regular', 'Approximate', 'Exact']:
result['calc_type_' + calc_type] = model.get_feature_importance(
type=cb.EFstrType.LossFunctionChange,
data=train_pool,
shap_calc_type=calc_type
).tolist()
prettified_result = model.get_feature_importance(
type=cb.EFstrType.LossFunctionChange,
data=train_pool,
prettified=True,
shap_calc_type=calc_type
)
result['calc_type_' + calc_type + '_prettified'] = [
{
"featureName": prettified_result['Feature Id'][i],
"importance": prettified_result['Importances'][i]
}
for i in range(len(prettified_result.index))
]
json.dump(
result,
fp=open(os.path.join(OUTPUT_DIR, 'feature_importance_loss_function_change.json'), 'w'),
allow_nan=True,
indent=2
)
def interaction():
dataset_dir = os.path.join(CATBOOST_TEST_DATA_DIR, 'querywise')
learn_set_path = os.path.join(dataset_dir, "train")
cd_path = os.path.join(dataset_dir, "train.cd")
model = utils.run_dist_train(
['--iterations', '20',
'--loss-function', 'QueryRMSE',
'--learn-set', learn_set_path,
'--cd', cd_path
],
model_class=cb.CatBoostRegressor
)
result = []
for firstFeatureIndex, secondFeatureIndex, score in model.get_feature_importance(type=cb.EFstrType.Interaction):
result.append(
{
"firstFeatureIndex": int(firstFeatureIndex),
"secondFeatureIndex": int(secondFeatureIndex),
"score": score
}
)
json.dump(
result,
fp=open(os.path.join(OUTPUT_DIR, 'feature_importance_interaction.json'), 'w'),
allow_nan=True,
indent=2
)
def shap_values():
result = {}
for problem_type in ['Regression', 'BinClass', 'MultiClass']:
if problem_type == 'Regression':
dataset_dir = os.path.join(CATBOOST_TEST_DATA_DIR, 'querywise')
learn_set_path = os.path.join(dataset_dir, "train")
cd_path = os.path.join(dataset_dir, "train.cd")
loss_function = 'QueryRMSE'
additional_train_params = []
model_class = cb.CatBoostRegressor
elif problem_type == 'BinClass':
dataset_dir = os.path.join(CATBOOST_TEST_DATA_DIR, 'higgs')
learn_set_path = os.path.join(dataset_dir, "train_small")
cd_path = os.path.join(dataset_dir, "train.cd")
loss_function = 'Logloss'
additional_train_params = []
model_class = cb.CatBoostClassifier
elif problem_type == 'MultiClass':
dataset_dir = os.path.join(CATBOOST_TEST_DATA_DIR, 'cloudness_small')
learn_set_path = os.path.join(dataset_dir, "train_small")
cd_path = os.path.join(dataset_dir, "train_float.cd")
loss_function = 'MultiClass'
additional_train_params = []
model_class = cb.CatBoostClassifier
model = utils.run_dist_train(
['--iterations', '20',
'--loss-function', loss_function,
'--learn-set', learn_set_path,
'--cd', cd_path
] + additional_train_params,
model_class=model_class
)
model.save_model(os.path.join(OUTPUT_DIR, "feature_importance_shap_values.problem_type=" + problem_type + ".cbm"))
train_pool = cb.Pool(
learn_set_path,
column_description=cd_path
)
for shap_mode in ['Auto', 'UsePreCalc', 'NoPreCalc']:
for shap_calc_type in ['Regular', 'Approximate', 'Exact']:
result_name = (
'problem_type=' + problem_type
+ ',shap_mode=' + shap_mode
+ ',shap_calc_type=' + shap_calc_type
)
result[result_name] = model.get_feature_importance(
type=cb.EFstrType.ShapValues,
data=train_pool,
shap_mode=shap_mode,
shap_calc_type=shap_calc_type
).tolist()
json.dump(
result,
fp=open(os.path.join(OUTPUT_DIR, 'feature_importance_shap_values.json'), 'w'),
allow_nan=True,
indent=2
)
def prediction_diff():
dataset_dir = os.path.join(CATBOOST_TEST_DATA_DIR, 'higgs')
learn_set_path = os.path.join(dataset_dir, "train_small")
cd_path = os.path.join(dataset_dir, "train.cd")
model = utils.run_dist_train(
['--iterations', '20',
'--loss-function', 'RMSE',
'--learn-set', learn_set_path,
'--cd', cd_path
],
model_class=cb.CatBoostRegressor
)
train_pool = cb.Pool(
learn_set_path,
column_description=cd_path
)
result = {}
result['simple'] = model.get_feature_importance(
type=cb.EFstrType.PredictionDiff,
data=train_pool.get_features()[:2]
).tolist()
prettified_result = model.get_feature_importance(
type=cb.EFstrType.PredictionDiff,
data=train_pool.get_features()[:2],
prettified=True
)
result['prettified'] = [
{
"featureName": prettified_result['Feature Id'][i],
"importance": prettified_result['Importances'][i]
}
for i in range(len(prettified_result.index))
]
json.dump(
result,
fp=open(os.path.join(OUTPUT_DIR, 'feature_importance_prediction_diff.json'), 'w'),
allow_nan=True,
indent=2
)
def shap_interaction_values():
result = {}
for problem_type in ['Regression', 'BinClass', 'MultiClass']:
if problem_type == 'Regression':
dataset_dir = os.path.join(CATBOOST_TEST_DATA_DIR, 'higgs')
learn_set_path = os.path.join(dataset_dir, "train_small")
cd_path = os.path.join(dataset_dir, "train.cd")
loss_function = 'RMSE'
additional_train_params = []
model_class = cb.CatBoostRegressor
elif problem_type == 'BinClass':
dataset_dir = os.path.join(CATBOOST_TEST_DATA_DIR, 'higgs')
learn_set_path = os.path.join(dataset_dir, "train_small")
cd_path = os.path.join(dataset_dir, "train.cd")
loss_function = 'Logloss'
additional_train_params = []
model_class = cb.CatBoostClassifier
elif problem_type == 'MultiClass':
dataset_dir = os.path.join(CATBOOST_TEST_DATA_DIR, 'cloudness_small')
learn_set_path = os.path.join(dataset_dir, "train_small")
cd_path = os.path.join(dataset_dir, "train_float.cd")
loss_function = 'MultiClass'
additional_train_params = []
model_class = cb.CatBoostClassifier
model = utils.run_dist_train(
['--iterations', '20',
'--loss-function', loss_function,
'--learn-set', learn_set_path,
'--cd', cd_path
] + additional_train_params,
model_class=model_class
)
model.save_model(os.path.join(OUTPUT_DIR, "feature_importance_shap_interaction_values.problem_type=" + problem_type + ".cbm"))
pool_for_feature_importance = cb.Pool(
learn_set_path,
column_description=cd_path
).slice([0,1,2,3,4])
for shap_mode in ['Auto', 'UsePreCalc', 'NoPreCalc']:
for shap_calc_type in ['Regular']:
result_name = (
'problem_type=' + problem_type
+ ',shap_mode=' + shap_mode
+ ',shap_calc_type=' + shap_calc_type
)
result[result_name] = model.get_feature_importance(
type=cb.EFstrType.ShapInteractionValues,
data=pool_for_feature_importance,
shap_mode=shap_mode,
shap_calc_type=shap_calc_type
).tolist()
json.dump(
result,
fp=open(os.path.join(OUTPUT_DIR, 'feature_importance_shap_interaction_values.json'), 'w'),
allow_nan=True,
indent=2
)
def main():
prediction_values_change()
loss_function_change()
interaction()
shap_values()
prediction_diff()
shap_interaction_values()
| 33.480769 | 134 | 0.589508 | 1,138 | 10,446 | 5.084359 | 0.09754 | 0.039405 | 0.065676 | 0.048393 | 0.897511 | 0.897511 | 0.88887 | 0.879191 | 0.871587 | 0.830971 | 0 | 0.0034 | 0.29619 | 10,446 | 311 | 135 | 33.588424 | 0.783596 | 0 | 0 | 0.708955 | 0 | 0 | 0.145811 | 0.033317 | 0 | 0 | 0 | 0 | 0 | 1 | 0.026119 | false | 0 | 0.100746 | 0 | 0.126866 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
4841ff660e8ab615dd34c486ce1e924b713d962d | 16,746 | py | Python | src/utils/utils_experiment.py | cchallu/dghl | 1cafd3e1390f1069fb8ce3aab2e3d3bc8271a079 | [
"Apache-2.0"
] | 2 | 2022-02-17T03:06:36.000Z | 2022-03-30T16:42:26.000Z | src/utils/utils_experiment.py | cchallu/dghl | 1cafd3e1390f1069fb8ce3aab2e3d3bc8271a079 | [
"Apache-2.0"
] | null | null | null | src/utils/utils_experiment.py | cchallu/dghl | 1cafd3e1390f1069fb8ce3aab2e3d3bc8271a079 | [
"Apache-2.0"
] | 1 | 2022-03-07T08:16:43.000Z | 2022-03-07T08:16:43.000Z | """
Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved.
Licensed under the Apache License, Version 2.0 (the "License").
You may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
"""
import os
import pickle
import numpy as np
import pandas as pd
import torch
from models.DGHL import DGHL
from models.DGHL_encoder import DGHL_encoder
from utils.utils import de_unfold
from utils.utils_visualization import plot_reconstruction_ts, plot_anomaly_scores
def train_DGHL(mc, train_data, test_data, test_labels, train_mask, test_mask, entities, make_plots, root_dir):
"""
train_data:
List of tensors with training data, each shape (n_time, 1, n_features)
test_data:
List of tensor with training data, each shape (n_time, 1, n_features)
test_labels:
List of arrays with test lables, each (ntime)
train_mask:
List of tensors with training mask, each shape (n_time, 1, n_features)
test_mask:
List of tensors with test mask, each shape (n_time, 1, n_features)
entities:
List of names with entities
"""
print(pd.Series(mc))
# --------------------------------------- Random seed --------------------------------------
np.random.seed(mc['random_seed'])
# --------------------------------------- Parse paramaters --------------------------------------
window_size = mc['window_size']
window_hierarchy = mc['window_hierarchy']
window_step = mc['window_step']
n_features = mc['n_features']
total_window_size = window_size*window_hierarchy
# --------------------------------------- Data Processing --------------------------------------
n_entities = len(entities)
train_data_list = []
test_data_list = []
train_mask_list = []
test_mask_list = []
# Loop to pre-process each entity
for entity in range(n_entities):
#print(10*'-','entity ', entity, ': ', entities[entity], 10*'-')
train_data_entity = train_data[entity].copy()
test_data_entity = test_data[entity].copy()
train_mask_entity = train_mask[entity].copy()
test_mask_entity = test_mask[entity].copy()
assert train_data_entity.shape == train_mask_entity.shape, 'Train data and Train mask should have equal dimensions'
assert test_data_entity.shape == test_mask_entity.shape, 'Test data and Test mask should have equal dimensions'
assert train_data_entity.shape[2] == mc['n_features'], 'Train data should match n_features'
assert test_data_entity.shape[2] == mc['n_features'], 'Test data should match n_features'
# --------------------------------------- Data Processing ---------------------------------------
# Complete first window for test, padding from training data
padding = total_window_size - (len(test_data_entity) - total_window_size*(len(test_data_entity)//total_window_size))
test_data_entity = np.vstack([train_data_entity[-padding:], test_data_entity])
test_mask_entity = np.vstack([train_mask_entity[-padding:], test_mask_entity])
# Create rolling windows
train_data_entity = torch.Tensor(train_data_entity).float()
train_data_entity = train_data_entity.permute(0,2,1)
train_data_entity = train_data_entity.unfold(dimension=0, size=total_window_size, step=window_step)
test_data_entity = torch.Tensor(test_data_entity).float()
test_data_entity = test_data_entity.permute(0,2,1)
test_data_entity = test_data_entity.unfold(dimension=0, size=total_window_size, step=window_step)
train_mask_entity = torch.Tensor(train_mask_entity).float()
train_mask_entity = train_mask_entity.permute(0,2,1)
train_mask_entity = train_mask_entity.unfold(dimension=0, size=total_window_size, step=window_step)
test_mask_entity = torch.Tensor(test_mask_entity).float()
test_mask_entity = test_mask_entity.permute(0,2,1)
test_mask_entity = test_mask_entity.unfold(dimension=0, size=total_window_size, step=window_step)
train_data_list.append(train_data_entity)
test_data_list.append(test_data_entity)
train_mask_list.append(train_mask_entity)
test_mask_list.append(test_mask_entity)
# Append all windows for complete windows data
train_windows_data = torch.vstack(train_data_list)
train_windows_mask = torch.vstack(train_mask_list)
# -------------------------------------------- Instantiate and train Model --------------------------------------------
print('Training model...')
model = DGHL(window_size=window_size, window_step=mc['window_step'], window_hierarchy=window_hierarchy,
hidden_multiplier=mc['hidden_multiplier'], max_filters=mc['max_filters'],
kernel_multiplier=mc['kernel_multiplier'], n_channels=n_features,
z_size=mc['z_size'], z_size_up=mc['z_size_up'], z_iters=mc['z_iters'],
z_sigma=mc['z_sigma'], z_step_size=mc['z_step_size'],
z_with_noise=mc['z_with_noise'], z_persistent=mc['z_persistent'],
batch_size=mc['batch_size'], learning_rate=mc['learning_rate'],
noise_std=mc['noise_std'],
normalize_windows=mc['normalize_windows'],
random_seed=mc['random_seed'], device=mc['device'])
model.fit(X=train_windows_data, mask=train_windows_mask, n_iterations=mc['n_iterations'])
# -------------------------------------------- Inference on each entity --------------------------------------------
for entity in range(n_entities):
rootdir_entity = f'{root_dir}/{entities[entity]}'
os.makedirs(name=rootdir_entity, exist_ok=True)
# Plots of reconstruction in train
print('Reconstructing train...')
x_train_true, x_train_hat, _, mask_windows = model.predict(X=train_data_list[entity], mask=train_mask_list[entity],
z_iters=mc['z_iters_inference'])
x_train_true, _ = de_unfold(x_windows=x_train_true, mask_windows=mask_windows, window_step=window_step)
x_train_hat, _ = de_unfold(x_windows=x_train_hat, mask_windows=mask_windows, window_step=window_step)
x_train_true = np.swapaxes(x_train_true,0,1)
x_train_hat = np.swapaxes(x_train_hat,0,1)
if make_plots:
filename = f'{rootdir_entity}/reconstruction_train.png'
plot_reconstruction_ts(x=x_train_true, x_hat=x_train_hat, n_features=n_features, filename=filename)
# --------------------------------------- Inference on test and anomaly scores ---------------------------------------
print('Computing scores on test...')
score_windows, ts_score, x_windows, x_hat_windows, _, mask_windows = model.anomaly_score(X=test_data_list[entity],
mask=test_mask_list[entity],
z_iters=mc['z_iters_inference'])
# Post-processing
# Fold windows
score_windows = score_windows[:,None,None,:]
score_mask = np.ones(score_windows.shape)
score, _ = de_unfold(x_windows=score_windows, mask_windows=score_mask, window_step=window_step)
x_test_true, _ = de_unfold(x_windows=x_windows, mask_windows=mask_windows, window_step=window_step)
x_test_hat, _ = de_unfold(x_windows=x_hat_windows, mask_windows=mask_windows, window_step=window_step)
x_test_true = np.swapaxes(x_test_true,0,1)
x_test_hat = np.swapaxes(x_test_hat,0,1)
score = score.flatten()
score = score[-len(test_labels[entity]):]
if make_plots:
filename = f'{rootdir_entity}/reconstruction_test.png'
plot_reconstruction_ts(x=x_test_true, x_hat=x_test_hat, n_features=n_features, filename=filename)
# Plot scores
if make_plots:
filename = f'{rootdir_entity}/anomaly_scores.png'
plot_anomaly_scores(score=score, labels=test_labels[entity], filename=filename)
results = {'score': score, 'ts_score':ts_score, 'x_test_true':x_test_true, 'x_test_hat':x_test_hat, 'labels':test_labels,
'x_train_true':x_train_true, 'x_train_hat':x_train_hat, 'train_mask': train_mask, 'mc':mc}
with open(f'{rootdir_entity}/results.p','wb') as f:
pickle.dump(results, f)
def train_DGHL_encoder(mc, train_data, test_data, test_labels, train_mask, test_mask, entities, make_plots, root_dir):
"""
train_data:
List of tensors with training data, each shape (n_time, 1, n_features)
test_data:
List of tensor with training data, each shape (n_time, 1, n_features)
test_labels:
List of arrays with test lables, each (ntime)
train_mask:
List of tensors with training mask, each shape (n_time, 1, n_features)
test_mask:
List of tensors with test mask, each shape (n_time, 1, n_features)
entities:
List of names with entities
"""
print(pd.Series(mc))
# --------------------------------------- Random seed --------------------------------------
np.random.seed(mc['random_seed'])
# --------------------------------------- Parse paramaters --------------------------------------
window_size = mc['window_size']
window_hierarchy = mc['window_hierarchy']
window_step = mc['window_step']
n_features = mc['n_features']
total_window_size = window_size*window_hierarchy
# --------------------------------------- Data Processing --------------------------------------
n_entities = len(entities)
train_data_list = []
test_data_list = []
train_mask_list = []
test_mask_list = []
# Loop to pre-process each entity
for entity in range(n_entities):
#print(10*'-','entity ', entity, ': ', entities[entity], 10*'-')
train_data_entity = train_data[entity].copy()
test_data_entity = test_data[entity].copy()
train_mask_entity = train_mask[entity].copy()
test_mask_entity = test_mask[entity].copy()
assert train_data_entity.shape == train_mask_entity.shape, 'Train data and Train mask should have equal dimensions'
assert test_data_entity.shape == test_mask_entity.shape, 'Test data and Test mask should have equal dimensions'
assert train_data_entity.shape[2] == mc['n_features'], 'Train data should match n_features'
assert test_data_entity.shape[2] == mc['n_features'], 'Test data should match n_features'
# --------------------------------------- Data Processing ---------------------------------------
# Complete first window for test, padding from training data
padding = total_window_size - (len(test_data_entity) - total_window_size*(len(test_data_entity)//total_window_size))
test_data_entity = np.vstack([train_data_entity[-padding:], test_data_entity])
test_mask_entity = np.vstack([train_mask_entity[-padding:], test_mask_entity])
# Create rolling windows
train_data_entity = torch.Tensor(train_data_entity).float()
train_data_entity = train_data_entity.permute(0,2,1)
train_data_entity = train_data_entity.unfold(dimension=0, size=total_window_size, step=window_step)
test_data_entity = torch.Tensor(test_data_entity).float()
test_data_entity = test_data_entity.permute(0,2,1)
test_data_entity = test_data_entity.unfold(dimension=0, size=total_window_size, step=window_step)
train_mask_entity = torch.Tensor(train_mask_entity).float()
train_mask_entity = train_mask_entity.permute(0,2,1)
train_mask_entity = train_mask_entity.unfold(dimension=0, size=total_window_size, step=window_step)
test_mask_entity = torch.Tensor(test_mask_entity).float()
test_mask_entity = test_mask_entity.permute(0,2,1)
test_mask_entity = test_mask_entity.unfold(dimension=0, size=total_window_size, step=window_step)
train_data_list.append(train_data_entity)
test_data_list.append(test_data_entity)
train_mask_list.append(train_mask_entity)
test_mask_list.append(test_mask_entity)
# Append all windows for complete windows data
train_windows_data = torch.vstack(train_data_list)
train_windows_mask = torch.vstack(train_mask_list)
# -------------------------------------------- Instantiate and train Model --------------------------------------------
print('Training model...')
model = DGHL_encoder(window_size=window_size, window_step=mc['window_step'], window_hierarchy=window_hierarchy,
hidden_multiplier=mc['hidden_multiplier'], max_filters=mc['max_filters'],
kernel_multiplier=mc['kernel_multiplier'], n_channels=n_features,
z_size=mc['z_size'], z_size_up=mc['z_size_up'],
batch_size=mc['batch_size'], learning_rate=mc['learning_rate'],
noise_std=mc['noise_std'],
normalize_windows=mc['normalize_windows'],
random_seed=mc['random_seed'], device=mc['device'])
model.fit(X=train_windows_data, mask=train_windows_mask, n_iterations=mc['n_iterations'])
# -------------------------------------------- Inference on each entity --------------------------------------------
for entity in range(n_entities):
rootdir_entity = f'{root_dir}/{entities[entity]}'
os.makedirs(name=rootdir_entity, exist_ok=True)
# Plots of reconstruction in train
print('Reconstructing train...')
x_train_true, x_train_hat, _, mask_windows = model.predict(X=train_data_list[entity], mask=train_mask_list[entity])
x_train_true, _ = de_unfold(x_windows=x_train_true, mask_windows=mask_windows, window_step=window_step)
x_train_hat, _ = de_unfold(x_windows=x_train_hat, mask_windows=mask_windows, window_step=window_step)
x_train_true = np.swapaxes(x_train_true,0,1)
x_train_hat = np.swapaxes(x_train_hat,0,1)
if make_plots:
filename = f'{rootdir_entity}/reconstruction_train.png'
plot_reconstruction_ts(x=x_train_true, x_hat=x_train_hat, n_features=n_features, filename=filename)
# --------------------------------------- Inference on test and anomaly scores ---------------------------------------
print('Computing scores on test...')
score_windows, ts_score, x_windows, x_hat_windows, _, mask_windows = model.anomaly_score(X=test_data_list[entity],
mask=test_mask_list[entity])
# Post-processing
# Fold windows
score_windows = score_windows[:,None,None,:]
score_mask = np.ones(score_windows.shape)
score, _ = de_unfold(x_windows=score_windows, mask_windows=score_mask, window_step=window_step)
x_test_true, _ = de_unfold(x_windows=x_windows, mask_windows=mask_windows, window_step=window_step)
x_test_hat, _ = de_unfold(x_windows=x_hat_windows, mask_windows=mask_windows, window_step=window_step)
x_test_true = np.swapaxes(x_test_true,0,1)
x_test_hat = np.swapaxes(x_test_hat,0,1)
score = score.flatten()
score = score[-len(test_labels[entity]):]
if make_plots:
filename = f'{rootdir_entity}/reconstruction_test.png'
plot_reconstruction_ts(x=x_test_true, x_hat=x_test_hat, n_features=n_features, filename=filename)
# Plot scores
if make_plots:
filename = f'{rootdir_entity}/anomaly_scores.png'
plot_anomaly_scores(score=score, labels=test_labels[entity], filename=filename)
results = {'score': score, 'ts_score':ts_score, 'labels':test_labels, 'mc':mc}
# results = {'score': score, 'ts_score':ts_score, 'x_test_true':x_test_true, 'x_test_hat':x_test_hat, 'labels':test_labels,
# 'x_train_true':x_train_true, 'x_train_hat':x_train_hat, 'train_mask': train_mask, 'mc':mc}
with open(f'{rootdir_entity}/results.p','wb') as f:
pickle.dump(results, f) | 52.006211 | 131 | 0.635674 | 2,160 | 16,746 | 4.575 | 0.093056 | 0.054645 | 0.042502 | 0.016191 | 0.91611 | 0.914693 | 0.914693 | 0.914693 | 0.914693 | 0.910038 | 0 | 0.005418 | 0.206437 | 16,746 | 322 | 132 | 52.006211 | 0.738205 | 0.225905 | 0 | 0.870968 | 0 | 0 | 0.114096 | 0.026708 | 0 | 0 | 0 | 0 | 0.043011 | 1 | 0.010753 | false | 0 | 0.048387 | 0 | 0.05914 | 0.043011 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
486b401f106af6df709d8d8f38b1b0631741bcd9 | 2,908 | py | Python | usersec/migrations/0002_hpcgroup_adjustments.py | bihealth/hpc-access | ff606b18b18230af2876a791ca706d3b24addb59 | [
"MIT"
] | null | null | null | usersec/migrations/0002_hpcgroup_adjustments.py | bihealth/hpc-access | ff606b18b18230af2876a791ca706d3b24addb59 | [
"MIT"
] | 27 | 2022-02-11T15:51:24.000Z | 2022-03-31T12:11:20.000Z | usersec/migrations/0002_hpcgroup_adjustments.py | bihealth/hpc-access | ff606b18b18230af2876a791ca706d3b24addb59 | [
"MIT"
] | null | null | null | # Generated by Django 4.0.2 on 2022-03-02 12:20
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
dependencies = [
("usersec", "0001_initial"),
]
operations = [
migrations.AlterField(
model_name="hpcgroup",
name="owner",
field=models.ForeignKey(
help_text="User registered as owner of the group",
null=True,
on_delete=django.db.models.deletion.CASCADE,
related_name="%(class)s_owner",
to="usersec.hpcuser",
),
),
migrations.AlterField(
model_name="hpcgroup",
name="status",
field=models.CharField(
choices=[
("INITIAL", "INITIAL"),
("ACTIVE", "ACTIVE"),
("DELETED", "DELETED"),
("EXPIRED", "EXPIRED"),
],
help_text="Status of the group object",
max_length=16,
),
),
migrations.AlterField(
model_name="hpcgroupversion",
name="owner",
field=models.ForeignKey(
help_text="User registered as owner of the group",
null=True,
on_delete=django.db.models.deletion.CASCADE,
related_name="%(class)s_owner",
to="usersec.hpcuser",
),
),
migrations.AlterField(
model_name="hpcgroupversion",
name="status",
field=models.CharField(
choices=[
("INITIAL", "INITIAL"),
("ACTIVE", "ACTIVE"),
("DELETED", "DELETED"),
("EXPIRED", "EXPIRED"),
],
help_text="Status of the group object",
max_length=16,
),
),
migrations.AlterField(
model_name="hpcuser",
name="status",
field=models.CharField(
choices=[
("INITIAL", "INITIAL"),
("ACTIVE", "ACTIVE"),
("DELETED", "DELETED"),
("EXPIRED", "EXPIRED"),
],
help_text="Status of the user object",
max_length=16,
),
),
migrations.AlterField(
model_name="hpcuserversion",
name="status",
field=models.CharField(
choices=[
("INITIAL", "INITIAL"),
("ACTIVE", "ACTIVE"),
("DELETED", "DELETED"),
("EXPIRED", "EXPIRED"),
],
help_text="Status of the user object",
max_length=16,
),
),
]
| 31.268817 | 66 | 0.434319 | 220 | 2,908 | 5.636364 | 0.272727 | 0.096774 | 0.120968 | 0.140323 | 0.846774 | 0.846774 | 0.78629 | 0.78629 | 0.762903 | 0.762903 | 0 | 0.01677 | 0.446355 | 2,908 | 92 | 67 | 31.608696 | 0.753416 | 0.015475 | 0 | 0.883721 | 1 | 0 | 0.19993 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.023256 | 0 | 0.05814 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
d20422a2e661c658065e92cba23e21e0e355980c | 180 | py | Python | Road_safety/hackathon/forms.py | mukul54/xyz-bosch | 92a3f2d1e8c8308547b2e9de5d5b20ebdd6d925a | [
"Apache-2.0"
] | 3 | 2019-07-05T16:52:34.000Z | 2021-07-09T09:01:03.000Z | Road_safety/hackathon/forms.py | mukul54/xyz-bosch | 92a3f2d1e8c8308547b2e9de5d5b20ebdd6d925a | [
"Apache-2.0"
] | 5 | 2020-08-18T21:45:56.000Z | 2021-04-13T14:36:47.000Z | Road_safety/hackathon/forms.py | mukul54/xyz-bosch | 92a3f2d1e8c8308547b2e9de5d5b20ebdd6d925a | [
"Apache-2.0"
] | 2 | 2019-07-02T21:36:40.000Z | 2019-08-23T16:17:11.000Z | from django import forms
class LoginForm(forms.Form):
username = forms.CharField(max_length=60)
password = forms.CharField(max_length=60, widget=forms.PasswordInput()) | 36 | 75 | 0.755556 | 23 | 180 | 5.826087 | 0.652174 | 0.208955 | 0.253731 | 0.343284 | 0.373134 | 0 | 0 | 0 | 0 | 0 | 0 | 0.025806 | 0.138889 | 180 | 5 | 75 | 36 | 0.83871 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.25 | 0.25 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 7 |
d21fbe2cb2a6286d6a2a99fbfeb7b8955af0ffb5 | 173 | py | Python | python/src/collatz/__init__.py | Skenvy/Collatz | 9b1738221fc421d153eabd37c837239b189c6bed | [
"Apache-2.0"
] | null | null | null | python/src/collatz/__init__.py | Skenvy/Collatz | 9b1738221fc421d153eabd37c837239b189c6bed | [
"Apache-2.0"
] | null | null | null | python/src/collatz/__init__.py | Skenvy/Collatz | 9b1738221fc421d153eabd37c837239b189c6bed | [
"Apache-2.0"
] | null | null | null | from .__version__ import __version__
from .parameterised import *
from .parameterised import _ErrMsg
from .parameterised import _CC
from .parameterised import _KNOWN_CYCLES
| 28.833333 | 40 | 0.849711 | 20 | 173 | 6.75 | 0.4 | 0.503704 | 0.681481 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.115607 | 173 | 5 | 41 | 34.6 | 0.882353 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
d266e7e5f351b8bc0c6594fe1e92ff8f319cb5f0 | 189 | py | Python | devel/lib/python2.7/dist-packages/motoman_msgs/msg/__init__.py | Pontiky/yaskawa-hc10-moveit | 2a6031f9404d285aa662636ccc941485b339e7fd | [
"BSD-2-Clause"
] | 1 | 2021-05-19T04:09:29.000Z | 2021-05-19T04:09:29.000Z | devel/lib/python2.7/dist-packages/motoman_msgs/msg/__init__.py | Pontiky/yaskawa-hc10-moveit | 2a6031f9404d285aa662636ccc941485b339e7fd | [
"BSD-2-Clause"
] | null | null | null | devel/lib/python2.7/dist-packages/motoman_msgs/msg/__init__.py | Pontiky/yaskawa-hc10-moveit | 2a6031f9404d285aa662636ccc941485b339e7fd | [
"BSD-2-Clause"
] | null | null | null | from ._DynamicJointPoint import *
from ._DynamicJointState import *
from ._DynamicJointTrajectory import *
from ._DynamicJointTrajectoryFeedback import *
from ._DynamicJointsGroup import *
| 31.5 | 46 | 0.84127 | 15 | 189 | 10.266667 | 0.466667 | 0.25974 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.10582 | 189 | 5 | 47 | 37.8 | 0.911243 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
962c258ab7295ef7118b4f750bde90492ee5234a | 5,499 | py | Python | scripts/twitPersonality/embeddings.py | IllinoisSocialMediaMacroscope/smm-bae | 9fea6fa61369db16c2bd95bc409be82c1f7a5c50 | [
"Apache-2.0"
] | 1 | 2018-12-11T18:57:15.000Z | 2018-12-11T18:57:15.000Z | scripts/twitPersonality/embeddings.py | IllinoisSocialMediaMacroscope/smm-bae | 9fea6fa61369db16c2bd95bc409be82c1f7a5c50 | [
"Apache-2.0"
] | 1 | 2022-01-22T03:08:48.000Z | 2022-01-22T03:08:48.000Z | scripts/twitPersonality/embeddings.py | IllinoisSocialMediaMacroscope/smm-bae | 9fea6fa61369db16c2bd95bc409be82c1f7a5c50 | [
"Apache-2.0"
] | 1 | 2021-11-04T01:10:18.000Z | 2021-11-04T01:10:18.000Z | from sklearn.feature_extraction.text import CountVectorizer
import numpy as np
#the function expects documents to be a list of documents. To use just one document, pass [[document]]
def transformTextForTraining(embed_dictionary, length_threshold, documents, y_O, y_C, y_E, y_A, y_N, operation, FastText, friends=None):
vectorizer = CountVectorizer(stop_words="english", analyzer="word")
analyzer = vectorizer.build_analyzer()
tokenizer = vectorizer.build_tokenizer()
string = False
deleted = 0
if type(documents) is str: #single post
string = True
documents = [documents]
text_embeddings = []
i = 0
for document in documents:
words = analyzer(document)
#words = tokenizer(document)
if len(words) < length_threshold and not string:
deleted += 1
y_O = np.delete(y_O, i)
y_C = np.delete(y_C, i)
y_E = np.delete(y_E, i)
y_A = np.delete(y_A, i)
y_N = np.delete(y_N, i)
if friends is not None:
friends = np.delete(friends, i)
continue
doc_embeddings = []
for word in words:
try:
word_embedding = embed_dictionary[word]
if FastText:
word_embedding = np.array(list(float(value) for value in word_embedding[:-1].split(" ")))
doc_embeddings.append(word_embedding)
except KeyError:
continue
if len(doc_embeddings) == 0 and not string:
deleted += 1
y_O = np.delete(y_O, i)
y_C = np.delete(y_C, i)
y_E = np.delete(y_E, i)
y_A = np.delete(y_A, i)
y_N = np.delete(y_N, i)
if friends is not None:
friends = np.delete(friends, i)
continue
if len(doc_embeddings) == 0:
return False
if friends is not None:
if operation=="sum":
text_embeddings.append( np.append(np.sum(np.array(doc_embeddings),axis=0),friends[i]) )
elif operation == "max":
text_embeddings.append(np.append(np.amax(np.array(doc_embeddings),axis=0),friends[i]) )
elif operation == "min":
text_embeddings.append(np.append(np.amin(np.array(doc_embeddings),axis=0),friends[i]) )
elif operation == "avg":
text_embeddings.append(np.append(np.mean(np.array(doc_embeddings),axis=0),friends[i]) )
elif operation == "conc":
npmax = np.amax(np.array(doc_embeddings),axis=0)
npmin = np.amin(np.array(doc_embeddings),axis=0)
npavg = np.mean(np.array(doc_embeddings),axis=0)
text_embeddings.append( np.append(np.concatenate((npmax, npmin, npavg)),friends[i]) )
else:
if operation=="sum":
text_embeddings.append(np.sum(np.array(doc_embeddings),axis=0))
elif operation == "max":
text_embeddings.append(np.amax(np.array(doc_embeddings),axis=0))
elif operation == "min":
text_embeddings.append(np.amin(np.array(doc_embeddings),axis=0))
elif operation == "avg":
text_embeddings.append(np.mean(np.array(doc_embeddings),axis=0))
elif operation == "conc":
npmax = np.amax(np.array(doc_embeddings),axis=0)
npmin = np.amin(np.array(doc_embeddings),axis=0)
npavg = np.mean(np.array(doc_embeddings),axis=0)
text_embeddings.append(np.concatenate((npmax, npmin, npavg)))
i += 1
if friends is not None:
return [np.array(text_embeddings), y_O, y_C, y_E, y_A, y_N, friends]
else:
return [np.array(text_embeddings), y_O, y_C, y_E, y_A, y_N]
def transformTextForTesting(embed_dictionary, length_threshold, documents, operation):
vectorizer = CountVectorizer(stop_words="english", analyzer="word")
analyzer = vectorizer.build_analyzer()
text_embeddings = []
i = 0
for document in documents:
words = analyzer(document)
if len(words) < length_threshold:
#move to the next document
continue
doc_embeddings = []
for word in words:
try:
word_embedding_string = embed_dictionary[word]
word_embedding = np.array(list(float(value) for value in word_embedding_string[:-1].split(" ")))
doc_embeddings.append(word_embedding)
except KeyError:
continue
if len(doc_embeddings) == 0:
continue
i += 1
if operation=="sum":
text_embeddings.append(np.sum(np.array(doc_embeddings),axis=0))
elif operation == "max":
text_embeddings.append(np.amax(np.array(doc_embeddings),axis=0))
elif operation == "min":
text_embeddings.append(np.amin(np.array(doc_embeddings),axis=0))
elif operation == "avg":
text_embeddings.append(np.mean(np.array(doc_embeddings),axis=0))
elif operation == "conc":
npmax = np.amax(np.array(doc_embeddings),axis=0)
npmin = np.amin(np.array(doc_embeddings),axis=0)
npavg = np.mean(np.array(doc_embeddings),axis=0)
text_embeddings.append(np.concatenate((npmax, npmin, npavg)))
if len(text_embeddings) == 0:
raise ValueError
return np.array(text_embeddings) | 41.977099 | 136 | 0.589016 | 692 | 5,499 | 4.524566 | 0.143064 | 0.116257 | 0.067071 | 0.134142 | 0.832961 | 0.787927 | 0.752475 | 0.717023 | 0.717023 | 0.702012 | 0 | 0.008799 | 0.297327 | 5,499 | 131 | 137 | 41.977099 | 0.801501 | 0.029824 | 0 | 0.747826 | 0 | 0 | 0.013503 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.017391 | false | 0 | 0.017391 | 0 | 0.069565 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
962c300fc4d3ebe068fc610f0dcfc42c741b9318 | 2,202 | py | Python | S4/S4 Library/simulation/conditional_layers/conditional_layer_commands.py | NeonOcean/Environment | ca658cf66e8fd6866c22a4a0136d415705b36d26 | [
"CC-BY-4.0"
] | 1 | 2021-05-20T19:33:37.000Z | 2021-05-20T19:33:37.000Z | S4/S4 Library/simulation/conditional_layers/conditional_layer_commands.py | NeonOcean/Environment | ca658cf66e8fd6866c22a4a0136d415705b36d26 | [
"CC-BY-4.0"
] | null | null | null | S4/S4 Library/simulation/conditional_layers/conditional_layer_commands.py | NeonOcean/Environment | ca658cf66e8fd6866c22a4a0136d415705b36d26 | [
"CC-BY-4.0"
] | null | null | null | from conditional_layers.conditional_layer_service import ConditionalLayerRequestSpeedType
from server_commands.argument_helpers import TunableInstanceParam
import services
import sims4.commands
@sims4.commands.Command('layers.load_layer')
def load_conditional_layer(conditional_layer:TunableInstanceParam(sims4.resources.Types.CONDITIONAL_LAYER), immediate:bool=True, timer_interval:int=1, timer_object_count:int=5):
if conditional_layer is None:
sims4.commands.output('Unable to find the conditional_layer instance specified.')
return
conditional_layer_service = services.conditional_layer_service()
speed = ConditionalLayerRequestSpeedType.IMMEDIATELY if immediate else ConditionalLayerRequestSpeedType.GRADUALLY
conditional_layer_service.load_conditional_layer(conditional_layer, speed=speed, timer_interval=timer_interval, timer_object_count=timer_object_count)
@sims4.commands.Command('layers.destroy_layer')
def destroy_conditional_layer(conditional_layer:TunableInstanceParam(sims4.resources.Types.CONDITIONAL_LAYER), immediate:bool=True, timer_interval:int=1, timer_object_count:int=5):
conditional_layer_service = services.conditional_layer_service()
speed = ConditionalLayerRequestSpeedType.IMMEDIATELY if immediate else ConditionalLayerRequestSpeedType.GRADUALLY
conditional_layer_service.destroy_conditional_layer(conditional_layer, speed=speed, timer_interval=timer_interval, timer_object_count=timer_object_count)
@sims4.commands.Command('layers.reload_layer')
def reload_conditional_layer(conditional_layer:TunableInstanceParam(sims4.resources.Types.CONDITIONAL_LAYER), immediate:bool=True, timer_interval:int=1, timer_object_count:int=5):
conditional_layer_service = services.conditional_layer_service()
speed = ConditionalLayerRequestSpeedType.IMMEDIATELY if immediate else ConditionalLayerRequestSpeedType.GRADUALLY
conditional_layer_service.destroy_conditional_layer(conditional_layer, speed=speed, timer_interval=timer_interval, timer_object_count=timer_object_count)
conditional_layer_service.load_conditional_layer(conditional_layer, speed=speed, timer_interval=timer_interval, timer_object_count=timer_object_count)
| 81.555556 | 180 | 0.865123 | 252 | 2,202 | 7.214286 | 0.18254 | 0.264026 | 0.139164 | 0.123212 | 0.810781 | 0.80473 | 0.80473 | 0.80473 | 0.80473 | 0.80473 | 0 | 0.006826 | 0.068574 | 2,202 | 26 | 181 | 84.692308 | 0.879571 | 0 | 0 | 0.434783 | 0 | 0 | 0.050863 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.130435 | false | 0 | 0.173913 | 0 | 0.347826 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
965d697abe9050e33f821acb378c2c5b1f222c88 | 763 | py | Python | examples/v1.0.0/example.py | catsital/pycasso | eb54cd82e66d06b3677c2c068716ceb818c19f97 | [
"MIT"
] | 4 | 2021-11-08T08:35:10.000Z | 2022-02-23T21:22:11.000Z | examples/v1.0.0/example.py | catsital/pycasso | eb54cd82e66d06b3677c2c068716ceb818c19f97 | [
"MIT"
] | null | null | null | examples/v1.0.0/example.py | catsital/pycasso | eb54cd82e66d06b3677c2c068716ceb818c19f97 | [
"MIT"
] | null | null | null | from pycasso import Canvas
img = '../examples/en_Pepper-and-Carrot_by-David-Revoy_E05P01_p2.png'
slice_size = 30
seed = 'Pycasso'
pyc = Canvas(img, slice_size, seed)
pyc.export(mode='scramble', path='en_Pepper-and-Carrot_by-David-Revoy_E05P01_p2_v1.0.0-prng.png')
# Canvas(img, slice_size, seed).export('scramble', 'en_Pepper-and-Carrot_by-David-Revoy_E05P01_p2_v1.0.0-prng.png')
img = 'en_Pepper-and-Carrot_by-David-Revoy_E05P01_p2_v1.0.0-prng.png'
slice_size = 30
seed = 'Pycasso'
pyc = Canvas(img, slice_size, seed)
pyc.export(mode='unscramble', path='en_Pepper-and-Carrot_by-David-Revoy_E05P01_p2_v1.0.0-prng-unscramble.png')
# Canvas(img, slice_size, seed).export('unscramble', 'en_Pepper-and-Carrot_by-David-Revoy_E05P01_p2_v1.0.0-prng-unscramble.png')
| 42.388889 | 128 | 0.775885 | 135 | 763 | 4.125926 | 0.214815 | 0.086176 | 0.118492 | 0.183124 | 0.868941 | 0.868941 | 0.868941 | 0.768402 | 0.768402 | 0.701975 | 0 | 0.068724 | 0.065531 | 763 | 17 | 129 | 44.882353 | 0.712482 | 0.314548 | 0 | 0.545455 | 0 | 0 | 0.551923 | 0.490385 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.090909 | 0 | 0.090909 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
967702d091c15203c5625e7ed8b2fa2b54732d3d | 86 | py | Python | Dynamics/Discrete/test.py | lambertdw/CalculiX-Examples | 1c003ff3a9dd8872c6c44b4cfaf3698997346465 | [
"MIT"
] | null | null | null | Dynamics/Discrete/test.py | lambertdw/CalculiX-Examples | 1c003ff3a9dd8872c6c44b4cfaf3698997346465 | [
"MIT"
] | null | null | null | Dynamics/Discrete/test.py | lambertdw/CalculiX-Examples | 1c003ff3a9dd8872c6c44b4cfaf3698997346465 | [
"MIT"
] | null | null | null | #!/usr/bin/python
import os
os.system("cgx -b run.fbd")
os.system("cgx -b runM.fbd")
| 14.333333 | 28 | 0.662791 | 17 | 86 | 3.352941 | 0.647059 | 0.280702 | 0.385965 | 0.421053 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.116279 | 86 | 5 | 29 | 17.2 | 0.75 | 0.186047 | 0 | 0 | 0 | 0 | 0.42029 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.333333 | 0 | 0.333333 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
96783813764d7f62c9c368cb8c5d05bd708b3231 | 13,299 | py | Python | tests/testowyl.py | lullabee/owyl | db0458bce9ff378bce1ffb7e7b93c86a1a0e5743 | [
"BSD-3-Clause"
] | null | null | null | tests/testowyl.py | lullabee/owyl | db0458bce9ff378bce1ffb7e7b93c86a1a0e5743 | [
"BSD-3-Clause"
] | null | null | null | tests/testowyl.py | lullabee/owyl | db0458bce9ff378bce1ffb7e7b93c86a1a0e5743 | [
"BSD-3-Clause"
] | null | null | null | # -*- coding: utf-8 -*-
"""testowyl -- some tests for owyl.
Copyright 2008 David Eyk. All rights reserved.
$Author$\n
$Rev$\n
$Date$
"""
__author__ = "$Author$"[9:-2]
__revision__ = "$Rev$"[6:-2]
__date__ = "$Date$"[7:-2]
import unittest
import owyl
from owyl import blackboard
class OwylTests(unittest.TestCase):
"""Tests for Owyl.
Note: tests should run the tree twice to make sure that the
constructed tree is re-usable.
"""
def testSucceed(self):
"""Can we succeed?
"""
s = owyl.succeed()
t = s()
self.assertEqual(next(t), True)
self.assertRaises(StopIteration, next(t))
t = s()
self.assertEqual(next(t), True)
self.assertRaises(StopIteration, next(t))
def testFail(self):
"""Can we fail?
"""
s = owyl.fail()
t = s()
self.assertEqual(next(t), False)
self.assertRaises(StopIteration, next(t))
t = s()
self.assertEqual(next(t), False)
self.assertRaises(StopIteration, next(t))
def testVisitSequenceSuccess(self):
"""Can we visit a successful sequence?
"""
tree = owyl.sequence(owyl.succeed(),
owyl.succeed(),
owyl.succeed())
v = owyl.visit(tree)
results = [x for x in v if x is not None]
self.assertEqual(results, [True, True, True, True])
v = owyl.visit(tree)
results = [x for x in v if x is not None]
self.assertEqual(results, [True, True, True, True])
def testVisitSequenceFailure(self):
"""Can we visit a failing sequence?
"""
tree = owyl.sequence(owyl.succeed(),
owyl.succeed(),
owyl.fail(),
owyl.succeed())
v = owyl.visit(tree)
results = [x for x in v if x is not None]
self.assertEqual(results, [True, True, False, False])
v = owyl.visit(tree)
results = [x for x in v if x is not None]
self.assertEqual(results, [True, True, False, False])
def testVisitSelectorSuccess(self):
"""Can we visit a successful selector?
"""
tree = owyl.selector(owyl.fail(),
owyl.fail(),
owyl.succeed(),
owyl.fail())
v = owyl.visit(tree)
results = [x for x in v if x is not None]
self.assertEqual(results, [False, False, True, True])
v = owyl.visit(tree)
results = [x for x in v if x is not None]
self.assertEqual(results, [False, False, True, True])
def testVisitSelectorFailure(self):
"""Can we visit a failing selector?
"""
tree = owyl.selector(owyl.fail(),
owyl.fail(),
owyl.fail())
v = owyl.visit(tree)
results = [x for x in v if x is not None]
self.assertEqual(results, [False, False, False, False])
v = owyl.visit(tree)
results = [x for x in v if x is not None]
self.assertEqual(results, [False, False, False, False])
def testParallel_AllSucceed_Success(self):
"""Can we visit a suceeding parallel (all succeed)?
"""
tree = owyl.parallel(owyl.sequence(owyl.succeed(),
owyl.succeed()),
owyl.sequence(owyl.succeed(),
owyl.succeed()),
policy=owyl.PARALLEL_SUCCESS.REQUIRE_ALL)
v = owyl.visit(tree)
results = [x for x in v if x is not None]
self.assertEqual(results, [True])
v = owyl.visit(tree)
results = [x for x in v if x is not None]
self.assertEqual(results, [True])
def testParallel_OneSucceeds_Success(self):
"""Can we visit a suceeding parallel (one succeeds)?
"""
tree = owyl.parallel(owyl.sequence(owyl.succeed(),
owyl.succeed()),
owyl.sequence(owyl.succeed(),
owyl.fail()),
policy=owyl.PARALLEL_SUCCESS.REQUIRE_ONE)
v = owyl.visit(tree)
results = [x for x in v if x is not None]
self.assertEqual(results, [True])
v = owyl.visit(tree)
results = [x for x in v if x is not None]
self.assertEqual(results, [True])
def testParallel_AllSucceed_Failure(self):
"""Can we visit a failing parallel (all succeed)?
"""
tree = owyl.parallel(owyl.sequence(owyl.succeed(),
owyl.fail()),
owyl.sequence(owyl.succeed(),
owyl.succeed()),
policy=owyl.PARALLEL_SUCCESS.REQUIRE_ALL)
v = owyl.visit(tree)
results = [x for x in v if x is not None]
self.assertEqual(results, [False])
def testParallel_OneSucceeds_Failure(self):
"""Can we visit a failing parallel (one succeeds)?
"""
tree = owyl.parallel(owyl.sequence(owyl.fail(),
owyl.fail()),
owyl.sequence(owyl.fail(),
owyl.fail()),
policy=owyl.PARALLEL_SUCCESS.REQUIRE_ONE)
v = owyl.visit(tree)
results = [x for x in v if x is not None]
self.assertEqual(results, [False])
v = owyl.visit(tree)
results = [x for x in v if x is not None]
self.assertEqual(results, [False])
def testThrow(self):
"""Can we throw an exception within the tree?
"""
tree = owyl.sequence(owyl.succeed(),
owyl.succeed(),
owyl.throw(throws=ValueError,
throws_message="AUGH!!"),
)
v = owyl.visit(tree)
self.assertEqual(next(v), True)
self.assertEqual(next(v), True)
self.assertRaises(ValueError, next(v))
v = owyl.visit(tree)
self.assertEqual(next(v), True)
self.assertEqual(next(v), True)
self.assertRaises(ValueError, next(v))
def testCatch(self):
"""Can we catch an exception thrown within the tree?
"""
tree = owyl.sequence(owyl.succeed(),
owyl.succeed(),
owyl.catch(owyl.throw(throws=ValueError,
throws_message="AUGH!!"),
caught=ValueError,
branch=owyl.succeed())
)
v = owyl.visit(tree)
self.assertEqual(next(v), True)
self.assertEqual(next(v), True)
self.assertEqual(next(v), True)
v = owyl.visit(tree)
self.assertEqual(next(v), True)
self.assertEqual(next(v), True)
self.assertEqual(next(v), True)
def testCatchIgnoresOthers(self):
"""Does catch ignore other exceptions thrown within the tree?
"""
tree = owyl.sequence(owyl.succeed(),
owyl.succeed(),
owyl.catch(owyl.throw(throws=ValueError,
throws_message="AUGH!!"),
caught=IndexError,
branch=owyl.succeed())
)
v = owyl.visit(tree)
self.assertEqual(next(v), True)
self.assertEqual(next(v), True)
self.assertRaises(ValueError, next(v))
v = owyl.visit(tree)
self.assertEqual(next(v), True)
self.assertEqual(next(v), True)
self.assertRaises(ValueError, next(v))
def testIdentity(self):
"""Does identity pass on return values unchanged?
"""
# Succeed after 5 iterations.
after = 5
tree = owyl.identity(owyl.succeedAfter(after=after))
v = owyl.visit(tree)
for x in range(after):
self.assertEqual(next(v), None)
self.assertEqual(next(v), True)
v = owyl.visit(tree)
for x in range(after):
self.assertEqual(next(v), None)
self.assertEqual(next(v), True)
tree = owyl.identity(owyl.failAfter(after=after))
v = owyl.visit(tree)
for x in range(after):
self.assertEqual(next(v), None)
self.assertEqual(next(v), False)
v = owyl.visit(tree)
for x in range(after):
self.assertEqual(next(v), None)
self.assertEqual(next(v), False)
def testCheckBB(self):
"""Can we check a value on a blackboard?
"""
value = "foo"
checker = lambda x: x == value
bb = blackboard.Blackboard('test', value=value)
tree = blackboard.checkBB(key='value',
check=checker)
# Note that we can pass in the blackboard at run-time.
v = owyl.visit(tree, blackboard=bb)
# Check should succeed.
self.assertEqual(next(v), True)
v = owyl.visit(tree, blackboard=bb)
self.assertEqual(next(v), True)
bb['value'] = 'bar'
# Check should now fail.
v = owyl.visit(tree, blackboard=bb)
self.assertEqual(next(v), False)
v = owyl.visit(tree, blackboard=bb)
self.assertEqual(next(v), False)
def testSetBB(self):
"""Can we set a value on a blackboard?
"""
value = 'foo'
checker = lambda x: x == value
bb = blackboard.Blackboard('test', value='bar')
tree = owyl.sequence(blackboard.setBB(key="value",
value=value),
blackboard.checkBB(key='value',
check=checker)
)
# Note that we can pass in the blackboard at run-time.
v = owyl.visit(tree, blackboard=bb)
# Sequence will succeed if the check succeeds.
result = [x for x in v][-1]
self.assertEqual(result, True)
v = owyl.visit(tree, blackboard=bb)
result = [x for x in v][-1]
self.assertEqual(result, True)
def testRepeatUntilSucceed(self):
"""Can we repeat a behavior until it succeeds?
"""
bb = blackboard.Blackboard('test', ) # 'value' defaults to None.
checker = lambda x: x is not None
parallel = owyl.parallel
repeat = owyl.repeatUntilSucceed
checkBB = blackboard.checkBB
setBB = blackboard.setBB
tree = parallel(repeat(checkBB(key='value',
check=checker),
final_value=True),
# That should fail until this sets the value:
owyl.selector(owyl.fail(),
owyl.fail(),
setBB(key='value',
value='foo')),
policy=owyl.PARALLEL_SUCCESS.REQUIRE_ALL)
v = owyl.visit(tree, blackboard=bb)
results = [x for x in v]
result = results[-1]
self.assertEqual(result, True)
# Need to reset the blackboard to get the same results.
bb = blackboard.Blackboard('test', ) # 'value' defaults to None.
v = owyl.visit(tree, blackboard=bb)
results = [x for x in v]
result = results[-1]
self.assertEqual(result, True)
def testRepeatUntilFail(self):
"""Can we repeat a behavior until it fails?
"""
bb = blackboard.Blackboard('test', value="foo")
checker = lambda x: x and True or False # must eval to True
parallel = owyl.parallel
repeat = owyl.repeatUntilFail
checkBB = blackboard.checkBB
setBB = blackboard.setBB
tree = parallel(repeat(checkBB(key='value',
check=checker),
final_value=True),
# That should succeed until this sets the value:
owyl.selector(owyl.fail(),
owyl.fail(),
setBB(key='value',
value=None)),
policy=owyl.PARALLEL_SUCCESS.REQUIRE_ALL)
v = owyl.visit(tree, blackboard=bb)
results = [x for x in v]
result = results[-1]
self.assertEqual(result, True)
# Need to reset the blackboard to get the same results.
bb = blackboard.Blackboard('test', value="foo")
v = owyl.visit(tree, blackboard=bb)
results = [x for x in v]
result = results[-1]
self.assertEqual(result, True)
if __name__ == "__main__":
runner = unittest
try:
import testoob
runner = testoob
except ImportError:
pass
runner.main()
| 32.918317 | 76 | 0.504324 | 1,425 | 13,299 | 4.675088 | 0.119298 | 0.11483 | 0.052537 | 0.073551 | 0.811168 | 0.80036 | 0.770339 | 0.760733 | 0.724557 | 0.683428 | 0 | 0.002335 | 0.388149 | 13,299 | 403 | 77 | 33 | 0.816394 | 0.121062 | 0 | 0.763359 | 0 | 0 | 0.011247 | 0 | 0 | 0 | 0 | 0 | 0.225191 | 1 | 0.068702 | false | 0.003817 | 0.019084 | 0 | 0.091603 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
738362558d730c020c556528e1034c8cb4754c79 | 2,217 | py | Python | tests/deconflict_test.py | NACHC-CAD/linkage-agent-tools | 324299e534bc55bd652eb670feb195ce5646f13e | [
"Apache-2.0"
] | null | null | null | tests/deconflict_test.py | NACHC-CAD/linkage-agent-tools | 324299e534bc55bd652eb670feb195ce5646f13e | [
"Apache-2.0"
] | 1 | 2021-10-01T15:13:15.000Z | 2021-10-01T15:13:15.000Z | tests/deconflict_test.py | NACHC-CAD/linkage-agent-tools | 324299e534bc55bd652eb670feb195ce5646f13e | [
"Apache-2.0"
] | null | null | null | from dcctools.deconflict import deconflict, link_count
example_result = {
"a": [142],
"b": [142, 280],
"run_results": [
{"a": 142, "b": 142, "project": "name-sex-dob-zip"},
{"a": 142, "b": 280, "project": "name-sex-dob-zip"},
{"a": 142, "b": 142, "project": "name-sex-dob-phone"},
{"a": 142, "b": 280, "project": "name-sex-dob-phone"},
{"a": 142, "b": 142, "project": "name-sex-dob-addr"},
{"a": 142, "b": 280, "project": "name-sex-dob-addr"},
{"a": 142, "b": 142, "project": "name-sex-dob-parents"},
{"a": 142, "b": 280, "project": "name-sex-dob-parents"},
{"a": 142, "c": 142, "project": "name-sex-dob-zip"},
{"a": 142, "c": 142, "project": "name-sex-dob-phone"},
{"a": 142, "c": 142, "project": "name-sex-dob-addr"},
{"b": 142, "c": 142, "project": "name-sex-dob-zip"},
{"b": 142, "c": 142, "project": "name-sex-dob-phone"},
{"b": 142, "c": 142, "project": "name-sex-dob-addr"},
{"b": 142, "c": 142, "project": "name-sex-dob-parents"},
],
"c": [142],
}
example_result_no_c = {
"a": [142],
"b": [142, 280],
"run_results": [
{"a": 142, "b": 142, "project": "name-sex-dob-zip"},
{"a": 142, "b": 280, "project": "name-sex-dob-zip"},
{"a": 142, "b": 142, "project": "name-sex-dob-phone"},
{"a": 142, "b": 280, "project": "name-sex-dob-phone"},
{"a": 142, "b": 142, "project": "name-sex-dob-addr"},
{"a": 142, "b": 280, "project": "name-sex-dob-addr"},
{"a": 142, "b": 142, "project": "name-sex-dob-parents"},
{"a": 142, "b": 280, "project": "name-sex-dob-parents"},
],
}
def test_link_count():
count = link_count(example_result, "b", 142)
assert count == 8
def test_deconflict():
deconflicted_record = deconflict(example_result, ["a", "b", "c"])
assert deconflicted_record["a"] == 142
assert deconflicted_record["b"] == 142
assert deconflicted_record["c"] == 142
def test_deconflict_with_missing_system():
deconflicted_record = deconflict(example_result_no_c, ["a", "b", "c"])
assert deconflicted_record["a"] == 142
assert deconflicted_record["b"] == 142
| 38.224138 | 74 | 0.528642 | 302 | 2,217 | 3.791391 | 0.109272 | 0.080349 | 0.281223 | 0.341485 | 0.823581 | 0.735371 | 0.729258 | 0.729258 | 0.648035 | 0.648035 | 0 | 0.102476 | 0.216509 | 2,217 | 57 | 75 | 38.894737 | 0.556707 | 0 | 0 | 0.571429 | 0 | 0 | 0.294091 | 0 | 0 | 0 | 0 | 0 | 0.122449 | 1 | 0.061224 | false | 0 | 0.020408 | 0 | 0.081633 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
738e54d588ad0a67d06a9f242314f67878f2bba6 | 101 | py | Python | pybase24/__init__.py | mildmelon/python-base24 | f730eddc34668d17b09d99495c63b096ad22c748 | [
"MIT"
] | 1 | 2020-03-09T04:35:00.000Z | 2020-03-09T04:35:00.000Z | pybase24/__init__.py | mildmelon/python-base24 | f730eddc34668d17b09d99495c63b096ad22c748 | [
"MIT"
] | null | null | null | pybase24/__init__.py | mildmelon/python-base24 | f730eddc34668d17b09d99495c63b096ad22c748 | [
"MIT"
] | null | null | null | from pybase24.base24 import ALPHABET, ALPHABET_LENGTH
from pybase24.base24 import encode24, decode24
| 33.666667 | 53 | 0.861386 | 13 | 101 | 6.615385 | 0.615385 | 0.27907 | 0.418605 | 0.55814 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.131868 | 0.09901 | 101 | 2 | 54 | 50.5 | 0.813187 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
73affc7b86cf58700e0444c9daaa12b6a841ead2 | 2,641 | py | Python | saleor/core/tests/test_middleware.py | fairhopeweb/saleor | 9ac6c22652d46ba65a5b894da5f1ba5bec48c019 | [
"CC-BY-4.0"
] | 15,337 | 2015-01-12T02:11:52.000Z | 2021-10-05T19:19:29.000Z | saleor/core/tests/test_middleware.py | fairhopeweb/saleor | 9ac6c22652d46ba65a5b894da5f1ba5bec48c019 | [
"CC-BY-4.0"
] | 7,486 | 2015-02-11T10:52:13.000Z | 2021-10-06T09:37:15.000Z | saleor/core/tests/test_middleware.py | fairhopeweb/saleor | 9ac6c22652d46ba65a5b894da5f1ba5bec48c019 | [
"CC-BY-4.0"
] | 5,864 | 2015-01-16T14:52:54.000Z | 2021-10-05T23:01:15.000Z | from django.core.handlers.base import BaseHandler
from freezegun import freeze_time
from ..jwt import (
JWT_REFRESH_TOKEN_COOKIE_NAME,
JWT_REFRESH_TYPE,
create_refresh_token,
jwt_encode,
jwt_user_payload,
)
@freeze_time("2020-03-18 12:00:00")
def test_jwt_refresh_token_middleware(rf, customer_user, settings):
refresh_token = create_refresh_token(customer_user)
settings.MIDDLEWARE = [
"saleor.core.middleware.jwt_refresh_token_middleware",
]
request = rf.request()
request.refresh_token = refresh_token
handler = BaseHandler()
handler.load_middleware()
response = handler.get_response(request)
cookie = response.cookies.get(JWT_REFRESH_TOKEN_COOKIE_NAME)
assert cookie.value == refresh_token
@freeze_time("2020-03-18 12:00:00")
def test_jwt_refresh_token_middleware_token_without_expire(rf, customer_user, settings):
settings.JWT_EXPIRE = True
payload = jwt_user_payload(
customer_user,
JWT_REFRESH_TYPE,
settings.JWT_TTL_REFRESH,
)
del payload["exp"]
refresh_token = jwt_encode(payload)
settings.MIDDLEWARE = [
"saleor.core.middleware.jwt_refresh_token_middleware",
]
request = rf.request()
request.refresh_token = refresh_token
handler = BaseHandler()
handler.load_middleware()
response = handler.get_response(request)
cookie = response.cookies.get(JWT_REFRESH_TOKEN_COOKIE_NAME)
assert cookie.value == refresh_token
@freeze_time("2020-03-18 12:00:00")
def test_jwt_refresh_token_middleware_samesite_debug_mode(rf, customer_user, settings):
refresh_token = create_refresh_token(customer_user)
settings.MIDDLEWARE = [
"saleor.core.middleware.jwt_refresh_token_middleware",
]
settings.DEBUG = True
request = rf.request()
request.refresh_token = refresh_token
handler = BaseHandler()
handler.load_middleware()
response = handler.get_response(request)
cookie = response.cookies.get(JWT_REFRESH_TOKEN_COOKIE_NAME)
assert cookie["samesite"] == "Lax"
@freeze_time("2020-03-18 12:00:00")
def test_jwt_refresh_token_middleware_samesite_none(rf, customer_user, settings):
refresh_token = create_refresh_token(customer_user)
settings.MIDDLEWARE = [
"saleor.core.middleware.jwt_refresh_token_middleware",
]
settings.DEBUG = False
request = rf.request()
request.refresh_token = refresh_token
handler = BaseHandler()
handler.load_middleware()
response = handler.get_response(request)
cookie = response.cookies.get(JWT_REFRESH_TOKEN_COOKIE_NAME)
assert cookie["samesite"] == "None"
| 32.604938 | 88 | 0.740629 | 325 | 2,641 | 5.673846 | 0.16 | 0.201735 | 0.105748 | 0.10846 | 0.816161 | 0.802603 | 0.802603 | 0.802603 | 0.802603 | 0.802603 | 0 | 0.025466 | 0.167361 | 2,641 | 80 | 89 | 33.0125 | 0.813097 | 0 | 0 | 0.614286 | 0 | 0 | 0.115865 | 0.077243 | 0 | 0 | 0 | 0 | 0.057143 | 1 | 0.057143 | false | 0 | 0.042857 | 0 | 0.1 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
73eb1bceac18001f8491ab9f5c52a91da1e2874a | 38,132 | py | Python | msgraph-cli-extensions/beta/financials_beta/azext_financials_beta/generated/action.py | thewahome/msgraph-cli | 33127d9efa23a0e5f5303c93242fbdbb73348671 | [
"MIT"
] | null | null | null | msgraph-cli-extensions/beta/financials_beta/azext_financials_beta/generated/action.py | thewahome/msgraph-cli | 33127d9efa23a0e5f5303c93242fbdbb73348671 | [
"MIT"
] | null | null | null | msgraph-cli-extensions/beta/financials_beta/azext_financials_beta/generated/action.py | thewahome/msgraph-cli | 33127d9efa23a0e5f5303c93242fbdbb73348671 | [
"MIT"
] | null | null | null | # --------------------------------------------------------------------------
# Copyright (c) Microsoft Corporation. All rights reserved.
# Licensed under the MIT License. See License.txt in the project root for
# license information.
#
# Code generated by Microsoft (R) AutoRest Code Generator.
# Changes may cause incorrect behavior and will be lost if the code is
# regenerated.
# --------------------------------------------------------------------------
# pylint: disable=protected-access
import argparse
from collections import defaultdict
from knack.util import CLIError
class AddAccounts(argparse._AppendAction):
def __call__(self, parser, namespace, values, option_string=None):
action = self.get_action(values, option_string)
super(AddAccounts, self).__call__(parser, namespace, action, option_string)
def get_action(self, values, option_string): # pylint: disable=no-self-use
try:
properties = defaultdict(list)
for (k, v) in (x.split('=', 1) for x in values):
properties[k].append(v)
properties = dict(properties)
except ValueError:
raise CLIError('usage error: {} [KEY=VALUE ...]'.format(option_string))
d = {}
for k in properties:
kl = k.lower()
v = properties[k]
if kl == 'blocked':
d['blocked'] = v[0]
elif kl == 'category':
d['category'] = v[0]
elif kl == 'display-name':
d['display_name'] = v[0]
elif kl == 'last-modified-date-time':
d['last_modified_date_time'] = v[0]
elif kl == 'number':
d['number'] = v[0]
elif kl == 'sub-category':
d['sub_category'] = v[0]
elif kl == 'id':
d['id'] = v[0]
else:
raise CLIError('Unsupported Key {} is provided for parameter accounts. All possible keys are: blocked, '
'category, display-name, last-modified-date-time, number, sub-category, id'.format(k))
return d
class AddAgedAccountsPayable(argparse._AppendAction):
def __call__(self, parser, namespace, values, option_string=None):
action = self.get_action(values, option_string)
super(AddAgedAccountsPayable, self).__call__(parser, namespace, action, option_string)
def get_action(self, values, option_string): # pylint: disable=no-self-use
try:
properties = defaultdict(list)
for (k, v) in (x.split('=', 1) for x in values):
properties[k].append(v)
properties = dict(properties)
except ValueError:
raise CLIError('usage error: {} [KEY=VALUE ...]'.format(option_string))
d = {}
for k in properties:
kl = k.lower()
v = properties[k]
if kl == 'aged-as-of-date':
d['aged_as_of_date'] = v[0]
elif kl == 'balance-due':
d['balance_due'] = v[0]
elif kl == 'currency-code':
d['currency_code'] = v[0]
elif kl == 'current-amount':
d['current_amount'] = v[0]
elif kl == 'name':
d['name'] = v[0]
elif kl == 'period1-amount':
d['period1_amount'] = v[0]
elif kl == 'period2-amount':
d['period2_amount'] = v[0]
elif kl == 'period3-amount':
d['period3_amount'] = v[0]
elif kl == 'period-length-filter':
d['period_length_filter'] = v[0]
elif kl == 'vendor-number':
d['vendor_number'] = v[0]
elif kl == 'id':
d['id'] = v[0]
else:
raise CLIError('Unsupported Key {} is provided for parameter aged_accounts_payable. All possible keys '
'are: aged-as-of-date, balance-due, currency-code, current-amount, name, '
'period1-amount, period2-amount, period3-amount, period-length-filter, vendor-number, '
'id'.format(k))
return d
class AddAgedAccountsReceivable(argparse._AppendAction):
def __call__(self, parser, namespace, values, option_string=None):
action = self.get_action(values, option_string)
super(AddAgedAccountsReceivable, self).__call__(parser, namespace, action, option_string)
def get_action(self, values, option_string): # pylint: disable=no-self-use
try:
properties = defaultdict(list)
for (k, v) in (x.split('=', 1) for x in values):
properties[k].append(v)
properties = dict(properties)
except ValueError:
raise CLIError('usage error: {} [KEY=VALUE ...]'.format(option_string))
d = {}
for k in properties:
kl = k.lower()
v = properties[k]
if kl == 'aged-as-of-date':
d['aged_as_of_date'] = v[0]
elif kl == 'balance-due':
d['balance_due'] = v[0]
elif kl == 'currency-code':
d['currency_code'] = v[0]
elif kl == 'current-amount':
d['current_amount'] = v[0]
elif kl == 'customer-number':
d['customer_number'] = v[0]
elif kl == 'name':
d['name'] = v[0]
elif kl == 'period1-amount':
d['period1_amount'] = v[0]
elif kl == 'period2-amount':
d['period2_amount'] = v[0]
elif kl == 'period3-amount':
d['period3_amount'] = v[0]
elif kl == 'period-length-filter':
d['period_length_filter'] = v[0]
elif kl == 'id':
d['id'] = v[0]
else:
raise CLIError('Unsupported Key {} is provided for parameter aged_accounts_receivable. All possible '
'keys are: aged-as-of-date, balance-due, currency-code, current-amount, '
'customer-number, name, period1-amount, period2-amount, period3-amount, '
'period-length-filter, id'.format(k))
return d
class AddCountriesRegions(argparse._AppendAction):
def __call__(self, parser, namespace, values, option_string=None):
action = self.get_action(values, option_string)
super(AddCountriesRegions, self).__call__(parser, namespace, action, option_string)
def get_action(self, values, option_string): # pylint: disable=no-self-use
try:
properties = defaultdict(list)
for (k, v) in (x.split('=', 1) for x in values):
properties[k].append(v)
properties = dict(properties)
except ValueError:
raise CLIError('usage error: {} [KEY=VALUE ...]'.format(option_string))
d = {}
for k in properties:
kl = k.lower()
v = properties[k]
if kl == 'address-format':
d['address_format'] = v[0]
elif kl == 'code':
d['code'] = v[0]
elif kl == 'display-name':
d['display_name'] = v[0]
elif kl == 'last-modified-date-time':
d['last_modified_date_time'] = v[0]
elif kl == 'id':
d['id'] = v[0]
else:
raise CLIError('Unsupported Key {} is provided for parameter countries_regions. All possible keys are: '
'address-format, code, display-name, last-modified-date-time, id'.format(k))
return d
class AddCurrencies(argparse._AppendAction):
def __call__(self, parser, namespace, values, option_string=None):
action = self.get_action(values, option_string)
super(AddCurrencies, self).__call__(parser, namespace, action, option_string)
def get_action(self, values, option_string): # pylint: disable=no-self-use
try:
properties = defaultdict(list)
for (k, v) in (x.split('=', 1) for x in values):
properties[k].append(v)
properties = dict(properties)
except ValueError:
raise CLIError('usage error: {} [KEY=VALUE ...]'.format(option_string))
d = {}
for k in properties:
kl = k.lower()
v = properties[k]
if kl == 'amount-decimal-places':
d['amount_decimal_places'] = v[0]
elif kl == 'amount-rounding-precision':
d['amount_rounding_precision'] = v[0]
elif kl == 'code':
d['code'] = v[0]
elif kl == 'display-name':
d['display_name'] = v[0]
elif kl == 'last-modified-date-time':
d['last_modified_date_time'] = v[0]
elif kl == 'symbol':
d['symbol'] = v[0]
elif kl == 'id':
d['id'] = v[0]
else:
raise CLIError('Unsupported Key {} is provided for parameter currencies. All possible keys are: '
'amount-decimal-places, amount-rounding-precision, code, display-name, '
'last-modified-date-time, symbol, id'.format(k))
return d
class AddFinancialsDimensionValues(argparse._AppendAction):
def __call__(self, parser, namespace, values, option_string=None):
action = self.get_action(values, option_string)
super(AddFinancialsDimensionValues, self).__call__(parser, namespace, action, option_string)
def get_action(self, values, option_string): # pylint: disable=no-self-use
try:
properties = defaultdict(list)
for (k, v) in (x.split('=', 1) for x in values):
properties[k].append(v)
properties = dict(properties)
except ValueError:
raise CLIError('usage error: {} [KEY=VALUE ...]'.format(option_string))
d = {}
for k in properties:
kl = k.lower()
v = properties[k]
if kl == 'code':
d['code'] = v[0]
elif kl == 'display-name':
d['display_name'] = v[0]
elif kl == 'last-modified-date-time':
d['last_modified_date_time'] = v[0]
elif kl == 'id':
d['id'] = v[0]
else:
raise CLIError('Unsupported Key {} is provided for parameter dimension_values. All possible keys are: '
'code, display-name, last-modified-date-time, id'.format(k))
return d
class AddItemCategories(argparse._AppendAction):
def __call__(self, parser, namespace, values, option_string=None):
action = self.get_action(values, option_string)
super(AddItemCategories, self).__call__(parser, namespace, action, option_string)
def get_action(self, values, option_string): # pylint: disable=no-self-use
try:
properties = defaultdict(list)
for (k, v) in (x.split('=', 1) for x in values):
properties[k].append(v)
properties = dict(properties)
except ValueError:
raise CLIError('usage error: {} [KEY=VALUE ...]'.format(option_string))
d = {}
for k in properties:
kl = k.lower()
v = properties[k]
if kl == 'code':
d['code'] = v[0]
elif kl == 'display-name':
d['display_name'] = v[0]
elif kl == 'last-modified-date-time':
d['last_modified_date_time'] = v[0]
elif kl == 'id':
d['id'] = v[0]
else:
raise CLIError('Unsupported Key {} is provided for parameter item_categories. All possible keys are: '
'code, display-name, last-modified-date-time, id'.format(k))
return d
class AddPaymentMethods(argparse._AppendAction):
def __call__(self, parser, namespace, values, option_string=None):
action = self.get_action(values, option_string)
super(AddPaymentMethods, self).__call__(parser, namespace, action, option_string)
def get_action(self, values, option_string): # pylint: disable=no-self-use
try:
properties = defaultdict(list)
for (k, v) in (x.split('=', 1) for x in values):
properties[k].append(v)
properties = dict(properties)
except ValueError:
raise CLIError('usage error: {} [KEY=VALUE ...]'.format(option_string))
d = {}
for k in properties:
kl = k.lower()
v = properties[k]
if kl == 'code':
d['code'] = v[0]
elif kl == 'display-name':
d['display_name'] = v[0]
elif kl == 'last-modified-date-time':
d['last_modified_date_time'] = v[0]
elif kl == 'id':
d['id'] = v[0]
else:
raise CLIError('Unsupported Key {} is provided for parameter payment_methods. All possible keys are: '
'code, display-name, last-modified-date-time, id'.format(k))
return d
class AddPaymentTerms(argparse._AppendAction):
def __call__(self, parser, namespace, values, option_string=None):
action = self.get_action(values, option_string)
super(AddPaymentTerms, self).__call__(parser, namespace, action, option_string)
def get_action(self, values, option_string): # pylint: disable=no-self-use
try:
properties = defaultdict(list)
for (k, v) in (x.split('=', 1) for x in values):
properties[k].append(v)
properties = dict(properties)
except ValueError:
raise CLIError('usage error: {} [KEY=VALUE ...]'.format(option_string))
d = {}
for k in properties:
kl = k.lower()
v = properties[k]
if kl == 'calculate-discount-on-credit-memos':
d['calculate_discount_on_credit_memos'] = v[0]
elif kl == 'code':
d['code'] = v[0]
elif kl == 'discount-date-calculation':
d['discount_date_calculation'] = v[0]
elif kl == 'discount-percent':
d['discount_percent'] = v[0]
elif kl == 'display-name':
d['display_name'] = v[0]
elif kl == 'due-date-calculation':
d['due_date_calculation'] = v[0]
elif kl == 'last-modified-date-time':
d['last_modified_date_time'] = v[0]
elif kl == 'id':
d['id'] = v[0]
else:
raise CLIError('Unsupported Key {} is provided for parameter payment_terms. All possible keys are: '
'calculate-discount-on-credit-memos, code, discount-date-calculation, discount-percent, '
'display-name, due-date-calculation, last-modified-date-time, id'.format(k))
return d
class AddFinancialsPicture(argparse._AppendAction):
def __call__(self, parser, namespace, values, option_string=None):
action = self.get_action(values, option_string)
super(AddFinancialsPicture, self).__call__(parser, namespace, action, option_string)
def get_action(self, values, option_string): # pylint: disable=no-self-use
try:
properties = defaultdict(list)
for (k, v) in (x.split('=', 1) for x in values):
properties[k].append(v)
properties = dict(properties)
except ValueError:
raise CLIError('usage error: {} [KEY=VALUE ...]'.format(option_string))
d = {}
for k in properties:
kl = k.lower()
v = properties[k]
if kl == 'content':
d['content'] = v[0]
elif kl == 'content-type':
d['content_type'] = v[0]
elif kl == 'height':
d['height'] = v[0]
elif kl == 'width':
d['width'] = v[0]
elif kl == 'id':
d['id'] = v[0]
else:
raise CLIError('Unsupported Key {} is provided for parameter picture. All possible keys are: content, '
'content-type, height, width, id'.format(k))
return d
class AddShipmentMethods(argparse._AppendAction):
def __call__(self, parser, namespace, values, option_string=None):
action = self.get_action(values, option_string)
super(AddShipmentMethods, self).__call__(parser, namespace, action, option_string)
def get_action(self, values, option_string): # pylint: disable=no-self-use
try:
properties = defaultdict(list)
for (k, v) in (x.split('=', 1) for x in values):
properties[k].append(v)
properties = dict(properties)
except ValueError:
raise CLIError('usage error: {} [KEY=VALUE ...]'.format(option_string))
d = {}
for k in properties:
kl = k.lower()
v = properties[k]
if kl == 'code':
d['code'] = v[0]
elif kl == 'display-name':
d['display_name'] = v[0]
elif kl == 'last-modified-date-time':
d['last_modified_date_time'] = v[0]
elif kl == 'id':
d['id'] = v[0]
else:
raise CLIError('Unsupported Key {} is provided for parameter shipment_methods. All possible keys are: '
'code, display-name, last-modified-date-time, id'.format(k))
return d
class AddTaxAreas(argparse._AppendAction):
def __call__(self, parser, namespace, values, option_string=None):
action = self.get_action(values, option_string)
super(AddTaxAreas, self).__call__(parser, namespace, action, option_string)
def get_action(self, values, option_string): # pylint: disable=no-self-use
try:
properties = defaultdict(list)
for (k, v) in (x.split('=', 1) for x in values):
properties[k].append(v)
properties = dict(properties)
except ValueError:
raise CLIError('usage error: {} [KEY=VALUE ...]'.format(option_string))
d = {}
for k in properties:
kl = k.lower()
v = properties[k]
if kl == 'code':
d['code'] = v[0]
elif kl == 'display-name':
d['display_name'] = v[0]
elif kl == 'last-modified-date-time':
d['last_modified_date_time'] = v[0]
elif kl == 'tax-type':
d['tax_type'] = v[0]
elif kl == 'id':
d['id'] = v[0]
else:
raise CLIError('Unsupported Key {} is provided for parameter tax_areas. All possible keys are: code, '
'display-name, last-modified-date-time, tax-type, id'.format(k))
return d
class AddTaxGroups(argparse._AppendAction):
def __call__(self, parser, namespace, values, option_string=None):
action = self.get_action(values, option_string)
super(AddTaxGroups, self).__call__(parser, namespace, action, option_string)
def get_action(self, values, option_string): # pylint: disable=no-self-use
try:
properties = defaultdict(list)
for (k, v) in (x.split('=', 1) for x in values):
properties[k].append(v)
properties = dict(properties)
except ValueError:
raise CLIError('usage error: {} [KEY=VALUE ...]'.format(option_string))
d = {}
for k in properties:
kl = k.lower()
v = properties[k]
if kl == 'code':
d['code'] = v[0]
elif kl == 'display-name':
d['display_name'] = v[0]
elif kl == 'last-modified-date-time':
d['last_modified_date_time'] = v[0]
elif kl == 'tax-type':
d['tax_type'] = v[0]
elif kl == 'id':
d['id'] = v[0]
else:
raise CLIError('Unsupported Key {} is provided for parameter tax_groups. All possible keys are: code, '
'display-name, last-modified-date-time, tax-type, id'.format(k))
return d
class AddUnitsOfMeasure(argparse._AppendAction):
def __call__(self, parser, namespace, values, option_string=None):
action = self.get_action(values, option_string)
super(AddUnitsOfMeasure, self).__call__(parser, namespace, action, option_string)
def get_action(self, values, option_string): # pylint: disable=no-self-use
try:
properties = defaultdict(list)
for (k, v) in (x.split('=', 1) for x in values):
properties[k].append(v)
properties = dict(properties)
except ValueError:
raise CLIError('usage error: {} [KEY=VALUE ...]'.format(option_string))
d = {}
for k in properties:
kl = k.lower()
v = properties[k]
if kl == 'code':
d['code'] = v[0]
elif kl == 'display-name':
d['display_name'] = v[0]
elif kl == 'international-standard-code':
d['international_standard_code'] = v[0]
elif kl == 'last-modified-date-time':
d['last_modified_date_time'] = v[0]
elif kl == 'id':
d['id'] = v[0]
else:
raise CLIError('Unsupported Key {} is provided for parameter units_of_measure. All possible keys are: '
'code, display-name, international-standard-code, last-modified-date-time, id'.format(k))
return d
class AddAccount(argparse.Action):
def __call__(self, parser, namespace, values, option_string=None):
action = self.get_action(values, option_string)
namespace.body = action
def get_action(self, values, option_string): # pylint: disable=no-self-use
try:
properties = defaultdict(list)
for (k, v) in (x.split('=', 1) for x in values):
properties[k].append(v)
properties = dict(properties)
except ValueError:
raise CLIError('usage error: {} [KEY=VALUE ...]'.format(option_string))
d = {}
for k in properties:
kl = k.lower()
v = properties[k]
return d
class AddAddress(argparse.Action):
def __call__(self, parser, namespace, values, option_string=None):
action = self.get_action(values, option_string)
namespace.address = action
def get_action(self, values, option_string): # pylint: disable=no-self-use
try:
properties = defaultdict(list)
for (k, v) in (x.split('=', 1) for x in values):
properties[k].append(v)
properties = dict(properties)
except ValueError:
raise CLIError('usage error: {} [KEY=VALUE ...]'.format(option_string))
d = {}
for k in properties:
kl = k.lower()
v = properties[k]
if kl == 'city':
d['city'] = v[0]
elif kl == 'country-letter-code':
d['country_letter_code'] = v[0]
elif kl == 'postal-code':
d['postal_code'] = v[0]
elif kl == 'state':
d['state'] = v[0]
elif kl == 'street':
d['street'] = v[0]
else:
raise CLIError('Unsupported Key {} is provided for parameter address. All possible keys are: city, '
'country-letter-code, postal-code, state, street'.format(k))
return d
class AddCurrency(argparse.Action):
def __call__(self, parser, namespace, values, option_string=None):
action = self.get_action(values, option_string)
namespace.body = action
def get_action(self, values, option_string): # pylint: disable=no-self-use
try:
properties = defaultdict(list)
for (k, v) in (x.split('=', 1) for x in values):
properties[k].append(v)
properties = dict(properties)
except ValueError:
raise CLIError('usage error: {} [KEY=VALUE ...]'.format(option_string))
d = {}
for k in properties:
kl = k.lower()
v = properties[k]
return d
class AddPaymentMethod(argparse.Action):
def __call__(self, parser, namespace, values, option_string=None):
action = self.get_action(values, option_string)
namespace.payment_method = action
def get_action(self, values, option_string): # pylint: disable=no-self-use
try:
properties = defaultdict(list)
for (k, v) in (x.split('=', 1) for x in values):
properties[k].append(v)
properties = dict(properties)
except ValueError:
raise CLIError('usage error: {} [KEY=VALUE ...]'.format(option_string))
d = {}
for k in properties:
kl = k.lower()
v = properties[k]
if kl == 'code':
d['code'] = v[0]
elif kl == 'display-name':
d['display_name'] = v[0]
elif kl == 'last-modified-date-time':
d['last_modified_date_time'] = v[0]
elif kl == 'id':
d['id'] = v[0]
else:
raise CLIError('Unsupported Key {} is provided for parameter payment_method. All possible keys are: '
'code, display-name, last-modified-date-time, id'.format(k))
return d
class AddPaymentTerm(argparse.Action):
def __call__(self, parser, namespace, values, option_string=None):
action = self.get_action(values, option_string)
namespace.payment_term = action
def get_action(self, values, option_string): # pylint: disable=no-self-use
try:
properties = defaultdict(list)
for (k, v) in (x.split('=', 1) for x in values):
properties[k].append(v)
properties = dict(properties)
except ValueError:
raise CLIError('usage error: {} [KEY=VALUE ...]'.format(option_string))
d = {}
for k in properties:
kl = k.lower()
v = properties[k]
if kl == 'calculate-discount-on-credit-memos':
d['calculate_discount_on_credit_memos'] = v[0]
elif kl == 'code':
d['code'] = v[0]
elif kl == 'discount-date-calculation':
d['discount_date_calculation'] = v[0]
elif kl == 'discount-percent':
d['discount_percent'] = v[0]
elif kl == 'display-name':
d['display_name'] = v[0]
elif kl == 'due-date-calculation':
d['due_date_calculation'] = v[0]
elif kl == 'last-modified-date-time':
d['last_modified_date_time'] = v[0]
elif kl == 'id':
d['id'] = v[0]
else:
raise CLIError('Unsupported Key {} is provided for parameter payment_term. All possible keys are: '
'calculate-discount-on-credit-memos, code, discount-date-calculation, discount-percent, '
'display-name, due-date-calculation, last-modified-date-time, id'.format(k))
return d
class AddFinancialsCompaniesPicture(argparse._AppendAction):
def __call__(self, parser, namespace, values, option_string=None):
action = self.get_action(values, option_string)
super(AddFinancialsCompaniesPicture, self).__call__(parser, namespace, action, option_string)
def get_action(self, values, option_string): # pylint: disable=no-self-use
try:
properties = defaultdict(list)
for (k, v) in (x.split('=', 1) for x in values):
properties[k].append(v)
properties = dict(properties)
except ValueError:
raise CLIError('usage error: {} [KEY=VALUE ...]'.format(option_string))
d = {}
for k in properties:
kl = k.lower()
v = properties[k]
if kl == 'content':
d['content'] = v[0]
elif kl == 'content-type':
d['content_type'] = v[0]
elif kl == 'height':
d['height'] = v[0]
elif kl == 'width':
d['width'] = v[0]
elif kl == 'id':
d['id'] = v[0]
else:
raise CLIError('Unsupported Key {} is provided for parameter picture. All possible keys are: content, '
'content-type, height, width, id'.format(k))
return d
class AddShipmentMethod(argparse.Action):
def __call__(self, parser, namespace, values, option_string=None):
action = self.get_action(values, option_string)
namespace.shipment_method = action
def get_action(self, values, option_string): # pylint: disable=no-self-use
try:
properties = defaultdict(list)
for (k, v) in (x.split('=', 1) for x in values):
properties[k].append(v)
properties = dict(properties)
except ValueError:
raise CLIError('usage error: {} [KEY=VALUE ...]'.format(option_string))
d = {}
for k in properties:
kl = k.lower()
v = properties[k]
if kl == 'code':
d['code'] = v[0]
elif kl == 'display-name':
d['display_name'] = v[0]
elif kl == 'last-modified-date-time':
d['last_modified_date_time'] = v[0]
elif kl == 'id':
d['id'] = v[0]
else:
raise CLIError('Unsupported Key {} is provided for parameter shipment_method. All possible keys are: '
'code, display-name, last-modified-date-time, id'.format(k))
return d
class AddFinancialsCompaniesDimensionValues(argparse._AppendAction):
def __call__(self, parser, namespace, values, option_string=None):
action = self.get_action(values, option_string)
super(AddFinancialsCompaniesDimensionValues, self).__call__(parser, namespace, action, option_string)
def get_action(self, values, option_string): # pylint: disable=no-self-use
try:
properties = defaultdict(list)
for (k, v) in (x.split('=', 1) for x in values):
properties[k].append(v)
properties = dict(properties)
except ValueError:
raise CLIError('usage error: {} [KEY=VALUE ...]'.format(option_string))
d = {}
for k in properties:
kl = k.lower()
v = properties[k]
if kl == 'code':
d['code'] = v[0]
elif kl == 'display-name':
d['display_name'] = v[0]
elif kl == 'last-modified-date-time':
d['last_modified_date_time'] = v[0]
elif kl == 'id':
d['id'] = v[0]
else:
raise CLIError('Unsupported Key {} is provided for parameter dimension_values. All possible keys are: '
'code, display-name, last-modified-date-time, id'.format(k))
return d
class AddFinancialsFinancialCompanyCreateEmployeePicture(argparse._AppendAction):
def __call__(self, parser, namespace, values, option_string=None):
action = self.get_action(values, option_string)
super(AddFinancialsFinancialCompanyCreateEmployeePicture, self).__call__(parser, namespace, action, option_string)
def get_action(self, values, option_string): # pylint: disable=no-self-use
try:
properties = defaultdict(list)
for (k, v) in (x.split('=', 1) for x in values):
properties[k].append(v)
properties = dict(properties)
except ValueError:
raise CLIError('usage error: {} [KEY=VALUE ...]'.format(option_string))
d = {}
for k in properties:
kl = k.lower()
v = properties[k]
if kl == 'content':
d['content'] = v[0]
elif kl == 'content-type':
d['content_type'] = v[0]
elif kl == 'height':
d['height'] = v[0]
elif kl == 'width':
d['width'] = v[0]
elif kl == 'id':
d['id'] = v[0]
else:
raise CLIError('Unsupported Key {} is provided for parameter picture. All possible keys are: content, '
'content-type, height, width, id'.format(k))
return d
class AddItemCategory(argparse.Action):
def __call__(self, parser, namespace, values, option_string=None):
action = self.get_action(values, option_string)
namespace.item_category = action
def get_action(self, values, option_string): # pylint: disable=no-self-use
try:
properties = defaultdict(list)
for (k, v) in (x.split('=', 1) for x in values):
properties[k].append(v)
properties = dict(properties)
except ValueError:
raise CLIError('usage error: {} [KEY=VALUE ...]'.format(option_string))
d = {}
for k in properties:
kl = k.lower()
v = properties[k]
if kl == 'code':
d['code'] = v[0]
elif kl == 'display-name':
d['display_name'] = v[0]
elif kl == 'last-modified-date-time':
d['last_modified_date_time'] = v[0]
elif kl == 'id':
d['id'] = v[0]
else:
raise CLIError('Unsupported Key {} is provided for parameter item_category. All possible keys are: '
'code, display-name, last-modified-date-time, id'.format(k))
return d
class AddFinancialsFinancialCompanyCreateSaleCreditMemoLinePicture(argparse._AppendAction):
def __call__(self, parser, namespace, values, option_string=None):
action = self.get_action(values, option_string)
super(AddFinancialsFinancialCompanyCreateSaleCreditMemoLinePicture, self).__call__(parser, namespace, action, option_string)
def get_action(self, values, option_string): # pylint: disable=no-self-use
try:
properties = defaultdict(list)
for (k, v) in (x.split('=', 1) for x in values):
properties[k].append(v)
properties = dict(properties)
except ValueError:
raise CLIError('usage error: {} [KEY=VALUE ...]'.format(option_string))
d = {}
for k in properties:
kl = k.lower()
v = properties[k]
if kl == 'content':
d['content'] = v[0]
elif kl == 'content-type':
d['content_type'] = v[0]
elif kl == 'height':
d['height'] = v[0]
elif kl == 'width':
d['width'] = v[0]
elif kl == 'id':
d['id'] = v[0]
else:
raise CLIError('Unsupported Key {} is provided for parameter picture. All possible keys are: content, '
'content-type, height, width, id'.format(k))
return d
class AddFinancialsFinancialCompanyCreatePurchaseInvoicePicture(argparse._AppendAction):
def __call__(self, parser, namespace, values, option_string=None):
action = self.get_action(values, option_string)
super(AddFinancialsFinancialCompanyCreatePurchaseInvoicePicture, self).__call__(parser, namespace, action, option_string)
def get_action(self, values, option_string): # pylint: disable=no-self-use
try:
properties = defaultdict(list)
for (k, v) in (x.split('=', 1) for x in values):
properties[k].append(v)
properties = dict(properties)
except ValueError:
raise CLIError('usage error: {} [KEY=VALUE ...]'.format(option_string))
d = {}
for k in properties:
kl = k.lower()
v = properties[k]
if kl == 'content':
d['content'] = v[0]
elif kl == 'content-type':
d['content_type'] = v[0]
elif kl == 'height':
d['height'] = v[0]
elif kl == 'width':
d['width'] = v[0]
elif kl == 'id':
d['id'] = v[0]
else:
raise CLIError('Unsupported Key {} is provided for parameter picture. All possible keys are: content, '
'content-type, height, width, id'.format(k))
return d
| 43.23356 | 133 | 0.521163 | 4,130 | 38,132 | 4.685714 | 0.047458 | 0.013849 | 0.034105 | 0.045473 | 0.888177 | 0.883578 | 0.877635 | 0.874122 | 0.874122 | 0.874122 | 0 | 0.007228 | 0.354164 | 38,132 | 881 | 134 | 43.282633 | 0.778576 | 0.03147 | 0 | 0.856061 | 0 | 0.008838 | 0.205781 | 0.050841 | 0 | 0 | 0 | 0 | 0 | 1 | 0.065657 | false | 0 | 0.003788 | 0 | 0.135101 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
73eba2a0057ee47e18b3a87462eecacf636329a2 | 8,364 | py | Python | sawyer/mujoco/tasks/toy_tasks.py | rlagywjd802/gym-sawyer | 385bbeafcccb61afb9099554f6a99b16f1f1a7c5 | [
"MIT"
] | null | null | null | sawyer/mujoco/tasks/toy_tasks.py | rlagywjd802/gym-sawyer | 385bbeafcccb61afb9099554f6a99b16f1f1a7c5 | [
"MIT"
] | null | null | null | sawyer/mujoco/tasks/toy_tasks.py | rlagywjd802/gym-sawyer | 385bbeafcccb61afb9099554f6a99b16f1f1a7c5 | [
"MIT"
] | null | null | null | import numpy as np
from sawyer.mujoco.tasks.base import ComposableTask
class InsertTask(ComposableTask):
"""
Task to insert a key object into an upward facing lock.
The task assumes the key is already grasped and the gripper is close to the
lock hole.
Reward function is based on the following heuristics:
- Positive reward for a smaller z coordinate of the key
- Negative reward for releasing object
"""
def __init__(self,
key_object,
lock_object,
never_done=False,
success_thresh=0.01,
target_z_pos=0.20,
completion_bonus=0,
c_dist=0.1,
c_grasp=0.9):
self._key_object = key_object
self._lock_object = lock_object
self._never_done = never_done
self._success_thresh = success_thresh
self._target_z_pos = target_z_pos
self._completion_bonus = completion_bonus
self._c_dist = c_dist
self._c_grasp = c_grasp
def compute_reward(self, obs, info):
key_pos = info['world_obs']['{}_position'.format(self._key_object)]
lock_site = info['hole_site']
grasped = info['grasped_{}'.format(self._key_object)]
target_pos = np.array([lock_site[0], lock_site[1], self._target_z_pos])
r_dist = -np.linalg.norm(target_pos - key_pos)
r_grasp = grasped * self._c_grasp
return r_dist + r_grasp
def is_success(self, obs, info):
if self._never_done:
return False
key_pos = info['world_obs']['{}_position'.format(self._key_object)]
lock_site = info['hole_site']
grasped = info['grasped_{}'.format(self._key_object)]
target_pos = np.array([lock_site[0], lock_site[1], self._target_z_pos])
r_dist = np.linalg.norm(target_pos - key_pos)
return (grasped and
np.linalg.norm(target_pos - key_pos) < self._success_thresh)
@property
def completion_bonus(self):
return self._completion_bonus
class RemoveTask(ComposableTask):
"""
Task to remove a key object from an upward facing lock.
The task assumes the key is already grasped and the gripper is close to the
lock hole.
Reward function is based on the following heuristics:
- Positive reward for a larger z coordinate of the key
- Negative reward for releasing object
"""
def __init__(self,
key_object,
lock_object,
never_done=False,
success_thresh=0.01,
target_z_pos=0.35,
completion_bonus=0,
c_dist=0.1,
c_grasp=0.9):
self._key_object = key_object
self._lock_object = lock_object
self._never_done = never_done
self._success_thresh = success_thresh
self._target_z_pos = target_z_pos
self._completion_bonus = completion_bonus
self._c_dist = c_dist
self._c_grasp = c_grasp
def compute_reward(self, obs, info):
key_pos = info['world_obs']['{}_position'.format(self._key_object)]
lock_site = info['hole_site']
grasped = info['grasped_{}'.format(self._key_object)]
target_pos = np.array([lock_site[0], lock_site[1], self._target_z_pos])
r_dist = -np.linalg.norm(target_pos - key_pos)
r_grasp = grasped * self._c_grasp
return r_dist + r_grasp
def is_success(self, obs, info):
if self._never_done:
return False
key_pos = info['world_obs']['{}_position'.format(self._key_object)]
lock_site = info['hole_site']
grasped = info['grasped_{}'.format(self._key_object)]
target_pos = np.array([lock_site[0], lock_site[1], self._target_z_pos])
r_dist = np.linalg.norm(target_pos - key_pos)
return (grasped and
np.linalg.norm(target_pos - key_pos) < self._success_thresh)
@property
def completion_bonus(self):
return self._completion_bonus
class OpenTask(ComposableTask):
"""
Task to open a toy box lid on a lateral sliding joint with an inserted peg.
The task assumes there is already a key object inserted into the lid hole.
Reward function is based on the following heuristics:
- Positive reward for increased lateral distance between box and lid
- Negative reward for releasing key object
"""
def __init__(self,
lid_object,
key_object,
never_done=False,
success_thresh=0.01,
target_lid_jpos=-0.05,
completion_bonus=0,
c_jdist=0.2,
c_xydist=0.8):
self._lid_object = lid_object
self._key_object = key_object
self._never_done = never_done
self._success_thresh = success_thresh
self._target_lid_jpos = target_lid_jpos
self._completion_bonus = completion_bonus
self._c_jdist = c_jdist
self._c_xydist = c_xydist
def compute_reward(self, obs, info):
key_pos = info['world_obs']['{}_position'.format(self._key_object)]
lid_joint_state = info['lid_joint_state']
lock_site = info['hole_site']
grasped = info['grasped_{}'.format(self._key_object)]
dxy_peg2hole = key_pos[:2] - lock_site[:2]
r_jdist = (1 - np.tanh(10. * np.abs(lid_joint_state - self._target_lid_jpos))) * 0.2
r_peg2hole = (1 - np.tanh(np.linalg.norm(dxy_peg2hole))) * self._c_xydist
return int(grasped) * (r_jdist + r_peg2hole)
def is_success(self, obs, info):
if self._never_done:
return False
key_pos = info['world_obs']['{}_position'.format(self._key_object)]
lid_joint_state = info['lid_joint_state']
lock_site = info['hole_site']
grasped = info['grasped_{}'.format(self._key_object)]
dxy_peg2hole = key_pos[:2] - lock_site[:2]
return (grasped and
np.linalg.norm(dxy_peg2hole) < self._success_thresh and
lid_joint_state <= self._target_lid_jpos)
@property
def completion_bonus(self):
return self._completion_bonus
class CloseTask(ComposableTask):
"""
Task to close a toy box lid with an inserted peg.
The task assumes there is already a key object inserted into the lid hole.
Reward function is based on the following heuristics:
- Positive reward for decreased lateral distance between box and lid
- Negative reward for releasing key object
"""
def __init__(self,
lid_object,
key_object,
never_done=False,
success_thresh=0.01,
target_lid_jpos=-0.05,
completion_bonus=0,
c_jdist=0.2,
c_xydist=0.8):
self._lid_object = lid_object
self._key_object = key_object
self._never_done = never_done
self._success_thresh = success_thresh
self._target_lid_jpos = target_lid_jpos
self._completion_bonus = completion_bonus
self._c_jdist = c_jdist
self._c_xydist = c_xydist
def compute_reward(self, obs, info):
key_pos = info['world_obs']['{}_position'.format(self._key_object)]
lid_joint_state = info['lid_joint_state']
lock_site = info['hole_site']
grasped = info['grasped_{}'.format(self._key_object)]
dxy_peg2hole = key_pos[:2] - lock_site[:2]
r_jdist = (1 - np.tanh(10. * np.abs(lid_joint_state - self._target_lid_jpos))) * self._c_jdist
r_peg2hole = (1 - np.tanh(np.linalg.norm(dxy_peg2hole))) * self._c_xydist
return int(grasped) * (r_jdist + r_peg2hole)
def is_success(self, obs, info):
if self._never_done:
return False
key_pos = info['world_obs']['{}_position'.format(self._key_object)]
lid_joint_state = info['lid_joint_state']
lock_site = info['hole_site']
grasped = info['grasped_{}'.format(self._key_object)]
dxy_peg2hole = key_pos[:2] - lock_site[:2]
return (grasped and
np.linalg.norm(dxy_peg2hole) < self._success_thresh and
lid_joint_state >= self._target_lid_jpos)
@property
def completion_bonus(self):
return self._completion_bonus
| 35.142857 | 102 | 0.626853 | 1,113 | 8,364 | 4.362084 | 0.102426 | 0.063028 | 0.058908 | 0.062616 | 0.934089 | 0.934089 | 0.934089 | 0.934089 | 0.934089 | 0.934089 | 0 | 0.01363 | 0.280727 | 8,364 | 237 | 103 | 35.291139 | 0.793384 | 0.143233 | 0 | 0.925926 | 0 | 0 | 0.052894 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.098765 | false | 0 | 0.012346 | 0.024691 | 0.234568 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
fb754a1c96b025de654642b4bbba2d4902f62fe6 | 43,030 | py | Python | test/test_distributed_metrics.py | intel/lp-opt-tool | 130eefa3586b38df6c0ff78cc8807ae273f6a63f | [
"Apache-2.0"
] | 52 | 2020-08-04T04:31:48.000Z | 2020-11-29T02:34:32.000Z | test/test_distributed_metrics.py | intel/lp-opt-tool | 130eefa3586b38df6c0ff78cc8807ae273f6a63f | [
"Apache-2.0"
] | null | null | null | test/test_distributed_metrics.py | intel/lp-opt-tool | 130eefa3586b38df6c0ff78cc8807ae273f6a63f | [
"Apache-2.0"
] | 7 | 2020-08-21T01:08:55.000Z | 2020-11-29T03:36:55.000Z | """Tests for the distributed metrics."""
import os
import signal
import shutil
import subprocess
import unittest
import re
import tensorflow
def build_fake_ut():
fake_ut = """
import numpy as np
import unittest
from neural_compressor.metric import METRICS
from neural_compressor.experimental.metric.f1 import evaluate
from neural_compressor.experimental.metric.evaluate_squad import evaluate as evaluate_squad
from neural_compressor.experimental.metric import bleu
import horovod.tensorflow as hvd
import os
import json
import tensorflow as tf
tf.compat.v1.enable_eager_execution()
class TestMetrics(unittest.TestCase):
@classmethod
def setUpClass(cls):
hvd.init()
if hvd.rank() == 0:
if os.path.exists('anno_0.yaml'):
os.remove('anno_0.yaml')
if os.path.exists('anno_1.yaml'):
os.remove('anno_1.yaml')
if os.path.exists('anno_2.yaml'):
os.remove('anno_2.yaml')
while hvd.rank() == 1:
if not os.path.exists('anno_0.yaml') \\
and not os.path.exists('anno_1.yaml') \\
and not os.path.exists('anno_2.yaml'):
break
@classmethod
def tearDownClass(cls):
if hvd.rank() == 1:
if os.path.exists('anno_0.yaml'):
os.remove('anno_0.yaml')
if os.path.exists('anno_1.yaml'):
os.remove('anno_1.yaml')
if os.path.exists('anno_2.yaml'):
os.remove('anno_2.yaml')
def test_mIOU(self):
metrics = METRICS('tensorflow')
miou = metrics['mIOU']()
miou.hvd = hvd
if hvd.rank() == 0:
preds = np.array([0])
labels = np.array([0])
else:
preds = np.array([0, 1, 1])
labels = np.array([1, 0, 1])
miou.update(preds, labels)
self.assertAlmostEqual(miou.result(), 0.33333334)
miou.reset()
if hvd.rank() == 0:
preds = np.array([0, 0])
labels = np.array([0, 1])
else:
preds = np.array([1, 1])
labels = np.array([1, 1])
miou.update(preds, labels)
self.assertAlmostEqual(miou.result(), 0.58333333)
def test_onnxrt_GLUE(self):
metrics = METRICS('onnxrt_qlinearops')
glue = metrics['GLUE']('mrpc')
glue.hvd = hvd
hvd.init()
preds = [np.array(
[[-3.2443411, 3.0909934],
[2.0500996, -2.3100944],
[1.870293 , -2.0741048],
[-2.8377204, 2.617834],
[2.008347 , -2.0215416],
[-2.9693947, 2.7782154],
[-2.9949608, 2.7887983],
[-3.0623112, 2.8748074]])
]
labels = [np.array([1, 0, 0, 1, 0, 1, 0, 1])]
self.assertRaises(NotImplementedError, glue.update, preds, labels)
preds_2 = [np.array(
[[-3.1296735, 2.8356276],
[-3.172515 , 2.9173899],
[-3.220131 , 3.0916846],
[2.1452675, -1.9398905],
[1.5475761, -1.9101546],
[-2.9797182, 2.721741],
[-3.2052834, 2.9934788],
[-2.7451005, 2.622343]])
]
labels_2 = [np.array([1, 1, 1, 0, 0, 1, 1, 1])]
self.assertRaises(NotImplementedError, glue.update, preds_2, labels_2)
glue.reset()
self.assertRaises(NotImplementedError, glue.update, preds, labels)
def test_tensorflow_F1(self):
metrics = METRICS('tensorflow')
F1 = metrics['F1']()
F1.hvd = hvd
hvd.init()
if hvd.rank() == 0:
preds = [1, 1, 1, 1]
labels = [0, 1, 1, 1]
else:
preds = [1, 1, 1, 1, 1, 1]
labels = [1, 1, 1, 1, 1, 1]
F1.update(preds, labels)
self.assertEqual(F1.result(), 0.9)
def test_squad_evaluate(self):
evaluate.hvd = hvd
hvd.init()
label = [{'paragraphs':\\
[{'qas':[{'answers': [{'answer_start': 177, 'text': 'Denver Broncos'}, \\
{'answer_start': 177, 'text': 'Denver Broncos'}, \\
{'answer_start': 177, 'text': 'Denver Broncos'}], \\
'question': 'Which NFL team represented the AFC at Super Bowl 50?', \\
'id': '56be4db0acb8001400a502ec'}]}]}]
preds = {'56be4db0acb8001400a502ec': 'Denver Broncos'}
f1 = evaluate(preds, label)
self.assertEqual(f1, 100.)
dataset = [{'paragraphs':\\
[{'qas':[{'answers': [{'answer_start': 177, 'text': 'Denver Broncos'}, \\
{'answer_start': 177, 'text': 'Denver Broncos'}, \\
{'answer_start': 177, 'text': 'Denver Broncos'}], \\
'question': 'Which NFL team represented the AFC at Super Bowl 50?', \\
'id': '56be4db0acb8001400a502ec'}]}]}]
predictions = {'56be4db0acb8001400a502ec': 'Denver Broncos'}
f1_squad = evaluate_squad(dataset,predictions)
self.assertEqual(f1_squad['f1'], 100.)
self.assertEqual(f1_squad['exact_match'], 100.)
def test_pytorch_F1(self):
metrics = METRICS('pytorch')
F1 = metrics['F1']()
import horovod.torch as hvd
F1.hvd = hvd
hvd.init()
F1.reset()
if hvd.rank() == 0:
preds = [1]
labels = [2]
else:
preds = [1]
labels = [1, 1]
F1.update(preds, labels)
self.assertEqual(F1.result(), 0.8)
def test_tensorflow_topk(self):
metrics = METRICS('tensorflow')
top1 = metrics['topk']()
top1.reset()
self.assertEqual(top1.result(), 0)
top2 = metrics['topk'](k=2)
top3 = metrics['topk'](k=3)
top1.hvd = hvd
top2.hvd = hvd
top3.hvd = hvd
hvd.init()
if hvd.rank() == 0:
predicts = [[0, 0.2, 0.9, 0.3]]
labels = [[0, 1, 0, 0]]
single_predict = [0, 0.2, 0.9, 0.3]
sparse_labels = [2]
single_label = 2
else:
predicts = [[0, 0.9, 0.8, 0]]
labels = [[0, 0, 1, 0]]
single_predict = [0, 0.2, 0.9, 0.3]
sparse_labels = [2]
single_label = 2
# test functionality of one-hot label
top1.update(predicts, labels)
top2.update(predicts, labels)
top3.update(predicts, labels)
self.assertEqual(top1.result(), 0.0)
self.assertEqual(top2.result(), 0.5)
self.assertEqual(top3.result(), 1)
# test functionality of sparse label
top1.reset()
top2.reset()
top3.reset()
top1.update(predicts, sparse_labels)
top2.update(predicts, sparse_labels)
top3.update(predicts, sparse_labels)
self.assertEqual(top1.result(), 0.5)
self.assertEqual(top2.result(), 1)
self.assertEqual(top3.result(), 1)
# test functionality of single label
top1.reset()
top2.reset()
top3.reset()
top1.update(single_predict, single_label)
top2.update(single_predict, single_label)
top3.update(single_predict, single_label)
self.assertEqual(top1.result(), 1)
self.assertEqual(top2.result(), 1)
self.assertEqual(top3.result(), 1)
def test_tensorflow_mAP(self):
metrics = METRICS('tensorflow')
fake_dict = 'dog: 1'
if hvd.rank() == 0:
with open('anno_0.yaml', 'w', encoding = "utf-8") as f:
f.write(fake_dict)
while True:
if os.path.exists('anno_0.yaml'):
break
mAP = metrics['mAP']('anno_0.yaml')
mAP.hvd = hvd
self.assertEqual(mAP.category_map_reverse['dog'], 1)
detection = [
np.array([[5]]),
np.array([[5]]),
np.array([[[0.16117382, 0.59801614, 0.81511605, 0.7858219 ],
[0.5589304 , 0. , 0.98301625, 0.520178 ],
[0.62706745, 0.35748824, 0.6892729 , 0.41513762],
[0.40032804, 0.01218696, 0.6924763 , 0.30341768],
[0.62706745, 0.35748824, 0.6892729 , 0.41513762]]]),
np.array([[0.9267181 , 0.8510787 , 0.60418576, 0.35155892, 0.31158054]]),
np.array([[ 1., 67., 51., 79., 47.]])
]
ground_truth = [
np.array([[[0.5633255 , 0.34003124, 0.69857144, 0.4009531 ],
[0.4763466 , 0.7769531 , 0.54334897, 0.9675937 ]]]),
np.array([['a', 'b']]),
np.array([[]]),
np.array([b'000000397133.jpg'])
]
self.assertRaises(NotImplementedError, mAP.update, detection, ground_truth)
detection = [
np.array([[[0.16117382, 0.59801614, 0.81511605, 0.7858219 ],
[0.62706745, 0.35748824, 0.6892729 , 0.41513762]]]),
np.array([[0.9267181 , 0.8510787]]),
np.array([[ 1., 1.]])
]
ground_truth = [
np.array([[[0.16117382, 0.59801614, 0.81511605, 0.7858219 ],
[0.62706745, 0.35748824, 0.6892729 , 0.41513762]]]),
np.array([[b'dog', b'dog']]),
np.array([[]]),
np.array([b'000000397133.jpg'])
]
self.assertRaises(NotImplementedError, mAP.update, detection, ground_truth)
mAP.result()
self.assertEqual(format(mAP.result(), '.5f'),
'0.00000')
detection = [
np.array([[[0.16117382, 0.59801614, 0.81511605, 0.7858219 ],
[0.5589304 , 0. , 0.98301625, 0.520178 ],
[0.62706745, 0.35748824, 0.6892729 , 0.41513762],
[0.40032804, 0.01218696, 0.6924763 , 0.30341768],
[0.62706745, 0.35748824, 0.6892729 , 0.41513762]]]),
np.array([[0.9267181 , 0.8510787 , 0.60418576, 0.35155892, 0.31158054]]),
np.array([[ 1., 67., 51., 79., 47.]])
]
detection_2 = [
np.array([[8]]),
np.array([[[0.82776225, 0.5865939 , 0.8927653 , 0.6302338 ],
[0.8375764 , 0.6424138 , 0.9055594 , 0.6921875 ],
[0.57902956, 0.39394334, 0.8342961 , 0.5577197 ],
[0.7949219 , 0.6513021 , 0.8472295 , 0.68427753],
[0.809729 , 0.5947042 , 0.8539927 , 0.62916476],
[0.7258591 , 0.08907133, 1. , 0.86224866],
[0.43100086, 0.37782395, 0.8384069 , 0.5616918 ],
[0.32005906, 0.84334356, 1. , 1. ]]]),
np.array([[0.86698544, 0.7562499 , 0.66414887, 0.64498234,\\
0.63083494,0.46618757, 0.3914739 , 0.3094324 ]]),
np.array([[55., 55., 79., 55., 55., 67., 79., 82.]])
]
ground_truth = [
np.array([[[0.5633255 , 0.34003124, 0.69857144, 0.4009531 ],
[0.56262296, 0.0015625 , 1. , 0.5431719 ],
[0.16374707, 0.60728127, 0.813911 , 0.77823436],
[0.5841452 , 0.21182813, 0.65156907, 0.24670312],
[0.8056206 , 0.048875 , 0.90124124, 0.1553125 ],
[0.6729742 , 0.09317187, 0.7696956 , 0.21203125],
[0.3848478 , 0.002125 , 0.61522245, 0.303 ],
[0.61548007, 0. , 0.7015925 , 0.097125 ],
[0.6381967 , 0.1865625 , 0.7184075 , 0.22534375],
[0.6274239 , 0.22104688, 0.71140516, 0.27134374],
[0.39566743, 0.24370313, 0.43578455, 0.284375 ],
[0.2673302 , 0.245625 , 0.3043794 , 0.27353126],
[0.7137705 , 0.15429688, 0.726815 , 0.17114063],
[0.6003747 , 0.25942189, 0.6438876 , 0.27320313],
[0.68845433, 0.13501562, 0.714637 , 0.17245312],
[0.69358313, 0.10959375, 0.7043091 , 0.12409375],
[0.493911 , 0. , 0.72571427, 0.299 ],
[0.69576114, 0.15107812, 0.70714283, 0.16332813],
[0.4763466 , 0.7769531 , 0.54334897, 0.9675937 ]]]),
np.array([[]]),
np.array([[44, 67, 1, 49, 51, 51, 79, 1, 47, 47, 51, 51,\\
56, 50, 56, 56, 79, 57, 81]]),
np.array([b'000000397133.jpg'])
]
ground_truth_2 = [
np.array([[[0.51508695, 0.2911648 , 0.5903478 , 0.31360796],
[0.9358696 , 0.07528409, 0.99891305, 0.25 ],
[0.8242174 , 0.3309659 , 0.93508697, 0.47301137],
[0.77413046, 0.22599432, 0.9858696 , 0.8179261 ],
[0.32582608, 0.8575 , 0.98426086, 0.9984659 ],
[0.77795655, 0.6268466 , 0.89930433, 0.73434657],
[0.5396087 , 0.39053977, 0.8483913 , 0.5615057 ],
[0.58473915, 0.75661933, 0.5998261 , 0.83579546],
[0.80391306, 0.6129829 , 0.8733478 , 0.66201705],
[0.8737391 , 0.6579546 , 0.943 , 0.7053693 ],
[0.775 , 0.6549716 , 0.8227391 , 0.6882955 ],
[0.8130869 , 0.58292615, 0.90526086, 0.62551135],
[0.7844348 , 0.68735796, 0.98182607, 0.83329546],
[0.872 , 0.6190057 , 0.9306522 , 0.6591761 ]]]),
np.array([[]]),
np.array([[64, 62, 62, 67, 82, 52, 79, 81, 55, 55, 55, 55, 62, 55]]),
np.array([b'000000037777.jpg'])
]
mAP = metrics['mAP']()
self.assertEqual(mAP.result(), 0)
mAP.update(detection, ground_truth)
mAP.update(detection, ground_truth)
self.assertEqual(format(mAP.result(), '.5f'),
'0.18182')
mAP.update(detection_2, ground_truth_2)
self.assertEqual(format(mAP.result(), '.5f'),
'0.20347')
mAP.reset()
mAP.update(detection, ground_truth)
self.assertEqual(format(mAP.result(), '.5f'),
'0.18182')
ground_truth_1 = [
np.array([[[0.51508695, 0.2911648 , 0.5903478 , 0.31360796],
[0.872 , 0.6190057 , 0.9306522 , 0.6591761 ]]]),
np.array([[]]),
np.array([[[64, 62]]]),
np.array([b'000000037777.jpg'])
]
self.assertRaises(ValueError, mAP.update, detection, ground_truth_1)
ground_truth_2 = [
np.array([[[0.51508695, 0.2911648 , 0.5903478 , 0.31360796],
[0.872 , 0.6190057 , 0.9306522 , 0.6591761 ]]]),
np.array([[]]),
np.array([[64]]),
np.array([b'000000037700.jpg'])
]
self.assertRaises(ValueError, mAP.update, detection, ground_truth_2)
detection_1 = [
np.array([[[0.16117382, 0.59801614, 0.81511605, 0.7858219 ],
[0.5589304 , 0. , 0.98301625, 0.520178 ]]]),
np.array([[0.9267181 , 0.8510787 , 0.60418576, 0.35155892, 0.31158054]]),
np.array([[ 1., 67., 51., 79., 47.]])
]
ground_truth_1 = [
np.array([[[0.51508695, 0.2911648 , 0.5903478 , 0.31360796],
[0.872 , 0.6190057 , 0.9306522 , 0.6591761 ]]]),
np.array([[]]),
np.array([[64, 62]]),
np.array([b'000000011.jpg'])
]
self.assertRaises(ValueError, mAP.update, detection_1, ground_truth_1)
ground_truth_2 = [
np.array([[[0.51508695, 0.2911648 , 0.5903478 , 0.31360796],
[0.872 , 0.6190057 , 0.9306522 , 0.6591761 ]]]),
np.array([[]]),
np.array([[64, 62]]),
np.array([b'000000012.jpg'])
]
detection_2 = [
np.array([[[0.16117382, 0.59801614, 0.81511605, 0.7858219 ],
[0.5589304 , 0. , 0.98301625, 0.520178 ]]]),
np.array([[0.9267181 , 0.8510787]]),
np.array([[ 1., 67., 51., 79., 47.]])
]
self.assertRaises(ValueError, mAP.update, detection_2, ground_truth_2)
def test_tensorflow_VOCmAP(self):
metrics = METRICS('tensorflow')
fake_dict = 'dog: 1'
if hvd.rank() == 0:
with open('anno_1.yaml', 'w', encoding = "utf-8") as f:
f.write(fake_dict)
while True:
if os.path.exists('anno_1.yaml'):
break
mAP = metrics['VOCmAP']('anno_1.yaml')
mAP.hvd = hvd
self.assertEqual(mAP.iou_thrs, 0.5)
self.assertEqual(mAP.map_points, 0)
self.assertEqual(mAP.category_map_reverse['dog'], 1)
detection = [
np.array([[5]]),
np.array([[5]]),
np.array([[[0.16117382, 0.59801614, 0.81511605, 0.7858219 ],
[0.5589304 , 0. , 0.98301625, 0.520178 ],
[0.62706745, 0.35748824, 0.6892729 , 0.41513762],
[0.40032804, 0.01218696, 0.6924763 , 0.30341768],
[0.62706745, 0.35748824, 0.6892729 , 0.41513762]]]),
np.array([[0.9267181 , 0.8510787 , 0.60418576, 0.35155892, 0.31158054]]),
np.array([[ 1., 67., 51., 79., 47.]])
]
ground_truth = [
np.array([[[0.5633255 , 0.34003124, 0.69857144, 0.4009531 ],
[0.4763466 , 0.7769531 , 0.54334897, 0.9675937 ]]]),
np.array([['a', 'b']]),
np.array([[]]),
np.array([b'000000397133.jpg'])
]
self.assertRaises(NotImplementedError, mAP.update, detection, ground_truth)
mAP = metrics['VOCmAP']()
detection = [
np.array([[[0.16117382, 0.59801614, 0.81511605, 0.7858219 ],
[0.5589304 , 0. , 0.98301625, 0.520178 ],
[0.62706745, 0.35748824, 0.6892729 , 0.41513762],
[0.40032804, 0.01218696, 0.6924763 , 0.30341768],
[0.62706745, 0.35748824, 0.6892729 , 0.41513762]]]),
np.array([[0.9267181 , 0.8510787 , 0.60418576, 0.35155892, 0.31158054]]),
np.array([[ 1., 67., 51., 79., 47.]])
]
detection_2 = [
np.array([[8]]),
np.array([[[0.82776225, 0.5865939 , 0.8927653 , 0.6302338 ],
[0.8375764 , 0.6424138 , 0.9055594 , 0.6921875 ],
[0.57902956, 0.39394334, 0.8342961 , 0.5577197 ],
[0.7949219 , 0.6513021 , 0.8472295 , 0.68427753],
[0.809729 , 0.5947042 , 0.8539927 , 0.62916476],
[0.7258591 , 0.08907133, 1. , 0.86224866],
[0.43100086, 0.37782395, 0.8384069 , 0.5616918 ],
[0.32005906, 0.84334356, 1. , 1. ]]]),
np.array([[0.86698544, 0.7562499 , 0.66414887, 0.64498234,\\
0.63083494,0.46618757, 0.3914739 , 0.3094324 ]]),
np.array([[55., 55., 79., 55., 55., 67., 79., 82.]])
]
ground_truth = [
np.array([[[0.5633255 , 0.34003124, 0.69857144, 0.4009531 ],
[0.56262296, 0.0015625 , 1. , 0.5431719 ],
[0.16374707, 0.60728127, 0.813911 , 0.77823436],
[0.5841452 , 0.21182813, 0.65156907, 0.24670312],
[0.8056206 , 0.048875 , 0.90124124, 0.1553125 ],
[0.6729742 , 0.09317187, 0.7696956 , 0.21203125],
[0.3848478 , 0.002125 , 0.61522245, 0.303 ],
[0.61548007, 0. , 0.7015925 , 0.097125 ],
[0.6381967 , 0.1865625 , 0.7184075 , 0.22534375],
[0.6274239 , 0.22104688, 0.71140516, 0.27134374],
[0.39566743, 0.24370313, 0.43578455, 0.284375 ],
[0.2673302 , 0.245625 , 0.3043794 , 0.27353126],
[0.7137705 , 0.15429688, 0.726815 , 0.17114063],
[0.6003747 , 0.25942189, 0.6438876 , 0.27320313],
[0.68845433, 0.13501562, 0.714637 , 0.17245312],
[0.69358313, 0.10959375, 0.7043091 , 0.12409375],
[0.493911 , 0. , 0.72571427, 0.299 ],
[0.69576114, 0.15107812, 0.70714283, 0.16332813],
[0.4763466 , 0.7769531 , 0.54334897, 0.9675937 ]]]),
np.array([[]]),
np.array([[44, 67, 1, 49, 51, 51, 79, 1, 47, 47, 51, 51,\\
56, 50, 56, 56, 79, 57, 81]]),
np.array([b'000000397133.jpg'])
]
ground_truth_2 = [
np.array([[[0.51508695, 0.2911648 , 0.5903478 , 0.31360796],
[0.9358696 , 0.07528409, 0.99891305, 0.25 ],
[0.8242174 , 0.3309659 , 0.93508697, 0.47301137],
[0.77413046, 0.22599432, 0.9858696 , 0.8179261 ],
[0.32582608, 0.8575 , 0.98426086, 0.9984659 ],
[0.77795655, 0.6268466 , 0.89930433, 0.73434657],
[0.5396087 , 0.39053977, 0.8483913 , 0.5615057 ],
[0.58473915, 0.75661933, 0.5998261 , 0.83579546],
[0.80391306, 0.6129829 , 0.8733478 , 0.66201705],
[0.8737391 , 0.6579546 , 0.943 , 0.7053693 ],
[0.775 , 0.6549716 , 0.8227391 , 0.6882955 ],
[0.8130869 , 0.58292615, 0.90526086, 0.62551135],
[0.7844348 , 0.68735796, 0.98182607, 0.83329546],
[0.872 , 0.6190057 , 0.9306522 , 0.6591761 ]]]),
np.array([[]]),
np.array([[64, 62, 62, 67, 82, 52, 79, 81, 55, 55, 55, 55, 62, 55]]),
np.array([b'000000037777.jpg'])
]
self.assertEqual(mAP.result(), 0)
mAP.update(detection, ground_truth)
mAP.update(detection, ground_truth)
self.assertEqual(format(mAP.result(), '.5f'),
'0.18182')
mAP.update(detection_2, ground_truth_2)
self.assertEqual(format(mAP.result(), '.5f'),
'0.20347')
mAP.reset()
mAP.update(detection, ground_truth)
self.assertEqual(format(mAP.result(), '.5f'),
'0.18182')
ground_truth_1 = [
np.array([[[0.51508695, 0.2911648 , 0.5903478 , 0.31360796],
[0.872 , 0.6190057 , 0.9306522 , 0.6591761 ]]]),
np.array([[]]),
np.array([[[64, 62]]]),
np.array([b'000000037777.jpg'])
]
self.assertRaises(ValueError, mAP.update, detection, ground_truth_1)
ground_truth_2 = [
np.array([[[0.51508695, 0.2911648 , 0.5903478 , 0.31360796],
[0.872 , 0.6190057 , 0.9306522 , 0.6591761 ]]]),
np.array([[]]),
np.array([[64]]),
np.array([b'000000037700.jpg'])
]
self.assertRaises(ValueError, mAP.update, detection, ground_truth_2)
detection_1 = [
np.array([[[0.16117382, 0.59801614, 0.81511605, 0.7858219 ],
[0.5589304 , 0. , 0.98301625, 0.520178 ]]]),
np.array([[0.9267181 , 0.8510787 , 0.60418576, 0.35155892, 0.31158054]]),
np.array([[ 1., 67., 51., 79., 47.]])
]
ground_truth_1 = [
np.array([[[0.51508695, 0.2911648 , 0.5903478 , 0.31360796],
[0.872 , 0.6190057 , 0.9306522 , 0.6591761 ]]]),
np.array([[]]),
np.array([[64, 62]]),
np.array([b'000000011.jpg'])
]
self.assertRaises(ValueError, mAP.update, detection_1, ground_truth_1)
ground_truth_2 = [
np.array([[[0.51508695, 0.2911648 , 0.5903478 , 0.31360796],
[0.872 , 0.6190057 , 0.9306522 , 0.6591761 ]]]),
np.array([[]]),
np.array([[64, 62]]),
np.array([b'000000012.jpg'])
]
detection_2 = [
np.array([[[0.16117382, 0.59801614, 0.81511605, 0.7858219 ],
[0.5589304 , 0. , 0.98301625, 0.520178 ]]]),
np.array([[0.9267181 , 0.8510787]]),
np.array([[ 1., 67., 51., 79., 47.]])
]
self.assertRaises(ValueError, mAP.update, detection_2, ground_truth_2)
def test_tensorflow_COCOmAP(self):
metrics = METRICS('tensorflow')
fake_dict = 'dog: 1'
if hvd.rank() == 0:
with open('anno_2.yaml', 'w', encoding = "utf-8") as f:
f.write(fake_dict)
while True:
if os.path.exists('anno_2.yaml'):
break
mAP = metrics['COCOmAP']('anno_2.yaml')
mAP.hvd = hvd
self.assertEqual(mAP.category_map_reverse['dog'], 1)
detection = [
np.array([[5]]),
np.array([[5]]),
np.array([[[0.16117382, 0.59801614, 0.81511605, 0.7858219 ],
[0.5589304 , 0. , 0.98301625, 0.520178 ],
[0.62706745, 0.35748824, 0.6892729 , 0.41513762],
[0.40032804, 0.01218696, 0.6924763 , 0.30341768],
[0.62706745, 0.35748824, 0.6892729 , 0.41513762]]]),
np.array([[0.9267181 , 0.8510787 , 0.60418576, 0.35155892, 0.31158054]]),
np.array([[ 1., 67., 51., 79., 47.]])
]
ground_truth = [
np.array([[[0.5633255 , 0.34003124, 0.69857144, 0.4009531 ],
[0.4763466 , 0.7769531 , 0.54334897, 0.9675937 ]]]),
np.array([['a', 'b']]),
np.array([[]]),
np.array([b'000000397133.jpg'])
]
self.assertRaises(NotImplementedError, mAP.update, detection, ground_truth)
mAP = metrics['COCOmAP']()
detection = [
np.array([[[0.16117382, 0.59801614, 0.81511605, 0.7858219 ],
[0.5589304 , 0. , 0.98301625, 0.520178 ],
[0.62706745, 0.35748824, 0.6892729 , 0.41513762],
[0.40032804, 0.01218696, 0.6924763 , 0.30341768],
[0.62706745, 0.35748824, 0.6892729 , 0.41513762]]]),
np.array([[0.9267181 , 0.8510787 , 0.60418576, 0.35155892, 0.31158054]]),
np.array([[ 1., 67., 51., 79., 47.]])
]
detection_2 = [
np.array([[8]]),
np.array([[[0.82776225, 0.5865939 , 0.8927653 , 0.6302338 ],
[0.8375764 , 0.6424138 , 0.9055594 , 0.6921875 ],
[0.57902956, 0.39394334, 0.8342961 , 0.5577197 ],
[0.7949219 , 0.6513021 , 0.8472295 , 0.68427753],
[0.809729 , 0.5947042 , 0.8539927 , 0.62916476],
[0.7258591 , 0.08907133, 1. , 0.86224866],
[0.43100086, 0.37782395, 0.8384069 , 0.5616918 ],
[0.32005906, 0.84334356, 1. , 1. ]]]),
np.array([[0.86698544, 0.7562499 , 0.66414887, 0.64498234,\\
0.63083494,0.46618757, 0.3914739 , 0.3094324 ]]),
np.array([[55., 55., 79., 55., 55., 67., 79., 82.]])
]
ground_truth = [
np.array([[[0.5633255 , 0.34003124, 0.69857144, 0.4009531 ],
[0.56262296, 0.0015625 , 1. , 0.5431719 ],
[0.16374707, 0.60728127, 0.813911 , 0.77823436],
[0.5841452 , 0.21182813, 0.65156907, 0.24670312],
[0.8056206 , 0.048875 , 0.90124124, 0.1553125 ],
[0.6729742 , 0.09317187, 0.7696956 , 0.21203125],
[0.3848478 , 0.002125 , 0.61522245, 0.303 ],
[0.61548007, 0. , 0.7015925 , 0.097125 ],
[0.6381967 , 0.1865625 , 0.7184075 , 0.22534375],
[0.6274239 , 0.22104688, 0.71140516, 0.27134374],
[0.39566743, 0.24370313, 0.43578455, 0.284375 ],
[0.2673302 , 0.245625 , 0.3043794 , 0.27353126],
[0.7137705 , 0.15429688, 0.726815 , 0.17114063],
[0.6003747 , 0.25942189, 0.6438876 , 0.27320313],
[0.68845433, 0.13501562, 0.714637 , 0.17245312],
[0.69358313, 0.10959375, 0.7043091 , 0.12409375],
[0.493911 , 0. , 0.72571427, 0.299 ],
[0.69576114, 0.15107812, 0.70714283, 0.16332813],
[0.4763466 , 0.7769531 , 0.54334897, 0.9675937 ]]]),
np.array([[]]),
np.array([[44, 67, 1, 49, 51, 51, 79, 1, 47, 47, 51, 51,\\
56, 50, 56, 56, 79, 57, 81]]),
np.array([b'000000397133.jpg'])
]
ground_truth_2 = [
np.array([[[0.51508695, 0.2911648 , 0.5903478 , 0.31360796],
[0.9358696 , 0.07528409, 0.99891305, 0.25 ],
[0.8242174 , 0.3309659 , 0.93508697, 0.47301137],
[0.77413046, 0.22599432, 0.9858696 , 0.8179261 ],
[0.32582608, 0.8575 , 0.98426086, 0.9984659 ],
[0.77795655, 0.6268466 , 0.89930433, 0.73434657],
[0.5396087 , 0.39053977, 0.8483913 , 0.5615057 ],
[0.58473915, 0.75661933, 0.5998261 , 0.83579546],
[0.80391306, 0.6129829 , 0.8733478 , 0.66201705],
[0.8737391 , 0.6579546 , 0.943 , 0.7053693 ],
[0.775 , 0.6549716 , 0.8227391 , 0.6882955 ],
[0.8130869 , 0.58292615, 0.90526086, 0.62551135],
[0.7844348 , 0.68735796, 0.98182607, 0.83329546],
[0.872 , 0.6190057 , 0.9306522 , 0.6591761 ]]]),
np.array([[]]),
np.array([[64, 62, 62, 67, 82, 52, 79, 81, 55, 55, 55, 55, 62, 55]]),
np.array([b'000000037777.jpg'])
]
self.assertEqual(mAP.result(), 0)
mAP.update(detection, ground_truth)
mAP.update(detection, ground_truth)
self.assertEqual(format(mAP.result(), '.5f'),
'0.14149')
mAP.update(detection_2, ground_truth_2)
self.assertEqual(format(mAP.result(), '.5f'),
'0.13366')
mAP.reset()
mAP.update(detection, ground_truth)
self.assertEqual(format(mAP.result(), '.5f'),
'0.14149')
ground_truth_1 = [
np.array([[[0.51508695, 0.2911648 , 0.5903478 , 0.31360796],
[0.872 , 0.6190057 , 0.9306522 , 0.6591761 ]]]),
np.array([[]]),
np.array([[[64, 62]]]),
np.array([b'000000037777.jpg'])
]
self.assertRaises(ValueError, mAP.update, detection, ground_truth_1)
ground_truth_2 = [
np.array([[[0.51508695, 0.2911648 , 0.5903478 , 0.31360796],
[0.872 , 0.6190057 , 0.9306522 , 0.6591761 ]]]),
np.array([[]]),
np.array([[64]]),
np.array([b'000000037700.jpg'])
]
self.assertRaises(ValueError, mAP.update, detection, ground_truth_2)
detection_1 = [
np.array([[[0.16117382, 0.59801614, 0.81511605, 0.7858219 ],
[0.5589304 , 0. , 0.98301625, 0.520178 ]]]),
np.array([[0.9267181 , 0.8510787 , 0.60418576, 0.35155892, 0.31158054]]),
np.array([[ 1., 67., 51., 79., 47.]])
]
ground_truth_1 = [
np.array([[[0.51508695, 0.2911648 , 0.5903478 , 0.31360796],
[0.872 , 0.6190057 , 0.9306522 , 0.6591761 ]]]),
np.array([[]]),
np.array([[64, 62]]),
np.array([b'000000011.jpg'])
]
self.assertRaises(ValueError, mAP.update, detection_1, ground_truth_1)
ground_truth_2 = [
np.array([[[0.51508695, 0.2911648 , 0.5903478 , 0.31360796],
[0.872 , 0.6190057 , 0.9306522 , 0.6591761 ]]]),
np.array([[]]),
np.array([[64, 62]]),
np.array([b'000000012.jpg'])
]
detection_2 = [
np.array([[[0.16117382, 0.59801614, 0.81511605, 0.7858219 ],
[0.5589304 , 0. , 0.98301625, 0.520178 ]]]),
np.array([[0.9267181 , 0.8510787]]),
np.array([[ 1., 67., 51., 79., 47.]])
]
self.assertRaises(ValueError, mAP.update, detection_2, ground_truth_2)
def test__accuracy(self):
if hvd.rank() == 0:
predicts1 = [1]
labels1 = [0]
predicts2 = [[0, 0]]
labels2 = [[0, 1]]
predicts3 = [[[0, 1], [0, 0], [0, 1]]]
labels3 = [[[0, 1], [0, 1], [1, 0]]]
predicts4 = [[0.2, 0.8]]
labels4 = [0]
else:
predicts1 = [0, 1, 1]
labels1 = [1, 1, 1]
predicts2 = [[0, 0]]
labels2 = [[1, 1]]
predicts3 = [[[0, 1], [0, 1], [0, 1]]]
labels3 = [[[1, 0], [1, 0], [1, 0]]]
predicts4 = [[0.1, 0.9], [0.3, 0.7], [0.4, 0.6]]
labels4 = [1, 0, 0]
import horovod.tensorflow as hvd_tf
metrics = METRICS('tensorflow')
acc = metrics['Accuracy']()
acc.hvd = hvd_tf
acc.update(predicts1, labels1)
acc_result = acc.result()
self.assertEqual(acc_result, 0.5)
acc.reset()
acc.update(predicts2, labels2)
self.assertEqual(acc.result(), 0.25)
acc.reset()
acc.update(predicts3, labels3)
self.assertEqual(acc.result(), 0.25)
acc.reset()
acc.update(predicts4, labels4)
self.assertEqual(acc.result(), 0.25)
acc.reset()
acc.update(1, 1)
self.assertEqual(acc.result(), 1.0)
wrong_predictions = [1, 0, 0]
wrong_labels = [[0, 1, 1]]
self.assertRaises(ValueError, acc.update, wrong_predictions, wrong_labels)
import horovod.torch as hvd_torch
hvd_torch.init()
metrics = METRICS('pytorch')
acc = metrics['Accuracy']()
acc.hvd = hvd_torch
acc.update(predicts1, labels1)
acc_result = acc.result()
self.assertEqual(acc_result, 0.5)
acc.reset()
acc.update(predicts2, labels2)
self.assertEqual(acc.result(), 0.25)
acc.reset()
acc.update(predicts3, labels3)
self.assertEqual(acc.result(), 0.25)
acc.reset()
acc.update(predicts4, labels4)
self.assertEqual(acc.result(), 0.25)
def test_mse(self):
if hvd.rank() == 0:
predicts1 = [1]
labels1 = [0]
predicts2 = [1, 1]
labels2 = [0, 1]
else:
predicts1 = [0, 0, 1]
labels1 = [1, 0, 0]
predicts2 = [1, 1]
labels2 = [1, 0]
import horovod.tensorflow as hvd_tf
metrics = METRICS('tensorflow')
mse = metrics['MSE'](compare_label=False)
mse.hvd = hvd_tf
mse.update(predicts1, labels1)
mse_result = mse.result()
self.assertEqual(mse_result, 0.75)
mse.update(predicts2, labels2)
mse_result = mse.result()
self.assertEqual(mse_result, 0.625)
import horovod.torch as hvd_torch
hvd_torch.init()
metrics = METRICS('pytorch')
mse = metrics['MSE']()
mse.hvd = hvd_torch
mse.update(predicts1, labels1)
mse_result = mse.result()
self.assertEqual(mse_result, 0.75)
mse.update(predicts2, labels2)
mse_result = mse.result()
self.assertEqual(mse_result, 0.625)
def test_mae(self):
if hvd.rank() == 0:
predicts1 = [1]
labels1 = [0]
predicts2 = [1, 1]
labels2 = [1, 1]
else:
predicts1 = [0, 0, 1]
labels1 = [1, 0, 0]
predicts2 = [1, 1]
labels2 = [1, 0]
import horovod.tensorflow as hvd_tf
metrics = METRICS('tensorflow')
mae = metrics['MAE']()
mae.hvd = hvd_tf
mae.update(predicts1, labels1)
mae_result = mae.result()
self.assertEqual(mae_result, 0.75)
if hvd.rank() == 1:
mae.update(0, 1)
mae_result = mae.result()
self.assertEqual(mae_result, 0.8)
mae.reset()
mae.update(predicts2, labels2)
mae_result = mae.result()
self.assertEqual(mae_result, 0.25)
import horovod.torch as hvd_torch
hvd_torch.init()
metrics = METRICS('pytorch')
mae = metrics['MAE']()
mae.hvd = hvd_torch
mae.update(predicts1, labels1)
mae_result = mae.result()
self.assertEqual(mae_result, 0.75)
mae.update(predicts2, labels2)
mae_result = mae.result()
self.assertEqual(mae_result, 0.5)
self.assertRaises(AssertionError, mae.update, [1], [1, 2])
self.assertRaises(AssertionError, mae.update, 1, [1,2])
self.assertRaises(AssertionError, mae.update, [1, 2], [1])
self.assertRaises(AssertionError, mae.update, 1, np.array([1,2]))
def test_rmse(self):
if hvd.rank() == 0:
predicts1 = [1]
labels1 = [1]
predicts2 = [1, 1]
labels2 = [1, 0]
else:
predicts1 = [0, 0, 1]
labels1 = [0, 0, 0]
predicts2 = [1, 1]
labels2 = [0, 0]
import horovod.tensorflow as hvd_tf
metrics = METRICS('tensorflow')
rmse = metrics['RMSE']()
rmse.hvd = hvd_tf
rmse.update(predicts1, labels1)
rmse_result = rmse.result()
self.assertEqual(rmse_result, 0.5)
rmse.reset()
rmse.update(predicts2, labels2)
rmse_result = rmse.result()
self.assertAlmostEqual(rmse_result, np.sqrt(0.75))
import horovod.torch as hvd_torch
hvd_torch.init()
metrics = METRICS('pytorch')
rmse = metrics['RMSE']()
rmse.hvd = hvd_torch
rmse.update(predicts1, labels1)
rmse_result = rmse.result()
self.assertEqual(rmse_result, 0.5)
rmse.update(predicts2, labels2)
rmse_result = rmse.result()
self.assertAlmostEqual(rmse_result, np.sqrt(0.5))
def test_loss(self):
if hvd.rank() == 0:
predicts1 = [1]
labels1 = [0]
predicts2 = [1, 0, 1]
labels2 = [1, 0, 0]
predicts3 = [1, 0]
labels3 = [0, 1]
else:
predicts1 = [0, 0, 1]
labels1 = [1, 0, 0]
predicts2 = [1]
labels2 = [0]
predicts3 = [0, 1]
labels3 = [0, 0]
import horovod.tensorflow as hvd_tf
metrics = METRICS('tensorflow')
loss = metrics['Loss']()
loss.hvd = hvd_tf
loss.update(predicts1, labels1)
loss_result = loss.result()
self.assertEqual(loss_result, 0.5)
loss.update(predicts2, labels2)
loss_result = loss.result()
self.assertEqual(loss_result, 0.625)
loss.reset()
loss.update(predicts3, labels3)
self.assertEqual(loss.result(), 0.5)
import horovod.torch as hvd_torch
hvd_torch.init()
metrics = METRICS('pytorch')
loss = metrics['Loss']()
loss.hvd = hvd_torch
loss.update(predicts1, labels1)
loss_result = loss.result()
self.assertEqual(loss_result, 0.5)
loss.update(predicts2, labels2)
loss_result = loss.result()
self.assertEqual(loss_result, 0.625)
loss.reset()
loss.update(predicts3, labels3)
self.assertEqual(loss.result(), 0.5)
if __name__ == "__main__":
unittest.main()
"""
with open('fake_ut.py', 'w', encoding="utf-8") as f:
f.write(fake_ut)
class TestDistributed(unittest.TestCase):
@classmethod
def setUpClass(cls):
build_fake_ut()
@classmethod
def tearDownClass(cls):
os.remove('fake_ut.py')
shutil.rmtree('./saved', ignore_errors = True)
shutil.rmtree('runs', ignore_errors = True)
@unittest.skipIf(tensorflow.version.VERSION >= '2.8.0', "Only supports tf 2.7.0 or below")
def test_distributed(self):
distributed_cmd = 'horovodrun -np 2 python fake_ut.py'
p = subprocess.Popen(distributed_cmd, preexec_fn = os.setsid, stdout = subprocess.PIPE,
stderr = subprocess.PIPE, shell=True) # nosec
try:
out, error = p.communicate()
matches = re.findall(r'FAILED', error.decode('utf-8'))
self.assertEqual(matches, [])
matches = re.findall(r'OK', error.decode('utf-8'))
self.assertTrue(len(matches) > 0)
except KeyboardInterrupt:
os.killpg(os.getpgid(p.pid), signal.SIGKILL)
assert 0
if __name__ == "__main__":
unittest.main()
| 43.729675 | 96 | 0.473112 | 4,750 | 43,030 | 4.223368 | 0.098105 | 0.055132 | 0.023528 | 0.015353 | 0.849011 | 0.824535 | 0.796022 | 0.78022 | 0.770899 | 0.762873 | 0 | 0.324909 | 0.377434 | 43,030 | 983 | 97 | 43.774161 | 0.423943 | 0.000953 | 0 | 0.725191 | 0 | 0.019629 | 0.971952 | 0.16719 | 0 | 0 | 0 | 0 | 0.098146 | 1 | 0.004362 | false | 0 | 0.030534 | 0 | 0.035987 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
fb88ed3aa9e5a1c09af74d57352a68aea9d75307 | 4,943 | py | Python | Diagnostics/plotting.py | bongiovimatthew/jigsaw-rl | e9589a78b62a7645fe9bc054f0411230eb249acf | [
"MIT"
] | 1 | 2018-09-11T23:50:38.000Z | 2018-09-11T23:50:38.000Z | Diagnostics/plotting.py | bongiovimatthew/jigsaw-rl | e9589a78b62a7645fe9bc054f0411230eb249acf | [
"MIT"
] | 6 | 2018-09-11T23:46:57.000Z | 2018-09-15T00:33:45.000Z | Diagnostics/plotting.py | bongiovimatthew/jigsaw-rl | e9589a78b62a7645fe9bc054f0411230eb249acf | [
"MIT"
] | null | null | null | import numpy as np
import pandas as pd
from collections import namedtuple
from matplotlib import pyplot as plt
EpisodeStats = namedtuple(
"Stats", ["episode_lengths", "episode_rewards", "episode_running_variance"])
TimestepStats = namedtuple("Stats", ["cumulative_rewards", "regrets"])
def plot_episode_stats(stats, smoothing_window=10, hideplot=False):
# Plot the episode length over time
fig1 = plt.figure(figsize=(10, 5))
plt.plot(stats.episode_lengths)
plt.xlabel("Episode")
plt.ylabel("Episode Length")
plt.title("Episode Length over Time")
if hideplot:
plt.close(fig1)
else:
plt.show(fig1)
# Plot the episode reward over time
fig2 = plt.figure(figsize=(10, 5))
rewards_smoothed = pd.Series(stats.episode_rewards).rolling(
smoothing_window, min_periods=smoothing_window).mean()
plt.plot(rewards_smoothed)
plt.xlabel("Episode")
plt.ylabel("Episode Reward (Smoothed)")
plt.title("Episode Reward over Time (Smoothed over window size {})".format(smoothing_window))
if hideplot:
plt.close(fig2)
else:
plt.show(fig2)
return fig1, fig2
def plot_pgresults(stats, smoothing_window=20, hideplot=False):
# Plot the episode length over time
fig1 = plt.figure(figsize=(10, 5))
plt.plot(stats.episode_lengths)
plt.xlabel("Episode")
plt.ylabel("Episode Length")
plt.title("Episode Length over Time")
if hideplot:
plt.close(fig1)
else:
plt.show(fig1)
# Plot the episode reward over time
fig2 = plt.figure(figsize=(10, 5))
rewards_smoothed = pd.Series(stats.episode_rewards).rolling(
smoothing_window, min_periods=smoothing_window).mean()
plt.plot(rewards_smoothed)
plt.xlabel("Episode")
plt.ylabel("Episode Reward (Smoothed)")
plt.title("Episode Reward over Time (Smoothed over window size {})".format(smoothing_window))
if hideplot:
plt.close(fig2)
else:
plt.show(fig2)
# Plot time steps and episode number
fig3 = plt.figure(figsize=(10, 5))
plt.plot(stats.episode_running_variance)
plt.xlabel("Episode")
plt.ylabel("Running Variance")
plt.title("Running Variance over Time")
if hideplot:
plt.close(fig3)
else:
plt.show(fig3)
# Plot time steps and episode number
fig4 = plt.figure(figsize=(10, 5))
plt.plot(np.arange(len(stats.episode_lengths)), np.cumsum(stats.episode_lengths))
plt.xlabel("Episode")
plt.ylabel("Cumulative Episode Length")
plt.title("Cumulative Episode Length over Time")
if hideplot:
plt.close(fig4)
else:
plt.show(fig4)
return fig1, fig2, fig3, fig4
def plot_dqnresults(stats, smoothing_window=20, hideplot=False):
# Plot the episode length over time
fig1 = plt.figure(figsize=(10, 5))
plt.plot(stats.episode_lengths)
plt.xlabel("Episode")
plt.ylabel("Episode Length")
plt.title("Episode Length over Time")
if hideplot:
plt.close(fig1)
else:
plt.show(fig1)
# Plot the episode reward over time
fig2 = plt.figure(figsize=(10, 5))
rewards_smoothed = pd.Series(stats.episode_rewards).rolling(
smoothing_window, min_periods=smoothing_window).mean()
plt.plot(rewards_smoothed)
plt.xlabel("Episode")
plt.ylabel("Episode Reward (Smoothed)")
plt.title("Episode Reward over Time (Smoothed over window size {})".format(smoothing_window))
if hideplot:
plt.close(fig2)
else:
plt.show(fig2)
# Plot time steps and episode number
fig4 = plt.figure(figsize=(10, 5))
plt.plot(np.arange(len(stats.episode_lengths)), np.cumsum(stats.episode_lengths))
plt.xlabel("Episode")
plt.ylabel("Cumulative Episode Length")
plt.title("Cumulative Episode Length over Time")
if hideplot:
plt.close(fig4)
else:
plt.show(fig4)
return fig1, fig2, fig3, fig4
def plot_reward_regret(stats, smoothing_window=1, hideplot=False):
# Plot the cumulative reward over time
fig1 = plt.figure(figsize=(10, 5))
plt.plot(stats.cumulative_rewards)
plt.xlabel("Timestep")
plt.ylabel("Cumulative Reward")
plt.title("Cumulative Reward over Timestep")
if hideplot:
plt.close(fig1)
else:
plt.show(fig1)
# Plot the regret over time
fig2 = plt.figure(figsize=(10, 5))
plt.plot(stats.regrets)
plt.xlabel("Timestep")
plt.ylabel("Regret")
plt.title("Regret over Timestep")
if hideplot:
plt.close(fig2)
else:
plt.show(fig2)
return fig1, fig2
def plot_arm_rewards(y, hideplot=False):
N = len(y)
x = range(N)
width = 1/1.5
fig1 = plt.figure(figsize=(10, 5))
plt.bar(x, y, width)
plt.xlabel("Arm")
plt.ylabel("Probability")
plt.title("Arm's Reward Distribution")
if hideplot:
plt.close(fig1)
else:
plt.show(fig1)
return fig1
| 28.572254 | 97 | 0.664374 | 651 | 4,943 | 4.970814 | 0.121352 | 0.042027 | 0.059333 | 0.066749 | 0.801298 | 0.777503 | 0.762052 | 0.754017 | 0.740729 | 0.717553 | 0 | 0.024491 | 0.215254 | 4,943 | 172 | 98 | 28.738372 | 0.809745 | 0.075056 | 0 | 0.761194 | 0 | 0 | 0.174781 | 0.005263 | 0 | 0 | 0 | 0 | 0 | 1 | 0.037313 | false | 0 | 0.029851 | 0 | 0.104478 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
fbb0c7886ab94aaa1e00208dd268bcd5ede401b5 | 127 | py | Python | gsfpy/gsfSBSensorSpecific.py | irewolepeter/gsfpy_USM_Implementation | c4614ac3f7d833eb86ea38c7708108b130f96612 | [
"MIT"
] | 7 | 2020-07-01T07:12:19.000Z | 2022-01-20T20:39:57.000Z | gsfpy/gsfSBSensorSpecific.py | irewolepeter/gsfpy_USM_Implementation | c4614ac3f7d833eb86ea38c7708108b130f96612 | [
"MIT"
] | 36 | 2020-06-23T09:10:15.000Z | 2022-03-22T10:27:58.000Z | gsfpy/gsfSBSensorSpecific.py | irewolepeter/gsfpy_USM_Implementation | c4614ac3f7d833eb86ea38c7708108b130f96612 | [
"MIT"
] | 2 | 2021-02-07T13:21:52.000Z | 2021-06-24T19:16:16.000Z | from gsfpy import mirror_default_gsf_version_submodule
mirror_default_gsf_version_submodule(globals(), "gsfSBSensorSpecific")
| 31.75 | 70 | 0.889764 | 15 | 127 | 7 | 0.666667 | 0.247619 | 0.304762 | 0.438095 | 0.609524 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.055118 | 127 | 3 | 71 | 42.333333 | 0.875 | 0 | 0 | 0 | 0 | 0 | 0.149606 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
837707f5f63be25a4ebf9fc9fa14b42c0c9e9b84 | 22 | py | Python | test.py | CCPPSS/python | 575659af9257038aa7d1343a19787c1569a66803 | [
"MulanPSL-1.0"
] | null | null | null | test.py | CCPPSS/python | 575659af9257038aa7d1343a19787c1569a66803 | [
"MulanPSL-1.0"
] | null | null | null | test.py | CCPPSS/python | 575659af9257038aa7d1343a19787c1569a66803 | [
"MulanPSL-1.0"
] | null | null | null | print('c')
print('c')
| 7.333333 | 10 | 0.545455 | 4 | 22 | 3 | 0.5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.090909 | 22 | 2 | 11 | 11 | 0.6 | 0 | 0 | 1 | 0 | 0 | 0.090909 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 7 |
839c31c6f8aba804e3537751511a0b9123504f15 | 2,250 | py | Python | exam/SandClock.py | yani-valeva/Programming-Basics-Python | c553d331ffd210d362df0098bedf28e125a65dbf | [
"MIT"
] | null | null | null | exam/SandClock.py | yani-valeva/Programming-Basics-Python | c553d331ffd210d362df0098bedf28e125a65dbf | [
"MIT"
] | null | null | null | exam/SandClock.py | yani-valeva/Programming-Basics-Python | c553d331ffd210d362df0098bedf28e125a65dbf | [
"MIT"
] | null | null | null | size = int(input())
hours = int(input())
question = 1
current_size = size
current_hour = hours
if hours == 0:
print('*' + ' *' * (current_size - 1))
else:
print('-' + ' -' * (current_size - 1))
for i in range(1, current_hour):
current_size = current_size - 1
if i % 2 == 1:
print('?' * question + '- ' * (current_size - 1) + '-' + '?' * question)
else:
print(' ' * question + '- ' * (current_size - 1) + '-' + ' ' * question)
question = question + 1
current_size = current_size - 1
is_even = True
if hours % 2 == 0:
is_even = False
if hours == 0:
is_even = True
sec_size = size - (hours + 1)
for i in range(1, sec_size + 1):
if is_even == True:
print('?' * question + '* ' * (current_size - 1) + '*' + '?' * question)
is_even = False
else:
print(' ' * question + '* ' * (current_size - 1) + '*' + ' ' * question)
is_even = True
if hours == 0 and i == sec_size - 1:
current_size = current_size - 1
question = question + 1
break
current_size = current_size - 1
question = question + 1
if is_even == True:
print('?' * question + 'o' + '?' * question)
is_even = False
else:
print(' ' * question + 'o' + ' ' * question)
is_even = True
current_size = current_size + 1
question = question - 1
for i in range(1, sec_size + 1):
if is_even == True:
print('?' * question + '- ' * (current_size - 1) + '-' + '?' * question)
is_even = False
else:
print(' ' * question + '- ' * (current_size - 1) + '-' + ' ' * question)
is_even = True
if hours == 0 and i == sec_size - 1:
current_size = current_size + 1
question = question - 1
break
current_size = current_size + 1
question = question - 1
for i in range(1, current_hour):
if is_even == True:
print('?' * question + '* ' * (current_size - 1) + '*' + '?' * question)
is_even = False
else:
print(' ' * question + '* ' * (current_size - 1) + '*' + ' ' * question)
is_even = True
current_size = current_size + 1
question = question - 1
if hours == 0:
print('-' + ' -' * (current_size - 1))
else:
print('*' + ' *' * (current_size - 1)) | 28.125 | 81 | 0.523556 | 273 | 2,250 | 4.124542 | 0.091575 | 0.283304 | 0.213144 | 0.248668 | 0.888988 | 0.846359 | 0.793961 | 0.708703 | 0.678508 | 0.676732 | 0 | 0.029544 | 0.308 | 2,250 | 80 | 82 | 28.125 | 0.693642 | 0 | 0 | 0.857143 | 0 | 0 | 0.025766 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.2 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
790214d60cffa2a494c5dcf9a8ff558464d131ab | 149 | py | Python | src/aijack/defense/ckks/__init__.py | luoshenseeker/AIJack | 4e871a5b3beb4b7c976d38060d6956efcebf880d | [
"MIT"
] | 24 | 2021-11-17T02:16:47.000Z | 2022-03-27T01:04:08.000Z | src/aijack/defense/ckks/__init__.py | luoshenseeker/AIJack | 4e871a5b3beb4b7c976d38060d6956efcebf880d | [
"MIT"
] | 9 | 2021-12-03T06:09:27.000Z | 2022-03-29T06:33:53.000Z | src/aijack/defense/ckks/__init__.py | luoshenseeker/AIJack | 4e871a5b3beb4b7c976d38060d6956efcebf880d | [
"MIT"
] | 5 | 2022-01-12T09:58:04.000Z | 2022-03-17T09:29:04.000Z | from .encoder import CKKSEncoder # noqa: F401
from .encrypter import CKKSEncrypter # noqa: F401
from .plaintext import CKKSPlaintext # noqa: F401
| 37.25 | 50 | 0.778523 | 18 | 149 | 6.444444 | 0.555556 | 0.206897 | 0.206897 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.072 | 0.161074 | 149 | 3 | 51 | 49.666667 | 0.856 | 0.214765 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
791a8c2f507bf3ec800896c79cb0eb3eda872906 | 61 | py | Python | nodes/IGGrid/__init__.py | bertrandboudaud/imagegraph | 7c95b645edeb1a68d56c6c3b19f1ff6fde413afc | [
"MIT"
] | null | null | null | nodes/IGGrid/__init__.py | bertrandboudaud/imagegraph | 7c95b645edeb1a68d56c6c3b19f1ff6fde413afc | [
"MIT"
] | null | null | null | nodes/IGGrid/__init__.py | bertrandboudaud/imagegraph | 7c95b645edeb1a68d56c6c3b19f1ff6fde413afc | [
"MIT"
] | null | null | null | from . import IGGrid
def get():
return IGGrid.IGGrid()
| 10.166667 | 26 | 0.655738 | 8 | 61 | 5 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.229508 | 61 | 5 | 27 | 12.2 | 0.851064 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | true | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 7 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.