hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
85e48a3deb30781c73d1f603ee3f9a48ef12e591 | 42,369 | py | Python | src/resof_pubmed.py | mohitgupta07/tipr-1st-assgn | be2f742de69dbf7c300410c230eaa541d8d0eab8 | [
"MIT"
] | null | null | null | src/resof_pubmed.py | mohitgupta07/tipr-1st-assgn | be2f742de69dbf7c300410c230eaa541d8d0eab8 | [
"MIT"
] | null | null | null | src/resof_pubmed.py | mohitgupta07/tipr-1st-assgn | be2f742de69dbf7c300410c230eaa541d8d0eab8 | [
"MIT"
] | 1 | 2019-02-15T16:44:02.000Z | 2019-02-15T16:44:02.000Z | Python 3.6.0 (v3.6.0:41df79263a11, Dec 23 2016, 08:06:12) [MSC v.1900 64 bit (AMD64)] on win32
Type "copyright", "credits" or "license()" for more information.
>>>
======== RESTART: D:\workspace\tipr\tipr-first-assignment\src\main.py ========
Welcome to the world of high and low dimensions!
>>> init()
Loading Already Stored File
Loading Already Stored File
Loading Already Stored File
Loading Already Stored File
Loading Already Stored File
Loading Already Stored File
>>> sk_fold_crossValidate(X,Y,KNN,10)
StratifiedKFold(n_splits=10, random_state=None, shuffle=False)
precision recall f1-score support
0 0.25 0.01 0.03 444
1 0.42 0.23 0.30 819
2 0.40 0.79 0.53 809
micro avg 0.40 0.40 0.40 2072
macro avg 0.36 0.35 0.29 2072
weighted avg 0.38 0.40 0.33 2072
counter:- 1 ACC: 0.402992277992278
precision recall f1-score support
0 0.19 0.01 0.03 444
1 0.39 0.22 0.28 818
2 0.39 0.77 0.52 809
micro avg 0.39 0.39 0.39 2071
macro avg 0.32 0.33 0.28 2071
weighted avg 0.35 0.39 0.32 2071
counter:- 2 ACC: 0.3901496861419604
precision recall f1-score support
0 0.39 0.02 0.03 444
1 0.42 0.26 0.32 818
2 0.40 0.76 0.53 809
micro avg 0.41 0.41 0.41 2071
macro avg 0.40 0.35 0.29 2071
weighted avg 0.41 0.41 0.34 2071
counter:- 3 ACC: 0.4056011588604539
precision recall f1-score support
0 0.18 0.01 0.02 444
1 0.41 0.25 0.31 818
2 0.39 0.76 0.52 809
micro avg 0.40 0.40 0.40 2071
macro avg 0.33 0.34 0.28 2071
weighted avg 0.36 0.40 0.33 2071
counter:- 4 ACC: 0.39546112988894255
precision recall f1-score support
0 0.16 0.01 0.02 443
1 0.42 0.23 0.30 818
2 0.41 0.81 0.54 809
micro avg 0.41 0.41 0.41 2070
macro avg 0.33 0.35 0.29 2070
weighted avg 0.36 0.41 0.33 2070
counter:- 5 ACC: 0.40869565217391307
precision recall f1-score support
0 0.14 0.01 0.01 443
1 0.41 0.22 0.29 818
2 0.40 0.79 0.53 809
micro avg 0.40 0.40 0.40 2070
macro avg 0.31 0.34 0.28 2070
weighted avg 0.35 0.40 0.32 2070
counter:- 6 ACC: 0.39806763285024155
precision recall f1-score support
0 0.16 0.01 0.02 443
1 0.41 0.24 0.30 818
2 0.39 0.76 0.52 808
micro avg 0.39 0.39 0.39 2069
macro avg 0.32 0.34 0.28 2069
weighted avg 0.35 0.39 0.33 2069
counter:- 7 ACC: 0.39342677622039635
precision recall f1-score support
0 0.35 0.01 0.03 443
1 0.39 0.21 0.27 818
2 0.39 0.77 0.52 808
micro avg 0.39 0.39 0.39 2069
macro avg 0.38 0.33 0.27 2069
weighted avg 0.38 0.39 0.32 2069
counter:- 8 ACC: 0.3895601739971
precision recall f1-score support
0 0.09 0.00 0.01 443
1 0.41 0.23 0.30 818
2 0.40 0.78 0.53 808
micro avg 0.40 0.40 0.40 2069
macro avg 0.30 0.34 0.28 2069
weighted avg 0.33 0.40 0.32 2069
counter:- 9 ACC: 0.3958434026099565
precision recall f1-score support
0 0.07 0.00 0.01 443
1 0.38 0.22 0.28 818
2 0.39 0.77 0.52 808
micro avg 0.39 0.39 0.39 2069
macro avg 0.28 0.33 0.27 2069
weighted avg 0.32 0.39 0.31 2069
counter:- 10 ACC: 0.38569357177380376
Best Result:= 0.39654914625090465
{'best_acc:': 0.39654914625090465}
>>>
sk_fold_crossValidate(X,Y,naiveClass,10)
StratifiedKFold(n_splits=10, random_state=None, shuffle=False)
no. of classes: 3 Data Shape: (18629, 128)
Traceback (most recent call last):
File "<pyshell#3>", line 2, in <module>
sk_fold_crossValidate(X,Y,naiveClass,10)
File "D:\workspace\tipr\tipr-first-assignment\src\main.py", line 40, in sk_fold_crossValidate
tmp=model(X_train, X_test, Y_train, Y_test)
File "D:\workspace\tipr\tipr-first-assignment\src\main.py", line 77, in naiveClass
print(confusion_matrix(Y_test, y_pred))
NameError: name 'y_pred' is not defined
>>>
======== RESTART: D:\workspace\tipr\tipr-first-assignment\src\main.py ========
Welcome to the world of high and low dimensions!
>>> init()
Loading Already Stored File
Loading Already Stored File
Loading Already Stored File
Loading Already Stored File
Loading Already Stored File
Loading Already Stored File
>>> sk_fold_crossValidate(X,Y,naiveClass,10)
StratifiedKFold(n_splits=10, random_state=None, shuffle=False)
no. of classes: 3 Data Shape: (18629, 128)
counter:- 1 ACC: 0.4025096525096525
no. of classes: 3 Data Shape: (18630, 128)
counter:- 2 ACC: 0.40994688556253017
no. of classes: 3 Data Shape: (18630, 128)
counter:- 3 ACC: 0.4398841139546113
no. of classes: 3 Data Shape: (18630, 128)
counter:- 4 ACC: 0.4505070014485756
no. of classes: 3 Data Shape: (18631, 128)
counter:- 5 ACC: 0.4444444444444444
no. of classes: 3 Data Shape: (18631, 128)
counter:- 6 ACC: 0.45169082125603865
no. of classes: 3 Data Shape: (18632, 128)
counter:- 7 ACC: 0.4485258579023683
no. of classes: 3 Data Shape: (18632, 128)
counter:- 8 ACC: 0.4224262928951184
no. of classes: 3 Data Shape: (18632, 128)
counter:- 9 ACC: 0.41324311261478974
no. of classes: 3 Data Shape: (18632, 128)
counter:- 10 ACC: 0.403093281778637
Best Result:= 0.42862714643667654
{'best_acc:': 0.42862714643667654}
>>> sk_fold_crossValidate(X,Y,sknaiveClass,10)
StratifiedKFold(n_splits=10, random_state=None, shuffle=False)
Gaussian Naive Bayes model accuracy(in %): 39.285714285714285
[[ 94 168 182]
[155 351 313]
[159 281 369]]
precision recall f1-score support
0 0.23 0.21 0.22 444
1 0.44 0.43 0.43 819
2 0.43 0.46 0.44 809
micro avg 0.39 0.39 0.39 2072
macro avg 0.37 0.37 0.37 2072
weighted avg 0.39 0.39 0.39 2072
counter:- 1 ACC: 0.39285714285714285
Gaussian Naive Bayes model accuracy(in %): 39.2563978754225
[[107 164 173]
[175 314 329]
[149 268 392]]
precision recall f1-score support
0 0.25 0.24 0.24 444
1 0.42 0.38 0.40 818
2 0.44 0.48 0.46 809
micro avg 0.39 0.39 0.39 2071
macro avg 0.37 0.37 0.37 2071
weighted avg 0.39 0.39 0.39 2071
counter:- 2 ACC: 0.392563978754225
Gaussian Naive Bayes model accuracy(in %): 43.74698213423467
[[109 152 183]
[136 380 302]
[131 261 417]]
precision recall f1-score support
0 0.29 0.25 0.27 444
1 0.48 0.46 0.47 818
2 0.46 0.52 0.49 809
micro avg 0.44 0.44 0.44 2071
macro avg 0.41 0.41 0.41 2071
weighted avg 0.43 0.44 0.43 2071
counter:- 3 ACC: 0.4374698213423467
Gaussian Naive Bayes model accuracy(in %): 45.48527281506519
[[116 169 159]
[134 400 284]
[127 256 426]]
precision recall f1-score support
0 0.31 0.26 0.28 444
1 0.48 0.49 0.49 818
2 0.49 0.53 0.51 809
micro avg 0.45 0.45 0.45 2071
macro avg 0.43 0.43 0.43 2071
weighted avg 0.45 0.45 0.45 2071
counter:- 4 ACC: 0.45485272815065186
Gaussian Naive Bayes model accuracy(in %): 45.507246376811594
[[139 151 153]
[133 369 316]
[103 272 434]]
precision recall f1-score support
0 0.37 0.31 0.34 443
1 0.47 0.45 0.46 818
2 0.48 0.54 0.51 809
micro avg 0.46 0.46 0.46 2070
macro avg 0.44 0.43 0.44 2070
weighted avg 0.45 0.46 0.45 2070
counter:- 5 ACC: 0.45507246376811594
Gaussian Naive Bayes model accuracy(in %): 45.94202898550724
[[132 144 167]
[155 380 283]
[134 236 439]]
precision recall f1-score support
0 0.31 0.30 0.31 443
1 0.50 0.46 0.48 818
2 0.49 0.54 0.52 809
micro avg 0.46 0.46 0.46 2070
macro avg 0.44 0.44 0.43 2070
weighted avg 0.46 0.46 0.46 2070
counter:- 6 ACC: 0.45942028985507244
Gaussian Naive Bayes model accuracy(in %): 43.64427259545674
[[118 157 168]
[153 366 299]
[147 242 419]]
precision recall f1-score support
0 0.28 0.27 0.27 443
1 0.48 0.45 0.46 818
2 0.47 0.52 0.49 808
micro avg 0.44 0.44 0.44 2069
macro avg 0.41 0.41 0.41 2069
weighted avg 0.43 0.44 0.43 2069
counter:- 7 ACC: 0.4364427259545674
Gaussian Naive Bayes model accuracy(in %): 42.67762203963267
[[126 136 181]
[141 351 326]
[146 256 406]]
precision recall f1-score support
0 0.31 0.28 0.29 443
1 0.47 0.43 0.45 818
2 0.44 0.50 0.47 808
micro avg 0.43 0.43 0.43 2069
macro avg 0.41 0.41 0.41 2069
weighted avg 0.43 0.43 0.43 2069
counter:- 8 ACC: 0.42677622039632673
Gaussian Naive Bayes model accuracy(in %): 40.50265828902852
[[104 144 195]
[152 332 334]
[141 265 402]]
precision recall f1-score support
0 0.26 0.23 0.25 443
1 0.45 0.41 0.43 818
2 0.43 0.50 0.46 808
micro avg 0.41 0.41 0.41 2069
macro avg 0.38 0.38 0.38 2069
weighted avg 0.40 0.41 0.40 2069
counter:- 9 ACC: 0.40502658289028515
Gaussian Naive Bayes model accuracy(in %): 37.554374093765105
[[101 163 179]
[173 312 333]
[179 265 364]]
precision recall f1-score support
0 0.22 0.23 0.23 443
1 0.42 0.38 0.40 818
2 0.42 0.45 0.43 808
micro avg 0.38 0.38 0.38 2069
macro avg 0.35 0.35 0.35 2069
weighted avg 0.38 0.38 0.38 2069
counter:- 10 ACC: 0.375543740937651
Best Result:= 0.42360256949063857
{'best_acc:': 0.42360256949063857}
>>> setseed(42)
Traceback (most recent call last):
File "<pyshell#7>", line 1, in <module>
setseed(42)
NameError: name 'setseed' is not defined
>>> set.seed(42)
Traceback (most recent call last):
File "<pyshell#8>", line 1, in <module>
set.seed(42)
AttributeError: type object 'set' has no attribute 'seed'
>>> set(42)
Traceback (most recent call last):
File "<pyshell#9>", line 1, in <module>
set(42)
TypeError: 'int' object is not iterable
>>> seed(42)
Traceback (most recent call last):
File "<pyshell#10>", line 1, in <module>
seed(42)
NameError: name 'seed' is not defined
>>>
======== RESTART: D:\workspace\tipr\tipr-first-assignment\src\main.py ========
Welcome to the world of high and low dimensions!
>>> init()
Loading Already Stored File
Loading Already Stored File
Loading Already Stored File
Loading Already Stored File
Loading Already Stored File
Loading Already Stored File
>>> sk_fold_crossValidate(X,Y,naiveClass,10)
StratifiedKFold(n_splits=10, random_state=None, shuffle=False)
no. of classes: 3 Data Shape: (18629, 128)
counter:- 1 ACC: 0.4025096525096525
no. of classes: 3 Data Shape: (18630, 128)
counter:- 2 ACC: 0.40994688556253017
no. of classes: 3 Data Shape: (18630, 128)
counter:- 3 ACC: 0.4398841139546113
no. of classes: 3 Data Shape: (18630, 128)
counter:- 4 ACC: 0.4505070014485756
no. of classes: 3 Data Shape: (18631, 128)
counter:- 5 ACC: 0.4444444444444444
no. of classes: 3 Data Shape: (18631, 128)
counter:- 6 ACC: 0.45169082125603865
no. of classes: 3 Data Shape: (18632, 128)
counter:- 7 ACC: 0.4485258579023683
no. of classes: 3 Data Shape: (18632, 128)
counter:- 8 ACC: 0.4224262928951184
no. of classes: 3 Data Shape: (18632, 128)
counter:- 9 ACC: 0.41324311261478974
no. of classes: 3 Data Shape: (18632, 128)
counter:- 10 ACC: 0.403093281778637
Best Result:= 0.42862714643667654
{'best_acc:': 0.42862714643667654}
>>> sk_fold_crossValidate(X,Y,sknaiveClass,10)
StratifiedKFold(n_splits=10, random_state=None, shuffle=False)
Gaussian Naive Bayes model accuracy(in %): 39.285714285714285
[[ 94 168 182]
[155 351 313]
[159 281 369]]
precision recall f1-score support
0 0.23 0.21 0.22 444
1 0.44 0.43 0.43 819
2 0.43 0.46 0.44 809
micro avg 0.39 0.39 0.39 2072
macro avg 0.37 0.37 0.37 2072
weighted avg 0.39 0.39 0.39 2072
counter:- 1 ACC: 0.39285714285714285
Gaussian Naive Bayes model accuracy(in %): 39.2563978754225
[[107 164 173]
[175 314 329]
[149 268 392]]
precision recall f1-score support
0 0.25 0.24 0.24 444
1 0.42 0.38 0.40 818
2 0.44 0.48 0.46 809
micro avg 0.39 0.39 0.39 2071
macro avg 0.37 0.37 0.37 2071
weighted avg 0.39 0.39 0.39 2071
counter:- 2 ACC: 0.392563978754225
Gaussian Naive Bayes model accuracy(in %): 43.74698213423467
[[109 152 183]
[136 380 302]
[131 261 417]]
precision recall f1-score support
0 0.29 0.25 0.27 444
1 0.48 0.46 0.47 818
2 0.46 0.52 0.49 809
micro avg 0.44 0.44 0.44 2071
macro avg 0.41 0.41 0.41 2071
weighted avg 0.43 0.44 0.43 2071
counter:- 3 ACC: 0.4374698213423467
Gaussian Naive Bayes model accuracy(in %): 45.48527281506519
[[116 169 159]
[134 400 284]
[127 256 426]]
precision recall f1-score support
0 0.31 0.26 0.28 444
1 0.48 0.49 0.49 818
2 0.49 0.53 0.51 809
micro avg 0.45 0.45 0.45 2071
macro avg 0.43 0.43 0.43 2071
weighted avg 0.45 0.45 0.45 2071
counter:- 4 ACC: 0.45485272815065186
Gaussian Naive Bayes model accuracy(in %): 45.507246376811594
[[139 151 153]
[133 369 316]
[103 272 434]]
precision recall f1-score support
0 0.37 0.31 0.34 443
1 0.47 0.45 0.46 818
2 0.48 0.54 0.51 809
micro avg 0.46 0.46 0.46 2070
macro avg 0.44 0.43 0.44 2070
weighted avg 0.45 0.46 0.45 2070
counter:- 5 ACC: 0.45507246376811594
Gaussian Naive Bayes model accuracy(in %): 45.94202898550724
[[132 144 167]
[155 380 283]
[134 236 439]]
precision recall f1-score support
0 0.31 0.30 0.31 443
1 0.50 0.46 0.48 818
2 0.49 0.54 0.52 809
micro avg 0.46 0.46 0.46 2070
macro avg 0.44 0.44 0.43 2070
weighted avg 0.46 0.46 0.46 2070
counter:- 6 ACC: 0.45942028985507244
Gaussian Naive Bayes model accuracy(in %): 43.64427259545674
[[118 157 168]
[153 366 299]
[147 242 419]]
precision recall f1-score support
0 0.28 0.27 0.27 443
1 0.48 0.45 0.46 818
2 0.47 0.52 0.49 808
micro avg 0.44 0.44 0.44 2069
macro avg 0.41 0.41 0.41 2069
weighted avg 0.43 0.44 0.43 2069
counter:- 7 ACC: 0.4364427259545674
Gaussian Naive Bayes model accuracy(in %): 42.67762203963267
[[126 136 181]
[141 351 326]
[146 256 406]]
precision recall f1-score support
0 0.31 0.28 0.29 443
1 0.47 0.43 0.45 818
2 0.44 0.50 0.47 808
micro avg 0.43 0.43 0.43 2069
macro avg 0.41 0.41 0.41 2069
weighted avg 0.43 0.43 0.43 2069
counter:- 8 ACC: 0.42677622039632673
Gaussian Naive Bayes model accuracy(in %): 40.50265828902852
[[104 144 195]
[152 332 334]
[141 265 402]]
precision recall f1-score support
0 0.26 0.23 0.25 443
1 0.45 0.41 0.43 818
2 0.43 0.50 0.46 808
micro avg 0.41 0.41 0.41 2069
macro avg 0.38 0.38 0.38 2069
weighted avg 0.40 0.41 0.40 2069
counter:- 9 ACC: 0.40502658289028515
Gaussian Naive Bayes model accuracy(in %): 37.554374093765105
[[101 163 179]
[173 312 333]
[179 265 364]]
precision recall f1-score support
0 0.22 0.23 0.23 443
1 0.42 0.38 0.40 818
2 0.42 0.45 0.43 808
micro avg 0.38 0.38 0.38 2069
macro avg 0.35 0.35 0.35 2069
weighted avg 0.38 0.38 0.38 2069
counter:- 10 ACC: 0.375543740937651
Best Result:= 0.42360256949063857
{'best_acc:': 0.42360256949063857}
>>> sk_fold_crossValidate(X,Y,skKNN,10)
StratifiedKFold(n_splits=10, random_state=None, shuffle=False)
[[138 162 144]
[235 310 274]
[282 251 276]]
precision recall f1-score support
0 0.21 0.31 0.25 444
1 0.43 0.38 0.40 819
2 0.40 0.34 0.37 809
micro avg 0.35 0.35 0.35 2072
macro avg 0.35 0.34 0.34 2072
weighted avg 0.37 0.35 0.36 2072
counter:- 1 ACC: 0.34942084942084944
[[139 152 153]
[260 302 256]
[244 269 296]]
precision recall f1-score support
0 0.22 0.31 0.26 444
1 0.42 0.37 0.39 818
2 0.42 0.37 0.39 809
micro avg 0.36 0.36 0.36 2071
macro avg 0.35 0.35 0.35 2071
weighted avg 0.38 0.36 0.36 2071
counter:- 2 ACC: 0.355866731047803
[[136 166 142]
[262 300 256]
[256 279 274]]
precision recall f1-score support
0 0.21 0.31 0.25 444
1 0.40 0.37 0.38 818
2 0.41 0.34 0.37 809
micro avg 0.34 0.34 0.34 2071
macro avg 0.34 0.34 0.33 2071
weighted avg 0.36 0.34 0.35 2071
counter:- 3 ACC: 0.34282955094157413
[[148 146 150]
[225 323 270]
[245 268 296]]
precision recall f1-score support
0 0.24 0.33 0.28 444
1 0.44 0.39 0.42 818
2 0.41 0.37 0.39 809
micro avg 0.37 0.37 0.37 2071
macro avg 0.36 0.36 0.36 2071
weighted avg 0.39 0.37 0.38 2071
counter:- 4 ACC: 0.37035248672139065
[[132 150 161]
[241 298 279]
[236 267 306]]
precision recall f1-score support
0 0.22 0.30 0.25 443
1 0.42 0.36 0.39 818
2 0.41 0.38 0.39 809
micro avg 0.36 0.36 0.36 2070
macro avg 0.35 0.35 0.34 2070
weighted avg 0.37 0.36 0.36 2070
counter:- 5 ACC: 0.35555555555555557
[[138 156 149]
[242 296 280]
[255 266 288]]
precision recall f1-score support
0 0.22 0.31 0.26 443
1 0.41 0.36 0.39 818
2 0.40 0.36 0.38 809
micro avg 0.35 0.35 0.35 2070
macro avg 0.34 0.34 0.34 2070
weighted avg 0.37 0.35 0.35 2070
counter:- 6 ACC: 0.34879227053140094
[[155 154 134]
[242 306 270]
[255 277 276]]
precision recall f1-score support
0 0.24 0.35 0.28 443
1 0.42 0.37 0.39 818
2 0.41 0.34 0.37 808
micro avg 0.36 0.36 0.36 2069
macro avg 0.35 0.36 0.35 2069
weighted avg 0.37 0.36 0.36 2069
counter:- 7 ACC: 0.35621072982116964
[[145 156 142]
[269 278 271]
[270 276 262]]
precision recall f1-score support
0 0.21 0.33 0.26 443
1 0.39 0.34 0.36 818
2 0.39 0.32 0.35 808
micro avg 0.33 0.33 0.33 2069
macro avg 0.33 0.33 0.32 2069
weighted avg 0.35 0.33 0.34 2069
counter:- 8 ACC: 0.33107781536974384
[[145 149 149]
[254 295 269]
[275 256 277]]
precision recall f1-score support
0 0.22 0.33 0.26 443
1 0.42 0.36 0.39 818
2 0.40 0.34 0.37 808
micro avg 0.35 0.35 0.35 2069
macro avg 0.35 0.34 0.34 2069
weighted avg 0.37 0.35 0.35 2069
counter:- 9 ACC: 0.34654422426292897
[[137 140 166]
[285 267 266]
[254 269 285]]
precision recall f1-score support
0 0.20 0.31 0.24 443
1 0.39 0.33 0.36 818
2 0.40 0.35 0.37 808
micro avg 0.33 0.33 0.33 2069
macro avg 0.33 0.33 0.33 2069
weighted avg 0.35 0.33 0.34 2069
counter:- 10 ACC: 0.333011116481392
Best Result:= 0.3489661330153808
{'best_acc:': 0.3489661330153808}
>>>
======== RESTART: D:\workspace\tipr\tipr-first-assignment\src\main.py ========
Welcome to the world of high and low dimensions!
>>> init()
Loading Already Stored File
Loading Already Stored File
Loading Already Stored File
Loading Already Stored File
Loading Already Stored File
Loading Already Stored File
>>> sk_fold_crossValidate(X,Y,skKNN,10)
StratifiedKFold(n_splits=10, random_state=None, shuffle=False)
[[138 162 144]
[235 310 274]
[282 251 276]]
precision recall f1-score support
0 0.21 0.31 0.25 444
1 0.43 0.38 0.40 819
2 0.40 0.34 0.37 809
micro avg 0.35 0.35 0.35 2072
macro avg 0.35 0.34 0.34 2072
weighted avg 0.37 0.35 0.36 2072
counter:- 1 ACC: 0.34942084942084944
[[139 152 153]
[260 302 256]
[244 269 296]]
precision recall f1-score support
0 0.22 0.31 0.26 444
1 0.42 0.37 0.39 818
2 0.42 0.37 0.39 809
micro avg 0.36 0.36 0.36 2071
macro avg 0.35 0.35 0.35 2071
weighted avg 0.38 0.36 0.36 2071
counter:- 2 ACC: 0.355866731047803
[[136 166 142]
[262 300 256]
[256 279 274]]
precision recall f1-score support
0 0.21 0.31 0.25 444
1 0.40 0.37 0.38 818
2 0.41 0.34 0.37 809
micro avg 0.34 0.34 0.34 2071
macro avg 0.34 0.34 0.33 2071
weighted avg 0.36 0.34 0.35 2071
counter:- 3 ACC: 0.34282955094157413
[[148 146 150]
[225 323 270]
[245 268 296]]
precision recall f1-score support
0 0.24 0.33 0.28 444
1 0.44 0.39 0.42 818
2 0.41 0.37 0.39 809
micro avg 0.37 0.37 0.37 2071
macro avg 0.36 0.36 0.36 2071
weighted avg 0.39 0.37 0.38 2071
counter:- 4 ACC: 0.37035248672139065
[[132 150 161]
[241 298 279]
[236 267 306]]
precision recall f1-score support
0 0.22 0.30 0.25 443
1 0.42 0.36 0.39 818
2 0.41 0.38 0.39 809
micro avg 0.36 0.36 0.36 2070
macro avg 0.35 0.35 0.34 2070
weighted avg 0.37 0.36 0.36 2070
counter:- 5 ACC: 0.35555555555555557
[[138 156 149]
[242 296 280]
[255 266 288]]
precision recall f1-score support
0 0.22 0.31 0.26 443
1 0.41 0.36 0.39 818
2 0.40 0.36 0.38 809
micro avg 0.35 0.35 0.35 2070
macro avg 0.34 0.34 0.34 2070
weighted avg 0.37 0.35 0.35 2070
counter:- 6 ACC: 0.34879227053140094
[[155 154 134]
[242 306 270]
[255 277 276]]
precision recall f1-score support
0 0.24 0.35 0.28 443
1 0.42 0.37 0.39 818
2 0.41 0.34 0.37 808
micro avg 0.36 0.36 0.36 2069
macro avg 0.35 0.36 0.35 2069
weighted avg 0.37 0.36 0.36 2069
counter:- 7 ACC: 0.35621072982116964
[[145 156 142]
[269 278 271]
[270 276 262]]
precision recall f1-score support
0 0.21 0.33 0.26 443
1 0.39 0.34 0.36 818
2 0.39 0.32 0.35 808
micro avg 0.33 0.33 0.33 2069
macro avg 0.33 0.33 0.32 2069
weighted avg 0.35 0.33 0.34 2069
counter:- 8 ACC: 0.33107781536974384
[[145 149 149]
[254 295 269]
[275 256 277]]
precision recall f1-score support
0 0.22 0.33 0.26 443
1 0.42 0.36 0.39 818
2 0.40 0.34 0.37 808
micro avg 0.35 0.35 0.35 2069
macro avg 0.35 0.34 0.34 2069
weighted avg 0.37 0.35 0.35 2069
counter:- 9 ACC: 0.34654422426292897
[[137 140 166]
[285 267 266]
[254 269 285]]
precision recall f1-score support
0 0.20 0.31 0.24 443
1 0.39 0.33 0.36 818
2 0.40 0.35 0.37 808
micro avg 0.33 0.33 0.33 2069
macro avg 0.33 0.33 0.33 2069
weighted avg 0.35 0.33 0.34 2069
counter:- 10 ACC: 0.333011116481392
Best Result:= 0.3489661330153808
{'best_acc:': 0.3489661330153808}
>>> sk_fold_crossValidate(X,Y,naiveClass,10)
StratifiedKFold(n_splits=10, random_state=None, shuffle=False)
no. of classes: 3 Data Shape: (18629, 128)
counter:- 1 ACC: 0.4025096525096525
no. of classes: 3 Data Shape: (18630, 128)
counter:- 2 ACC: 0.40994688556253017
no. of classes: 3 Data Shape: (18630, 128)
counter:- 3 ACC: 0.4398841139546113
no. of classes: 3 Data Shape: (18630, 128)
counter:- 4 ACC: 0.4505070014485756
no. of classes: 3 Data Shape: (18631, 128)
counter:- 5 ACC: 0.4444444444444444
no. of classes: 3 Data Shape: (18631, 128)
counter:- 6 ACC: 0.45169082125603865
no. of classes: 3 Data Shape: (18632, 128)
counter:- 7 ACC: 0.4485258579023683
no. of classes: 3 Data Shape: (18632, 128)
counter:- 8 ACC: 0.4224262928951184
no. of classes: 3 Data Shape: (18632, 128)
counter:- 9 ACC: 0.41324311261478974
no. of classes: 3 Data Shape: (18632, 128)
counter:- 10 ACC: 0.403093281778637
Best Result:= 0.42862714643667654
{'best_acc:': 0.42862714643667654}
>>> sk_fold_crossValidate(X,Y,sknaiveClass,10)
StratifiedKFold(n_splits=10, random_state=None, shuffle=False)
Gaussian Naive Bayes model accuracy(in %): 39.285714285714285
[[ 94 168 182]
[155 351 313]
[159 281 369]]
precision recall f1-score support
0 0.23 0.21 0.22 444
1 0.44 0.43 0.43 819
2 0.43 0.46 0.44 809
micro avg 0.39 0.39 0.39 2072
macro avg 0.37 0.37 0.37 2072
weighted avg 0.39 0.39 0.39 2072
counter:- 1 ACC: 0.39285714285714285
Gaussian Naive Bayes model accuracy(in %): 39.2563978754225
[[107 164 173]
[175 314 329]
[149 268 392]]
precision recall f1-score support
0 0.25 0.24 0.24 444
1 0.42 0.38 0.40 818
2 0.44 0.48 0.46 809
micro avg 0.39 0.39 0.39 2071
macro avg 0.37 0.37 0.37 2071
weighted avg 0.39 0.39 0.39 2071
counter:- 2 ACC: 0.392563978754225
Gaussian Naive Bayes model accuracy(in %): 43.74698213423467
[[109 152 183]
[136 380 302]
[131 261 417]]
precision recall f1-score support
0 0.29 0.25 0.27 444
1 0.48 0.46 0.47 818
2 0.46 0.52 0.49 809
micro avg 0.44 0.44 0.44 2071
macro avg 0.41 0.41 0.41 2071
weighted avg 0.43 0.44 0.43 2071
counter:- 3 ACC: 0.4374698213423467
Gaussian Naive Bayes model accuracy(in %): 45.48527281506519
[[116 169 159]
[134 400 284]
[127 256 426]]
precision recall f1-score support
0 0.31 0.26 0.28 444
1 0.48 0.49 0.49 818
2 0.49 0.53 0.51 809
micro avg 0.45 0.45 0.45 2071
macro avg 0.43 0.43 0.43 2071
weighted avg 0.45 0.45 0.45 2071
counter:- 4 ACC: 0.45485272815065186
Gaussian Naive Bayes model accuracy(in %): 45.507246376811594
[[139 151 153]
[133 369 316]
[103 272 434]]
precision recall f1-score support
0 0.37 0.31 0.34 443
1 0.47 0.45 0.46 818
2 0.48 0.54 0.51 809
micro avg 0.46 0.46 0.46 2070
macro avg 0.44 0.43 0.44 2070
weighted avg 0.45 0.46 0.45 2070
counter:- 5 ACC: 0.45507246376811594
Gaussian Naive Bayes model accuracy(in %): 45.94202898550724
[[132 144 167]
[155 380 283]
[134 236 439]]
precision recall f1-score support
0 0.31 0.30 0.31 443
1 0.50 0.46 0.48 818
2 0.49 0.54 0.52 809
micro avg 0.46 0.46 0.46 2070
macro avg 0.44 0.44 0.43 2070
weighted avg 0.46 0.46 0.46 2070
counter:- 6 ACC: 0.45942028985507244
Gaussian Naive Bayes model accuracy(in %): 43.64427259545674
[[118 157 168]
[153 366 299]
[147 242 419]]
precision recall f1-score support
0 0.28 0.27 0.27 443
1 0.48 0.45 0.46 818
2 0.47 0.52 0.49 808
micro avg 0.44 0.44 0.44 2069
macro avg 0.41 0.41 0.41 2069
weighted avg 0.43 0.44 0.43 2069
counter:- 7 ACC: 0.4364427259545674
Gaussian Naive Bayes model accuracy(in %): 42.67762203963267
[[126 136 181]
[141 351 326]
[146 256 406]]
precision recall f1-score support
0 0.31 0.28 0.29 443
1 0.47 0.43 0.45 818
2 0.44 0.50 0.47 808
micro avg 0.43 0.43 0.43 2069
macro avg 0.41 0.41 0.41 2069
weighted avg 0.43 0.43 0.43 2069
counter:- 8 ACC: 0.42677622039632673
Gaussian Naive Bayes model accuracy(in %): 40.50265828902852
[[104 144 195]
[152 332 334]
[141 265 402]]
precision recall f1-score support
0 0.26 0.23 0.25 443
1 0.45 0.41 0.43 818
2 0.43 0.50 0.46 808
micro avg 0.41 0.41 0.41 2069
macro avg 0.38 0.38 0.38 2069
weighted avg 0.40 0.41 0.40 2069
counter:- 9 ACC: 0.40502658289028515
Gaussian Naive Bayes model accuracy(in %): 37.554374093765105
[[101 163 179]
[173 312 333]
[179 265 364]]
precision recall f1-score support
0 0.22 0.23 0.23 443
1 0.42 0.38 0.40 818
2 0.42 0.45 0.43 808
micro avg 0.38 0.38 0.38 2069
macro avg 0.35 0.35 0.35 2069
weighted avg 0.38 0.38 0.38 2069
counter:- 10 ACC: 0.375543740937651
Best Result:= 0.42360256949063857
{'best_acc:': 0.42360256949063857}
>>> sk_fold_crossValidate(X,Y,skKNN,10)
StratifiedKFold(n_splits=10, random_state=None, shuffle=False)Traceback (most recent call last):
File "<pyshell#19>", line 1, in <module>
sk_fold_crossValidate(X,Y,skKNN,10)
File "D:\workspace\tipr\tipr-first-assignment\src\main.py", line 46, in sk_fold_crossValidate
tmp=model(X_train, X_test, Y_train, Y_test)
File "D:\workspace\tipr\tipr-first-assignment\src\main.py", line 116, in skKNN
y_pred = classifier.predict(X_test)
File "D:\Program Files\python\python36\lib\site-packages\sklearn\neighbors\classification.py", line 149, in predict
neigh_dist, neigh_ind = self.kneighbors(X)
File "D:\Program Files\python\python36\lib\site-packages\sklearn\neighbors\base.py", line 455, in kneighbors
for s in gen_even_slices(X.shape[0], n_jobs)
File "D:\Program Files\python\python36\lib\site-packages\sklearn\externals\joblib\parallel.py", line 917, in __call__
if self.dispatch_one_batch(iterator):
File "D:\Program Files\python\python36\lib\site-packages\sklearn\externals\joblib\parallel.py", line 759, in dispatch_one_batch
self._dispatch(tasks)
File "D:\Program Files\python\python36\lib\site-packages\sklearn\externals\joblib\parallel.py", line 716, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "D:\Program Files\python\python36\lib\site-packages\sklearn\externals\joblib\_parallel_backends.py", line 182, in apply_async
result = ImmediateResult(func)
File "D:\Program Files\python\python36\lib\site-packages\sklearn\externals\joblib\_parallel_backends.py", line 549, in __init__
self.results = batch()
File "D:\Program Files\python\python36\lib\site-packages\sklearn\externals\joblib\parallel.py", line 225, in __call__
for func, args, kwargs in self.items]
File "D:\Program Files\python\python36\lib\site-packages\sklearn\externals\joblib\parallel.py", line 225, in <listcomp>
for func, args, kwargs in self.items]
File "D:\Program Files\python\python36\lib\site-packages\sklearn\neighbors\base.py", line 292, in _tree_query_parallel_helper
return tree.query(data, n_neighbors, return_distance)
KeyboardInterrupt
>>> sk_fold_crossValidate(X,Y,KNN,10)
StratifiedKFold(n_splits=10, random_state=None, shuffle=False)
precision recall f1-score support
0 0.25 0.01 0.03 444
1 0.42 0.23 0.30 819
2 0.40 0.79 0.53 809
micro avg 0.40 0.40 0.40 2072
macro avg 0.36 0.35 0.29 2072
weighted avg 0.38 0.40 0.33 2072
counter:- 1 ACC: 0.402992277992278
precision recall f1-score support
0 0.19 0.01 0.03 444
1 0.39 0.22 0.28 818
2 0.39 0.77 0.52 809
micro avg 0.39 0.39 0.39 2071
macro avg 0.32 0.33 0.28 2071
weighted avg 0.35 0.39 0.32 2071
counter:- 2 ACC: 0.3901496861419604
precision recall f1-score support
0 0.39 0.02 0.03 444
1 0.42 0.26 0.32 818
2 0.40 0.76 0.53 809
micro avg 0.41 0.41 0.41 2071
macro avg 0.40 0.35 0.29 2071
weighted avg 0.41 0.41 0.34 2071
counter:- 3 ACC: 0.4056011588604539
precision recall f1-score support
0 0.18 0.01 0.02 444
1 0.41 0.25 0.31 818
2 0.39 0.76 0.52 809
micro avg 0.40 0.40 0.40 2071
macro avg 0.33 0.34 0.28 2071
weighted avg 0.36 0.40 0.33 2071
counter:- 4 ACC: 0.39546112988894255
precision recall f1-score support
0 0.16 0.01 0.02 443
1 0.42 0.23 0.30 818
2 0.41 0.81 0.54 809
micro avg 0.41 0.41 0.41 2070
macro avg 0.33 0.35 0.29 2070
weighted avg 0.36 0.41 0.33 2070
counter:- 5 ACC: 0.40869565217391307
precision recall f1-score support
0 0.14 0.01 0.01 443
1 0.41 0.22 0.29 818
2 0.40 0.79 0.53 809
micro avg 0.40 0.40 0.40 2070
macro avg 0.31 0.34 0.28 2070
weighted avg 0.35 0.40 0.32 2070
counter:- 6 ACC: 0.39806763285024155
precision recall f1-score support
0 0.16 0.01 0.02 443
1 0.41 0.24 0.30 818
2 0.39 0.76 0.52 808
micro avg 0.39 0.39 0.39 2069
macro avg 0.32 0.34 0.28 2069
weighted avg 0.35 0.39 0.33 2069
counter:- 7 ACC: 0.39342677622039635
precision recall f1-score support
0 0.35 0.01 0.03 443
1 0.39 0.21 0.27 818
2 0.39 0.77 0.52 808
micro avg 0.39 0.39 0.39 2069
macro avg 0.38 0.33 0.27 2069
weighted avg 0.38 0.39 0.32 2069
counter:- 8 ACC: 0.3895601739971
precision recall f1-score support
0 0.09 0.00 0.01 443
1 0.41 0.23 0.30 818
2 0.40 0.78 0.53 808
micro avg 0.40 0.40 0.40 2069
macro avg 0.30 0.34 0.28 2069
weighted avg 0.33 0.40 0.32 2069
counter:- 9 ACC: 0.3958434026099565
precision recall f1-score support
0 0.07 0.00 0.01 443
1 0.38 0.22 0.28 818
2 0.39 0.77 0.52 808
micro avg 0.39 0.39 0.39 2069
macro avg 0.28 0.33 0.27 2069
weighted avg 0.32 0.39 0.31 2069
counter:- 10 ACC: 0.38569357177380376
Best Result:= 0.39654914625090465
{'best_acc:': 0.39654914625090465}
>>>
| 36.33705 | 132 | 0.492813 | 6,315 | 42,369 | 3.288044 | 0.066825 | 0.040455 | 0.014255 | 0.074167 | 0.960316 | 0.958775 | 0.958775 | 0.951599 | 0.947746 | 0.943508 | 0 | 0.413991 | 0.428804 | 42,369 | 1,165 | 133 | 36.36824 | 0.44399 | 0 | 0 | 0.949153 | 0 | 0.010593 | 0.030234 | 0.022682 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0.001059 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
85f264a652c395a8bb1ca2b6feb55ec663962bb7 | 26,652 | py | Python | mmcls/datasets/cifar.py | TanZheling/mmclassification | 1c3ff80f4f8a0b57a57eb08f2325a0c4befb7201 | [
"Apache-2.0"
] | null | null | null | mmcls/datasets/cifar.py | TanZheling/mmclassification | 1c3ff80f4f8a0b57a57eb08f2325a0c4befb7201 | [
"Apache-2.0"
] | null | null | null | mmcls/datasets/cifar.py | TanZheling/mmclassification | 1c3ff80f4f8a0b57a57eb08f2325a0c4befb7201 | [
"Apache-2.0"
] | null | null | null | import os
import os.path
import pickle
import copy
import numpy as np
import torch.distributed as dist
import mmcv
from mmcv.runner import get_dist_info
from .base_dataset import BaseDataset
from .builder import DATASETS
from .utils import check_integrity, download_and_extract_archive
def x_u_split(num_labeled, labels, replace=False):
class_num = len(set(labels))
label_per_class, surplus = divmod(num_labeled, class_num)
labels = np.array(labels)
labeled_idx = []
for i in range(class_num):
idx = np.where(labels == i)[0]
surplus_i = 1 if surplus > 0 else 0
surplus -= surplus_i
idx = np.random.choice(idx, label_per_class + surplus_i, replace)
labeled_idx.extend(idx)
unlabeled_idx = np.array(list(set(range(len(labels))) - set(labeled_idx)))
labeled_idx = np.array(labeled_idx)
assert len(labeled_idx) == num_labeled
return labeled_idx, unlabeled_idx
@DATASETS.register_module()
class CIFAR10(BaseDataset):
"""`CIFAR10 <https://www.cs.toronto.edu/~kriz/cifar.html>`_ Dataset.
This implementation is modified from
https://github.com/pytorch/vision/blob/master/torchvision/datasets/cifar.py # noqa: E501
"""
base_folder = 'cifar-10-batches-py'
url = 'https://www.cs.toronto.edu/~kriz/cifar-10-python.tar.gz'
filename = 'cifar-10-python.tar.gz'
tgz_md5 = 'c58f30108f718f92721af3b95e74349a'
train_list = [
['data_batch_1', 'c99cafc152244af753f735de768cd75f'],
['data_batch_2', 'd4bba439e000b95fd0a9bffe97cbabec'],
['data_batch_3', '54ebc095f3ab1f0389bbae665268c751'],
['data_batch_4', '634d18415352ddfa80567beed471001a'],
['data_batch_5', '482c414d41f54cd18b22e5b47cb7c3cb'],
]
test_list = [
['test_batch', '40351d587109b95175f43aff81a1287e'],
]
meta = {
'filename': 'batches.meta',
'key': 'label_names',
'md5': '5ff9c542aee3614f3951f8cda6e48888',
}
def __init__(self, mode=None, labeled_ratio=0.5, **kwargs):
assert mode in ['labeled', 'unlabeled', None]
self.mode = mode
self.labeled_ratio = labeled_ratio
super().__init__(**kwargs)
def load_annotations(self):
rank, world_size = get_dist_info()
if rank == 0 and not self._check_integrity():
download_and_extract_archive(
self.url,
self.data_prefix,
filename=self.filename,
md5=self.tgz_md5)
if world_size > 1:
dist.barrier()
assert self._check_integrity(), \
'Shared storage seems unavailable. ' \
f'Please download the dataset manually through {self.url}.'
if not self.test_mode:
downloaded_list = self.train_list
else:
downloaded_list = self.test_list
self.imgs = []
self.gt_labels = []
# load the picked numpy arrays
for file_name, checksum in downloaded_list:
file_path = os.path.join(self.data_prefix, self.base_folder,
file_name)
with open(file_path, 'rb') as f:
entry = pickle.load(f, encoding='latin1')
self.imgs.append(entry['data'])
if 'labels' in entry:
self.gt_labels.extend(entry['labels'])
else:
self.gt_labels.extend(entry['fine_labels'])
self.imgs = np.vstack(self.imgs).reshape(-1, 3, 32, 32)
self.imgs = self.imgs.transpose((0, 2, 3, 1)) # convert to HWC
self._load_meta()
if self.mode is not None:
labeled_idxs, unlabeled_idxs = x_u_split(num_labeled=int(self.labeled_ratio * len(self.imgs)), labels=self.gt_labels)
self.imgs = self.imgs[labeled_idxs] if self.mode == 'labeled' else self.imgs[unlabeled_idxs]
gt_labels = np.array(self.gt_labels)
gt_labels = gt_labels[labeled_idxs] if self.mode == 'labeled' else gt_labels[unlabeled_idxs]
self.gt_labels = gt_labels.tolist()
print('CIFAR{}:'.format(self.base_folder[6:9]), self.imgs.shape, len(self.gt_labels))
data_infos = []
for img, gt_label in zip(self.imgs, self.gt_labels):
gt_label = np.array(gt_label, dtype=np.int64)
info = {'img': img, 'gt_label': gt_label, 'ori_img': copy.deepcopy(img)}
data_infos.append(info)
return data_infos
def _load_meta(self):
path = os.path.join(self.data_prefix, self.base_folder,
self.meta['filename'])
if not check_integrity(path, self.meta['md5']):
raise RuntimeError(
'Dataset metadata file not found or corrupted.' +
' You can use download=True to download it')
with open(path, 'rb') as infile:
data = pickle.load(infile, encoding='latin1')
self.CLASSES = data[self.meta['key']]
def _check_integrity(self):
root = self.data_prefix
for fentry in (self.train_list + self.test_list):
filename, md5 = fentry[0], fentry[1]
fpath = os.path.join(root, self.base_folder, filename)
if not check_integrity(fpath, md5):
return False
return True
@DATASETS.register_module()
class CIFAR100(CIFAR10):
"""`CIFAR100 <https://www.cs.toronto.edu/~kriz/cifar.html>`_ Dataset."""
base_folder = 'cifar-100-python'
url = 'https://www.cs.toronto.edu/~kriz/cifar-100-python.tar.gz'
filename = 'cifar-100-python.tar.gz'
tgz_md5 = 'eb9058c3a382ffc7106e4002c42a8d85'
train_list = [
['train', '16019d7e3df5f24257cddd939b257f8d'],
]
test_list = [
['test', 'f0ef6b0ae62326f3e7ffdfab6717acfc'],
]
meta = {
'filename': 'meta',
'key': 'fine_label_names',
'md5': '7973b15100ade9c7d40fb424638fde48',
}
def get_npy(data_path, is_train, corruption, severity, shuffle=False):
# np.load('data/CIFAR-10-C/train/%s_%d_images.npy' % (corruption, level))
npy_list = []
load_list = []
for c in corruption:
for s in severity:
load_list.append((c, s))
load_list = np.array(load_list)
if shuffle:
order = np.random.permutation(len(corruption) * len(severity))
load_list = load_list[order]
print('Shuffling:', load_list)
for i in load_list:
c, s = i[0], int(i[1])
assert s in [1, 2, 3, 4, 5]
assert c in [
'gaussian_noise', 'shot_noise', 'impulse_noise',
'defocus_blur', 'glass_blur', 'motion_blur',
'zoom_blur', 'snow', 'frost', 'fog', 'brightness',
'contrast', 'elastic_transform', 'pixelate', 'jpeg_compression'
]
if is_train:
len_npy = 50000 # num of training images
npy_all = np.load(data_path + '/train/%s.npy' % c)
else:
len_npy = 10000 # num of testing images
npy_all = np.load(data_path + '/val/%s.npy' % c)
npy_list.append(npy_all[(s - 1) * len_npy: s * len_npy])
if len(npy_list) > 1:
return np.concatenate(tuple(npy_list), axis=0)
else:
return npy_list[0]
def get_npy_cbar(data_path, is_train, corruption, severity, shuffle=False):
npy_list = []
load_list = []
for c in corruption:
for s in severity:
load_list.append((c, s))
load_list = np.array(load_list)
if shuffle:
order = np.random.permutation(len(corruption) * len(severity))
load_list = load_list[order]
print('Shuffling:', load_list)
for i in load_list:
c, s = i[0], int(i[1])
assert s in [1, 2, 3, 4, 5]
assert c in [
"blue_noise_sample", "checkerboard_cutout",
"inverse_sparkles", "pinch_and_twirl",
"ripple", "brownish_noise",
"circular_motion_blur", "lines",
"sparkles", "transverse_chromatic_abberation"
]
if is_train:
len_npy = 50000 # num of training images
npy_all = np.load(data_path + '/%s.npy' % c)
else:
len_npy = 10000 # num of testing images
npy_all = np.load(data_path + '/%s.npy' % c)
npy_list.append(npy_all[(s - 1) * len_npy: s * len_npy])
if len(npy_list) > 1:
return np.concatenate(tuple(npy_list), axis=0)
else:
return npy_list[0]
def get_npy_25(data_path, is_train, corruption, severity, shuffle=False):
npy_list = []
load_list = []
for c in corruption:
for s in severity:
load_list.append((c, s))
load_list = np.array(load_list)
if shuffle:
order = np.random.permutation(len(corruption) * len(severity))
load_list = load_list[order]
print('Shuffling:', load_list)
for i in load_list:
c, s = i[0], int(i[1])
assert s in [1, 2, 3, 4, 5]
assert c in [
'gaussian_noise', 'shot_noise', 'impulse_noise',
'defocus_blur', 'glass_blur', 'motion_blur',
'zoom_blur', 'snow', 'frost', 'fog', 'brightness',
'contrast', 'elastic_transform', 'pixelate', 'jpeg_compression',
"blue_noise_sample", "checkerboard_cutout",
"inverse_sparkles", "pinch_and_twirl",
"ripple", "brownish_noise",
"circular_motion_blur", "lines",
"sparkles", "transverse_chromatic_abberation"
]
if is_train:
len_npy = 50000 # num of training images
npy_all = np.load(data_path + '/%s.npy' % c)
else:
len_npy = 10000 # num of testing images
npy_all = np.load(data_path + '/%s.npy' % c)
npy_list.append(npy_all[(s - 1) * len_npy: s * len_npy])
if len(npy_list) > 1:
return np.concatenate(tuple(npy_list), axis=0)
else:
return npy_list[0]
@DATASETS.register_module()
class CIFAR10C(CIFAR10):
"""`CIFAR10-C <https://github.com/hendrycks/robustness>`_ Dataset.
"""
npy_folder = 'cifar10c'
def __init__(self, corruption, severity, shuffle=False, **kwargs):
if isinstance(corruption, str):
corruption = [corruption]
if isinstance(severity, int):
severity = [severity]
self.corruption, self.severity = corruption, severity
self.shuffle = shuffle
super().__init__(**kwargs)
def load_annotations(self):
if not self._check_integrity():
download_and_extract_archive(
self.url,
self.data_prefix,
filename=self.filename,
md5=self.tgz_md5)
if not self.test_mode:
downloaded_list = self.train_list
else:
downloaded_list = self.test_list
# self.imgs = []
# self.gt_labels = []
gt_labels = []
# load the picked numpy arrays
for file_name, checksum in downloaded_list:
file_path = os.path.join(self.data_prefix, self.base_folder,
file_name)
with open(file_path, 'rb') as f:
entry = pickle.load(f, encoding='latin1')
# self.imgs.append(entry['data'])
if 'labels' in entry:
gt_labels.extend(entry['labels'])
else:
gt_labels.extend(entry['fine_labels'])
# self.imgs = np.vstack(self.imgs).reshape(-1, 3, 32, 32)
# self.imgs = self.imgs.transpose((0, 2, 3, 1)) # convert to HWC
self.imgs = get_npy(
os.path.join(self.data_prefix, self.npy_folder),
not self.test_mode,
self.corruption,
self.severity,
self.shuffle
)
# CORRUPTED DATA is already N x 32 x 32 x 3
len_c, len_s, len_l = len(self.corruption), len(self.severity), len(gt_labels)
self.gt_labels = np.array(gt_labels * len_c * len_s)
print('CIFAR10C', '(test)' if self.test_mode else '(train)', self.corruption, self.severity, self.imgs.shape, len(self.gt_labels))
self._load_meta()
# shuffle data of each single npy
if self.shuffle:
order = []
for c in range(len_c):
for s in range(len_s):
start = (c * len_s + s) * len_l
print('Shuffling: {} to {}'.format(start, start + len_l))
order.append(np.random.permutation([i for i in range(start, start + len_l)]))
order = np.concatenate(order, axis=0)
self.imgs = self.imgs[order]
self.gt_labels = self.gt_labels[order]
data_infos = []
for img, gt_label in zip(self.imgs, self.gt_labels):
gt_label = np.array(gt_label, dtype=np.int64)
# info = {'img': img, 'gt_label': gt_label}
info = {
'img': img,
'gt_label': gt_label,
'ori_img': copy.deepcopy(img)
}
data_infos.append(info)
return data_infos
@DATASETS.register_module()
class CIFAR10CBAR(CIFAR10):
"""`CIFAR10-C-BAR <https://github.com/facebookresearch/augmentation-corruption>`_ Dataset.
"""
npy_folder = 'cifar10cbar'
def __init__(self, corruption, severity, shuffle=False, **kwargs):
if isinstance(corruption, str):
corruption = [corruption]
if isinstance(severity, int):
severity = [severity]
self.corruption, self.severity = corruption, severity
self.shuffle = shuffle
super().__init__(**kwargs)
def load_annotations(self):
if not self._check_integrity():
download_and_extract_archive(
self.url,
self.data_prefix,
filename=self.filename,
md5=self.tgz_md5)
if not self.test_mode:
downloaded_list = self.train_list
else:
downloaded_list = self.test_list
# self.imgs = []
# self.gt_labels = []
gt_labels = []
# load the picked numpy arrays
for file_name, checksum in downloaded_list:
file_path = os.path.join(self.data_prefix, self.base_folder,
file_name)
with open(file_path, 'rb') as f:
entry = pickle.load(f, encoding='latin1')
# self.imgs.append(entry['data'])
if 'labels' in entry:
gt_labels.extend(entry['labels'])
else:
gt_labels.extend(entry['fine_labels'])
# self.imgs = np.vstack(self.imgs).reshape(-1, 3, 32, 32)
# self.imgs = self.imgs.transpose((0, 2, 3, 1)) # convert to HWC
self.imgs = get_npy_cbar(
os.path.join(self.data_prefix, self.npy_folder),
not self.test_mode,
self.corruption,
self.severity,
self.shuffle
)
# CORRUPTED DATA is already N x 32 x 32 x 3
len_c, len_s, len_l = len(self.corruption), len(self.severity), len(gt_labels)
self.gt_labels = np.array(gt_labels * len_c * len_s)
print('CIFAR10CBAR', '(test)' if self.test_mode else '(train)', self.corruption, self.severity, self.imgs.shape, len(self.gt_labels))
self._load_meta()
# shuffle data of each single npy
if self.shuffle:
order = []
for c in range(len_c):
for s in range(len_s):
start = (c * len_s + s) * len_l
print('Shuffling: {} to {}'.format(start, start + len_l))
order.append(np.random.permutation([i for i in range(start, start + len_l)]))
order = np.concatenate(order, axis=0)
self.imgs = self.imgs[order]
self.gt_labels = self.gt_labels[order]
data_infos = []
for img, gt_label in zip(self.imgs, self.gt_labels):
gt_label = np.array(gt_label, dtype=np.int64)
# info = {'img': img, 'gt_label': gt_label}
info = {
'img': img,
'gt_label': gt_label,
'ori_img': copy.deepcopy(img)
}
data_infos.append(info)
return data_infos
@DATASETS.register_module()
class CIFAR10C25(CIFAR10):
"""
`CIFAR10-C <https://github.com/hendrycks/robustness>`_ Dataset.
`CIFAR10-C-BAR <https://github.com/facebookresearch/augmentation-corruption>`_ Dataset.
"""
npy_folder = 'cifar10c25'
def __init__(self, corruption, severity, shuffle=False, **kwargs):
if isinstance(corruption, str):
corruption = [corruption]
if isinstance(severity, int):
severity = [severity]
self.corruption, self.severity = corruption, severity
self.shuffle = shuffle
super().__init__(**kwargs)
def load_annotations(self):
if not self._check_integrity():
download_and_extract_archive(
self.url,
self.data_prefix,
filename=self.filename,
md5=self.tgz_md5)
if not self.test_mode:
downloaded_list = self.train_list
else:
downloaded_list = self.test_list
# self.imgs = []
# self.gt_labels = []
gt_labels = []
# load the picked numpy arrays
for file_name, checksum in downloaded_list:
file_path = os.path.join(self.data_prefix, self.base_folder,
file_name)
with open(file_path, 'rb') as f:
entry = pickle.load(f, encoding='latin1')
# self.imgs.append(entry['data'])
if 'labels' in entry:
gt_labels.extend(entry['labels'])
else:
gt_labels.extend(entry['fine_labels'])
# self.imgs = np.vstack(self.imgs).reshape(-1, 3, 32, 32)
# self.imgs = self.imgs.transpose((0, 2, 3, 1)) # convert to HWC
self.imgs = get_npy_25(
os.path.join(self.data_prefix, self.npy_folder),
not self.test_mode,
self.corruption,
self.severity,
self.shuffle
)
# CORRUPTED DATA is already N x 32 x 32 x 3
len_c, len_s, len_l = len(self.corruption), len(self.severity), len(gt_labels)
self.gt_labels = np.array(gt_labels * len_c * len_s)
print('CIFAR10C25', '(test)' if self.test_mode else '(train)', self.corruption, self.severity, self.imgs.shape, len(self.gt_labels))
self._load_meta()
# shuffle data of each single npy
if self.shuffle:
order = []
for c in range(len_c):
for s in range(len_s):
start = (c * len_s + s) * len_l
print('Shuffling: {} to {}'.format(start, start + len_l))
order.append(np.random.permutation([i for i in range(start, start + len_l)]))
order = np.concatenate(order, axis=0)
self.imgs = self.imgs[order]
self.gt_labels = self.gt_labels[order]
data_infos = []
for img, gt_label in zip(self.imgs, self.gt_labels):
gt_label = np.array(gt_label, dtype=np.int64)
# info = {'img': img, 'gt_label': gt_label}
info = {
'img': img,
'gt_label': gt_label,
'ori_img': copy.deepcopy(img)
}
data_infos.append(info)
return data_infos
@DATASETS.register_module()
class CIFAR100C(CIFAR100):
"""`CIFAR100-C <https://github.com/hendrycks/robustness>`_ Dataset.
"""
base_folder = 'cifar-100-python'
url = 'https://www.cs.toronto.edu/~kriz/cifar-100-python.tar.gz'
filename = 'cifar-100-python.tar.gz'
tgz_md5 = 'eb9058c3a382ffc7106e4002c42a8d85'
train_list = [
['train', '16019d7e3df5f24257cddd939b257f8d'],
]
test_list = [
['test', 'f0ef6b0ae62326f3e7ffdfab6717acfc'],
]
meta = {
'filename': 'meta',
'key': 'fine_label_names',
'md5': '7973b15100ade9c7d40fb424638fde48',
}
npy_folder = 'cifar100c'
def __init__(self, corruption, severity, shuffle=False, shuffle_deep=False, **kwargs):
if isinstance(corruption, str):
corruption = [corruption]
if isinstance(severity, int):
severity = [severity]
self.corruption, self.severity = corruption, severity
self.shuffle = shuffle
self.shuffle_deep = shuffle_deep
super().__init__(**kwargs)
def load_annotations(self):
if not self._check_integrity():
download_and_extract_archive(
self.url,
self.data_prefix,
filename=self.filename,
md5=self.tgz_md5)
if not self.test_mode:
downloaded_list = self.train_list
else:
downloaded_list = self.test_list
gt_labels = []
# load the picked numpy arrays
for file_name, checksum in downloaded_list:
file_path = os.path.join(self.data_prefix, self.base_folder,
file_name)
with open(file_path, 'rb') as f:
entry = pickle.load(f, encoding='latin1')
# self.imgs.append(entry['data'])
if 'labels' in entry:
gt_labels.extend(entry['labels'])
else:
gt_labels.extend(entry['fine_labels'])
# self.imgs = np.vstack(self.imgs).reshape(-1, 3, 32, 32)
# self.imgs = self.imgs.transpose((0, 2, 3, 1)) # convert to HWC
self.imgs = get_npy(
os.path.join(self.data_prefix, self.npy_folder),
not self.test_mode,
self.corruption,
self.severity,
self.shuffle
)
# CORRUPTED DATA is already N x 32 x 32 x 3
len_c, len_s, len_l = len(self.corruption), len(self.severity), len(gt_labels)
self.gt_labels = np.array(gt_labels * len_c * len_s)
print('CIFAR100C', '(test)' if self.test_mode else '(train)', self.corruption, self.severity, self.imgs.shape, len(self.gt_labels))
self._load_meta()
# shuffle data of each single npy
if self.shuffle:
order = []
for c in range(len_c):
for s in range(len_s):
start = (c * len_s + s) * len_l
print('Shuffling: {} to {}'.format(start, start + len_l))
order.append(np.random.permutation([i for i in range(start, start + len_l)]))
order = np.concatenate(order, axis=0)
self.imgs = self.imgs[order]
self.gt_labels = self.gt_labels[order]
elif self.shuffle_deep:
print('Shuffling: {} to {}'.format(0, len_c * len_s * len_l))
order = np.random.permutation([i for i in range(len_c * len_s * len_l)])
self.imgs = self.imgs[order]
self.gt_labels = self.gt_labels[order]
self._load_meta()
data_infos = []
for img, gt_label in zip(self.imgs, self.gt_labels):
gt_label = np.array(gt_label, dtype=np.int64)
# info = {'img': img, 'gt_label': gt_label}
info = {
'img': img,
'gt_label': gt_label,
'ori_img': copy.deepcopy(img)
}
data_infos.append(info)
return data_infos
@DATASETS.register_module()
class CIFAR102(BaseDataset):
"""`CIFAR10.2 <https://github.com/modestyachts/cifar-10.2>`_ Dataset.
This implementation is modified from
https://github.com/pytorch/vision/blob/master/torchvision/datasets/cifar.py # noqa: E501
"""
CLASSES = ['airplane', 'automobile', 'bird', 'cat', 'deer', 'dog', 'frog', 'horse', 'ship', 'truck']
def __init__(self, soft_file=None, **kwargs):
self.soft_file = soft_file
super(CIFAR102, self).__init__(**kwargs)
def load_annotations(self):
# assert self.test_mode # test data only
if not self.test_mode:
npy_path = self.data_prefix + '/cifar102_train.npy'
else:
npy_path = self.data_prefix + '/cifar102_test.npy'
npy_data = np.load(npy_path, allow_pickle=True).item()
self.imgs = npy_data['images'] # Nx32x32x3
self.gt_labels = npy_data['labels']
if self.soft_file is not None:
soft_labels = mmcv.load(self.soft_file)
data_infos = []
for idx, (img, gt_label) in enumerate(zip(self.imgs, self.gt_labels)):
gt_label = np.array(gt_label, dtype=np.int64)
info = {'img': img, 'gt_label': gt_label}
if self.soft_file is not None:
info['gt_logit'] = np.array(soft_labels[idx])
data_infos.append(info)
return data_infos
@DATASETS.register_module()
class CIFAR101(BaseDataset):
"""`CIFAR10.1 <https://github.com/modestyachts/CIFAR-10.1>`_ Dataset.
This implementation is modified from
https://github.com/pytorch/vision/blob/master/torchvision/datasets/cifar.py # noqa: E501
"""
CLASSES = ['airplane', 'automobile', 'bird', 'cat', 'deer', 'dog', 'frog', 'horse', 'ship', 'truck']
def __init__(self, soft_file=None, **kwargs):
self.soft_file = soft_file
super(CIFAR101, self).__init__(**kwargs)
def load_annotations(self):
# assert self.test_mode # test data only
self.imgs = np.load(self.data_prefix + '/cifar10.1_v6_data.npy') # 2000x32x32x3
self.gt_labels = np.load(self.data_prefix + '/cifar10.1_v6_labels.npy')
if self.soft_file is not None:
soft_labels = mmcv.load(self.soft_file)
data_infos = []
for idx, (img, gt_label) in enumerate(zip(self.imgs, self.gt_labels)):
gt_label = np.array(gt_label, dtype=np.int64)
info = {'img': img, 'gt_label': gt_label}
if self.soft_file is not None:
info['gt_logit'] = np.array(soft_labels[idx])
data_infos.append(info)
return data_infos
# axel https://github.com/modestyachts/cifar-10.2/raw/master/cifar102_train.npy;
# axel https://github.com/modestyachts/cifar-10.2/raw/master/cifar102_test.npy;
# axel https://github.com/modestyachts/CIFAR-10.1/raw/master/datasets/cifar10.1_v6_data.npy;
# axel https://github.com/modestyachts/CIFAR-10.1/raw/master/datasets/cifar10.1_v6_labels.npy
# or wget | 36.761379 | 141 | 0.577855 | 3,280 | 26,652 | 4.494207 | 0.092378 | 0.033648 | 0.03012 | 0.011397 | 0.828302 | 0.812903 | 0.802727 | 0.784275 | 0.772675 | 0.760735 | 0 | 0.036963 | 0.301628 | 26,652 | 725 | 142 | 36.761379 | 0.75501 | 0.113387 | 0 | 0.740876 | 0 | 0.005474 | 0.111452 | 0.029312 | 0 | 0 | 0 | 0 | 0.016423 | 1 | 0.036496 | false | 0 | 0.020073 | 0 | 0.149635 | 0.023723 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
c81439b4d095f75b28c19c281c6b4ea5c1948900 | 191 | py | Python | mechmat/properties/viscosity/__init__.py | mecheng/mechmat | 2c3bc43dce85d4827450a8ad69311ca49bb0d035 | [
"MIT"
] | 1 | 2019-05-13T09:19:13.000Z | 2019-05-13T09:19:13.000Z | mechmat/properties/viscosity/__init__.py | mecheng/mechmat | 2c3bc43dce85d4827450a8ad69311ca49bb0d035 | [
"MIT"
] | 25 | 2019-05-24T18:59:38.000Z | 2021-06-01T23:44:25.000Z | mechmat/properties/viscosity/__init__.py | jellespijker/mechmat | 2c3bc43dce85d4827450a8ad69311ca49bb0d035 | [
"MIT"
] | 1 | 2020-09-06T12:38:08.000Z | 2020-09-06T12:38:08.000Z | from mechmat.properties.viscosity.viscosity import Viscosity
from mechmat.properties.viscosity.crossarrhenius import CrossArrhenius
from mechmat.properties.viscosity.crosswlf import CrossWLF
| 47.75 | 70 | 0.890052 | 21 | 191 | 8.095238 | 0.333333 | 0.194118 | 0.370588 | 0.529412 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.062827 | 191 | 3 | 71 | 63.666667 | 0.949721 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
c863c0afcef5c52f1e21403dc0cb6ff119b323d0 | 4,987 | py | Python | consult/views.py | Emmanuel-9/Consultancy | a13a63efcfd733a21e24334e1bf0850d764ac8d4 | [
"MIT"
] | null | null | null | consult/views.py | Emmanuel-9/Consultancy | a13a63efcfd733a21e24334e1bf0850d764ac8d4 | [
"MIT"
] | null | null | null | consult/views.py | Emmanuel-9/Consultancy | a13a63efcfd733a21e24334e1bf0850d764ac8d4 | [
"MIT"
] | null | null | null | from django.shortcuts import render,redirect
from django.http import HttpResponse
from .forms import SubscriptionForm,ContactForm
# Create your views here.
def home(request):
if request.method == 'POST':
form = SubscriptionForm(request.POST)
if form.is_valid():
print('valid')
else:
form = SubscriptionForm()
return render (request,'base.html',{"form": form})
def services(request):
if request.method == 'POST':
form = SubscriptionForm(request.POST)
if form.is_valid():
print('valid')
else:
form = SubscriptionForm()
return render(request, 'services/services.html',{"form": form})
def about(request):
if request.method == 'POST':
form = SubscriptionForm(request.POST)
if form.is_valid():
print('valid')
else:
form = SubscriptionForm()
return render(request, 'about/about.html',{"form": form})
def case_studies(request):
if request.method == 'POST':
form = SubscriptionForm(request.POST)
if form.is_valid():
print('valid')
else:
form = SubscriptionForm()
return render(request, 'case_studies.html',{"form": form})
def questions(request):
if request.method == 'POST':
form = SubscriptionForm(request.POST)
if form.is_valid():
print('valid')
else:
form = SubscriptionForm()
return render(request, 'questions.html',{"form": form})
def blog(request):
if request.method == 'POST':
form = SubscriptionForm(request.POST)
if form.is_valid():
print('valid')
else:
form = SubscriptionForm()
return render(request, 'blog.html',{"form": form})
def approach(request):
if request.method == 'POST':
form = SubscriptionForm(request.POST)
if form.is_valid():
print('valid')
else:
form = SubscriptionForm()
return render(request,'about/approach.html',{"form": form})
def career(request):
if request.method == 'POST':
form = SubscriptionForm(request.POST)
if form.is_valid():
print('valid')
else:
form = SubscriptionForm()
return render(request,'about/careers.html',{"form": form})
def purpose(request):
if request.method == 'POST':
form = SubscriptionForm(request.POST)
if form.is_valid():
print('valid')
else:
form = SubscriptionForm()
return render(request,'about/purpose.html',{"form": form})
def recognition(request):
if request.method == 'POST':
form = SubscriptionForm(request.POST)
if form.is_valid():
print('valid')
else:
form = SubscriptionForm()
return render(request,'about/recognition.html',{"form": form})
def team(request):
if request.method == 'POST':
form = SubscriptionForm(request.POST)
if form.is_valid():
print('valid')
else:
form = SubscriptionForm()
return render(request,'about/team.html',{"form": form})
def building(request):
if request.method == 'POST':
form = SubscriptionForm(request.POST)
if form.is_valid():
print('valid')
else:
form = SubscriptionForm()
return render(request,'services/building.html',{"form": form})
def change(request):
if request.method == 'POST':
form = SubscriptionForm(request.POST)
if form.is_valid():
print('valid')
else:
form = SubscriptionForm()
return render(request,'services/change.html',{"form": form})
def conflict(request):
if request.method == 'POST':
form = SubscriptionForm(request.POST)
if form.is_valid():
print('valid')
else:
form = SubscriptionForm()
return render(request,'services/conflict.html',{"form": form})
def digital(request):
if request.method == 'POST':
form = SubscriptionForm(request.POST)
if form.is_valid():
print('valid')
else:
form = SubscriptionForm()
return render(request,'services/digital.html',{"form": form})
def leadership(request):
if request.method == 'POST':
form = SubscriptionForm(request.POST)
if form.is_valid():
print('valid')
else:
form = SubscriptionForm()
return render(request,'services/leadership.html',{"form": form})
def talent(request):
if request.method == 'POST':
form = SubscriptionForm(request.POST)
if form.is_valid():
print('valid')
else:
form = SubscriptionForm()
return render(request,'services/talent.html',{"form": form})
def contact(request):
if request.method == 'POST':
form = SubscriptionForm(request.POST)
if form.is_valid():
print('valid')
else:
form = SubscriptionForm()
return render(request,'contact.html',{"form": form})
| 28.497143 | 84 | 0.597754 | 521 | 4,987 | 5.683301 | 0.092131 | 0.243161 | 0.097264 | 0.133739 | 0.776765 | 0.776765 | 0.776765 | 0.776765 | 0.776765 | 0.776765 | 0 | 0 | 0.264688 | 4,987 | 175 | 85 | 28.497143 | 0.807472 | 0.004612 | 0 | 0.734694 | 0 | 0 | 0.111626 | 0.026798 | 0 | 0 | 0 | 0 | 0 | 1 | 0.122449 | false | 0 | 0.020408 | 0 | 0.265306 | 0.122449 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
c86ae0e2128b63122e2c092617507430a45b16bc | 8,767 | py | Python | catkin_ws/build/catkin_generated/generate_cached_setup.py | udeto/carla_autoware_final | 479e53f916be1cffa2524eb854bc3d1c47471bc1 | [
"MIT"
] | 2 | 2021-02-03T07:22:12.000Z | 2021-06-24T15:10:52.000Z | catkin_ws/build/catkin_generated/generate_cached_setup.py | udeto/carla_autoware_final | 479e53f916be1cffa2524eb854bc3d1c47471bc1 | [
"MIT"
] | null | null | null | catkin_ws/build/catkin_generated/generate_cached_setup.py | udeto/carla_autoware_final | 479e53f916be1cffa2524eb854bc3d1c47471bc1 | [
"MIT"
] | 2 | 2020-08-24T09:16:31.000Z | 2020-12-08T06:18:23.000Z | # -*- coding: utf-8 -*-
from __future__ import print_function
import argparse
import os
import stat
import sys
# find the import for catkin's python package - either from source space or from an installed underlay
if os.path.exists(os.path.join('/opt/ros/kinetic/share/catkin/cmake', 'catkinConfig.cmake.in')):
sys.path.insert(0, os.path.join('/opt/ros/kinetic/share/catkin/cmake', '..', 'python'))
try:
from catkin.environment_cache import generate_environment_script
except ImportError:
# search for catkin package in all workspaces and prepend to path
for workspace in "/home/lukas/carla/autoware.ai/install/ymc;/home/lukas/carla/autoware.ai/install/xsens_driver;/home/lukas/carla/autoware.ai/install/lattice_planner;/home/lukas/carla/autoware.ai/install/waypoint_planner;/home/lukas/carla/autoware.ai/install/waypoint_maker;/home/lukas/carla/autoware.ai/install/way_planner;/home/lukas/carla/autoware.ai/install/op_utilities;/home/lukas/carla/autoware.ai/install/op_simulation_package;/home/lukas/carla/autoware.ai/install/op_local_planner;/home/lukas/carla/autoware.ai/install/op_global_planner;/home/lukas/carla/autoware.ai/install/lidar_kf_contour_track;/home/lukas/carla/autoware.ai/install/op_ros_helpers;/home/lukas/carla/autoware.ai/install/lane_planner;/home/lukas/carla/autoware.ai/install/ff_waypoint_follower;/home/lukas/carla/autoware.ai/install/dp_planner;/home/lukas/carla/autoware.ai/install/waypoint_follower;/home/lukas/carla/autoware.ai/install/vlg22c_cam;/home/lukas/carla/autoware.ai/install/vision_ssd_detect;/home/lukas/carla/autoware.ai/install/vision_segment_enet_detect;/home/lukas/carla/autoware.ai/install/vision_lane_detect;/home/lukas/carla/autoware.ai/install/vision_darknet_detect;/home/lukas/carla/autoware.ai/install/vision_beyond_track;/home/lukas/carla/autoware.ai/install/vehicle_socket;/home/lukas/carla/autoware.ai/install/vehicle_model;/home/lukas/carla/autoware.ai/install/vehicle_gazebo_simulation_launcher;/home/lukas/carla/autoware.ai/install/vehicle_gazebo_simulation_interface;/home/lukas/carla/autoware.ai/install/vehicle_description;/home/lukas/carla/autoware.ai/install/trafficlight_recognizer;/home/lukas/carla/autoware.ai/install/op_simu;/home/lukas/carla/autoware.ai/install/op_planner;/home/lukas/carla/autoware.ai/install/op_utility;/home/lukas/carla/autoware.ai/install/lidar_euclidean_cluster_detect;/home/lukas/carla/autoware.ai/install/vector_map_server;/home/lukas/carla/autoware.ai/install/road_occupancy_processor;/home/lukas/carla/autoware.ai/install/costmap_generator;/home/lukas/carla/autoware.ai/install/object_map;/home/lukas/carla/autoware.ai/install/naive_motion_predict;/home/lukas/carla/autoware.ai/install/map_file;/home/lukas/carla/autoware.ai/install/libvectormap;/home/lukas/carla/autoware.ai/install/imm_ukf_pda_track;/home/lukas/carla/autoware.ai/install/decision_maker;/home/lukas/carla/autoware.ai/install/vector_map;/home/lukas/carla/autoware.ai/install/vector_map_msgs;/home/lukas/carla/autoware.ai/install/vectacam;/home/lukas/carla/autoware.ai/install/udon_socket;/home/lukas/carla/autoware.ai/install/twist_generator;/home/lukas/carla/autoware.ai/install/tablet_socket;/home/lukas/carla/autoware.ai/install/runtime_manager;/home/lukas/carla/autoware.ai/install/mqtt_socket;/home/lukas/carla/autoware.ai/install/tablet_socket_msgs;/home/lukas/carla/autoware.ai/install/state_machine_lib;/home/lukas/carla/autoware.ai/install/sound_player;/home/lukas/carla/autoware.ai/install/sick_lms5xx;/home/lukas/carla/autoware.ai/install/sick_ldmrs_tools;/home/lukas/carla/autoware.ai/install/sick_ldmrs_driver;/home/lukas/carla/autoware.ai/install/sick_ldmrs_msgs;/home/lukas/carla/autoware.ai/install/sick_ldmrs_description;/home/lukas/carla/autoware.ai/install/points2image;/home/lukas/carla/autoware.ai/install/rosinterface;/home/lukas/carla/autoware.ai/install/rosbag_controller;/home/lukas/carla/autoware.ai/install/roi_object_filter;/home/lukas/carla/autoware.ai/install/range_vision_fusion;/home/lukas/carla/autoware.ai/install/pos_db;/home/lukas/carla/autoware.ai/install/points_preprocessor;/home/lukas/carla/autoware.ai/install/points_downsampler;/home/lukas/carla/autoware.ai/install/pixel_cloud_fusion;/home/lukas/carla/autoware.ai/install/lidar_localizer;/home/lukas/carla/autoware.ai/install/pcl_omp_registration;/home/lukas/carla/autoware.ai/install/pc2_downsampler;/home/lukas/carla/autoware.ai/install/oculus_socket;/home/lukas/carla/autoware.ai/install/obj_db;/home/lukas/carla/autoware.ai/install/nmea_navsat;/home/lukas/carla/autoware.ai/install/ndt_tku;/home/lukas/carla/autoware.ai/install/ndt_cpu;/home/lukas/carla/autoware.ai/install/multi_lidar_calibrator;/home/lukas/carla/autoware.ai/install/microstrain_driver;/home/lukas/carla/autoware.ai/install/memsic_imu;/home/lukas/carla/autoware.ai/install/marker_downsampler;/home/lukas/carla/autoware.ai/install/map_tools;/home/lukas/carla/autoware.ai/install/map_tf_generator;/home/lukas/carla/autoware.ai/install/log_tools;/home/lukas/carla/autoware.ai/install/lidar_shape_estimation;/home/lukas/carla/autoware.ai/install/lidar_point_pillars;/home/lukas/carla/autoware.ai/install/lidar_naive_l_shape_detect;/home/lukas/carla/autoware.ai/install/lidar_fake_perception;/home/lukas/carla/autoware.ai/install/lidar_apollo_cnn_seg_detect;/home/lukas/carla/autoware.ai/install/lgsvl_simulator_bridge;/home/lukas/carla/autoware.ai/install/kvaser;/home/lukas/carla/autoware.ai/install/kitti_launch;/home/lukas/carla/autoware.ai/install/kitti_player;/home/lukas/carla/autoware.ai/install/kitti_box_publisher;/home/lukas/carla/autoware.ai/install/javad_navsat_driver;/home/lukas/carla/autoware.ai/install/integrated_viewer;/home/lukas/carla/autoware.ai/install/image_processor;/home/lukas/carla/autoware.ai/install/hokuyo;/home/lukas/carla/autoware.ai/install/graph_tools;/home/lukas/carla/autoware.ai/install/gnss_localizer;/home/lukas/carla/autoware.ai/install/gnss;/home/lukas/carla/autoware.ai/install/glviewer;/home/lukas/carla/autoware.ai/install/gazebo_world_description;/home/lukas/carla/autoware.ai/install/gazebo_imu_description;/home/lukas/carla/autoware.ai/install/gazebo_camera_description;/home/lukas/carla/autoware.ai/install/garmin;/home/lukas/carla/autoware.ai/install/freespace_planner;/home/lukas/carla/autoware.ai/install/fastvirtualscan;/home/lukas/carla/autoware.ai/install/ekf_localizer;/home/lukas/carla/autoware.ai/install/ds4_msgs;/home/lukas/carla/autoware.ai/install/ds4_driver;/home/lukas/carla/autoware.ai/install/detected_objects_visualizer;/home/lukas/carla/autoware.ai/install/decision_maker_panel;/home/lukas/carla/autoware.ai/install/data_preprocessor;/home/lukas/carla/autoware.ai/install/custom_msgs;/home/lukas/carla/autoware.ai/install/calibration_publisher;/home/lukas/carla/autoware.ai/install/autoware_health_checker;/home/lukas/carla/autoware.ai/install/autoware_system_msgs;/home/lukas/carla/autoware.ai/install/autoware_rviz_plugins;/home/lukas/carla/autoware.ai/install/autoware_quickstart_examples;/home/lukas/carla/autoware.ai/install/autoware_pointgrey_drivers;/home/lukas/carla/autoware.ai/install/autoware_driveworks_interface;/home/lukas/carla/autoware.ai/install/autoware_connector;/home/lukas/carla/autoware.ai/install/autoware_camera_lidar_calibrator;/home/lukas/carla/autoware.ai/install/astar_search;/home/lukas/carla/autoware.ai/install/as;/home/lukas/carla/autoware.ai/install/amathutils_lib;/home/lukas/carla/autoware.ai/install/autoware_msgs;/home/lukas/carla/autoware.ai/install/autoware_map_msgs;/home/lukas/carla/autoware.ai/install/autoware_launcher_rviz;/home/lukas/carla/autoware.ai/install/autoware_launcher;/home/lukas/carla/autoware.ai/install/autoware_external_msgs;/home/lukas/carla/autoware.ai/install/autoware_driveworks_gmsl_interface;/home/lukas/carla/autoware.ai/install/autoware_config_msgs;/home/lukas/carla/autoware.ai/install/autoware_can_msgs;/home/lukas/carla/autoware.ai/install/autoware_build_flags;/home/lukas/carla/autoware.ai/install/autoware_bag_tools;/home/lukas/carla/autoware.ai/install/adi_driver;/opt/ros/kinetic".split(';'):
python_path = os.path.join(workspace, 'lib/python2.7/dist-packages')
if os.path.isdir(os.path.join(python_path, 'catkin')):
sys.path.insert(0, python_path)
break
from catkin.environment_cache import generate_environment_script
code = generate_environment_script('/home/lukas/carla/carla-autoware/catkin_ws/devel/env.sh')
output_filename = '/home/lukas/carla/carla-autoware/catkin_ws/build/catkin_generated/setup_cached.sh'
with open(output_filename, 'w') as f:
#print('Generate script for cached setup "%s"' % output_filename)
f.write('\n'.join(code))
mode = os.stat(output_filename).st_mode
os.chmod(output_filename, mode | stat.S_IXUSR)
| 282.806452 | 7,504 | 0.837459 | 1,332 | 8,767 | 5.346096 | 0.196697 | 0.17315 | 0.269344 | 0.417076 | 0.782755 | 0.782755 | 0.678276 | 0.275804 | 0.02612 | 0 | 0 | 0.001397 | 0.020303 | 8,767 | 30 | 7,505 | 292.233333 | 0.827687 | 0.028516 | 0 | 0.090909 | 0 | 0.045455 | 0.909422 | 0.907307 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.363636 | 0 | 0.363636 | 0.045455 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 8 |
23ac8f412381061a276c8f10434854fd2d05045b | 216 | py | Python | evalai/utils/urls.py | varunagrawal/evalai-cli | aa44574035fccdc63eafe96742fc895621fac085 | [
"BSD-3-Clause"
] | null | null | null | evalai/utils/urls.py | varunagrawal/evalai-cli | aa44574035fccdc63eafe96742fc895621fac085 | [
"BSD-3-Clause"
] | null | null | null | evalai/utils/urls.py | varunagrawal/evalai-cli | aa44574035fccdc63eafe96742fc895621fac085 | [
"BSD-3-Clause"
] | null | null | null | from enum import Enum
class Urls(Enum):
challenge_list = "/api/challenges/challenge/all"
past_challenge_list = "/api/challenges/challenge/past"
future_challenge_list = "/api/challenges/challenge/future"
| 30.857143 | 62 | 0.759259 | 27 | 216 | 5.888889 | 0.444444 | 0.245283 | 0.301887 | 0.490566 | 0.660377 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.12963 | 216 | 6 | 63 | 36 | 0.845745 | 0 | 0 | 0 | 0 | 0 | 0.421296 | 0.421296 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.2 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 7 |
23c5424d8f62bc8620da006b93cf624352460113 | 10,957 | py | Python | Protheus_WebApp/Modules/SIGACTB/CTBA101TESTCASE.py | 98llm/tir-script-samples | 0bff8393b79356aa562e9e6512c11ee6e039b177 | [
"MIT"
] | 17 | 2018-09-24T17:27:08.000Z | 2021-09-16T19:09:46.000Z | Protheus_WebApp/Modules/SIGACTB/CTBA101TESTCASE.py | 98llm/tir-script-samples | 0bff8393b79356aa562e9e6512c11ee6e039b177 | [
"MIT"
] | 4 | 2018-09-24T17:30:32.000Z | 2022-01-03T11:39:30.000Z | Protheus_WebApp/Modules/SIGACTB/CTBA101TESTCASE.py | 98llm/tir-script-samples | 0bff8393b79356aa562e9e6512c11ee6e039b177 | [
"MIT"
] | 18 | 2019-06-07T17:41:34.000Z | 2022-01-31T18:17:31.000Z | from tir import Webapp
import unittest
class CTBA101(unittest.TestCase):
@classmethod
def setUpClass(inst):
inst.oHelper = Webapp()
inst.oHelper.Setup("SIGACTB", "01/01/2019", "T1", "D MG 01", "34")
inst.oHelper.Program("CTBA101")
def test_CTBA101_001(self):
# Estorno
self.oHelper.WaitShow("Lançamento Contábil")
COD = "000000004 01/06/2015"
self.oHelper.SearchBrowse(
f"D MG 01 {COD}", "Filial+cta Debito + Data Lcto")
self.oHelper.SetButton("Outras Ações", "Estornar")
self.oHelper.SetButton("Ok")
self.oHelper.SetButton("Salvar")
self.oHelper.SetButton("Sim")
self.oHelper.AssertTrue()
def test_CTBA101_002(self):
# Excluir
COD = "000000005 02/06/2015"
self.oHelper.SearchBrowse(
f"D MG 01 {COD}", "Filial+cta Credito + Data Lcto")
self.oHelper.SetButton("Outras Ações", "Excluir")
self.oHelper.SetButton("Confirmar")
self.oHelper.SetButton("Sim")
self.oHelper.SearchBrowse(
f"D MG 01 {COD}", "Filial+cta Credito + Data Lcto")
self.oHelper.AssertTrue()
def test_CTBA101_003(self):
# Visualizar
self.oHelper.SetButton("Visualizar")
self.oHelper.SetButton("Confirmar")
self.oHelper.AssertTrue()
def test_CTBA101_004(self):
# Alterar
COD = "000000003 02/01/2020"
self.oHelper.SearchBrowse(
f"D MG 01 {COD}", "Filial+cta Credito + Data Lcto")
self.oHelper.SetButton("Alterar")
self.oHelper.SetButton("Ok")
self.oHelper.ClickFolder("Conversoes")
self.oHelper.CheckResult(
"crit", "Informada", grid=True, line=1)
self.oHelper.LoadGrid()
self.oHelper.SetButton("Salvar")
self.oHelper.SetButton("Sim")
self.oHelper.AssertTrue()
def test_CTBA101_006(self):
# Incluir PARTIDA DOBRADA
self.oHelper.SetButton("Incluir")
self.oHelper.SetBranch("D MG 01")
self.oHelper.SetValue("Data","27062015")
self.oHelper.SetValue("Lote","CBA101")
self.oHelper.SetButton("Ok")
self.oHelper.SetValue("cTipoCTB","3 - Partida Dobrada",name_attr=True)
self.oHelper.SetValue("cHistPad","005",name_attr=True)
self.oHelper.SetValue("cDebito","CTBA101D",name_attr=True)
self.oHelper.SetValue("cCredit","CTBA101C",name_attr=True)
self.oHelper.SetValue("nValorCTB","500,00",name_attr=True)
self.oHelper.SetButton("Salvar")
self.oHelper.SetButton("Sim")
self.oHelper.SetButton("Cancelar")
self.oHelper.AssertTrue()
def test_CTBA101_007(self):
# Alterar
self.oHelper.SearchBrowse(f"D MG 01 28/06/2015211EDC001000001001", "Filial+data Lcto + Numero Lote + Sub Lote + Numero Doc + Numero Linha + Tipo")
# self.oHelper.SearchBrowse(f"D MG 01 {codigo}",key=1,index=True)
self.oHelper.SetButton("Alterar")
self.oHelper.SetButton("Ok")
self.oHelper.SetValue("cHistPad","005",name_attr=True)
self.oHelper.ClickFolder("Conversoes")
self.oHelper.SetValue("Crit", "5 - Nao tem Conversao", grid=True, grid_number=1, row=1)
self.oHelper.SetValue("Crit", "5 - Nao tem Conversao", grid=True, grid_number=1, row=2)
self.oHelper.SetValue("Crit", "5 - Nao tem Conversao", grid=True, grid_number=1, row=3)
self.oHelper.SetValue("Crit", "5 - Nao tem Conversao", grid=True, grid_number=1, row=4)
# self.oHelper.ClickFolder("Conversoes")
# self.oHelper.ClickFolder("Lançamento")
self.oHelper.LoadGrid()
self.oHelper.ClickFolder("Outras Informaçäes")#VERIFICA OUTRAS INFORMAÇÃES
self.oHelper.SetButton("Salvar")
self.oHelper.SetButton("Sim")
self.oHelper.AssertTrue()
def test_CTBA101_008(self):
# Incluir DEBITO
self.oHelper.SetButton("Incluir")
self.oHelper.SetBranch("D MG 01")
self.oHelper.SetValue("Data","23062015")
self.oHelper.SetValue("Lote","CTINDE")
self.oHelper.SetButton("Ok")
self.oHelper.SetValue("cTipoCTB","1 - Debito",name_attr=True)
self.oHelper.SetValue("cDebito","CTBA101D",name_attr=True)
self.oHelper.SetValue("cCustoCrd","000000003",name_attr=True)
self.oHelper.SetValue("cItemCrd","000000001",name_attr=True)
self.oHelper.SetValue("cCLVLCrd","000002",name_attr=True)
self.oHelper.SetButton("Salvar")
self.oHelper.CheckHelp(text_help="NOCTADEB", button="Fechar")
self.oHelper.SetValue("cCustoCrd","",name_attr=True)
self.oHelper.SetButton("Salvar")
self.oHelper.CheckHelp(text_help="NOCTADEB", button="Fechar")
self.oHelper.SetValue("cItemCrd","",name_attr=True)
self.oHelper.SetButton("Salvar")
self.oHelper.CheckHelp(text_help="NOCTADEB", button="Fechar")
self.oHelper.SetValue("cCLVLCrd","",name_attr=True)
self.oHelper.SetButton("Salvar")
self.oHelper.CheckHelp(text_help="NOVALOR", button="Fechar")
self.oHelper.SetValue("nValorCTB","500,00",name_attr=True)
self.oHelper.SetButton("Salvar")
self.oHelper.CheckHelp(text_help="NOHIST", button="Fechar")
self.oHelper.SetValue("cHistPad","005",name_attr=True)
self.oHelper.SetButton("Salvar")
self.oHelper.CheckHelp(text_help="CT2DEB", button="Fechar")
self.oHelper.SetValue("cTpSald","9",name_attr=True)
self.oHelper.SetButton("Salvar")
self.oHelper.SetButton("Sim")
self.oHelper.SetButton("Cancelar")
self.oHelper.AssertTrue()
def test_CTBA101_009(self):
# Incluir CREDITO
self.oHelper.SetButton("Incluir")
self.oHelper.SetBranch("D MG 01")
self.oHelper.SetValue("Data","24062015")
self.oHelper.SetValue("Lote","CTINCR")
self.oHelper.SetButton("Ok")
self.oHelper.SetValue("cTpSald","9",name_attr=True)
self.oHelper.SetValue("cTipoCTB","2 - Credito",name_attr=True)
self.oHelper.SetValue("cCredit","CTBA101C",name_attr=True)
self.oHelper.SetValue("cCustoDeb","000000003",name_attr=True)
self.oHelper.SetValue("cItemDeb","000000001",name_attr=True)
self.oHelper.SetValue("cCLVLDeb","000002",name_attr=True)
self.oHelper.SetValue("nValorCTB","500,00",name_attr=True)
self.oHelper.SetValue("cHistPad","500",name_attr=True)
self.oHelper.CheckHelp(text_help="NOHISTPAD", button="Fechar")
self.oHelper.SetValue("cHistPad","005",name_attr=True)
self.oHelper.SetButton("Salvar")
self.oHelper.CheckHelp(text_help="NOCTACRD", button="Fechar")
self.oHelper.SetValue("cCustoDeb","",name_attr=True)
self.oHelper.SetButton("Salvar")
self.oHelper.CheckHelp(text_help="NOCTACRD", button="Fechar")
self.oHelper.SetValue("cItemDeb","",name_attr=True)
self.oHelper.SetButton("Salvar")
self.oHelper.CheckHelp(text_help="NOCTACRD", button="Fechar")
self.oHelper.SetValue("cCLVLDeb","",name_attr=True)
self.oHelper.SetButton("Salvar")
self.oHelper.SetButton("Sim")
self.oHelper.SetButton("Cancelar")
self.oHelper.AssertTrue()
def test_CTBA101_010(self):
# Incluir PARTIDA DOBRADA
self.oHelper.WaitShow("Lançamento Contábil")
self.oHelper.SetKey("F12")
self.oHelper.WaitShow("Parametros")
self.oHelper.SetValue("Repete lcto anter. ?","Nao")
self.oHelper.SetButton("Ok")
self.oHelper.SetButton("Incluir")
self.oHelper.SetBranch("D MG 01")
self.oHelper.SetValue("Data","25062015")
self.oHelper.SetValue("Lote","CTNRPT")
self.oHelper.SetButton("Ok")
self.oHelper.SetValue("cTipoCTB","3 - Partida Dobrada",name_attr=True)
self.oHelper.SetValue("cHistPad","005",name_attr=True)
self.oHelper.SetValue("cDebito","CTBA101D",name_attr=True)
self.oHelper.SetValue("cCredit","CTBA101C",name_attr=True)
self.oHelper.SetValue("nValorCTB","1000,00",name_attr=True)
self.oHelper.SetButton("Salvar")
self.oHelper.SetButton("Sim")
self.oHelper.SetButton("Cancelar")
self.oHelper.AssertTrue()
def test_CTBA101_011(self):
# Alterar
self.oHelper.WaitShow("Lançamento Contábil")
self.oHelper.SearchBrowse(f"D MG 01 29/06/2015211EDH001000001001", "Filial+data Lcto + Numero Lote + Sub Lote + Numero Doc + Numero Linha + Tipo")
self.oHelper.SetButton("Alterar")
self.oHelper.SetButton("Ok")
self.oHelper.SetValue("cHistPad","003",name_attr=True)
self.oHelper.ClickFolder("Conversoes")
self.oHelper.SetValue("Crit", "1 - Diaria", grid=True, grid_number=1, row=1)
self.oHelper.SetValue("Crit", "1 - Diaria", grid=True, grid_number=1, row=2)
self.oHelper.SetValue("Crit", "1 - Diaria", grid=True, grid_number=1, row=3)
self.oHelper.SetValue("Crit", "1 - Diaria", grid=True, grid_number=1, row=4)
# self.oHelper.ClickFolder("Conversoes")
# self.oHelper.ClickFolder("Lançamento")
self.oHelper.LoadGrid()
self.oHelper.ClickFolder("Lançamento")#VERIFICA OUTRAS INFORMAÇÃES
self.oHelper.SetButton("Salvar")
self.oHelper.SetButton("Sim")
self.oHelper.AssertTrue()
def test_CTBA101_012(self):
# Copiar
self.oHelper.WaitShow("Lançamento Contábil")
self.oHelper.SearchBrowse(f"D MG 01 30/06/2015COPIAR001000001001", "Filial+data Lcto + Numero Lote + Sub Lote + Numero Doc + Numero Linha + Tipo")
self.oHelper.SetButton("Outras Ações", "Copiar")
self.oHelper.SetValue("Lote","CPMSMA")
self.oHelper.SetButton("Ok")
self.oHelper.SetButton("Salvar")
self.oHelper.SetButton("Sim")
self.oHelper.AssertTrue()
def test_CTBA101_013(self):
# Copiar
self.oHelper.WaitShow("Lançamento Contábil")
self.oHelper.SearchBrowse(f"D MG 01 30/06/2015COPIAR001000001001", "Filial+data Lcto + Numero Lote + Sub Lote + Numero Doc + Numero Linha + Tipo")
self.oHelper.SetButton("Outras Ações", "Cópia Filial")
self.oHelper.SetValue("MV_PAR01","D MG 02",name_attr=True)
self.oHelper.SetButton("Ok")
self.oHelper.SetValue("Lote","CPOTFI")
self.oHelper.SetButton("Ok")
self.oHelper.SetButton("Salvar")
self.oHelper.SetButton("Sim")
self.oHelper.AssertTrue()
@classmethod
def tearDownClass(inst):
inst.oHelper.TearDown()
if __name__ == '__main__':
unittest.main()
| 29.454301 | 154 | 0.632929 | 1,237 | 10,957 | 5.535974 | 0.136621 | 0.273072 | 0.175234 | 0.084112 | 0.839077 | 0.821554 | 0.784463 | 0.725175 | 0.71802 | 0.71802 | 0 | 0.0513 | 0.224331 | 10,957 | 371 | 155 | 29.533693 | 0.754442 | 0.03824 | 0 | 0.590909 | 0 | 0 | 0.213063 | 0.010648 | 0 | 0 | 0 | 0 | 0.060606 | 1 | 0.070707 | false | 0 | 0.010101 | 0 | 0.085859 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
f1bb4d6c4cc2e4c46aaaf6210d0c5a2014da833e | 150 | py | Python | markets/config.py | xportation/unico-markets-api | 2de5474a95d294cafa3573a76d0de607a7fc4c65 | [
"MIT"
] | null | null | null | markets/config.py | xportation/unico-markets-api | 2de5474a95d294cafa3573a76d0de607a7fc4c65 | [
"MIT"
] | null | null | null | markets/config.py | xportation/unico-markets-api | 2de5474a95d294cafa3573a76d0de607a7fc4c65 | [
"MIT"
] | null | null | null | import os
def database_url():
return os.environ.get('DATABASE_URL', 'sqlite:///./storage.db')
def test_database_url():
return 'sqlite://'
| 15 | 67 | 0.666667 | 20 | 150 | 4.8 | 0.6 | 0.34375 | 0.354167 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.153333 | 150 | 9 | 68 | 16.666667 | 0.755906 | 0 | 0 | 0 | 0 | 0 | 0.286667 | 0.146667 | 0 | 0 | 0 | 0 | 0 | 1 | 0.4 | true | 0 | 0.2 | 0.4 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 8 |
9e540f0d15f2b1ac48f4189770f0dfe06ae3c3d1 | 6,993 | py | Python | simpleredial/dataloader/bert_mask_augmentation_dataloader.py | gmftbyGMFTBY/SimpleReDial-v1 | f45b8eb23d1499ec617b4cc4f417d83d8f2b6bde | [
"MIT"
] | 36 | 2021-10-13T10:32:08.000Z | 2022-03-20T07:50:05.000Z | simpleredial/dataloader/bert_mask_augmentation_dataloader.py | gmftbyGMFTBY/SimpleReDial-v1 | f45b8eb23d1499ec617b4cc4f417d83d8f2b6bde | [
"MIT"
] | 3 | 2021-11-24T10:57:59.000Z | 2022-03-27T15:37:40.000Z | simpleredial/dataloader/bert_mask_augmentation_dataloader.py | gmftbyGMFTBY/SimpleReDial-v1 | f45b8eb23d1499ec617b4cc4f417d83d8f2b6bde | [
"MIT"
] | 1 | 2022-03-15T07:13:22.000Z | 2022-03-15T07:13:22.000Z | from header import *
from .utils import *
from .util_func import *
class BERTMaskAugmentationDataset(Dataset):
def __init__(self, vocab, path, **args):
self.args = args
self.vocab = vocab
self.pad = self.vocab.convert_tokens_to_ids('[PAD]')
self.sep = self.vocab.convert_tokens_to_ids('[SEP]')
self.cls = self.vocab.convert_tokens_to_ids('[CLS]')
self.unk = self.vocab.convert_tokens_to_ids('[UNK]')
self.mask = self.vocab.convert_tokens_to_ids('[MASK]')
self.special_tokens = [self.pad, self.sep, self.cls]
suffix = args['tokenizer'].replace('/', '_')
self.pp_path = f'{os.path.splitext(path)[0]}_bert_mask_da_{suffix}.pt'
if os.path.exists(self.pp_path):
self.data = torch.load(self.pp_path)
print(f'[!] load preprocessed file from {self.pp_path}')
return None
data = read_text_data_utterances_full(path, lang=self.args['lang'], turn_length=self.args['full_turn_length'])
self.data = []
for label, utterances in tqdm(data):
item = self.vocab.encode(utterances[-1], add_special_tokens=False)
item = item[:self.args['res_max_len']-2]
num_valid = len([i for i in item if i not in self.special_tokens])
if num_valid < self.args['min_len']:
continue
ids = [self.cls] + item[:self.args['res_max_len']-2] + [self.sep]
self.data.append({
'ids': ids,
'response': utterances[-1],
'context': utterances[:-1],
})
def __len__(self):
return len(self.data)
def __getitem__(self, i):
bundle = self.data[i]
return bundle['ids'], bundle['context'], bundle['response']
def save(self):
data = torch.save(self.data, self.pp_path)
print(f'[!] save preprocessed dataset into {self.pp_path}')
def collate(self, batch):
ids = [i[0] for i in batch]
context = [i[1] for i in batch]
response = [i[2] for i in batch]
return {
'ids': ids,
'context': context,
'response': response,
'full': False,
}
class BERTMaskAugmentationFullDataset(Dataset):
def __init__(self, vocab, path, **args):
self.args = args
self.vocab = vocab
self.pad = self.vocab.convert_tokens_to_ids('[PAD]')
self.sep = self.vocab.convert_tokens_to_ids('[SEP]')
self.cls = self.vocab.convert_tokens_to_ids('[CLS]')
self.unk = self.vocab.convert_tokens_to_ids('[UNK]')
self.mask = self.vocab.convert_tokens_to_ids('[MASK]')
self.special_tokens = [self.pad, self.sep, self.cls]
suffix = args['tokenizer'].replace('/', '_')
self.pp_path = f'{os.path.splitext(path)[0]}_bert_mask_da_full_{suffix}.pt'
if os.path.exists(self.pp_path):
self.data = torch.load(self.pp_path)
print(f'[!] load preprocessed file from {self.pp_path}')
return None
data = read_text_data_utterances(path, lang=self.args['lang'])
self.data = []
for label, utterances in tqdm(data):
if label == 0:
continue
item = self.vocab.batch_encode_plus(utterances, add_special_tokens=False)['input_ids']
rids = []
for i, s in zip(item, utterances):
i = i[:self.args['res_max_len']-2]
ids = [self.cls] + i[:self.args['res_max_len']-2] + [self.sep]
rids.append(ids)
self.data.append({
'ids': rids,
'response': utterances[-1],
'context': utterances[:-1],
})
def __len__(self):
return len(self.data)
def __getitem__(self, i):
bundle = self.data[i]
return bundle['ids'], bundle['context'], bundle['response']
def save(self):
data = torch.save(self.data, self.pp_path)
print(f'[!] save preprocessed dataset into {self.pp_path}')
def collate(self, batch):
ids, length = [], []
for i in batch:
ids.extend(i[0])
length.append(len(i[0]))
context = [i[1] for i in batch]
response = [i[2] for i in batch]
return {
'ids': ids,
'context': context,
'response': response,
'length': length,
'full': True,
}
class BERTMaskAugmentationFullSepDataset(Dataset):
def __init__(self, vocab, path, **args):
self.args = args
self.vocab = vocab
self.pad = self.vocab.convert_tokens_to_ids('[PAD]')
self.sep = self.vocab.convert_tokens_to_ids('[SEP]')
self.cls = self.vocab.convert_tokens_to_ids('[CLS]')
self.unk = self.vocab.convert_tokens_to_ids('[UNK]')
self.mask = self.vocab.convert_tokens_to_ids('[MASK]')
self.special_tokens = [self.pad, self.sep, self.cls]
suffix = args['tokenizer'].replace('/', '_')
self.pp_path = f'{os.path.splitext(path)[0]}_bert_mask_da_full_all_{suffix}.pt'
if os.path.exists(self.pp_path):
self.data = torch.load(self.pp_path)
print(f'[!] load preprocessed file from {self.pp_path}')
return None
data = read_text_data_utterances_full(path, lang=self.args['lang'])
self.data = []
for label, utterances in tqdm(data):
if label == 0:
continue
item = self.vocab.batch_encode_plus(utterances, add_special_tokens=False)['input_ids']
ids = []
for u in item:
ids.extend(u + [self.sep])
ids.pop()
ids = [self.cls] + ids[-(self.args['max_len']-2):] + [self.sep]
sep_index = [idx for idx, i in enumerate(ids) if i == self.sep]
num_valid = len([i for i in ids if i not in self.special_tokens])
if num_valid < self.args['min_len']:
continue
self.data.append({
'ids': ids,
'sep_index': sep_index,
'response': utterances[-1],
'context': utterances[:-1],
})
def __len__(self):
return len(self.data)
def __getitem__(self, i):
bundle = self.data[i]
return bundle['ids'], bundle['sep_index'], bundle['context'], bundle['response']
def save(self):
data = torch.save(self.data, self.pp_path)
print(f'[!] save preprocessed dataset into {self.pp_path}')
def collate(self, batch):
ids, sep_index = [i[0] for i in batch], [i[1] for i in batch]
context = [i[2] for i in batch]
response = [i[3] for i in batch]
return {
'ids': ids,
'sep_index': sep_index,
'context': context,
'response': response,
'full': True,
}
| 37 | 118 | 0.551695 | 879 | 6,993 | 4.194539 | 0.110353 | 0.058584 | 0.04882 | 0.089504 | 0.857337 | 0.834011 | 0.805262 | 0.776783 | 0.753458 | 0.753458 | 0 | 0.005783 | 0.307593 | 6,993 | 188 | 119 | 37.196809 | 0.755679 | 0 | 0 | 0.732919 | 0 | 0 | 0.126412 | 0.02431 | 0 | 0 | 0 | 0 | 0 | 1 | 0.093168 | false | 0 | 0.018634 | 0.018634 | 0.204969 | 0.037267 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
9e62a35e76884afe38bbf067ddbe5676b22fb8cf | 29,135 | py | Python | src/data_get.py | OriginalLinilin/course-selection-website | e6fd2668d1bcacb3ec05790037c52582a970a105 | [
"MIT"
] | null | null | null | src/data_get.py | OriginalLinilin/course-selection-website | e6fd2668d1bcacb3ec05790037c52582a970a105 | [
"MIT"
] | null | null | null | src/data_get.py | OriginalLinilin/course-selection-website | e6fd2668d1bcacb3ec05790037c52582a970a105 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
# --------------------------------------------------------------
# Description:
#
#
# --------------------------------------------------------------
from flask import Flask, session, Blueprint, request, flash
import mysql.connector
import json
from utils import *
from config import *
import traceback
data_bp = Blueprint('data', __name__, url_prefix='/data')
# get
@data_bp.route('/TeacherInfo', methods=['GET', 'POST'])
def getData_TeacherInfo():
mydb = mysql.connector.connect(
host=mysql_host, user=mysql_user, passwd=mysql_passwd, db=mysql_db)
cur = mydb.cursor()
sql = "select TeacherNo,TeacherName,TeacherGender,TeacherBirthday,CollegeName from TeacherInfo,CollegeInfo where TeacherInfo.TeaCollegeNo=CollegeInfo.CollegeNo"
try:
cur.execute(sql)
cur_res = cur.fetchall()
if cur_res != []:
data = {}
data['total'] = len(cur_res)
data['rows'] = []
for res in cur_res:
cur_data = {"No": res[0], "name": res[1], "gender": res[2], "birthday": res[3].isoformat(),
"collegeName": res[4]}
data['rows'].append(cur_data)
data = json.dumps(data)
return data
except Exception as e:
print(traceback.print_exc())
flash(traceback.print_exc())
return None
# 获取学生信息
@data_bp.route('/StudentInfo', methods=['GET', 'POST'])
def getData_StudentInfo():
mydb = mysql.connector.connect(
host=mysql_host, user=mysql_user, passwd=mysql_passwd, db=mysql_db)
cur = mydb.cursor()
if (session["identity"] == "admin"):
sql = "select StudentNo,StudentName,StudentGender,StudentBirthday,CollegeName,SpecialityName from StudentInfo,CollegeInfo,SpecialityInfo where StudentInfo.CollegeNo=CollegeInfo.CollegeNo and StudentInfo.SpecialityNo=SpecialityInfo.SpecialityNo"
try:
cur.execute(sql)
cur_res = cur.fetchall()
if cur_res != []:
data = {}
data['total'] = len(cur_res)
data['rows'] = []
for res in cur_res:
cur_data = {"No": res[0], "name": res[1], "gender": res[2], "birthday": res[3].isoformat(),
"collegeName": res[4], "specialityName": res[5]}
data['rows'].append(cur_data)
data = json.dumps(data)
return data
except Exception as e:
print(traceback.print_exc())
flash(traceback.print_exc())
return None
else:
if(request.form["courseNo"] == ""):
return {}
else:
sql = "select StudentNo,StudentName,StudentGender,StudentBirthday,CollegeName,SpecialityName from CollegeInfo,SpecialityInfo,StudentInfo where StudentInfo.StudentNo in (select StudentNo from StudentCurriculum SC where SC.CourseNo=%s) and StudentInfo.CollegeNo=CollegeInfo.CollegeNo and StudentInfo.SpecialityNo=SpecialityInfo.SpecialityNo" % (
request.form["courseNo"])
try:
cur.execute(sql)
cur_res = cur.fetchall()
if cur_res != []:
data = {}
data['total'] = len(cur_res)
data['rows'] = []
for res in cur_res:
cur_data = {"No": res[0], "name": res[1], "gender": res[2], "birthday": res[3].isoformat(),
"collegeName": res[4], "specialityName": res[5]}
data['rows'].append(cur_data)
data = json.dumps(data)
return data
except Exception as e:
print(traceback.print_exc())
flash(traceback.print_exc())
return None
# 获取管理员信息
@data_bp.route('/AdminInfo', methods=['GET', 'POST'])
def getData_AdminInfo():
mydb = mysql.connector.connect(
host=mysql_host, user=mysql_user, passwd=mysql_passwd, db=mysql_db)
cur = mydb.cursor()
sql = "select AdminNo,AdminName,AdminGender,AdminBirthday from AdminInfo"
try:
cur.execute(sql)
cur_res = cur.fetchall()
if cur_res != []:
data = {}
data['total'] = len(cur_res)
data['rows'] = []
for res in cur_res:
cur_data = {"No": res[0], "name": res[1],
"gender": res[2], "birthday": res[3].isoformat()}
data['rows'].append(cur_data)
data = json.dumps(data)
return data
except Exception as e:
print(traceback.print_exc())
flash(traceback.print_exc())
return None
# 获取Major
@data_bp.route('/CollegeSpeciality', methods=['GET', 'POST'])
def getData_CollegeSpeciality():
mydb = mysql.connector.connect(
host=mysql_host, user=mysql_user, passwd=mysql_passwd, db=mysql_db)
cur = mydb.cursor()
sql = "select CollegeName,SpecialityName,SpecialityNo from SpecialityInfo,CollegeInfo where SpecialityInfo.CollegeNo=CollegeInfo.CollegeNo;"
try:
cur.execute(sql)
cur_res = cur.fetchall()
if cur_res != []:
data = []
sameCol = {}
for res in cur_res:
if "ColName" in sameCol:
if sameCol["ColName"] == res[0]:
sameCol["SpeName"].append([res[1], res[2]])
else:
data.append(dict(sameCol))
sameCol["ColName"] = res[0]
sameCol["SpeName"] = [[res[1], res[2]]]
else:
sameCol["ColName"] = res[0]
sameCol["SpeName"] = [[res[1], res[2]]]
data.append(dict(sameCol))
data = json.dumps(data)
return data
else:
print("cur_res == []")
return None
except Exception as e:
print(traceback.print_exc())
flash(traceback.print_exc())
return None
# 获取Course信息
@data_bp.route('/CourseInfo', methods=['GET', 'POST'])
def getData_CourseInfo():
mydb = mysql.connector.connect(
host=mysql_host, user=mysql_user, passwd=mysql_passwd, db=mysql_db)
cur = mydb.cursor()
sql = "select CourseNo,CourseName,TeacherName,CourseDay,CourseBeginNo,CourseNums,ClassroomPosition from CourseInfo,TeacherInfo,ClassroomInfo where CourseInfo.TeacherNo=TeacherInfo.TeacherNo and CourseInfo.ClassroomNo=ClassroomInfo.ClassroomNo;"
try:
cur.execute(sql)
cur_res = cur.fetchall()
if cur_res != []:
data = {}
data['total'] = len(cur_res)
data['rows'] = []
for res in cur_res:
cur_data = {"courseID": res[0], "courseName": res[1],
"teacherName": res[2], "classroom": res[6]}
cur_data["time"] = cur_courseTime([res[3], res[4], res[5]])
data['rows'].append(cur_data)
data = json.dumps(data)
return data
except Exception as e:
print(traceback.print_exc())
flash(traceback.print_exc())
return None
# 获取任课Course信息
@data_bp.route('/TeachCourseInfo', methods=['GET', 'POST'])
def getData_TeachCourseInfo():
mydb = mysql.connector.connect(
host=mysql_host, user=mysql_user, passwd=mysql_passwd, db=mysql_db)
cur = mydb.cursor()
sql = "select CourseNo,CourseName,TeacherName,CourseDay,CourseBeginNo,CourseNums,ClassroomPosition from CourseInfo,TeacherInfo,ClassroomInfo where CourseInfo.TeacherNo=%s and CourseInfo.TeacherNo=TeacherInfo.TeacherNo and CourseInfo.ClassroomNo=ClassroomInfo.ClassroomNo;" % (
session["No"])
try:
cur.execute(sql)
cur_res = cur.fetchall()
if cur_res != []:
data = {}
data['total'] = len(cur_res)
data['rows'] = []
for res in cur_res:
cur_data = {"courseID": res[0], "courseName": res[1],
"teacherName": res[2], "classroom": res[6]}
cur_data["time"] = cur_courseTime([res[3], res[4], res[5]])
data['rows'].append(cur_data)
data = json.dumps(data)
return data
except Exception as e:
print(traceback.print_exc())
flash(traceback.print_exc())
return None
# 获取Major课表
@data_bp.route('/SpecialityCurriculum', methods=['GET', 'POST'])
def getData_SpecialityCurriculum():
mydb = mysql.connector.connect(
host=mysql_host, user=mysql_user, passwd=mysql_passwd, db=mysql_db)
cur = mydb.cursor()
sql = "select SpecialityName,CourseName,TeacherName,CourseDay,CourseBeginNo,CourseNums,ClassroomPosition from SpecialityCurriculum,CourseInfo,TeacherInfo,ClassroomInfo,SpecialityInfo where SpecialityCurriculum.SpecialityNo=SpecialityInfo.SpecialityNo and SpecialityCurriculum.CourseNo=CourseInfo.CourseNo and CourseInfo.TeacherNo=TeacherInfo.TeacherNo and CourseInfo.ClassroomNo=ClassroomInfo.ClassroomNo;"
try:
cur.execute(sql)
cur_res = cur.fetchall()
data = {}
data['total'] = len(cur_res)
data['rows'] = []
for res in cur_res:
cur_data = {
"specialityName": res[0], "courseName": res[1], "teacherName": res[2], "classroom": res[6]}
cur_data["time"] = cur_courseTime([res[3], res[4], res[5]])
data['rows'].append(cur_data)
data = json.dumps(data)
return data
except Exception as e:
print(traceback.print_exc())
flash(traceback.print_exc())
return None
# 获取Major课表,显示Course表
@data_bp.route('/SpecialityCurriculum_display', methods=['GET', 'POST'])
def getData_SpecialityCurriculum_display():
mydb = mysql.connector.connect(
host=mysql_host, user=mysql_user, passwd=mysql_passwd, db=mysql_db)
cur = mydb.cursor()
sql = "select CourseNo,CourseName,TeacherName,CourseDay,CourseBeginNo,CourseNums,ClassroomPosition from CourseInfo,TeacherInfo,ClassroomInfo where CourseInfo.CourseNo in (select CourseNo from SpecialityCurriculum where SpecialityCurriculum.SpecialityNo = %s) and CourseInfo.ClassroomNo=ClassroomInfo.ClassroomNo and CourseInfo.TeacherNo=TeacherInfo.TeacherNo;" % (
session["speciality"])
try:
cur.execute(sql)
cur_res = cur.fetchall()
data = {}
data['total'] = len(cur_res)
data['rows'] = [{'ClassNo': "1"}, {'ClassNo': "2"}, {'ClassNo': "3"}, {'ClassNo': "4"}, {'ClassNo': "5"}, {
'ClassNo': "6"}, {'ClassNo': "7"}, {'ClassNo': "8"}, {'ClassNo': "9"}, {'ClassNo': "10"}]
week = ["Monday", "Tuesday", "Wednesday",
"Thursday", "Friday", "Saturday", "Sunday"]
for res in cur_res:
cur_data = res[1]+' by '+res[2]+'\n('+res[6]+')'
courseBeginNo = int(res[4])
courseDay = week[int(res[3])-1]
for i in range(res[5]):
data['rows'][courseBeginNo+i-1][courseDay] = cur_data
data = json.dumps(data)
return data
except Exception as e:
print(traceback.print_exc())
flash(traceback.print_exc())
return None
# 获取个人课表
@data_bp.route('/StudentCurriculum', methods=['GET', 'POST'])
def getData_StudentCurriculum():
mydb = mysql.connector.connect(
host=mysql_host, user=mysql_user, passwd=mysql_passwd, db=mysql_db)
cur = mydb.cursor()
sql = "select CourseNo,CourseName,TeacherInfo.TeacherNo,TeacherName,CourseDay,CourseBeginNo,CourseNums,ClassroomPosition from CourseInfo,TeacherInfo,ClassroomInfo where CourseInfo.CourseNo in (select CourseNo from StudentCurriculum where StudentCurriculum.StudentNo=%s) and CourseInfo.TeacherNo=TeacherInfo.TeacherNo and CourseInfo.ClassroomNo=ClassroomInfo.ClassroomNo;" % (
session["No"])
try:
cur.execute(sql)
cur_res = cur.fetchall()
if cur_res != []:
data = {}
data['total'] = len(cur_res)
data['rows'] = []
for res in cur_res:
cur_data = {"courseID": res[0], "courseName": res[1],
"teacherID": res[2], "teacherName": res[3], "classroom": res[7]}
cur_data["time"] = cur_courseTime([res[4], res[5], res[6]])
data['rows'].append(cur_data)
data = json.dumps(data)
return data
except Exception as e:
print(traceback.print_exc())
flash(traceback.print_exc())
return None
# 获取个人课表,显示Course表
@data_bp.route('/StudentCurriculum_display', methods=['GET', 'POST'])
def getData_StudentCurriculum_display():
mydb = mysql.connector.connect(
host=mysql_host, user=mysql_user, passwd=mysql_passwd, db=mysql_db)
cur = mydb.cursor()
sql = "select CourseNo,CourseName,TeacherName,CourseDay,CourseBeginNo,CourseNums,ClassroomPosition from CourseInfo,TeacherInfo,ClassroomInfo where CourseInfo.CourseNo in (select CourseNo from StudentCurriculum where StudentCurriculum.StudentNo = %s) and CourseInfo.ClassroomNo=ClassroomInfo.ClassroomNo and CourseInfo.TeacherNo=TeacherInfo.TeacherNo;" % (
session["No"])
try:
cur.execute(sql)
cur_res = cur.fetchall()
data = {}
data['total'] = len(cur_res)
data['rows'] = [{'ClassNo': "1"}, {'ClassNo': "2"}, {'ClassNo': "3"}, {'ClassNo': "4"}, {'ClassNo': "5"}, {
'ClassNo': "6"}, {'ClassNo': "7"}, {'ClassNo': "8"}, {'ClassNo': "9"}, {'ClassNo': "10"}]
week = ["Monday", "Tuesday", "Wednesday",
"Thursday", "Friday", "Saturday", "Sunday"]
for res in cur_res:
cur_data = res[1]+' by '+res[2]+'\n('+res[6]+')'
courseBeginNo = int(res[4])
courseDay = week[int(res[3])-1]
for i in range(res[5]):
data['rows'][courseBeginNo+i-1][courseDay] = cur_data
data = json.dumps(data)
return data
except Exception as e:
print(traceback.print_exc())
flash(traceback.print_exc())
return None
# 获取教师课表,显示Course表
@data_bp.route('/TeacherCurriculum_display', methods=['GET', 'POST'])
def getData_TeacherCurriculum_display():
mydb = mysql.connector.connect(
host=mysql_host, user=mysql_user, passwd=mysql_passwd, db=mysql_db)
cur = mydb.cursor()
sql = "select CourseNo,CourseName,TeacherName,CourseDay,CourseBeginNo,CourseNums,ClassroomPosition from CourseInfo,TeacherInfo,ClassroomInfo where CourseInfo.TeacherNo=%s and CourseInfo.ClassroomNo=ClassroomInfo.ClassroomNo;" % (
session["No"])
try:
cur.execute(sql)
cur_res = cur.fetchall()
data = {}
data['total'] = len(cur_res)
data['rows'] = [{'ClassNo': "1"}, {'ClassNo': "2"}, {'ClassNo': "3"}, {'ClassNo': "4"}, {'ClassNo': "5"}, {
'ClassNo': "6"}, {'ClassNo': "7"}, {'ClassNo': "8"}, {'ClassNo': "9"}, {'ClassNo': "10"}]
week = ["Monday", "Tuesday", "Wednesday",
"Thursday", "Friday", "Saturday", "Sunday"]
for res in cur_res:
cur_data = res[1]+' by '+res[2]+'\n('+res[6]+')'
courseBeginNo = int(res[4])
courseDay = week[int(res[3])-1]
for i in range(res[5]):
data['rows'][courseBeginNo+i-1][courseDay] = cur_data
data = json.dumps(data)
return data
except Exception as e:
print(traceback.print_exc())
flash(traceback.print_exc())
return None
# 获取Application for Teaching
@data_bp.route('/TeachApplication', methods=['GET', 'POST'])
def getData_TeachApplication():
mydb = mysql.connector.connect(
host=mysql_host, user=mysql_user, passwd=mysql_passwd, db=mysql_db)
cur = mydb.cursor()
if(session["identity"] == "admin"):
sql = "select TeachCourseApplication.TeacherNo,TeacherName,TeachCourseApplication.CourseNo,CourseName,Reason from CourseInfo,TeacherInfo,TeachCourseApplication where TeachCourseApplication.TeacherNo=TeacherInfo.TeacherNo and TeachCourseApplication.CourseNo=CourseInfo.CourseNo and TeachCourseApplication.Status='waiting';"
try:
cur.execute(sql)
cur_res = cur.fetchall()
data = {}
data['total'] = len(cur_res)
data['rows'] = []
for res in cur_res:
cur_data = {"teacherID": res[0], "teacherName": res[1],
"courseID": res[2], "courseName": res[3], "reason": res[4]}
data['rows'].append(cur_data)
data = json.dumps(data)
return data
except Exception as e:
print(traceback.print_exc())
flash(traceback.print_exc())
return None
else:
sql = "select TeachCourseApplication.CourseNo,CourseName,Reason,Status from CourseInfo,TeachCourseApplication where TeachCourseApplication.TeacherNo = %s and TeachCourseApplication.CourseNo=CourseInfo.CourseNo;" % (
session["No"])
try:
cur.execute(sql)
cur_res = cur.fetchall()
data = {}
data['total'] = len(cur_res)
data['rows'] = []
for res in cur_res:
cur_data = {"courseID": res[0],
"courseName": res[1], "reason": res[2]}
if(res[3] == "waiting"):
cur_data["status"] = "等待审核"
elif(res[3] == "dismissed"):
cur_data["status"] = "驳回"
elif(res[3] == "passed"):
cur_data["status"] = "通过"
else:
cur_data["status"] = "未知"
data['rows'].append(cur_data)
data = json.dumps(data)
return data
except Exception as e:
print(traceback.print_exc())
flash(traceback.print_exc())
return None
# 获取Application for Class Start
@data_bp.route('/StartApplication', methods=['GET', 'POST'])
def getData_StartApplication():
mydb = mysql.connector.connect(
host=mysql_host, user=mysql_user, passwd=mysql_passwd, db=mysql_db)
cur = mydb.cursor()
if(session["identity"] == "admin"):
sql = "select StartCourseApplication.TeacherNo,TeacherName,CourseName,Reason from TeacherInfo,StartCourseApplication where StartCourseApplication.TeacherNo=TeacherInfo.TeacherNo and StartCourseApplication.Status='waiting';"
try:
cur.execute(sql)
cur_res = cur.fetchall()
data = {}
data['total'] = len(cur_res)
data['rows'] = []
for res in cur_res:
cur_data = {
"teacherID": res[0], "teacherName": res[1], "courseName": res[2], "reason": res[3]}
data['rows'].append(cur_data)
data = json.dumps(data)
return data
except Exception as e:
print(traceback.print_exc())
flash(traceback.print_exc())
return None
else:
sql = "select CourseName,Reason,Status from StartCourseApplication where StartCourseApplication.TeacherNo = %s ;" % (
session["No"])
try:
cur.execute(sql)
cur_res = cur.fetchall()
data = {}
data['total'] = len(cur_res)
data['rows'] = []
for res in cur_res:
cur_data = {"courseName": res[0], "reason": res[1]}
if (res[2] == "waiting"):
cur_data["status"] = "等待审核"
elif (res[2] == "dismissed"):
cur_data["status"] = "驳回"
elif (res[2] == "passed"):
cur_data["status"] = "通过"
else:
cur_data["status"] = "未知"
data['rows'].append(cur_data)
data = json.dumps(data)
return data
except Exception as e:
print(traceback.print_exc())
flash(traceback.print_exc())
return None
# 获取Classroom信息
@data_bp.route('/ClassroomInfo', methods=['GET', 'POST'])
def getData_ClassroomInfo():
mydb = mysql.connector.connect(
host=mysql_host, user=mysql_user, passwd=mysql_passwd, db=mysql_db)
cur = mydb.cursor()
sql = "select ClassroomNo,ClassroomPosition from ClassroomInfo;"
try:
cur.execute(sql)
cur_res = cur.fetchall()
if cur_res != []:
data = []
for res in cur_res:
cur_data = {"No": res[0], "position": res[1]}
data.append(cur_data)
data = json.dumps(data)
return data
except Exception as e:
print(traceback.print_exc())
flash(traceback.print_exc())
return None
# 获取考试信息
@data_bp.route('/ExamInfo', methods=['GET', 'POST'])
def getData_ExamInfo():
mydb = mysql.connector.connect(
host=mysql_host, user=mysql_user, passwd=mysql_passwd, db=mysql_db)
cur = mydb.cursor()
if session["identity"] == "admin":
sql = "select ExamInfo.CourseNo,CourseName,ExamDay,ExamBeginTime,ExamEndTime,TeacherName,ClassroomPosition from ExamInfo,CourseInfo,TeacherInfo,ClassroomInfo where ExamInfo.CourseNo=CourseInfo.CourseNo and ExamInfo.TeacherNo=TeacherInfo.TeacherNo and ExamInfo.ClassroomNo=ClassroomInfo.ClassroomNo;"
try:
cur.execute(sql)
cur_res = cur.fetchall()
data = []
for res in cur_res:
cur_data = {"courseID": res[0], "courseName": res[1],
"teacherName": res[5], "classroomPosition": res[6]}
cur_data["examTime"] = cur_examTime([res[2], res[3], res[4]])
data.append(cur_data)
data = json.dumps(data)
return data
except Exception as e:
print(traceback.print_exc())
flash(traceback.print_exc())
return None
elif session["identity"] == "student":
sql = "select ExamInfo.CourseNo,CourseName,ExamDay,ExamBeginTime,ExamEndTime,TeacherName,ClassroomPosition from ExamInfo,CourseInfo,TeacherInfo,ClassroomInfo where ExamInfo.CourseNo in (select CourseNo from StudentCurriculum where StudentNo=%s) and ExamInfo.TeacherNo=TeacherInfo.TeacherNo and ExamInfo.ClassroomNo=ClassroomInfo.ClassroomNo and ExamInfo.CourseNo=CourseInfo.CourseNo;" % (
session["No"])
try:
cur.execute(sql)
cur_res = cur.fetchall()
data = []
for res in cur_res:
cur_data = {"courseID": res[0], "courseName": res[1], "teacherName": res[5],
"classroomPosition": res[6]}
cur_data["examTime"] = cur_examTime([res[2], res[3], res[4]])
data.append(cur_data)
data = json.dumps(data)
return data
except Exception as e:
print(traceback.print_exc())
flash(traceback.print_exc())
return None
elif session["identity"] == "teacher":
if(request.form["choice"] == "monitor"):
sql = "select ExamInfo.CourseNo,CourseName,ExamDay,ExamBeginTime,ExamEndTime,ClassroomPosition from ExamInfo,CourseInfo,ClassroomInfo where ExamInfo.TeacherNo=%s and ExamInfo.ClassroomNo=ClassroomInfo.ClassroomNo and ExamInfo.CourseNo=CourseInfo.CourseNo;" % (
session["No"])
try:
cur.execute(sql)
cur_res = cur.fetchall()
data = []
for res in cur_res:
cur_data = {"courseID": res[0], "courseName": res[1], "teacherName": session['name'],
"classroomPosition": res[5]}
cur_data["examTime"] = cur_examTime(
[res[2], res[3], res[4]])
data.append(cur_data)
data = json.dumps(data)
return data
except Exception as e:
print(traceback.print_exc())
flash(traceback.print_exc())
return None
elif(request.form["choice"] == "class"):
sql = "select ExamInfo.CourseNo,CourseName,ExamDay,ExamBeginTime,ExamEndTime,ClassroomPosition from ExamInfo,CourseInfo,ClassroomInfo where CourseInfo.TeacherNo=%s and ExamInfo.ClassroomNo=ClassroomInfo.ClassroomNo and ExamInfo.CourseNo=CourseInfo.CourseNo;" % (
session["No"])
try:
cur.execute(sql)
cur_res = cur.fetchall()
data = []
for res in cur_res:
cur_data = {"courseID": res[0], "courseName": res[1], "teacherName": session['name'],
"classroomPosition": res[5]}
cur_data["examTime"] = cur_examTime(
[res[2], res[3], res[4]])
data.append(cur_data)
data = json.dumps(data)
return data
except Exception as e:
print(traceback.print_exc())
flash(traceback.print_exc())
return None
else:
return {'status': False, 'reason': 'wrong choice'}
# 获取Grade信息
@data_bp.route('/ScoreInfo', methods=['GET', 'POST'])
def getData_ScoreInfo():
mydb = mysql.connector.connect(
host=mysql_host, user=mysql_user, passwd=mysql_passwd, db=mysql_db)
cur = mydb.cursor()
if session["identity"] == "student":
sql = "select ScoreInfo.CourseNo,CourseName,Score from CourseInfo,ScoreInfo where ScoreInfo.StudentNo=%s and ScoreInfo.CourseNo=CourseInfo.CourseNo;" % (
session["No"])
try:
cur.execute(sql)
cur_res = cur.fetchall()
data = []
for res in cur_res:
cur_data = {"courseID": res[0],
"courseName": res[1], "score": res[2]}
data.append(cur_data)
data = json.dumps(data)
return data
except Exception as e:
print(traceback.print_exc())
flash(traceback.print_exc())
return None
elif session["identity"] == "teacher":
if(request.form["courseNo"] == ""):
return {}
sql = "select SC.StudentNo,StudentName,Score from ScoreInfo right join (select StudentInfo.StudentNo,StudentName from StudentCurriculum,StudentInfo where StudentCurriculum.CourseNo=%s and StudentCurriculum.StudentNo=StudentInfo.StudentNo) SC on ScoreInfo.CourseNo=%s and ScoreInfo.StudentNo=SC.StudentNo" % (
request.form["courseNo"], request.form["courseNo"])
try:
cur.execute(sql)
cur_res = cur.fetchall()
data = []
for res in cur_res:
cur_data = {"No": res[0], "name": res[1], "score": res[2]}
data.append(cur_data)
data = json.dumps(data)
return data
except Exception as e:
print(traceback.print_exc())
flash(traceback.print_exc())
return None
# 获取Evaluations信息
@data_bp.route('/EvaluationInfo', methods=['GET', 'POST'])
def getData_EvaluationInfo():
mydb = mysql.connector.connect(
host=mysql_host, user=mysql_user, passwd=mysql_passwd, db=mysql_db)
cur = mydb.cursor()
if session["identity"] == "student":
sql = " select E.CourseNo,CourseName,E.TeacherNo,TeacherName,StuContent from (select CourseNo,TeacherNo,StuContent from EvaluationInfo where StudentNo=%s) E,TeacherInfo,CourseInfo where E.CourseNo=CourseInfo.CourseNo and E.TeacherNo=TeacherInfo.TeacherNo;" % (
session["No"])
try:
cur.execute(sql)
cur_res = cur.fetchall()
data = []
for res in cur_res:
cur_data = {"courseID": res[0], "courseName": res[1],
"teacherID": res[2], "teacherName": res[3], "content": res[4]}
data.append(cur_data)
data = json.dumps(data)
return data
except Exception as e:
print(traceback.print_exc())
flash(traceback.print_exc())
return None
elif session["identity"] == "teacher":
if(request.form["courseNo"] == ""):
return {}
sql = "select E.CourseNo,CourseName,StuContent from (select CourseNo,StuContent from EvaluationInfo where CourseNo=%s and TeaContent is null) E,CourseInfo where E.CourseNo=CourseInfo.CourseNo;" % (
request.form["courseNo"])
try:
cur.execute(sql)
cur_res = cur.fetchall()
data = []
for res in cur_res:
cur_data = {
"courseID": res[0], "courseName": res[1], "content": res[2]}
data.append(cur_data)
data = json.dumps(data)
return data
except Exception as e:
print(traceback.print_exc())
flash(traceback.print_exc())
return None
| 42.782673 | 410 | 0.576111 | 3,056 | 29,135 | 5.390707 | 0.063809 | 0.027316 | 0.051596 | 0.024281 | 0.805755 | 0.777468 | 0.753976 | 0.739408 | 0.739408 | 0.718344 | 0 | 0.008644 | 0.289274 | 29,135 | 680 | 411 | 42.845588 | 0.786932 | 0.013352 | 0 | 0.826891 | 0 | 0.035294 | 0.288002 | 0.16879 | 0 | 0 | 0 | 0 | 0 | 1 | 0.028571 | false | 0.031933 | 0.010084 | 0 | 0.131092 | 0.089076 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
9ea9035f0b3cdefc758b1cc6cc5fb6c54a78cae3 | 27,095 | py | Python | pysem/imputer.py | planplus/pysem | 6effa2e1e468c889e89109ac4a7a486b0813f02d | [
"MIT"
] | 2 | 2021-12-10T04:20:58.000Z | 2022-01-07T06:57:17.000Z | pysem/imputer.py | planplus/pysem | 6effa2e1e468c889e89109ac4a7a486b0813f02d | [
"MIT"
] | null | null | null | pysem/imputer.py | planplus/pysem | 6effa2e1e468c889e89109ac4a7a486b0813f02d | [
"MIT"
] | null | null | null | #!/usr/bin/env python3
# -*- coding: utf-8 -*-
"""Internal imputer module."""
from .model import Model
from .model_means import ModelMeans
from .model_effects import ModelEffects
from .solver import Solver
from .utils import chol_inv
from . import startingvalues
import pandas as pd
import numpy as np
class Imputer(Model):
"""Model for missing data imputation."""
symb_missing = '@'
matrices_names = tuple(list(Model.matrices_names) + ['data_imp'])
def __init__(self, model: Model, data: pd.DataFrame, factors=True):
"""
Instantiate Imputer.
Parameters
----------
model : Model
Model.
data : pd.DataFrame
Data with missing data labeled as np.nan.
factors: bool
If True, factors are estimated. The default is True.
Returns
-------
None.
"""
self.mod = model
self.mx_data_imp = data.copy()
self.n_param_missing = 0
self.factors = factors
desc = model.description
self.dict_effects[self.symb_missing] = self.effect_missing
super().__init__(desc, mimic_lavaan=model.mimic_lavaan)
self.objectives = {'ML': (self.obj_ml, self.grad_ml)}
self.load_starting_values()
def finalize_variable_classification(self):
"""
Finalize variable classification.
Reorders variables for better visual fancyness and does extra
model-specific variable respecification.
Returns
-------
None.
"""
super().finalize_variable_classification()
if self.factors:
self.vars['observed'] += sorted(self.vars['latent'])
def setup_matrices(self):
"""
Initialize base matrix structures of the model.
Returns
-------
None.
"""
super().setup_matrices()
for i, f in enumerate(self.start_rules):
name = f.__name__
if not name.endswith('_imp'):
self.start_rules[i] = getattr(startingvalues, name + '_imp')
def build_data_imp(self):
"""
Build model-implied model imputed data matrix.
Returns
-------
mx : np.ndarrau
Model-implied data matrix.
names : tuple
Row and column names.
"""
mx = self.mx_data_imp[self.vars['observed']].copy()
names = (list(mx.index), list(mx.columns))
return mx.values, names
def preprocess_effects(self, effects: dict):
"""
Run a routine just before effects are applied.
Used to apply covariances to model.
Parameters
-------
effects : dict
Mapping opcode->lvalues->rvalues->multiplicator.
Returns
-------
None.
"""
super().preprocess_effects(effects)
for _, lvals in effects.items():
for lval, rvals in lvals.items():
for rval in rvals:
rvals[rval] = self.symb_starting_values
missing = effects[self.symb_missing]
obs = self.vars['observed']
mx = self.mx_data_imp
for v in obs:
if v not in mx.columns:
mx[v] = np.nan
mx = mx[obs]
inds = list(mx.index)
for i, j in zip(*np.where(np.isnan(mx))):
v = obs[j]
missing[inds[i]][v] = None
def effect_missing(self, items: dict):
"""
Work through missing operation.
Parameters
----------
items : dict
Mapping lvalues->rvalues->multiplicator.
Returns
-------
None.
"""
mx, (rows, cols) = self.mx_data_imp, self.names_data_imp
for lv, rvs in items.items():
i = rows.index(lv)
for rv in rvs:
j = cols.index(rv)
ind = (i, j)
self.n_param_missing += 1
name = f'_m{self.n_param_missing}'
self.add_param(name, matrix=mx, indices=ind, start=None,
active=True, symmetric=False,
bound=(None, None))
def fit(self, solver='SLSQP', clean_slate=False):
"""
Perform data imputation.
Parameters
----------
solver: str
Solver to use. The default is 'SLSQP'.
Returns
----------
SolverResult:
Result of optimizaiton.
"""
if clean_slate or not hasattr(self, 'param_vals'):
self.prepare_params()
obj, grad = self.get_objective('ML')
solver = Solver(solver, obj, grad, self.param_vals)
res = solver.solve()
self.update_matrices(res.x)
return res
def prepare_params(self):
"""
Prepare structures for effective optimization routines.
Returns
-------
None.
"""
super().prepare_params()
self.mx_sigma = self.calc_sigma()[0]
try:
self.mx_sigma_inv = chol_inv(self.mx_sigma)
except np.linalg.LinAlgError:
self.mx_sigma_inv = np.linalg.pinv(self.mx_sigma)
def calc_data_grad(self):
"""
Calculate model-implied data gradient.
Returns
-------
list
List of gradient values.
"""
return self.mx_diffs
def obj_ml(self, x: np.ndarray):
"""
Calculate ML objective value.
Parameters
----------
x : np.ndarray
Parameters vector.
Returns
-------
float
Loglikelihood value.
"""
self.update_matrices(x)
data = self.mx_data_imp
return np.einsum('ij,jk,ik->', data, self.mx_sigma_inv, data)
def grad_ml(self, x: np.ndarray):
"""
Gradient of ML objective function.
Parameters
----------
x : np.ndarray
Parameters vector.
Returns
-------
np.ndarray
Gradient of GLS.
"""
self.update_matrices(x)
data_grad = self.calc_data_grad()
t = self.mx_sigma_inv @ self.mx_data_imp.T
return 2 * np.array([np.einsum('ij,ji->', g[4], t)
for g in data_grad])
def get_fancy(self):
"""
Returns imputed data in DataFrame form.
Returns
-------
pd.DataFrame
DataFrame with imputed data and factor scores.
"""
data = pd.DataFrame(self.mx_data_imp, index=self.names_data_imp[0],
columns=self.names_data_imp[1])
return data
def operation_start(self, operation):
pass
def operation_bound(self, operation):
pass
def operation_constraint(self, operation):
pass
class ImputerMeans(ModelMeans):
"""ModelMeans for missing data imputation."""
symb_missing = '@'
matrices_names = tuple(list(ModelMeans.matrices_names) + \
['data_imp', 'g_imp'])
def __init__(self, model: ModelMeans, data: pd.DataFrame, factors=True):
"""
Instantiate ImputerMeans.
Parameters
----------
model : ModelMeans
Model with meanstructure.
data : pd.DataFrame
Data with missing data labeled as np.nan.
factors: bool
If True, factors are estimated. The default is True.
Returns
-------
None.
"""
self.mod = model
self.mx_data_imp = data.copy()
if model.intercepts:
data = data.copy()
data['1'] = 1.0
t = [v for v in model.vars['observed_exogenous']
if v in data.columns]
self.mx_g = data[t].copy()
self.n_param_missing = 0
self.factors = factors
desc = model.description
self.dict_effects[self.symb_missing] = self.effect_missing
super().__init__(desc, mimic_lavaan=model.mimic_lavaan,
intercepts=model.intercepts)
self.objectives = {'ML': (self.obj_ml, self.grad_ml)}
self.load_starting_values()
def finalize_variable_classification(self):
"""
Finalize variable classification.
Reorders variables for better visual fancyness and does extra
model-specific variable respecification.
Returns
-------
None.
"""
super().finalize_variable_classification()
if self.factors:
self.vars['observed'] += sorted(self.vars['latent'])
def setup_matrices(self):
"""
Initialize base matrix structures of the model.
Returns
-------
None.
"""
super().setup_matrices()
for i, f in enumerate(self.start_rules):
name = f.__name__
if not name.endswith('_imp'):
self.start_rules[i] = getattr(startingvalues, name + '_imp')
def build_data_imp(self):
"""
Build model-implied model imputed data matrix.
Returns
-------
mx : np.ndarray
Model-implied data matrix.
names : tuple
Row and column names.
"""
mx = self.mx_data_imp[self.vars['observed']].copy()
names = (list(mx.columns), list(mx.index))
return mx.values.T, names
def build_g_imp(self):
"""
Build model-implied model imputed data G matrix.
Returns
-------
mx : np.ndarray
Model-implied G matrix.
names : tuple
Row and column names.
"""
mx = self.mx_g[self.vars['observed_exogenous']].copy()
names = (list(mx.columns), list(mx.index))
return mx.values.T, names
def preprocess_effects(self, effects: dict):
"""
Run a routine just before effects are applied.
Used to apply covariances to model.
Parameters
-------
effects : dict
Mapping opcode->lvalues->rvalues->multiplicator.
Returns
-------
None.
"""
super().preprocess_effects(effects)
for _, lvals in effects.items():
for lval, rvals in lvals.items():
for rval in rvals:
rvals[rval] = self.symb_starting_values
missing = effects[self.symb_missing]
obs = self.vars['observed']
mx = self.mx_data_imp
for v in obs:
if v not in mx.columns:
mx[v] = np.nan
mx = mx[obs]
inds = list(mx.index)
for i, j in zip(*np.where(np.isnan(mx))):
v = obs[j]
missing[inds[i]][v] = None
obs = self.vars['observed_exogenous']
mx = self.mx_g
for v in obs:
if v not in mx.columns:
mx[v] = np.nan
mx = mx[obs]
inds = list(mx.index)
for i, j in zip(*np.where(np.isnan(mx))):
v = obs[j]
missing[inds[i]][v] = None
def effect_missing(self, items: dict):
"""
Work through missing operation.
Parameters
----------
items : dict
Mapping lvalues->rvalues->multiplicator.
Returns
-------
None.
"""
obs_exo1 = self.vars['observed_exogenous_1']
obs_exo2 = self.vars['observed_exogenous_2']
for lv, rvs in items.items():
for rv in rvs:
if lv in obs_exo1:
mx = self.mx_g1
rows, cols = self.names_g1_imp
elif rv in obs_exo2:
mx = self.mx_g2
rows, cols = self.names_g2_imp
else:
mx, (rows, cols) = self.mx_data_imp, self.names_data_imp
i = rows.index(rv)
j = cols.index(lv)
ind = (i, j)
self.n_param_missing += 1
name = f'_m{self.n_param_missing}'
self.add_param(name, matrix=mx, indices=ind, start=None,
active=True, symmetric=False,
bound=(None, None))
def fit(self, solver='SLSQP', clean_slate=False):
"""
Perform data imputation.
Parameters
----------
solver: str
Solver to use. The default is 'SLSQP'.
Returns
----------
SolverResult:
Result of optimizaiton.
"""
if not clean_slate or not hasattr(self, 'param_vals'):
self.prepare_params()
obj, grad = self.get_objective('ML')
solver = Solver(solver, obj, grad, self.param_vals)
res = solver.solve()
self.update_matrices(res.x)
return res
def prepare_params(self):
"""
Prepare structures for effective optimization routines.
Returns
-------
None.
"""
super().prepare_params()
self.mx_sigma = self.calc_sigma()[0]
try:
self.mx_sigma_inv = chol_inv(self.mx_sigma)
except np.linalg.LinAlgError:
self.mx_sigma_inv = np.linalg.pinv(self.mx_sigma)
i = np.identity(self.mx_beta.shape[0])
self.mx_m = self.mx_lambda @ np.linalg.inv(i - self.mx_beta)
self.mx_g = self.mx_g_imp
self.mx_mg = self.mx_m @ self.mx_gamma1 + self.mx_gamma2
def calc_data_grad(self):
"""
Calculate model-implied data gradient.
Returns
-------
list
List of gradient values.
"""
grad = list()
for mxs in self.mx_diffs:
g = np.float32(0.0)
if mxs[-2] is not None: # data_imp
g += mxs[-2]
if mxs[-1] is not None: # g1
g -= self.mx_mg @ mxs[-1]
grad.append(g)
return grad
def obj_ml(self, x: np.ndarray):
"""
Calculate ML objective value.
Parameters
----------
x : np.ndarray
Parameters vector.
Returns
-------
float
Loglikelihood value.
"""
self.update_matrices(x)
center = self.mx_data_imp - self.calc_mean(self.mx_m)
return np.einsum('ji,jk,ki->', center, self.mx_sigma_inv, center)
def grad_ml(self, x: np.ndarray):
"""
Gradient of ML objective function.
Parameters
----------
x : np.ndarray
Parameters vector.
Returns
-------
np.ndarray
Gradient of ML.
"""
self.update_matrices(x)
data_grad = self.calc_data_grad()
center = self.mx_data_imp - self.calc_mean(self.mx_m)
t = self.mx_sigma_inv @ center
return 2 * np.array([np.einsum('ji,ji->', g, t)
for g in data_grad])
def get_fancy(self):
"""
Returns imputed data in DataFrame form.
Returns
-------
pd.DataFrame
DataFrame with imputed data and factor scores.
"""
data = pd.DataFrame(self.mx_data_imp, index=self.names_data_imp[0],
columns=self.names_data_imp[1])
g = pd.DataFrame(self.mx_g_imp, index=self.names_g_imp[0],
columns=self.names_g_imp[1])
if '1' in g.index:
g.drop('1', inplace=True)
res = pd.concat([data, g]).T
var = self.mod.vars
t = sorted(var['all'] - var['latent']) + sorted(var['latent'])
return res[t]
def operation_start(self, operation):
pass
def operation_bound(self, operation):
pass
def operation_constraint(self, operation):
pass
class ImputerEffects(ModelEffects):
"""ModelEffects for missing data imputation."""
symb_missing = '@'
matrices_names = tuple(list(ModelEffects.matrices_names) + \
['data_imp', 'g_imp'])
def __init__(self, model: ModelMeans, data: pd.DataFrame, group: str,
k=None, factors=True):
"""
Instantiate ImputerMeans.
Parameters
----------
model : ModelMeans
Model with meanstructure.
data : pd.DataFrame
Data with missing data labeled as np.nan.
k : pd.DataFrame
Kinship between individuals matrix. If None, identity is assumed.
The default is None.
factors: bool
If True, factors are estimated. The default is True.
Returns
-------
None.
"""
self.mod = model
if model.intercepts:
data = data.copy()
data['1'] = 1.0
self.mx_data_orig = data.copy()
self.mx_data_imp = data
t = [v for v in model.vars['observed_exogenous']
if v in data.columns]
self.mx_g = data[t].copy()
self.n_param_missing = 0
self.factors = factors
desc = model.description
self.dict_effects[self.symb_missing] = self.effect_missing
super().__init__(desc, mimic_lavaan=model.mimic_lavaan,
intercepts=model.intercepts)
self.objectives = {'ML': (self.obj_ml, None)}
self.load_starting_values()
self.load(group, k=k)
def load(self, group: str, k=None):
"""
Load kinship K matrix.
Parameters
----------
group : str
Name of column with group labels.
k : pd.DataFrame
Covariance matrix across rows, i.e. kinship matrix. If None,
identity is assumed. The default is None.
KeyError
Rises when there are missing variables from the data.
Exception
Rises when group parameter is None.
Returns
-------
None.
"""
obs = self.vars['observed']
data = self.mx_data_orig
grs = data[group]
p_names = list(grs.unique())
p, n = len(p_names), data.shape[0]
if k is None:
k = np.identity(p)
elif k.shape[0] != p:
raise Exception("Dimensions of K don't match number of groups.")
z = np.zeros((n, p))
for i, germ in enumerate(grs):
j = p_names.index(germ)
z[i, j] = 1.0
if type(k) is pd.DataFrame:
try:
k = k.loc[p_names, p_names].values
except KeyError:
raise KeyError("Certain groups in K differ from those " \
"provided in a dataset.")
self.mx_g = data[self.vars['observed_exogenous']].values.T
if len(self.mx_g.shape) != 2:
self.mx_g = self.mx_g[np.newaxis, :]
g = self.mx_g
self.num_m = self.vars['observed']
zkz = z @ k @ z.T
c = np.linalg.inv(np.identity(self.mx_beta.shape[0]) - self.mx_beta)
m = self.mx_lambda @ c
sigma = m @ self.mx_psi @ m.T + self.mx_theta
self.mx_m = m
m = len(obs)
r = n * (np.ones((m, m)) * self.mx_v[0, 0] + sigma) \
+ np.trace(zkz) * self.mx_d
w = np.ones((n, n)) * np.trace(sigma) + zkz * np.trace(self.mx_d) + \
np.identity(n) * self.mx_v[0, 0] * m
self.mx_r_inv = chol_inv(r)
self.mx_w_inv = chol_inv(w)
def finalize_variable_classification(self):
"""
Finalize variable classification.
Reorders variables for better visual fancyness and does extra
model-specific variable respecification.
Returns
-------
None.
"""
super().finalize_variable_classification()
if self.factors:
self.vars['observed'] += sorted(self.vars['latent'])
def setup_matrices(self):
"""
Initialize base matrix structures of the model.
Returns
-------
None.
"""
super().setup_matrices()
for i, f in enumerate(self.start_rules):
name = f.__name__
if not name.endswith('_imp'):
self.start_rules[i] = getattr(startingvalues, name + '_imp')
def build_data_imp(self):
"""
Build model-implied model imputed data matrix.
Returns
-------
mx : np.ndarray
Model-implied data matrix.
names : tuple
Row and column names.
"""
mx = self.mx_data_imp[self.vars['observed']].copy()
names = (list(mx.columns), list(mx.index))
return mx.values.T, names
def build_g_imp(self):
"""
Build model-implied model imputed data G matrix.
Returns
-------
mx : np.ndarray
Model-implied G matrix.
names : tuple
Row and column names.
"""
mx = self.mx_g[self.vars['observed_exogenous']].copy()
names = (list(mx.columns), list(mx.index))
return mx.values.T, names
def preprocess_effects(self, effects: dict):
"""
Run a routine just before effects are applied.
Used to apply covariances to model.
Parameters
-------
effects : dict
Mapping opcode->lvalues->rvalues->multiplicator.
Returns
-------
None.
"""
super().preprocess_effects(effects)
for _, lvals in effects.items():
for lval, rvals in lvals.items():
for rval in rvals:
rvals[rval] = self.symb_starting_values
missing = effects[self.symb_missing]
obs = self.vars['observed']
mx = self.mx_data_imp
for v in obs:
if v not in mx.columns:
mx[v] = np.nan
mx = mx[obs]
inds = list(mx.index)
for i, j in zip(*np.where(np.isnan(mx))):
v = obs[j]
missing[inds[i]][v] = None
obs = self.vars['observed_exogenous']
mx = self.mx_g
for v in obs:
if v not in mx.columns:
mx[v] = np.nan
mx = mx[obs]
inds = list(mx.index)
for i, j in zip(*np.where(np.isnan(mx))):
v = obs[j]
missing[inds[i]][v] = None
def effect_missing(self, items: dict):
"""
Work through missing operation.
Parameters
----------
items : dict
Mapping lvalues->rvalues->multiplicator.
Returns
-------
None.
"""
obs_exo1 = self.vars['observed_exogenous_1']
obs_exo2 = self.vars['observed_exogenous_2']
for lv, rvs in items.items():
for rv in rvs:
if lv in obs_exo1:
mx = self.mx_g1
rows, cols = self.names_g1_imp
elif rv in obs_exo2:
mx = self.mx_g2
rows, cols = self.names_g2_imp
else:
mx, (rows, cols) = self.mx_data_imp, self.names_data_imp
i = rows.index(rv)
j = cols.index(lv)
ind = (i, j)
self.n_param_missing += 1
name = f'_m{self.n_param_missing}'
self.add_param(name, matrix=mx, indices=ind, start=None,
active=True, symmetric=False,
bound=(None, None))
def fit(self, solver='SLSQP', clean_slate=False):
"""
Perform data imputation.
Parameters
----------
solver: str
Solver to use. The default is 'SLSQP'.
Returns
----------
SolverResult:
Result of optimizaiton.
"""
if clean_slate or not hasattr(self, 'param_vals'):
self.prepare_params()
obj, grad = self.get_objective('ML')
solver = Solver(solver, obj, grad, self.param_vals)
res = solver.solve()
self.update_matrices(res.x)
return res
def prepare_params(self):
"""
Prepare structures for effective optimization routines.
Returns
-------
None.
"""
super().prepare_params()
def calc_data_grad(self):
"""
Calculate model-implied data gradient.
Returns
-------
list
List of gradient values.
"""
grad = list()
for mxs in self.mx_diffs:
g = np.float32(0.0)
if mxs[-2] is not None: # data_imp
g += mxs[-2]
if mxs[-1] is not None: # g1
g -= self.mx_mg @ mxs[-1]
grad.append(g)
return grad
def obj_ml(self, x: np.ndarray):
"""
Calculate ML objective value.
Parameters
----------
x : np.ndarray
Parameters vector.
Returns
-------
float
Loglikelihood value.
"""
self.update_matrices(x)
center = self.mx_data_imp - self.calc_mean(self.mx_m)
r, w = self.mx_r_inv, self.mx_w_inv
return np.einsum('ij,jk->', w @ center.T, r @ center)
def grad_ml(self, x: np.ndarray):
"""
Gradient of ML objective function.
Parameters
----------
x : np.ndarray
Parameters vector.
Returns
-------
np.ndarray
Gradient of ML.
"""
self.update_matrices(x)
data_grad = self.calc_data_grad()
center = self.mx_data_imp - self.calc_mean(self.mx_m)
t = self.mx_sigma_inv @ center
return 2 * np.array([np.einsum('ji,ji->', g, t)
for g in data_grad])
def get_fancy(self):
"""
Returns imputed data in DataFrame form.
Returns
-------
pd.DataFrame
DataFrame with imputed data and factor scores.
"""
data = pd.DataFrame(self.mx_data_imp, index=self.names_data_imp[0],
columns=self.names_data_imp[1])
g = pd.DataFrame(self.mx_g_imp, index=self.names_g_imp[0],
columns=self.names_g_imp[1])
if '1' in g.index:
g.drop('1', inplace=True)
res = pd.concat([data, g]).T
var = self.mod.vars
t = sorted(var['all'] - var['latent']) + sorted(var['latent'])
return res[t]
def operation_start(self, operation):
pass
def operation_bound(self, operation):
pass
def operation_constraint(self, operation):
pass
def get_imputer(self):
"""
Retrieve an appropriate Imputer instance.
Parameters
----------
self : Model or ModelMeans
Model.
Returns
-------
Imputer, ImputerMeans or ImputerEffects.
"""
if type(self) is Model:
return Imputer
elif type(self) is ModelMeans:
return ImputerMeans
elif type(self) is ModelEffects:
return ImputerEffects
| 27.704499 | 77 | 0.518767 | 3,104 | 27,095 | 4.387887 | 0.097616 | 0.038767 | 0.016887 | 0.020044 | 0.861454 | 0.848164 | 0.840308 | 0.840308 | 0.840308 | 0.835609 | 0 | 0.004961 | 0.367596 | 27,095 | 977 | 78 | 27.732856 | 0.789904 | 0.230375 | 0 | 0.806075 | 0 | 0 | 0.038781 | 0.004041 | 0 | 0 | 0 | 0 | 0 | 1 | 0.114486 | false | 0.021028 | 0.018692 | 0 | 0.207944 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
7b97e3f90b4ad41aca84387dc44aed5883ff0c95 | 37,858 | py | Python | sdk/python/pulumi_oci/goldengate/deployment_backup.py | EladGabay/pulumi-oci | 6841e27d4a1a7e15c672306b769912efbfd3ba99 | [
"ECL-2.0",
"Apache-2.0"
] | 5 | 2021-08-17T11:14:46.000Z | 2021-12-31T02:07:03.000Z | sdk/python/pulumi_oci/goldengate/deployment_backup.py | pulumi-oci/pulumi-oci | 6841e27d4a1a7e15c672306b769912efbfd3ba99 | [
"ECL-2.0",
"Apache-2.0"
] | 1 | 2021-09-06T11:21:29.000Z | 2021-09-06T11:21:29.000Z | sdk/python/pulumi_oci/goldengate/deployment_backup.py | pulumi-oci/pulumi-oci | 6841e27d4a1a7e15c672306b769912efbfd3ba99 | [
"ECL-2.0",
"Apache-2.0"
] | 2 | 2021-08-24T23:31:30.000Z | 2022-01-02T19:26:54.000Z | # coding=utf-8
# *** WARNING: this file was generated by the Pulumi Terraform Bridge (tfgen) Tool. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
import warnings
import pulumi
import pulumi.runtime
from typing import Any, Mapping, Optional, Sequence, Union, overload
from .. import _utilities
__all__ = ['DeploymentBackupArgs', 'DeploymentBackup']
@pulumi.input_type
class DeploymentBackupArgs:
def __init__(__self__, *,
bucket: pulumi.Input[str],
compartment_id: pulumi.Input[str],
deployment_id: pulumi.Input[str],
display_name: pulumi.Input[str],
namespace: pulumi.Input[str],
object: pulumi.Input[str],
defined_tags: Optional[pulumi.Input[Mapping[str, Any]]] = None,
freeform_tags: Optional[pulumi.Input[Mapping[str, Any]]] = None):
"""
The set of arguments for constructing a DeploymentBackup resource.
:param pulumi.Input[str] bucket: Name of the bucket where the object is to be uploaded in the object storage
:param pulumi.Input[str] compartment_id: (Updatable) The [OCID](https://docs.cloud.oracle.com/iaas/Content/General/Concepts/identifiers.htm) of the compartment being referenced.
:param pulumi.Input[str] deployment_id: The [OCID](https://docs.cloud.oracle.com/iaas/Content/General/Concepts/identifiers.htm) of the deployment being referenced.
:param pulumi.Input[str] display_name: An object's Display Name.
:param pulumi.Input[str] namespace: Name of namespace that serves as a container for all of your buckets
:param pulumi.Input[str] object: Name of the object to be uploaded to object storage
:param pulumi.Input[Mapping[str, Any]] defined_tags: (Updatable) Tags defined for this resource. Each key is predefined and scoped to a namespace. Example: `{"foo-namespace.bar-key": "value"}`
:param pulumi.Input[Mapping[str, Any]] freeform_tags: (Updatable) A simple key-value pair that is applied without any predefined name, type, or scope. Exists for cross-compatibility only. Example: `{"bar-key": "value"}`
"""
pulumi.set(__self__, "bucket", bucket)
pulumi.set(__self__, "compartment_id", compartment_id)
pulumi.set(__self__, "deployment_id", deployment_id)
pulumi.set(__self__, "display_name", display_name)
pulumi.set(__self__, "namespace", namespace)
pulumi.set(__self__, "object", object)
if defined_tags is not None:
pulumi.set(__self__, "defined_tags", defined_tags)
if freeform_tags is not None:
pulumi.set(__self__, "freeform_tags", freeform_tags)
@property
@pulumi.getter
def bucket(self) -> pulumi.Input[str]:
"""
Name of the bucket where the object is to be uploaded in the object storage
"""
return pulumi.get(self, "bucket")
@bucket.setter
def bucket(self, value: pulumi.Input[str]):
pulumi.set(self, "bucket", value)
@property
@pulumi.getter(name="compartmentId")
def compartment_id(self) -> pulumi.Input[str]:
"""
(Updatable) The [OCID](https://docs.cloud.oracle.com/iaas/Content/General/Concepts/identifiers.htm) of the compartment being referenced.
"""
return pulumi.get(self, "compartment_id")
@compartment_id.setter
def compartment_id(self, value: pulumi.Input[str]):
pulumi.set(self, "compartment_id", value)
@property
@pulumi.getter(name="deploymentId")
def deployment_id(self) -> pulumi.Input[str]:
"""
The [OCID](https://docs.cloud.oracle.com/iaas/Content/General/Concepts/identifiers.htm) of the deployment being referenced.
"""
return pulumi.get(self, "deployment_id")
@deployment_id.setter
def deployment_id(self, value: pulumi.Input[str]):
pulumi.set(self, "deployment_id", value)
@property
@pulumi.getter(name="displayName")
def display_name(self) -> pulumi.Input[str]:
"""
An object's Display Name.
"""
return pulumi.get(self, "display_name")
@display_name.setter
def display_name(self, value: pulumi.Input[str]):
pulumi.set(self, "display_name", value)
@property
@pulumi.getter
def namespace(self) -> pulumi.Input[str]:
"""
Name of namespace that serves as a container for all of your buckets
"""
return pulumi.get(self, "namespace")
@namespace.setter
def namespace(self, value: pulumi.Input[str]):
pulumi.set(self, "namespace", value)
@property
@pulumi.getter
def object(self) -> pulumi.Input[str]:
"""
Name of the object to be uploaded to object storage
"""
return pulumi.get(self, "object")
@object.setter
def object(self, value: pulumi.Input[str]):
pulumi.set(self, "object", value)
@property
@pulumi.getter(name="definedTags")
def defined_tags(self) -> Optional[pulumi.Input[Mapping[str, Any]]]:
"""
(Updatable) Tags defined for this resource. Each key is predefined and scoped to a namespace. Example: `{"foo-namespace.bar-key": "value"}`
"""
return pulumi.get(self, "defined_tags")
@defined_tags.setter
def defined_tags(self, value: Optional[pulumi.Input[Mapping[str, Any]]]):
pulumi.set(self, "defined_tags", value)
@property
@pulumi.getter(name="freeformTags")
def freeform_tags(self) -> Optional[pulumi.Input[Mapping[str, Any]]]:
"""
(Updatable) A simple key-value pair that is applied without any predefined name, type, or scope. Exists for cross-compatibility only. Example: `{"bar-key": "value"}`
"""
return pulumi.get(self, "freeform_tags")
@freeform_tags.setter
def freeform_tags(self, value: Optional[pulumi.Input[Mapping[str, Any]]]):
pulumi.set(self, "freeform_tags", value)
@pulumi.input_type
class _DeploymentBackupState:
def __init__(__self__, *,
backup_type: Optional[pulumi.Input[str]] = None,
bucket: Optional[pulumi.Input[str]] = None,
compartment_id: Optional[pulumi.Input[str]] = None,
defined_tags: Optional[pulumi.Input[Mapping[str, Any]]] = None,
deployment_id: Optional[pulumi.Input[str]] = None,
display_name: Optional[pulumi.Input[str]] = None,
freeform_tags: Optional[pulumi.Input[Mapping[str, Any]]] = None,
is_automatic: Optional[pulumi.Input[bool]] = None,
lifecycle_details: Optional[pulumi.Input[str]] = None,
namespace: Optional[pulumi.Input[str]] = None,
object: Optional[pulumi.Input[str]] = None,
ogg_version: Optional[pulumi.Input[str]] = None,
state: Optional[pulumi.Input[str]] = None,
system_tags: Optional[pulumi.Input[Mapping[str, Any]]] = None,
time_created: Optional[pulumi.Input[str]] = None,
time_of_backup: Optional[pulumi.Input[str]] = None,
time_updated: Optional[pulumi.Input[str]] = None):
"""
Input properties used for looking up and filtering DeploymentBackup resources.
:param pulumi.Input[str] backup_type: Possible Deployment backup types.
:param pulumi.Input[str] bucket: Name of the bucket where the object is to be uploaded in the object storage
:param pulumi.Input[str] compartment_id: (Updatable) The [OCID](https://docs.cloud.oracle.com/iaas/Content/General/Concepts/identifiers.htm) of the compartment being referenced.
:param pulumi.Input[Mapping[str, Any]] defined_tags: (Updatable) Tags defined for this resource. Each key is predefined and scoped to a namespace. Example: `{"foo-namespace.bar-key": "value"}`
:param pulumi.Input[str] deployment_id: The [OCID](https://docs.cloud.oracle.com/iaas/Content/General/Concepts/identifiers.htm) of the deployment being referenced.
:param pulumi.Input[str] display_name: An object's Display Name.
:param pulumi.Input[Mapping[str, Any]] freeform_tags: (Updatable) A simple key-value pair that is applied without any predefined name, type, or scope. Exists for cross-compatibility only. Example: `{"bar-key": "value"}`
:param pulumi.Input[bool] is_automatic: True if this object is automatically created
:param pulumi.Input[str] lifecycle_details: Describes the object's current state in detail. For example, it can be used to provide actionable information for a resource in a Failed state.
:param pulumi.Input[str] namespace: Name of namespace that serves as a container for all of your buckets
:param pulumi.Input[str] object: Name of the object to be uploaded to object storage
:param pulumi.Input[str] ogg_version: Version of OGG
:param pulumi.Input[str] state: Possible lifecycle states.
:param pulumi.Input[Mapping[str, Any]] system_tags: The system tags associated with this resource, if any. The system tags are set by Oracle Cloud Infrastructure services. Each key is predefined and scoped to namespaces. For more information, see [Resource Tags](https://docs.cloud.oracle.com/iaas/Content/General/Concepts/resourcetags.htm). Example: `{orcl-cloud: {free-tier-retain: true}}`
:param pulumi.Input[str] time_created: The time the resource was created. The format is defined by [RFC3339](https://tools.ietf.org/html/rfc3339), such as `2016-08-25T21:10:29.600Z`.
:param pulumi.Input[str] time_of_backup: The time of the resource backup. The format is defined by [RFC3339](https://tools.ietf.org/html/rfc3339), such as `2016-08-25T21:10:29.600Z`.
:param pulumi.Input[str] time_updated: The time the resource was last updated. The format is defined by [RFC3339](https://tools.ietf.org/html/rfc3339), such as `2016-08-25T21:10:29.600Z`.
"""
if backup_type is not None:
pulumi.set(__self__, "backup_type", backup_type)
if bucket is not None:
pulumi.set(__self__, "bucket", bucket)
if compartment_id is not None:
pulumi.set(__self__, "compartment_id", compartment_id)
if defined_tags is not None:
pulumi.set(__self__, "defined_tags", defined_tags)
if deployment_id is not None:
pulumi.set(__self__, "deployment_id", deployment_id)
if display_name is not None:
pulumi.set(__self__, "display_name", display_name)
if freeform_tags is not None:
pulumi.set(__self__, "freeform_tags", freeform_tags)
if is_automatic is not None:
pulumi.set(__self__, "is_automatic", is_automatic)
if lifecycle_details is not None:
pulumi.set(__self__, "lifecycle_details", lifecycle_details)
if namespace is not None:
pulumi.set(__self__, "namespace", namespace)
if object is not None:
pulumi.set(__self__, "object", object)
if ogg_version is not None:
pulumi.set(__self__, "ogg_version", ogg_version)
if state is not None:
pulumi.set(__self__, "state", state)
if system_tags is not None:
pulumi.set(__self__, "system_tags", system_tags)
if time_created is not None:
pulumi.set(__self__, "time_created", time_created)
if time_of_backup is not None:
pulumi.set(__self__, "time_of_backup", time_of_backup)
if time_updated is not None:
pulumi.set(__self__, "time_updated", time_updated)
@property
@pulumi.getter(name="backupType")
def backup_type(self) -> Optional[pulumi.Input[str]]:
"""
Possible Deployment backup types.
"""
return pulumi.get(self, "backup_type")
@backup_type.setter
def backup_type(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "backup_type", value)
@property
@pulumi.getter
def bucket(self) -> Optional[pulumi.Input[str]]:
"""
Name of the bucket where the object is to be uploaded in the object storage
"""
return pulumi.get(self, "bucket")
@bucket.setter
def bucket(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "bucket", value)
@property
@pulumi.getter(name="compartmentId")
def compartment_id(self) -> Optional[pulumi.Input[str]]:
"""
(Updatable) The [OCID](https://docs.cloud.oracle.com/iaas/Content/General/Concepts/identifiers.htm) of the compartment being referenced.
"""
return pulumi.get(self, "compartment_id")
@compartment_id.setter
def compartment_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "compartment_id", value)
@property
@pulumi.getter(name="definedTags")
def defined_tags(self) -> Optional[pulumi.Input[Mapping[str, Any]]]:
"""
(Updatable) Tags defined for this resource. Each key is predefined and scoped to a namespace. Example: `{"foo-namespace.bar-key": "value"}`
"""
return pulumi.get(self, "defined_tags")
@defined_tags.setter
def defined_tags(self, value: Optional[pulumi.Input[Mapping[str, Any]]]):
pulumi.set(self, "defined_tags", value)
@property
@pulumi.getter(name="deploymentId")
def deployment_id(self) -> Optional[pulumi.Input[str]]:
"""
The [OCID](https://docs.cloud.oracle.com/iaas/Content/General/Concepts/identifiers.htm) of the deployment being referenced.
"""
return pulumi.get(self, "deployment_id")
@deployment_id.setter
def deployment_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "deployment_id", value)
@property
@pulumi.getter(name="displayName")
def display_name(self) -> Optional[pulumi.Input[str]]:
"""
An object's Display Name.
"""
return pulumi.get(self, "display_name")
@display_name.setter
def display_name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "display_name", value)
@property
@pulumi.getter(name="freeformTags")
def freeform_tags(self) -> Optional[pulumi.Input[Mapping[str, Any]]]:
"""
(Updatable) A simple key-value pair that is applied without any predefined name, type, or scope. Exists for cross-compatibility only. Example: `{"bar-key": "value"}`
"""
return pulumi.get(self, "freeform_tags")
@freeform_tags.setter
def freeform_tags(self, value: Optional[pulumi.Input[Mapping[str, Any]]]):
pulumi.set(self, "freeform_tags", value)
@property
@pulumi.getter(name="isAutomatic")
def is_automatic(self) -> Optional[pulumi.Input[bool]]:
"""
True if this object is automatically created
"""
return pulumi.get(self, "is_automatic")
@is_automatic.setter
def is_automatic(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "is_automatic", value)
@property
@pulumi.getter(name="lifecycleDetails")
def lifecycle_details(self) -> Optional[pulumi.Input[str]]:
"""
Describes the object's current state in detail. For example, it can be used to provide actionable information for a resource in a Failed state.
"""
return pulumi.get(self, "lifecycle_details")
@lifecycle_details.setter
def lifecycle_details(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "lifecycle_details", value)
@property
@pulumi.getter
def namespace(self) -> Optional[pulumi.Input[str]]:
"""
Name of namespace that serves as a container for all of your buckets
"""
return pulumi.get(self, "namespace")
@namespace.setter
def namespace(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "namespace", value)
@property
@pulumi.getter
def object(self) -> Optional[pulumi.Input[str]]:
"""
Name of the object to be uploaded to object storage
"""
return pulumi.get(self, "object")
@object.setter
def object(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "object", value)
@property
@pulumi.getter(name="oggVersion")
def ogg_version(self) -> Optional[pulumi.Input[str]]:
"""
Version of OGG
"""
return pulumi.get(self, "ogg_version")
@ogg_version.setter
def ogg_version(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "ogg_version", value)
@property
@pulumi.getter
def state(self) -> Optional[pulumi.Input[str]]:
"""
Possible lifecycle states.
"""
return pulumi.get(self, "state")
@state.setter
def state(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "state", value)
@property
@pulumi.getter(name="systemTags")
def system_tags(self) -> Optional[pulumi.Input[Mapping[str, Any]]]:
"""
The system tags associated with this resource, if any. The system tags are set by Oracle Cloud Infrastructure services. Each key is predefined and scoped to namespaces. For more information, see [Resource Tags](https://docs.cloud.oracle.com/iaas/Content/General/Concepts/resourcetags.htm). Example: `{orcl-cloud: {free-tier-retain: true}}`
"""
return pulumi.get(self, "system_tags")
@system_tags.setter
def system_tags(self, value: Optional[pulumi.Input[Mapping[str, Any]]]):
pulumi.set(self, "system_tags", value)
@property
@pulumi.getter(name="timeCreated")
def time_created(self) -> Optional[pulumi.Input[str]]:
"""
The time the resource was created. The format is defined by [RFC3339](https://tools.ietf.org/html/rfc3339), such as `2016-08-25T21:10:29.600Z`.
"""
return pulumi.get(self, "time_created")
@time_created.setter
def time_created(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "time_created", value)
@property
@pulumi.getter(name="timeOfBackup")
def time_of_backup(self) -> Optional[pulumi.Input[str]]:
"""
The time of the resource backup. The format is defined by [RFC3339](https://tools.ietf.org/html/rfc3339), such as `2016-08-25T21:10:29.600Z`.
"""
return pulumi.get(self, "time_of_backup")
@time_of_backup.setter
def time_of_backup(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "time_of_backup", value)
@property
@pulumi.getter(name="timeUpdated")
def time_updated(self) -> Optional[pulumi.Input[str]]:
"""
The time the resource was last updated. The format is defined by [RFC3339](https://tools.ietf.org/html/rfc3339), such as `2016-08-25T21:10:29.600Z`.
"""
return pulumi.get(self, "time_updated")
@time_updated.setter
def time_updated(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "time_updated", value)
class DeploymentBackup(pulumi.CustomResource):
@overload
def __init__(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
bucket: Optional[pulumi.Input[str]] = None,
compartment_id: Optional[pulumi.Input[str]] = None,
defined_tags: Optional[pulumi.Input[Mapping[str, Any]]] = None,
deployment_id: Optional[pulumi.Input[str]] = None,
display_name: Optional[pulumi.Input[str]] = None,
freeform_tags: Optional[pulumi.Input[Mapping[str, Any]]] = None,
namespace: Optional[pulumi.Input[str]] = None,
object: Optional[pulumi.Input[str]] = None,
__props__=None):
"""
This resource provides the Deployment Backup resource in Oracle Cloud Infrastructure Golden Gate service.
Creates a new DeploymentBackup.
## Example Usage
```python
import pulumi
import pulumi_oci as oci
test_deployment_backup = oci.goldengate.DeploymentBackup("testDeploymentBackup",
bucket=var["deployment_backup_bucket"],
compartment_id=var["compartment_id"],
deployment_id=oci_golden_gate_deployment["test_deployment"]["id"],
display_name=var["deployment_backup_display_name"],
namespace=var["deployment_backup_namespace"],
object=var["deployment_backup_object"],
defined_tags={
"foo-namespace.bar-key": "value",
},
freeform_tags={
"bar-key": "value",
})
```
## Import
DeploymentBackups can be imported using the `id`, e.g.
```sh
$ pulumi import oci:goldengate/deploymentBackup:DeploymentBackup test_deployment_backup "id"
```
:param str resource_name: The name of the resource.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[str] bucket: Name of the bucket where the object is to be uploaded in the object storage
:param pulumi.Input[str] compartment_id: (Updatable) The [OCID](https://docs.cloud.oracle.com/iaas/Content/General/Concepts/identifiers.htm) of the compartment being referenced.
:param pulumi.Input[Mapping[str, Any]] defined_tags: (Updatable) Tags defined for this resource. Each key is predefined and scoped to a namespace. Example: `{"foo-namespace.bar-key": "value"}`
:param pulumi.Input[str] deployment_id: The [OCID](https://docs.cloud.oracle.com/iaas/Content/General/Concepts/identifiers.htm) of the deployment being referenced.
:param pulumi.Input[str] display_name: An object's Display Name.
:param pulumi.Input[Mapping[str, Any]] freeform_tags: (Updatable) A simple key-value pair that is applied without any predefined name, type, or scope. Exists for cross-compatibility only. Example: `{"bar-key": "value"}`
:param pulumi.Input[str] namespace: Name of namespace that serves as a container for all of your buckets
:param pulumi.Input[str] object: Name of the object to be uploaded to object storage
"""
...
@overload
def __init__(__self__,
resource_name: str,
args: DeploymentBackupArgs,
opts: Optional[pulumi.ResourceOptions] = None):
"""
This resource provides the Deployment Backup resource in Oracle Cloud Infrastructure Golden Gate service.
Creates a new DeploymentBackup.
## Example Usage
```python
import pulumi
import pulumi_oci as oci
test_deployment_backup = oci.goldengate.DeploymentBackup("testDeploymentBackup",
bucket=var["deployment_backup_bucket"],
compartment_id=var["compartment_id"],
deployment_id=oci_golden_gate_deployment["test_deployment"]["id"],
display_name=var["deployment_backup_display_name"],
namespace=var["deployment_backup_namespace"],
object=var["deployment_backup_object"],
defined_tags={
"foo-namespace.bar-key": "value",
},
freeform_tags={
"bar-key": "value",
})
```
## Import
DeploymentBackups can be imported using the `id`, e.g.
```sh
$ pulumi import oci:goldengate/deploymentBackup:DeploymentBackup test_deployment_backup "id"
```
:param str resource_name: The name of the resource.
:param DeploymentBackupArgs args: The arguments to use to populate this resource's properties.
:param pulumi.ResourceOptions opts: Options for the resource.
"""
...
def __init__(__self__, resource_name: str, *args, **kwargs):
resource_args, opts = _utilities.get_resource_args_opts(DeploymentBackupArgs, pulumi.ResourceOptions, *args, **kwargs)
if resource_args is not None:
__self__._internal_init(resource_name, opts, **resource_args.__dict__)
else:
__self__._internal_init(resource_name, *args, **kwargs)
def _internal_init(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
bucket: Optional[pulumi.Input[str]] = None,
compartment_id: Optional[pulumi.Input[str]] = None,
defined_tags: Optional[pulumi.Input[Mapping[str, Any]]] = None,
deployment_id: Optional[pulumi.Input[str]] = None,
display_name: Optional[pulumi.Input[str]] = None,
freeform_tags: Optional[pulumi.Input[Mapping[str, Any]]] = None,
namespace: Optional[pulumi.Input[str]] = None,
object: Optional[pulumi.Input[str]] = None,
__props__=None):
if opts is None:
opts = pulumi.ResourceOptions()
if not isinstance(opts, pulumi.ResourceOptions):
raise TypeError('Expected resource options to be a ResourceOptions instance')
if opts.version is None:
opts.version = _utilities.get_version()
if opts.id is None:
if __props__ is not None:
raise TypeError('__props__ is only valid when passed in combination with a valid opts.id to get an existing resource')
__props__ = DeploymentBackupArgs.__new__(DeploymentBackupArgs)
if bucket is None and not opts.urn:
raise TypeError("Missing required property 'bucket'")
__props__.__dict__["bucket"] = bucket
if compartment_id is None and not opts.urn:
raise TypeError("Missing required property 'compartment_id'")
__props__.__dict__["compartment_id"] = compartment_id
__props__.__dict__["defined_tags"] = defined_tags
if deployment_id is None and not opts.urn:
raise TypeError("Missing required property 'deployment_id'")
__props__.__dict__["deployment_id"] = deployment_id
if display_name is None and not opts.urn:
raise TypeError("Missing required property 'display_name'")
__props__.__dict__["display_name"] = display_name
__props__.__dict__["freeform_tags"] = freeform_tags
if namespace is None and not opts.urn:
raise TypeError("Missing required property 'namespace'")
__props__.__dict__["namespace"] = namespace
if object is None and not opts.urn:
raise TypeError("Missing required property 'object'")
__props__.__dict__["object"] = object
__props__.__dict__["backup_type"] = None
__props__.__dict__["is_automatic"] = None
__props__.__dict__["lifecycle_details"] = None
__props__.__dict__["ogg_version"] = None
__props__.__dict__["state"] = None
__props__.__dict__["system_tags"] = None
__props__.__dict__["time_created"] = None
__props__.__dict__["time_of_backup"] = None
__props__.__dict__["time_updated"] = None
super(DeploymentBackup, __self__).__init__(
'oci:goldengate/deploymentBackup:DeploymentBackup',
resource_name,
__props__,
opts)
@staticmethod
def get(resource_name: str,
id: pulumi.Input[str],
opts: Optional[pulumi.ResourceOptions] = None,
backup_type: Optional[pulumi.Input[str]] = None,
bucket: Optional[pulumi.Input[str]] = None,
compartment_id: Optional[pulumi.Input[str]] = None,
defined_tags: Optional[pulumi.Input[Mapping[str, Any]]] = None,
deployment_id: Optional[pulumi.Input[str]] = None,
display_name: Optional[pulumi.Input[str]] = None,
freeform_tags: Optional[pulumi.Input[Mapping[str, Any]]] = None,
is_automatic: Optional[pulumi.Input[bool]] = None,
lifecycle_details: Optional[pulumi.Input[str]] = None,
namespace: Optional[pulumi.Input[str]] = None,
object: Optional[pulumi.Input[str]] = None,
ogg_version: Optional[pulumi.Input[str]] = None,
state: Optional[pulumi.Input[str]] = None,
system_tags: Optional[pulumi.Input[Mapping[str, Any]]] = None,
time_created: Optional[pulumi.Input[str]] = None,
time_of_backup: Optional[pulumi.Input[str]] = None,
time_updated: Optional[pulumi.Input[str]] = None) -> 'DeploymentBackup':
"""
Get an existing DeploymentBackup resource's state with the given name, id, and optional extra
properties used to qualify the lookup.
:param str resource_name: The unique name of the resulting resource.
:param pulumi.Input[str] id: The unique provider ID of the resource to lookup.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[str] backup_type: Possible Deployment backup types.
:param pulumi.Input[str] bucket: Name of the bucket where the object is to be uploaded in the object storage
:param pulumi.Input[str] compartment_id: (Updatable) The [OCID](https://docs.cloud.oracle.com/iaas/Content/General/Concepts/identifiers.htm) of the compartment being referenced.
:param pulumi.Input[Mapping[str, Any]] defined_tags: (Updatable) Tags defined for this resource. Each key is predefined and scoped to a namespace. Example: `{"foo-namespace.bar-key": "value"}`
:param pulumi.Input[str] deployment_id: The [OCID](https://docs.cloud.oracle.com/iaas/Content/General/Concepts/identifiers.htm) of the deployment being referenced.
:param pulumi.Input[str] display_name: An object's Display Name.
:param pulumi.Input[Mapping[str, Any]] freeform_tags: (Updatable) A simple key-value pair that is applied without any predefined name, type, or scope. Exists for cross-compatibility only. Example: `{"bar-key": "value"}`
:param pulumi.Input[bool] is_automatic: True if this object is automatically created
:param pulumi.Input[str] lifecycle_details: Describes the object's current state in detail. For example, it can be used to provide actionable information for a resource in a Failed state.
:param pulumi.Input[str] namespace: Name of namespace that serves as a container for all of your buckets
:param pulumi.Input[str] object: Name of the object to be uploaded to object storage
:param pulumi.Input[str] ogg_version: Version of OGG
:param pulumi.Input[str] state: Possible lifecycle states.
:param pulumi.Input[Mapping[str, Any]] system_tags: The system tags associated with this resource, if any. The system tags are set by Oracle Cloud Infrastructure services. Each key is predefined and scoped to namespaces. For more information, see [Resource Tags](https://docs.cloud.oracle.com/iaas/Content/General/Concepts/resourcetags.htm). Example: `{orcl-cloud: {free-tier-retain: true}}`
:param pulumi.Input[str] time_created: The time the resource was created. The format is defined by [RFC3339](https://tools.ietf.org/html/rfc3339), such as `2016-08-25T21:10:29.600Z`.
:param pulumi.Input[str] time_of_backup: The time of the resource backup. The format is defined by [RFC3339](https://tools.ietf.org/html/rfc3339), such as `2016-08-25T21:10:29.600Z`.
:param pulumi.Input[str] time_updated: The time the resource was last updated. The format is defined by [RFC3339](https://tools.ietf.org/html/rfc3339), such as `2016-08-25T21:10:29.600Z`.
"""
opts = pulumi.ResourceOptions.merge(opts, pulumi.ResourceOptions(id=id))
__props__ = _DeploymentBackupState.__new__(_DeploymentBackupState)
__props__.__dict__["backup_type"] = backup_type
__props__.__dict__["bucket"] = bucket
__props__.__dict__["compartment_id"] = compartment_id
__props__.__dict__["defined_tags"] = defined_tags
__props__.__dict__["deployment_id"] = deployment_id
__props__.__dict__["display_name"] = display_name
__props__.__dict__["freeform_tags"] = freeform_tags
__props__.__dict__["is_automatic"] = is_automatic
__props__.__dict__["lifecycle_details"] = lifecycle_details
__props__.__dict__["namespace"] = namespace
__props__.__dict__["object"] = object
__props__.__dict__["ogg_version"] = ogg_version
__props__.__dict__["state"] = state
__props__.__dict__["system_tags"] = system_tags
__props__.__dict__["time_created"] = time_created
__props__.__dict__["time_of_backup"] = time_of_backup
__props__.__dict__["time_updated"] = time_updated
return DeploymentBackup(resource_name, opts=opts, __props__=__props__)
@property
@pulumi.getter(name="backupType")
def backup_type(self) -> pulumi.Output[str]:
"""
Possible Deployment backup types.
"""
return pulumi.get(self, "backup_type")
@property
@pulumi.getter
def bucket(self) -> pulumi.Output[str]:
"""
Name of the bucket where the object is to be uploaded in the object storage
"""
return pulumi.get(self, "bucket")
@property
@pulumi.getter(name="compartmentId")
def compartment_id(self) -> pulumi.Output[str]:
"""
(Updatable) The [OCID](https://docs.cloud.oracle.com/iaas/Content/General/Concepts/identifiers.htm) of the compartment being referenced.
"""
return pulumi.get(self, "compartment_id")
@property
@pulumi.getter(name="definedTags")
def defined_tags(self) -> pulumi.Output[Mapping[str, Any]]:
"""
(Updatable) Tags defined for this resource. Each key is predefined and scoped to a namespace. Example: `{"foo-namespace.bar-key": "value"}`
"""
return pulumi.get(self, "defined_tags")
@property
@pulumi.getter(name="deploymentId")
def deployment_id(self) -> pulumi.Output[str]:
"""
The [OCID](https://docs.cloud.oracle.com/iaas/Content/General/Concepts/identifiers.htm) of the deployment being referenced.
"""
return pulumi.get(self, "deployment_id")
@property
@pulumi.getter(name="displayName")
def display_name(self) -> pulumi.Output[str]:
"""
An object's Display Name.
"""
return pulumi.get(self, "display_name")
@property
@pulumi.getter(name="freeformTags")
def freeform_tags(self) -> pulumi.Output[Mapping[str, Any]]:
"""
(Updatable) A simple key-value pair that is applied without any predefined name, type, or scope. Exists for cross-compatibility only. Example: `{"bar-key": "value"}`
"""
return pulumi.get(self, "freeform_tags")
@property
@pulumi.getter(name="isAutomatic")
def is_automatic(self) -> pulumi.Output[bool]:
"""
True if this object is automatically created
"""
return pulumi.get(self, "is_automatic")
@property
@pulumi.getter(name="lifecycleDetails")
def lifecycle_details(self) -> pulumi.Output[str]:
"""
Describes the object's current state in detail. For example, it can be used to provide actionable information for a resource in a Failed state.
"""
return pulumi.get(self, "lifecycle_details")
@property
@pulumi.getter
def namespace(self) -> pulumi.Output[str]:
"""
Name of namespace that serves as a container for all of your buckets
"""
return pulumi.get(self, "namespace")
@property
@pulumi.getter
def object(self) -> pulumi.Output[str]:
"""
Name of the object to be uploaded to object storage
"""
return pulumi.get(self, "object")
@property
@pulumi.getter(name="oggVersion")
def ogg_version(self) -> pulumi.Output[str]:
"""
Version of OGG
"""
return pulumi.get(self, "ogg_version")
@property
@pulumi.getter
def state(self) -> pulumi.Output[str]:
"""
Possible lifecycle states.
"""
return pulumi.get(self, "state")
@property
@pulumi.getter(name="systemTags")
def system_tags(self) -> pulumi.Output[Mapping[str, Any]]:
"""
The system tags associated with this resource, if any. The system tags are set by Oracle Cloud Infrastructure services. Each key is predefined and scoped to namespaces. For more information, see [Resource Tags](https://docs.cloud.oracle.com/iaas/Content/General/Concepts/resourcetags.htm). Example: `{orcl-cloud: {free-tier-retain: true}}`
"""
return pulumi.get(self, "system_tags")
@property
@pulumi.getter(name="timeCreated")
def time_created(self) -> pulumi.Output[str]:
"""
The time the resource was created. The format is defined by [RFC3339](https://tools.ietf.org/html/rfc3339), such as `2016-08-25T21:10:29.600Z`.
"""
return pulumi.get(self, "time_created")
@property
@pulumi.getter(name="timeOfBackup")
def time_of_backup(self) -> pulumi.Output[str]:
"""
The time of the resource backup. The format is defined by [RFC3339](https://tools.ietf.org/html/rfc3339), such as `2016-08-25T21:10:29.600Z`.
"""
return pulumi.get(self, "time_of_backup")
@property
@pulumi.getter(name="timeUpdated")
def time_updated(self) -> pulumi.Output[str]:
"""
The time the resource was last updated. The format is defined by [RFC3339](https://tools.ietf.org/html/rfc3339), such as `2016-08-25T21:10:29.600Z`.
"""
return pulumi.get(self, "time_updated")
| 47.500627 | 400 | 0.654181 | 4,617 | 37,858 | 5.174356 | 0.054364 | 0.074592 | 0.071494 | 0.058937 | 0.888949 | 0.8545 | 0.828548 | 0.799581 | 0.79054 | 0.740268 | 0 | 0.010356 | 0.232263 | 37,858 | 796 | 401 | 47.560302 | 0.811595 | 0.383512 | 0 | 0.600446 | 1 | 0 | 0.104636 | 0.002243 | 0 | 0 | 0 | 0 | 0 | 1 | 0.165179 | false | 0.002232 | 0.011161 | 0 | 0.279018 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
7b9995e03013b6f72abd8607db3d80bea18d9e4f | 31,283 | py | Python | scripts/pr_reminder_basedon_assignee.py | shichun-0415/docs | a19ad7656da96ba8e00253de4c56658301621db7 | [
"Apache-2.0"
] | 1 | 2022-02-22T15:39:41.000Z | 2022-02-22T15:39:41.000Z | scripts/pr_reminder_basedon_assignee.py | shichun-0415/docs | a19ad7656da96ba8e00253de4c56658301621db7 | [
"Apache-2.0"
] | null | null | null | scripts/pr_reminder_basedon_assignee.py | shichun-0415/docs | a19ad7656da96ba8e00253de4c56658301621db7 | [
"Apache-2.0"
] | null | null | null | from os import close
import sys
import requests
from lxml import etree
from datetime import datetime
from string import Template
docs_cn_url = 'https://github.com/pingcap/docs-cn/pulls?q=is%3Apr'
docs_url = 'https://github.com/pingcap/docs/pulls?q=is%3Apr'
open_url = '+is%3Aopen+is%3Apr'
close_url = '+is%3Aclosed+label%3Atranslation%2Fdoing'
v54 = '+label%3Av5.4'
type_compatibility_change = '+label%3Atype%2Fcompatibility-or-feature-change'
type_oncall = '+label%3AONCALL'
type_bugfix = '+label%3Atype%2Fbug-fix'
type_enhancement = '+label%3Atype%2Fenhancement'
shichun_0415_assignee = '+assignee%3Ashichun-0415'
shichun_0415_author = '+author%3Ashichun-0415'
en_jin19_assignee = '+assignee%3Aen-jin19'
en_jin19_author = '+author%3Aen-jin19'
hfxsd_assignee = '+assignee%3Ahfxsd'
hfxsd_author = '+author%3Ahfxsd'
ran_huang_assignee = '+assignee%3Aran-huang'
ran_huang_author = '+author%3Aran-huang'
qiancai_assignee = '+assignee%3Aqiancai'
qiancai_author = '+author%3Aqiancai'
tomshawn_assignee = '+assignee%3ATomShawn'
tomshawn_author = '+author%3ATomShawn'
def get_pr_no(url):
page_text = requests.get(url=url).text
tree = etree.HTML(page_text)
pr_no = tree.xpath('//div[@class="table-list-header-toggle states flex-auto pl-0"]/a[@class="btn-link selected"]/text()')[1].strip()
if pr_no:
if pr_no.endswith('d'):
return str(pr_no[:-7])
if pr_no.endswith('n'):
return str(pr_no[:-5])
# print("未抓取到 PR 数目")
else:
return 0
TEMPLATE = '''
*************************************************
待处理的 PR 数目报告
查询时间:{date}
待处理 PR 数目如下
*************************************************
v5.4 发版文档,中文文档截止日期 2021-01-07,英文文档截止日期 2021-01-18
- en-jin19
- docs-cn:有 {en-jin19-zh-assignee-open} 个未合源语 PR 待处理,有 {en-jin19-zh-assignee-close} 个已合源语 PR 待翻译,已翻译了 {en-jin19-zh-author-open} 个 PR 未合并
- docs:有 {en-jin19-en-assignee-open} 个未合源语 PR 待处理,有 {en-jin19-en-assignee-close} 个已合源语 PR 待翻译,已翻译了 {en-jin19-en-author-open} 个 PR 未合并
- shichun-0415
- docs-cn:有 {shichun-0415-zh-assignee-open} 个未合源语 PR 待处理,有 {shichun-0415-zh-assignee-close} 个已合源语 PR 待翻译,已翻译了 {shichun-0415-zh-author-open} 个 PR 未合并
- docs:有 {shichun-0415-en-assignee-open} 个未合源语 PR 待处理,有 {shichun-0415-en-assignee-close} 个已合源语 PR 待翻译,已翻译了 {shichun-0415-en-author-open} 个 PR 未合并
- hfxsd
- docs-cn:有 {hfxsd-zh-assignee-open} 个未合源语 PR 待处理,有 {hfxsd-zh-assignee-close} 个已合源语 PR 待翻译,已翻译了 {hfxsd-zh-author-open} 个 PR 未合并
- docs:有 {hfxsd-en-assignee-open} 个未合源语 PR 待处理,有 {hfxsd-en-assignee-close} 个已合源语 PR 待翻译,已翻译了 {hfxsd-en-author-open} 个 PR 未合并
- ran-huang
- docs-cn:有 {ran-huang-zh-assignee-open} 个未合源语 PR 待处理,有 {ran-huang-zh-assignee-close} 个已合源语 PR 待翻译,已翻译了 {ran-huang-zh-author-open} 个 PR 未合并
- docs:有 {ran-huang-en-assignee-open} 个未合源语 PR 待处理,有 {ran-huang-en-assignee-close} 个已合源语 PR 待翻译,已翻译了 {ran-huang-en-author-open} 个 PR 未合并
- qiancai
- docs-cn:有 {qiancai-zh-assignee-open} 个未合源语 PR 待处理,有 {qiancai-zh-assignee-close} 个已合源语 PR 待翻译,已翻译了 {qiancai-zh-author-open} 个 PR 未合并
- docs:有 {qiancai-en-assignee-open} 个未合源语 PR 待处理,有 {qiancai-en-assignee-close} 个已合源语 PR 待翻译,已翻译了 {qiancai-en-author-open} 个 PR 未合并
- TomShawn
- docs-cn:有 {tomshawn-zh-assignee-open} 个未合源语 PR 待处理,有 {tomshawn-zh-assignee-close} 个已合源语 PR 待翻译,已翻译了 {tomshawn-zh-author-open} 个 PR 未合并
- docs:有 {tomshawn-en-assignee-open} 个未合源语 PR 待处理,有 {tomshawn-en-assignee-close} 个已合源语 PR 待翻译,已翻译了 {tomshawn-en-author-open} 个 PR 未合并
*************************************************
'''
if __name__ == "__main__":
data = {
'date': datetime.utcnow().strftime('%Y-%m-%d'),
'shichun_0415_zh_assignee_open': get_pr_no(docs_cn_url + open_url + shichun_0415_assignee + v54),
'shichun_0415_zh_assignee_close': get_pr_no(docs_cn_url + close_url + shichun_0415_assignee + v54),
'shichun_0415_zh_author_open': get_pr_no(docs_cn_url + open_url + shichun_0415_author + v54),
'shichun_0415_en_assignee_open': get_pr_no(docs_url + open_url + shichun_0415_assignee + v54),
'shichun_0415_en_assignee_close': get_pr_no(docs_url + close_url + shichun_0415_assignee + v54),
'shichun_0415_en_author_open': get_pr_no(docs_url + open_url + shichun_0415_author + v54),
'en_jin19_zh_assignee_open': get_pr_no(docs_cn_url + open_url + en_jin19_assignee + v54),
'en_jin19_zh_assignee_close': get_pr_no(docs_cn_url + close_url + en_jin19_assignee + v54),
'en_jin19_zh_author_open': get_pr_no(docs_cn_url + open_url + en_jin19_author + v54),
'en_jin19_en_assignee_open': get_pr_no(docs_url + open_url + en_jin19_assignee + v54),
'en_jin19_en_assignee_close': get_pr_no(docs_url + close_url + en_jin19_assignee + v54),
'en_jin19_en_author_open': get_pr_no(docs_url + open_url + en_jin19_author + v54),
'hfxsd_zh_assignee_open': get_pr_no(docs_cn_url + open_url + hfxsd_assignee + v54),
'hfxsd_zh_assignee_close': get_pr_no(docs_cn_url + close_url + hfxsd_assignee + v54),
'hfxsd_zh_author_open': get_pr_no(docs_cn_url + open_url + hfxsd_author + v54),
'hfxsd_en_assignee_open': get_pr_no(docs_url + open_url + hfxsd_assignee + v54),
'hfxsd_en_assignee_close': get_pr_no(docs_url + close_url + hfxsd_assignee + v54),
'hfxsd_en_author_open': get_pr_no(docs_url + open_url + hfxsd_author + v54),
'ran_huang_zh_assignee_open': get_pr_no(docs_cn_url + open_url + ran_huang_assignee + v54),
'ran_huang_zh_assignee_close': get_pr_no(docs_cn_url + close_url + ran_huang_assignee + v54),
'ran_huang_zh_author_open': get_pr_no(docs_cn_url + open_url + ran_huang_author + v54),
'ran_huang_en_assignee_open': get_pr_no(docs_url + open_url + ran_huang_assignee + v54),
'ran_huang_en_assignee_close': get_pr_no(docs_url + close_url + ran_huang_assignee + v54),
'ran_huang_en_author_open': get_pr_no(docs_url + open_url + ran_huang_author + v54),
'qiancai_zh_assignee_open': get_pr_no(docs_cn_url + open_url + qiancai_assignee + v54),
'qiancai_zh_assignee_close': get_pr_no(docs_cn_url + close_url + qiancai_assignee + v54),
'qiancai_zh_author_open': get_pr_no(docs_cn_url + open_url + qiancai_author +v54),
'qiancai_en_assignee_open': get_pr_no(docs_url + open_url + qiancai_assignee + v54),
'qiancai_en_assignee_close': get_pr_no(docs_url + close_url + qiancai_assignee + v54),
'qiancai_en_author_open': get_pr_no(docs_url + open_url + qiancai_author +v54),
'tomshawn_zh_assignee_open': get_pr_no(docs_cn_url + open_url + tomshawn_assignee + v54),
'tomshawn_zh_assignee_close': get_pr_no(docs_cn_url + close_url + tomshawn_assignee + v54),
'tomshawn_zh_author_open': get_pr_no(docs_cn_url + open_url + tomshawn_author + v54),
'tomshawn_en_assignee_open': get_pr_no(docs_url + open_url + tomshawn_assignee + v54),
'tomshawn_en_assignee_close': get_pr_no(docs_url + close_url + tomshawn_assignee + v54),
'tomshawn_en_author_open': get_pr_no(docs_url + open_url + tomshawn_author + v54),
'shichun_0415_zh_assignee_open_url': docs_cn_url + open_url + shichun_0415_assignee + v54,
'shichun_0415_zh_assignee_close_url': docs_cn_url + close_url + shichun_0415_assignee + v54,
'shichun_0415_zh_author_open_url': docs_cn_url + open_url + shichun_0415_author + v54,
'shichun_0415_en_assignee_open_url': docs_url + open_url + shichun_0415_assignee + v54,
'shichun_0415_en_assignee_close_url': docs_url + close_url + shichun_0415_assignee + v54,
'shichun_0415_en_author_open_url': docs_url + open_url + shichun_0415_author + v54,
'en_jin19_zh_assignee_open_url': docs_cn_url + open_url + en_jin19_assignee + v54,
'en_jin19_zh_assignee_close_url': docs_cn_url + close_url + en_jin19_assignee + v54,
'en_jin19_zh_author_open_url': docs_cn_url + open_url + en_jin19_author + v54,
'en_jin19_en_assignee_open_url': docs_url + open_url + en_jin19_assignee + v54,
'en_jin19_en_assignee_close_url': docs_url + close_url + en_jin19_assignee + v54,
'en_jin19_en_author_open_url': docs_url + open_url + en_jin19_author + v54,
'hfxsd_zh_assignee_open_url': docs_cn_url + open_url + hfxsd_assignee + v54,
'hfxsd_zh_assignee_close_url': docs_cn_url + close_url + hfxsd_assignee + v54,
'hfxsd_zh_author_open_url': docs_cn_url + open_url + hfxsd_author + v54,
'hfxsd_en_assignee_open_url': docs_url + open_url + hfxsd_assignee + v54,
'hfxsd_en_assignee_close_url': docs_url + close_url + hfxsd_assignee + v54,
'hfxsd_en_author_open_url': docs_url + open_url + hfxsd_author + v54,
'ran_huang_zh_assignee_open_url': docs_cn_url + open_url + ran_huang_assignee + v54,
'ran_huang_zh_assignee_close_url': docs_cn_url + close_url + ran_huang_assignee + v54,
'ran_huang_zh_author_open_url': docs_cn_url + open_url + ran_huang_author + v54,
'ran_huang_en_assignee_open_url': docs_url + open_url + ran_huang_assignee + v54,
'ran_huang_en_assignee_close_url': docs_url + close_url + ran_huang_assignee + v54,
'ran_huang_en_author_open_url': docs_url + open_url + ran_huang_author + v54,
'qiancai_zh_assignee_open_url': docs_cn_url + open_url + qiancai_assignee + v54,
'qiancai_zh_assignee_close_url': docs_cn_url + close_url + qiancai_assignee + v54,
'qiancai_zh_author_open_url': docs_cn_url + open_url + qiancai_author + v54,
'qiancai_en_assignee_open_url': docs_url + open_url + qiancai_assignee + v54,
'qiancai_en_assignee_close_url': docs_url + close_url + qiancai_assignee + v54,
'qiancai_en_author_open_url': docs_url + open_url + qiancai_author + v54,
'tomshawn_zh_assignee_open_url': docs_cn_url + open_url + tomshawn_assignee + v54,
'tomshawn_zh_assignee_close_url': docs_cn_url + close_url + tomshawn_assignee + v54,
'tomshawn_zh_author_open_url': docs_cn_url + open_url + tomshawn_author + v54,
'tomshawn_en_assignee_open_url': docs_url + open_url + tomshawn_assignee + v54,
'tomshawn_en_assignee_close_url': docs_url + close_url + tomshawn_assignee + v54,
'tomshawn_en_author_open_url': docs_url + open_url + tomshawn_author + v54,
}
URL = sys.argv[1]
d = Template("""{
"msg_type": "post",
"content": {
"post": {
"zh-CN": {
"title": "待处理的 PR 数目报告",
"content": [
[
{
"tag": "text",
"text": "查询时间:${date}"
}
],
[
{
"tag": "text",
"text": "待处理 PR 数目如下"
}
],
[
{
"tag": "text",
"text": "*************************************************"
}
],
[
{
"tag": "text",
"text": "v5.4 发版文档,中文文档截止日期 2021-01-07,英文文档截止日期 2021-01-18"
}
],
[
{
"tag": "text",
"text": ""
}
],
[
{
"tag": "text",
"text": "- en-jin19"
}
],
[
{
"tag": "text",
"text": ""
}
],
[
{
"tag": "text",
"text": " - docs-cn:有 ${en_jin19_zh_assignee_open} 个未合源语 PR"
},
{
"tag": "a",
"text": "待处理",
"href": "${en_jin19_zh_assignee_open_url}"
},
{
"tag": "text",
"text": ",有 ${en_jin19_zh_assignee_close} 个已合源语 PR"
},
{
"tag": "a",
"text": "待翻译",
"href": "${en_jin19_zh_assignee_close_url}"
},
{
"tag": "text",
"text": ",已翻译了 ${en_jin19_zh_author_open} 个 PR"
},
{
"tag": "a",
"text": "未合并",
"href": "${en_jin19_zh_author_open_url}"
}
],
[
{
"tag": "text",
"text": " - docs:有 ${en_jin19_en_assignee_open} 个未合源语 PR"
},
{
"tag": "a",
"text": "待处理",
"href": "${en_jin19_en_assignee_open_url}"
},
{
"tag": "text",
"text": ",有 ${en_jin19_en_assignee_close} 个已合源语 PR"
},
{
"tag": "a",
"text": "待翻译",
"href": "${en_jin19_en_assignee_close_url}"
},
{
"tag": "text",
"text": ",已翻译了 ${en_jin19_en_author_open} 个 PR"
},
{
"tag": "a",
"text": "未合并",
"href": "${en_jin19_en_author_open_url}"
}
],
[
{
"tag": "text",
"text": ""
}
],
[
{
"tag": "text",
"text": "- shichun-0415"
}
],
[
{
"tag": "text",
"text": ""
}
],
[
{
"tag": "text",
"text": " - docs-cn:有 ${shichun_0415_zh_assignee_open} 个未合源语 PR"
},
{
"tag": "a",
"text": "待处理",
"href": "${shichun_0415_zh_assignee_open_url}"
},
{
"tag": "text",
"text": ",有 ${shichun_0415_zh_assignee_close} 个已合源语 PR"
},
{
"tag": "a",
"text": "待翻译",
"href": "${shichun_0415_zh_assignee_close_url}"
},
{
"tag": "text",
"text": ",已翻译了 ${shichun_0415_zh_author_open} 个 PR"
},
{
"tag": "a",
"text": "未合并",
"href": "${shichun_0415_zh_author_open_url}"
}
],
[
{
"tag": "text",
"text": " - docs:有 ${shichun_0415_en_assignee_open} 个未合源语 PR"
},
{
"tag": "a",
"text": "待处理",
"href": "${shichun_0415_en_assignee_open_url}"
},
{
"tag": "text",
"text": ",有 ${shichun_0415_en_assignee_close} 个已合源语 PR"
},
{
"tag": "a",
"text": "待翻译",
"href": "${shichun_0415_en_assignee_close_url}"
},
{
"tag": "text",
"text": ",已翻译了 ${shichun_0415_en_author_open} 个 PR"
},
{
"tag": "a",
"text": "未合并",
"href": "${shichun_0415_en_author_open_url}"
}
],
[
{
"tag": "text",
"text": ""
}
],
[
{
"tag": "text",
"text": "- hfxsd"
}
],
[
{
"tag": "text",
"text": ""
}
],
[
{
"tag": "text",
"text": " - docs-cn:有 ${hfxsd_zh_assignee_open} 个未合源语 PR"
},
{
"tag": "a",
"text": "待处理",
"href": "${hfxsd_zh_assignee_open_url}"
},
{
"tag": "text",
"text": ",有 ${hfxsd_zh_assignee_close} 个已合源语 PR"
},
{
"tag": "a",
"text": "待翻译",
"href": "${hfxsd_zh_assignee_close_url}"
},
{
"tag": "text",
"text": ",已翻译了 ${hfxsd_zh_author_open} 个 PR"
},
{
"tag": "a",
"text": "未合并",
"href": "${hfxsd_zh_author_open_url}"
}
],
[
{
"tag": "text",
"text": " - docs:有 ${hfxsd_en_assignee_open} 个未合源语 PR"
},
{
"tag": "a",
"text": "待处理",
"href": "${hfxsd_en_assignee_open_url}"
},
{
"tag": "text",
"text": ",有 ${hfxsd_en_assignee_close} 个已合源语 PR"
},
{
"tag": "a",
"text": "待翻译",
"href": "${hfxsd_en_assignee_close_url}"
},
{
"tag": "text",
"text": ",已翻译了 ${hfxsd_en_author_open} 个 PR"
},
{
"tag": "a",
"text": "未合并",
"href": "${hfxsd_en_author_open_url}"
}
],
[
{
"tag": "text",
"text": ""
}
],
[
{
"tag": "text",
"text": "- ran-huang"
}
],
[
{
"tag": "text",
"text": ""
}
],
[
{
"tag": "text",
"text": " - docs-cn:有 ${ran_huang_zh_assignee_open} 个未合源语 PR"
},
{
"tag": "a",
"text": "待处理",
"href": "${ran_huang_zh_assignee_open_url}"
},
{
"tag": "text",
"text": ",有 ${ran_huang_zh_assignee_close} 个已合源语 PR"
},
{
"tag": "a",
"text": "待翻译",
"href": "${ran_huang_zh_assignee_close_url}"
},
{
"tag": "text",
"text": ",已翻译了 ${ran_huang_zh_author_open} 个 PR"
},
{
"tag": "a",
"text": "未合并",
"href": "${ran_huang_zh_author_open_url}"
}
],
[
{
"tag": "text",
"text": " - docs:有 ${ran_huang_en_assignee_open} 个未合源语 PR"
},
{
"tag": "a",
"text": "待处理",
"href": "${ran_huang_en_assignee_open_url}"
},
{
"tag": "text",
"text": ",有 ${ran_huang_en_assignee_close} 个已合源语 PR"
},
{
"tag": "a",
"text": "待翻译",
"href": "${ran_huang_en_assignee_close_url}"
},
{
"tag": "text",
"text": ",已翻译了 ${ran_huang_en_author_open} 个 PR"
},
{
"tag": "a",
"text": "未合并",
"href": "${ran_huang_en_author_open_url}"
}
],
[
{
"tag": "text",
"text": ""
}
],
[
{
"tag": "text",
"text": "- qiancai"
}
],
[
{
"tag": "text",
"text": ""
}
],
[
{
"tag": "text",
"text": " - docs-cn:有 ${qiancai_zh_assignee_open} 个未合源语 PR"
},
{
"tag": "a",
"text": "待处理",
"href": "${qiancai_zh_assignee_open_url}"
},
{
"tag": "text",
"text": ",有 ${qiancai_zh_assignee_close} 个已合源语 PR"
},
{
"tag": "a",
"text": "待翻译",
"href": "${qiancai_zh_assignee_close_url}"
},
{
"tag": "text",
"text": ",已翻译了 ${qiancai_zh_author_open} 个 PR"
},
{
"tag": "a",
"text": "未合并",
"href": "${qiancai_zh_author_open_url}"
}
],
[
{
"tag": "text",
"text": " - docs:有 ${qiancai_en_assignee_open} 个未合源语 PR"
},
{
"tag": "a",
"text": "待处理",
"href": "${qiancai_en_assignee_open_url}"
},
{
"tag": "text",
"text": ",有 ${qiancai_en_assignee_close} 个已合源语 PR"
},
{
"tag": "a",
"text": "待翻译",
"href": "${qiancai_en_assignee_close_url}"
},
{
"tag": "text",
"text": ",已翻译了 ${qiancai_en_author_open} 个 PR"
},
{
"tag": "a",
"text": "未合并",
"href": "${qiancai_en_author_open_url}"
}
],
[
{
"tag": "text",
"text": ""
}
],
[
{
"tag": "text",
"text": "- TomShawn"
}
],
[
{
"tag": "text",
"text": ""
}
],
[
{
"tag": "text",
"text": " - docs-cn:有 ${tomshawn_zh_assignee_open} 个未合源语 PR"
},
{
"tag": "a",
"text": "待处理",
"href": "${tomshawn_zh_assignee_open_url}"
},
{
"tag": "text",
"text": ",有 ${tomshawn_zh_assignee_close} 个已合源语 PR"
},
{
"tag": "a",
"text": "待翻译",
"href": "${tomshawn_zh_assignee_close_url}"
},
{
"tag": "text",
"text": ",已翻译了 ${tomshawn_zh_author_open} 个 PR"
},
{
"tag": "a",
"text": "未合并",
"href": "${tomshawn_zh_author_open_url}"
}
],
[
{
"tag": "text",
"text": " - docs:有 ${tomshawn_en_assignee_open} 个未合源语 PR"
},
{
"tag": "a",
"text": "待处理",
"href": "${tomshawn_en_assignee_open_url}"
},
{
"tag": "text",
"text": ",有 ${tomshawn_en_assignee_close} 个已合源语 PR"
},
{
"tag": "a",
"text": "待翻译",
"href": "${tomshawn_en_assignee_close_url}"
},
{
"tag": "text",
"text": ",已翻译了 ${tomshawn_en_author_open} 个 PR"
},
{
"tag": "a",
"text": "未合并",
"href": "${tomshawn_en_author_open_url}"
}
]
]
}
}
}
}""")
headers = {
'Content-Type': 'application/json'
}
r_data=d.substitute(data).encode('utf-8')
r_docs = requests.request("POST", URL, headers=headers, data=r_data)
print(f'{r_docs.status_code} {r_docs.reason}')
# print(r_docs.text)
| 46.691045 | 152 | 0.357958 | 2,543 | 31,283 | 4.014157 | 0.058592 | 0.066516 | 0.0625 | 0.038793 | 0.859914 | 0.857367 | 0.84473 | 0.810541 | 0.701117 | 0.591301 | 0 | 0.034957 | 0.537289 | 31,283 | 669 | 153 | 46.760837 | 0.670259 | 0.001215 | 0 | 0.278125 | 0 | 0.021875 | 0.810333 | 0.177438 | 0 | 0 | 0 | 0 | 0 | 1 | 0.001563 | false | 0 | 0.009375 | 0 | 0.015625 | 0.001563 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
c8b023629034db0818d70a0563305fdcb34a7062 | 3,439 | py | Python | Yukki/Plugins/broadcast.py | vaibhav252/YukkiMusic-Old | 06aa5ecbdbf62fac461836601dbcd5854a8c55fb | [
"MIT"
] | 13 | 2021-12-21T07:01:50.000Z | 2022-03-30T07:11:41.000Z | Yukki/Plugins/broadcast.py | vaibhav252/YukkiMusic-Old | 06aa5ecbdbf62fac461836601dbcd5854a8c55fb | [
"MIT"
] | null | null | null | Yukki/Plugins/broadcast.py | vaibhav252/YukkiMusic-Old | 06aa5ecbdbf62fac461836601dbcd5854a8c55fb | [
"MIT"
] | 95 | 2021-12-21T01:32:07.000Z | 2022-03-27T16:14:06.000Z | import asyncio
from pyrogram.types import Message
from pyrogram import filters, Client
from Yukki import app, OWNER
from ..YukkiUtilities.helpers.filters import command
from Yukki.YukkiUtilities.database.chats import get_served_chats
@app.on_message(command(["broadcast_pin"]) & filters.user(OWNER))
async def broadcast_message_pin(_, message):
if not message.reply_to_message:
pass
else:
msg = await message.reply_text("🔄 Broadcasting message...")
x = message.reply_to_message.message_id
y = message.chat.id
sent = 0
pins = 0
chats = []
schats = await get_served_chats()
for chat in schats:
chats.append(int(chat["chat_id"]))
for i in chats:
try:
m = await app.forward_messages(i, y, x)
try:
await m.pin(disable_notification=False)
pins += 1
except Exception:
pass
await asyncio.sleep(.3)
sent += 1
except Exception:
pass
await msg.edit_text(f"✅ Broadcasted message in {sent} chats\n📌 Sent with {pins} chat pins.")
return
if len(message.command) < 2:
await message.reply_text("**usage**:\n\n/broadcast_pin (message)")
return
msg = await message.reply_text("🔄 Broadcasting message...")
text = message.text.split(None, 1)[1]
sent = 0
pins = 0
chats = []
schats = await get_served_chats()
for chat in schats:
chats.append(int(chat["chat_id"]))
for i in chats:
try:
m = await app.send_message(i, text=text)
try:
await m.pin(disable_notification=False)
pins += 1
except Exception:
pass
await asyncio.sleep(.3)
sent += 1
except Exception:
pass
await msg.edit_text(f"✅ Broadcasted message in {sent} chats\n📌 Sent with {pins} chat pins.")
@app.on_message(command(["broadcast"]) & filters.user(OWNER))
async def broadcast_message_nopin(_, message):
if not message.reply_to_message:
pass
else:
msg = await message.reply_text("🔄 Broadcasting message...")
x = message.reply_to_message.message_id
y = message.chat.id
sent = 0
chats = []
schats = await get_served_chats()
for chat in schats:
chats.append(int(chat["chat_id"]))
for i in chats:
try:
m = await app.forward_messages(i, y, x)
await asyncio.sleep(0.3)
sent += 1
except Exception:
pass
await msg.edit_text(f"✅ Broadcasted message in {sent} chats")
return
if len(message.command) < 2:
await message.reply_text(
"**usage**:\n\n/broadcast (message)"
)
return
msg = await message.reply_text("🔄 Broadcasting message...")
text = message.text.split(None, 1)[1]
sent = 0
chats = []
schats = await get_served_chats()
for chat in schats:
chats.append(int(chat["chat_id"]))
for i in chats:
try:
m = await app.send_message(i, text=text)
await asyncio.sleep(0.3)
sent += 1
except Exception:
pass
await msg.edit_text(f"✅ Broadcasted message in {sent} chats")
| 32.140187 | 102 | 0.562954 | 426 | 3,439 | 4.450704 | 0.176056 | 0.063291 | 0.053797 | 0.066456 | 0.888186 | 0.85865 | 0.85865 | 0.816456 | 0.816456 | 0.816456 | 0 | 0.010508 | 0.335853 | 3,439 | 106 | 103 | 32.443396 | 0.815236 | 0 | 0 | 0.86 | 0 | 0 | 0.125618 | 0.015121 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.08 | 0.06 | 0 | 0.1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 8 |
c8bb4311e3e1f1c1a0af44e1e0e61c762b6153f2 | 148,865 | py | Python | ion/services/coi/test/test_governance.py | ooici/coi-services | 43246f46a82e597345507afd7dfc7373cb346afa | [
"BSD-2-Clause"
] | 3 | 2016-09-20T09:50:06.000Z | 2018-08-10T01:41:38.000Z | ion/services/coi/test/test_governance.py | ooici/coi-services | 43246f46a82e597345507afd7dfc7373cb346afa | [
"BSD-2-Clause"
] | null | null | null | ion/services/coi/test/test_governance.py | ooici/coi-services | 43246f46a82e597345507afd7dfc7373cb346afa | [
"BSD-2-Clause"
] | 2 | 2016-03-16T22:25:49.000Z | 2016-11-26T14:54:21.000Z | #!/usr/bin/env python
__author__ = 'Stephen P. Henrie'
import unittest, os, gevent, platform, simplejson
from mock import Mock, patch
from pyon.util.containers import get_ion_ts
from pyon.util.int_test import IonIntegrationTestCase
from nose.plugins.attrib import attr
from pyon.util.context import LocalContextMixin
from pyon.agent.agent import ResourceAgentState, ResourceAgentEvent
from pyon.datastore.datastore import DatastoreManager
from pyon.event.event import EventRepository
from pyon.core.exception import BadRequest, Conflict, Inconsistent, NotFound, Unauthorized, InstStateError
from pyon.public import PRED, RT, IonObject, CFG, log, OT, LCS, LCE, AS
from interface.services.coi.iresource_registry_service import ResourceRegistryServiceProcessClient
from interface.services.coi.iorg_management_service import OrgManagementServiceProcessClient
from interface.services.coi.iidentity_management_service import IdentityManagementServiceProcessClient
from interface.services.sa.iinstrument_management_service import InstrumentManagementServiceProcessClient
from interface.services.coi.iexchange_management_service import ExchangeManagementServiceProcessClient
from interface.services.coi.ipolicy_management_service import PolicyManagementServiceProcessClient
from interface.services.cei.ischeduler_service import SchedulerServiceProcessClient
from interface.services.coi.isystem_management_service import SystemManagementServiceProcessClient
from interface.services.sa.iobservatory_management_service import ObservatoryManagementServiceProcessClient
from pyon.ion.resregistry import ResourceRegistryServiceWrapper
from interface.objects import AgentCommand, ProposalOriginatorEnum, ProposalStatusEnum, NegotiationStatusEnum, ComputedValueAvailability
from ion.agents.instrument.direct_access.direct_access_server import DirectAccessTypes
from pyon.core.governance.negotiation import Negotiation
from ion.processes.bootstrap.load_system_policy import LoadSystemPolicy
from pyon.core.governance import ORG_MANAGER_ROLE, ORG_MEMBER_ROLE, ION_MANAGER, get_system_actor, get_system_actor_header
from pyon.core.governance import get_actor_header, get_web_authentication_actor
from ion.services.sa.observatory.observatory_management_service import INSTRUMENT_OPERATOR_ROLE, OBSERVATORY_OPERATOR_ROLE
from pyon.net.endpoint import RPCClient, BidirClientChannel
from ion.services.sa.test.test_find_related_resources import ResourceHelper
ORG2 = 'Org 2'
USER1_CERTIFICATE = """-----BEGIN CERTIFICATE-----
MIIEMzCCAxugAwIBAgICBQAwDQYJKoZIhvcNAQEFBQAwajETMBEGCgmSJomT8ixkARkWA29yZzEX
MBUGCgmSJomT8ixkARkWB2NpbG9nb24xCzAJBgNVBAYTAlVTMRAwDgYDVQQKEwdDSUxvZ29uMRsw
GQYDVQQDExJDSUxvZ29uIEJhc2ljIENBIDEwHhcNMTAxMTE4MjIyNTA2WhcNMTAxMTE5MTAzMDA2
WjBvMRMwEQYKCZImiZPyLGQBGRMDb3JnMRcwFQYKCZImiZPyLGQBGRMHY2lsb2dvbjELMAkGA1UE
BhMCVVMxFzAVBgNVBAoTDlByb3RlY3ROZXR3b3JrMRkwFwYDVQQDExBSb2dlciBVbndpbiBBMjU0
MIIBIjANBgkqhkiG9w0BAQEFAAOCAQ8AMIIBCgKCAQEA6QhsWxhUXbIxg+1ZyEc7d+hIGvchVmtb
g0kKLmivgoVsA4U7swNDRH6svW242THta0oTf6crkRx7kOKg6jma2lcAC1sjOSddqX7/92ChoUPq
7LWt2T6GVVA10ex5WAeB/o7br/Z4U8/75uCBis+ru7xEDl09PToK20mrkcz9M4HqIv1eSoPkrs3b
2lUtQc6cjuHRDU4NknXaVMXTBHKPM40UxEDHJueFyCiZJFg3lvQuSsAl4JL5Z8pC02T8/bODBuf4
dszsqn2SC8YDw1xrujvW2Bd7Q7BwMQ/gO+dZKM1mLJFpfEsR9WrjMeg6vkD2TMWLMr0/WIkGC8u+
6M6SMQIDAQABo4HdMIHaMAwGA1UdEwEB/wQCMAAwDgYDVR0PAQH/BAQDAgSwMBMGA1UdJQQMMAoG
CCsGAQUFBwMCMBgGA1UdIAQRMA8wDQYLKwYBBAGCkTYBAgEwagYDVR0fBGMwYTAuoCygKoYoaHR0
cDovL2NybC5jaWxvZ29uLm9yZy9jaWxvZ29uLWJhc2ljLmNybDAvoC2gK4YpaHR0cDovL2NybC5k
b2Vncmlkcy5vcmcvY2lsb2dvbi1iYXNpYy5jcmwwHwYDVR0RBBgwFoEUaXRzYWdyZWVuMUB5YWhv
by5jb20wDQYJKoZIhvcNAQEFBQADggEBAEYHQPMY9Grs19MHxUzMwXp1GzCKhGpgyVKJKW86PJlr
HGruoWvx+DLNX75Oj5FC4t8bOUQVQusZGeGSEGegzzfIeOI/jWP1UtIjzvTFDq3tQMNvsgROSCx5
CkpK4nS0kbwLux+zI7BWON97UpMIzEeE05pd7SmNAETuWRsHMP+x6i7hoUp/uad4DwbzNUGIotdK
f8b270icOVgkOKRdLP/Q4r/x8skKSCRz1ZsRdR+7+B/EgksAJj7Ut3yiWoUekEMxCaTdAHPTMD/g
Mh9xL90hfMJyoGemjJswG5g3fAdTP/Lv0I6/nWeH/cLjwwpQgIEjEAVXl7KHuzX5vPD/wqQ=
-----END CERTIFICATE-----"""
DENY_EXCHANGE_TEXT = '''
<Rule RuleId="%s" Effect="Deny">
<Description>
%s
</Description>
<Target>
<Resources>
<Resource>
<ResourceMatch MatchId="urn:oasis:names:tc:xacml:1.0:function:string-equal">
<AttributeValue DataType="http://www.w3.org/2001/XMLSchema#string">exchange_management</AttributeValue>
<ResourceAttributeDesignator AttributeId="urn:oasis:names:tc:xacml:1.0:resource:resource-id" DataType="http://www.w3.org/2001/XMLSchema#string"/>
</ResourceMatch>
</Resource>
</Resources>
</Target>
</Rule>
'''
TEST_POLICY_TEXT = '''
<Rule RuleId="%s" Effect="Permit">
<Description>
%s
</Description>
<Target>
<Subjects>
<Subject>
<SubjectMatch MatchId="urn:oasis:names:tc:xacml:1.0:function:string-equal">
<AttributeValue DataType="http://www.w3.org/2001/XMLSchema#string">anonymous</AttributeValue>
<SubjectAttributeDesignator AttributeId="urn:oasis:names:tc:xacml:1.0:subject:subject-id" DataType="http://www.w3.org/2001/XMLSchema#string"/>
</SubjectMatch>
</Subject>
</Subjects>
<Actions>
<Action>
<ActionMatch MatchId="urn:oasis:names:tc:xacml:1.0:function:string-equal">
<AttributeValue DataType="http://www.w3.org/2001/XMLSchema#string">create_exchange_space</AttributeValue>
<ActionAttributeDesignator AttributeId="urn:oasis:names:tc:xacml:1.0:action:action-id" DataType="http://www.w3.org/2001/XMLSchema#string"/>
</ActionMatch>
</Action>
</Actions>
</Target>
</Rule>
'''
TEST_BOUNDARY_POLICY_TEXT = '''
<Rule RuleId="%s" Effect="Deny">
<Description>
%s
</Description>
<Target>
<Subjects>
<Subject>
<SubjectMatch MatchId="urn:oasis:names:tc:xacml:1.0:function:string-equal">
<AttributeValue DataType="http://www.w3.org/2001/XMLSchema#string">anonymous</AttributeValue>
<SubjectAttributeDesignator AttributeId="urn:oasis:names:tc:xacml:1.0:subject:subject-id" DataType="http://www.w3.org/2001/XMLSchema#string"/>
</SubjectMatch>
</Subject>
</Subjects>
</Target>
</Rule>
'''
###########
DENY_PARAM_50_RULE = '''
<Rule RuleId="%s:" Effect="Permit">
<Description>
%s
</Description>
<Target>
<Resources>
<Resource>
<ResourceMatch MatchId="urn:oasis:names:tc:xacml:1.0:function:string-regexp-match">
<AttributeValue DataType="http://www.w3.org/2001/XMLSchema#string">.*$</AttributeValue>
<ResourceAttributeDesignator AttributeId="urn:oasis:names:tc:xacml:1.0:resource:resource-id" DataType="http://www.w3.org/2001/XMLSchema#string"/>
</ResourceMatch>
</Resource>
<Resource>
<ResourceMatch MatchId="urn:oasis:names:tc:xacml:1.0:function:string-equal">
<AttributeValue DataType="http://www.w3.org/2001/XMLSchema#string">agent</AttributeValue>
<ResourceAttributeDesignator AttributeId="urn:oasis:names:tc:xacml:1.0:resource:receiver-type" DataType="http://www.w3.org/2001/XMLSchema#string"/>
</ResourceMatch>
</Resource>
</Resources>
<Actions>
<Action>
<ActionMatch MatchId="urn:oasis:names:tc:xacml:1.0:function:string-equal">
<AttributeValue DataType="http://www.w3.org/2001/XMLSchema#string">set_resource</AttributeValue>
<ActionAttributeDesignator AttributeId="urn:oasis:names:tc:xacml:1.0:action:action-id" DataType="http://www.w3.org/2001/XMLSchema#string"/>
</ActionMatch>
</Action>
</Actions>
</Target>
<Condition>
<Apply FunctionId="urn:oasis:names:tc:xacml:1.0:function:evaluate-code">
<AttributeValue DataType="http://www.w3.org/2001/XMLSchema#string"><![CDATA[def policy_func(process, message, headers):
params = message['params']
if params['INTERVAL'] <= 50:
return True, ''
return False, 'The value for SBE37Parameter.INTERVAL cannot be greater than 50'
]]>
</AttributeValue>
<ActionAttributeDesignator AttributeId="urn:oasis:names:tc:xacml:1.0:action:param-dict" DataType="http://www.w3.org/2001/XMLSchema#dict"/>
</Apply>
</Condition>
</Rule>
'''
DENY_PARAM_30_RULE = '''
<Rule RuleId="%s:" Effect="Permit">
<Description>
%s
</Description>
<Target>
<Resources>
<Resource>
<ResourceMatch MatchId="urn:oasis:names:tc:xacml:1.0:function:string-regexp-match">
<AttributeValue DataType="http://www.w3.org/2001/XMLSchema#string">.*$</AttributeValue>
<ResourceAttributeDesignator AttributeId="urn:oasis:names:tc:xacml:1.0:resource:resource-id" DataType="http://www.w3.org/2001/XMLSchema#string"/>
</ResourceMatch>
</Resource>
<Resource>
<ResourceMatch MatchId="urn:oasis:names:tc:xacml:1.0:function:string-equal">
<AttributeValue DataType="http://www.w3.org/2001/XMLSchema#string">agent</AttributeValue>
<ResourceAttributeDesignator AttributeId="urn:oasis:names:tc:xacml:1.0:resource:receiver-type" DataType="http://www.w3.org/2001/XMLSchema#string"/>
</ResourceMatch>
</Resource>
</Resources>
<Actions>
<Action>
<ActionMatch MatchId="urn:oasis:names:tc:xacml:1.0:function:string-equal">
<AttributeValue DataType="http://www.w3.org/2001/XMLSchema#string">set_resource</AttributeValue>
<ActionAttributeDesignator AttributeId="urn:oasis:names:tc:xacml:1.0:action:action-id" DataType="http://www.w3.org/2001/XMLSchema#string"/>
</ActionMatch>
</Action>
</Actions>
</Target>
<Condition>
<Apply FunctionId="urn:oasis:names:tc:xacml:1.0:function:evaluate-code">
<AttributeValue DataType="http://www.w3.org/2001/XMLSchema#string"><![CDATA[def policy_func(process, message, headers):
params = message['params']
if params['INTERVAL'] <= 30:
return True, ''
return False, 'The value for SBE37Parameter.INTERVAL cannot be greater than 30'
]]>
</AttributeValue>
<ActionAttributeDesignator AttributeId="urn:oasis:names:tc:xacml:1.0:action:param-dict" DataType="http://www.w3.org/2001/XMLSchema#dict"/>
</Apply>
</Condition>
</Rule>
'''
DENY_PARAM_10_RULE = '''
<Rule RuleId="%s:" Effect="Permit">
<Description>
%s
</Description>
<Target>
<Resources>
<Resource>
<ResourceMatch MatchId="urn:oasis:names:tc:xacml:1.0:function:string-regexp-match">
<AttributeValue DataType="http://www.w3.org/2001/XMLSchema#string">.*$</AttributeValue>
<ResourceAttributeDesignator AttributeId="urn:oasis:names:tc:xacml:1.0:resource:resource-id" DataType="http://www.w3.org/2001/XMLSchema#string"/>
</ResourceMatch>
</Resource>
<Resource>
<ResourceMatch MatchId="urn:oasis:names:tc:xacml:1.0:function:string-equal">
<AttributeValue DataType="http://www.w3.org/2001/XMLSchema#string">agent</AttributeValue>
<ResourceAttributeDesignator AttributeId="urn:oasis:names:tc:xacml:1.0:resource:receiver-type" DataType="http://www.w3.org/2001/XMLSchema#string"/>
</ResourceMatch>
</Resource>
</Resources>
<Actions>
<Action>
<ActionMatch MatchId="urn:oasis:names:tc:xacml:1.0:function:string-equal">
<AttributeValue DataType="http://www.w3.org/2001/XMLSchema#string">set_resource</AttributeValue>
<ActionAttributeDesignator AttributeId="urn:oasis:names:tc:xacml:1.0:action:action-id" DataType="http://www.w3.org/2001/XMLSchema#string"/>
</ActionMatch>
</Action>
</Actions>
</Target>
<Condition>
<Apply FunctionId="urn:oasis:names:tc:xacml:1.0:function:evaluate-code">
<AttributeValue DataType="http://www.w3.org/2001/XMLSchema#string"><![CDATA[def policy_func(process, message, headers):
params = message['params']
if params['INTERVAL'] <= 10:
return True, ''
return False, 'The value for SBE37Parameter.INTERVAL cannot be greater than 10'
]]>
</AttributeValue>
<ActionAttributeDesignator AttributeId="urn:oasis:names:tc:xacml:1.0:action:param-dict" DataType="http://www.w3.org/2001/XMLSchema#dict"/>
</Apply>
</Condition>
</Rule>
'''
@attr('INT', group='coi')
class TestGovernanceHeaders(IonIntegrationTestCase):
def setUp(self):
# Start container
self._start_container()
#Load a deploy file
self.container.start_rel_from_url('res/deploy/r2deploy.yml')
#Instantiate a process to represent the test
process=GovernanceTestProcess()
self.rr_client = ResourceRegistryServiceProcessClient(node=self.container.node, process=process)
#Get info on the ION System Actor
self.system_actor = get_system_actor()
log.info('system actor:' + self.system_actor._id)
self.system_actor_header = get_system_actor_header()
self.resource_id_header_value = ''
@attr('LOCOINT')
@attr('HEADERS')
@unittest.skipIf(os.getenv('CEI_LAUNCH_TEST', False),'Not integrated for CEI')
def test_governance_message_headers(self):
'''
This test is used to make sure the ION endpoint code is properly setting the
'''
#Get function pointer to send function
old_send = BidirClientChannel._send
# make new send to patch on that duplicates send
def patched_send(*args, **kwargs):
#Only duplicate the message send from the initial client call
msg_headers = kwargs['headers']
if (self.resource_id_header_value == '') and msg_headers.has_key('resource-id'):
self.resource_id_header_value = msg_headers['resource-id']
return old_send(*args, **kwargs)
# patch it into place with auto-cleanup to try to interogate the message headers
patcher = patch('pyon.net.endpoint.BidirClientChannel._send', patched_send)
patcher.start()
self.addCleanup(patcher.stop)
# Instantiate an object
obj = IonObject("UserInfo", name="name")
# Can't call update with object that hasn't been persisted
with self.assertRaises(BadRequest) as cm:
self.rr_client.update(obj)
# self.assertTrue(cm.exception.message.startswith("Object does not have required '_id' or '_rev' attribute"))
self.resource_id_header_value = ''
# Persist object and read it back
obj_id, obj_rev = self.rr_client.create(obj)
log.debug('The id of the created object is %s', obj_id)
self.assertEqual(self.resource_id_header_value, '' )
self.resource_id_header_value = ''
read_obj = self.rr_client.read(obj_id)
self.assertEqual(self.resource_id_header_value, obj_id )
# Cannot create object with _id and _rev fields pre-set
self.resource_id_header_value = ''
with self.assertRaises(BadRequest) as cm:
self.rr_client.create(read_obj)
#self.assertTrue(cm.exception.message.startswith("Doc must not have '_id'"))
self.assertEqual(self.resource_id_header_value, '' )
# Update object
read_obj.name = "John Doe"
self.resource_id_header_value = ''
self.rr_client.update(read_obj)
self.assertEqual(self.resource_id_header_value, obj_id )
# Update should fail with revision mismatch
self.resource_id_header_value = ''
with self.assertRaises(Conflict) as cm:
self.rr_client.update(read_obj)
#self.assertTrue(cm.exception.message.startswith("Object not based on most current version"))
self.assertEqual(self.resource_id_header_value, obj_id )
# Re-read and update object
self.resource_id_header_value = ''
read_obj = self.rr_client.read(obj_id)
self.assertEqual(self.resource_id_header_value, obj_id )
self.resource_id_header_value = ''
self.rr_client.update(read_obj)
self.assertEqual(self.resource_id_header_value, obj_id )
#Create second object
obj = IonObject("UserInfo", name="Babs Smith")
self.resource_id_header_value = ''
# Persist object and read it back
obj2_id, obj2_rev = self.rr_client.create(obj)
log.debug('The id of the created object is %s', obj_id)
self.assertEqual(self.resource_id_header_value, '' )
#Test for multi-read
self.resource_id_header_value = ''
objs = self.rr_client.read_mult([obj_id, obj2_id])
self.assertAlmostEquals(self.resource_id_header_value, [obj_id, obj2_id])
self.assertEqual(len(objs),2)
# Delete object
self.resource_id_header_value = ''
self.rr_client.delete(obj_id)
self.assertEqual(self.resource_id_header_value, obj_id )
# Delete object
self.resource_id_header_value = ''
self.rr_client.delete(obj2_id)
self.assertEqual(self.resource_id_header_value, obj2_id )
class GovernanceTestProcess(LocalContextMixin):
name = 'gov_test'
id='gov_client'
process_type = 'simple'
@attr('INT', group='coi')
class TestGovernanceInt(IonIntegrationTestCase):
def __init__(self, *args, **kwargs):
#Hack for running tests on CentOS which is significantly slower than a Mac
self.SLEEP_TIME = 1
ver = platform.mac_ver()
if ver[0] == '':
self.SLEEP_TIME = 3 # Increase for non Macs
log.info('Not running on a Mac')
else:
log.info('Running on a Mac)')
IonIntegrationTestCase.__init__(self, *args, **kwargs)
def setUp(self):
# Start container
self._start_container()
#Load a deploy file
self.container.start_rel_from_url('res/deploy/r2deploy.yml')
#Instantiate a process to represent the test
process=GovernanceTestProcess()
#Load system policies after container has started all of the services
policy_loaded = CFG.get_safe('system.load_policy', False)
if not policy_loaded:
log.debug('Loading policy')
LoadSystemPolicy.op_load_system_policies(process)
gevent.sleep(self.SLEEP_TIME*2) # Wait for events to be fired and policy updated
self.rr_msg_client = ResourceRegistryServiceProcessClient(node=self.container.node, process=process)
self.rr_client = ResourceRegistryServiceWrapper(self.container.resource_registry, process)
self.id_client = IdentityManagementServiceProcessClient(node=self.container.node, process=process)
self.pol_client = PolicyManagementServiceProcessClient(node=self.container.node, process=process)
self.org_client = OrgManagementServiceProcessClient(node=self.container.node, process=process)
self.ims_client = InstrumentManagementServiceProcessClient(node=self.container.node, process=process)
self.ems_client = ExchangeManagementServiceProcessClient(node=self.container.node, process=process)
self.ssclient = SchedulerServiceProcessClient(node=self.container.node, process=process)
self.sys_management = SystemManagementServiceProcessClient(node=self.container.node, process=process)
self.obs_client = ObservatoryManagementServiceProcessClient(node=self.container.node, process=process)
#Get info on the ION System Actor
self.system_actor = get_system_actor()
log.info('system actor:' + self.system_actor._id)
self.system_actor_header = get_system_actor_header()
#Get info on the Web Authentication Actor
self.apache_actor = get_web_authentication_actor()
if not self.apache_actor:
#Can't find the apache actor so just use the system actor
self.apache_actor = self.system_actor
self.apache_actor_header = get_actor_header(self.apache_actor._id)
self.anonymous_actor_headers = {'ion-actor-id':'anonymous'}
self.ion_org = self.org_client.find_org()
#Setup access to event repository
dsm = DatastoreManager()
ds = dsm.get_datastore("events")
self.event_repo = EventRepository(dsm)
def tearDown(self):
policy_list, _ = self.rr_client.find_resources(restype=RT.Policy)
#Must remove the policies in the reverse order they were added
for policy in sorted(policy_list,key=lambda p: p.ts_created, reverse=True):
self.pol_client.delete_policy(policy._id, headers=self.system_actor_header)
gevent.sleep(self.SLEEP_TIME) # Wait for events to be fired and policy updated
@attr('LOCOINT')
@attr('BASIC')
@unittest.skipIf(os.getenv('CEI_LAUNCH_TEST', False),'Not integrated for CEI')
def test_basic_policy_operations(self):
#Make sure that the system policies have been loaded
policy_list,_ = self.rr_client.find_resources(restype=RT.Policy, id_only=True)
self.assertNotEqual(len(policy_list),0,"The system policies have not been loaded into the Resource Registry")
log.debug('Begin testing with policies')
#First check existing policies to see if they are in place to keep an anonymous user from creating things
with self.assertRaises(Unauthorized) as cm:
test_org_id = self.org_client.create_org(org=IonObject(RT.Org, name='test_org', description='A test Org'))
self.assertIn( 'org_management(create_org) has been denied',cm.exception.message)
with self.assertRaises(NotFound) as cm:
test_org = self.org_client.find_org(name='test_org')
#Add a new policy to deny all operations to the exchange_management by default .
test_policy_id = self.pol_client.create_service_access_policy('exchange_management', 'Exchange_Management_Deny_Policy',
'Deny all operations in Exchange Management Service by default',
DENY_EXCHANGE_TEXT, headers=self.system_actor_header)
gevent.sleep(self.SLEEP_TIME) # Wait for events to be fired and policy updated
#Attempt to access an operation in service which does not have specific policies set
es_obj = IonObject(RT.ExchangeSpace, description= 'ION test XS', name='ioncore2' )
with self.assertRaises(Unauthorized) as cm:
self.ems_client.create_exchange_space(es_obj, headers=self.anonymous_actor_headers)
self.assertIn( 'exchange_management(create_exchange_space) has been denied',cm.exception.message)
#Add a new policy to allow the the above service call.
test_policy_id = self.pol_client.create_service_access_policy('exchange_management', 'Exchange_Management_Test_Policy',
'Allow specific operations in the Exchange Management Service for anonymous user',
TEST_POLICY_TEXT, headers=self.system_actor_header)
gevent.sleep(self.SLEEP_TIME) # Wait for events to be fired and policy updated
#The previous attempt at this operations should now be allowed.
es_obj = IonObject(RT.ExchangeSpace, description= 'ION test XS', name='ioncore2' )
with self.assertRaises(BadRequest) as cm:
self.ems_client.create_exchange_space(es_obj, headers=self.anonymous_actor_headers)
self.assertIn( 'Arguments not set',cm.exception.message)
#disable the test policy to try again
self.pol_client.disable_policy(test_policy_id, headers=self.system_actor_header)
gevent.sleep(self.SLEEP_TIME) # Wait for events to be published and policy updated
#The same request that previously was allowed should now be denied
es_obj = IonObject(RT.ExchangeSpace, description= 'ION test XS', name='ioncore2' )
with self.assertRaises(Unauthorized) as cm:
self.ems_client.create_exchange_space(es_obj, headers=self.anonymous_actor_headers)
self.assertIn( 'exchange_management(create_exchange_space) has been denied',cm.exception.message)
#now enable the test policy to try again
self.pol_client.enable_policy(test_policy_id, headers=self.system_actor_header)
gevent.sleep(self.SLEEP_TIME) # Wait for events to be published and policy updated
#The previous attempt at this operations should now be allowed.
es_obj = IonObject(RT.ExchangeSpace, description= 'ION test XS', name='ioncore2' )
with self.assertRaises(BadRequest) as cm:
self.ems_client.create_exchange_space(es_obj, headers=self.anonymous_actor_headers)
self.assertIn( 'Arguments not set',cm.exception.message)
#Now test service operation specific policies - specifically that there can be more than one on the same operation.
pol1_id = self.pol_client.add_process_operation_precondition_policy(process_name='policy_management', op='disable_policy', policy_content='func1_pass', headers=self.system_actor_header )
gevent.sleep(self.SLEEP_TIME) # Wait for events to be published and policy updated
#try to disable the test policy again
self.pol_client.disable_policy(test_policy_id, headers=self.system_actor_header)
gevent.sleep(self.SLEEP_TIME) # Wait for events to be published and policy updated
#The same request that previously was allowed should now be denied
es_obj = IonObject(RT.ExchangeSpace, description= 'ION test XS', name='ioncore2' )
with self.assertRaises(Unauthorized) as cm:
self.ems_client.create_exchange_space(es_obj, headers=self.anonymous_actor_headers)
self.assertIn( 'exchange_management(create_exchange_space) has been denied',cm.exception.message)
#now enable the test policy to try again
self.pol_client.enable_policy(test_policy_id, headers=self.system_actor_header)
gevent.sleep(self.SLEEP_TIME) # Wait for events to be published and policy updated
#The previous attempt at this operations should now be allowed.
es_obj = IonObject(RT.ExchangeSpace, description= 'ION test XS', name='ioncore2' )
with self.assertRaises(BadRequest) as cm:
self.ems_client.create_exchange_space(es_obj, headers=self.anonymous_actor_headers)
self.assertIn( 'Arguments not set',cm.exception.message)
pol2_id = self.pol_client.add_process_operation_precondition_policy(process_name='policy_management', op='disable_policy', policy_content='func2_deny', headers=self.system_actor_header )
gevent.sleep(self.SLEEP_TIME) # Wait for events to be published and policy updated
#try to disable the test policy again
with self.assertRaises(Unauthorized) as cm:
self.pol_client.disable_policy(test_policy_id, headers=self.system_actor_header)
self.assertIn( 'Denied for no reason',cm.exception.message)
self.pol_client.delete_policy(pol2_id, headers=self.system_actor_header)
gevent.sleep(self.SLEEP_TIME) # Wait for events to be published and policy updated
#try to disable the test policy again
self.pol_client.disable_policy(test_policy_id, headers=self.system_actor_header)
gevent.sleep(self.SLEEP_TIME) # Wait for events to be published and policy updated
#The same request that previously was allowed should now be denied
es_obj = IonObject(RT.ExchangeSpace, description= 'ION test XS', name='ioncore2' )
with self.assertRaises(Unauthorized) as cm:
self.ems_client.create_exchange_space(es_obj, headers=self.anonymous_actor_headers)
self.assertIn( 'exchange_management(create_exchange_space) has been denied',cm.exception.message)
#try to enable the test policy again
self.pol_client.enable_policy(test_policy_id, headers=self.system_actor_header)
gevent.sleep(self.SLEEP_TIME) # Wait for events to be published and policy updated
#The previous attempt at this operations should now be allowed.
es_obj = IonObject(RT.ExchangeSpace, description= 'ION test XS', name='ioncore2' )
with self.assertRaises(BadRequest) as cm:
self.ems_client.create_exchange_space(es_obj, headers=self.anonymous_actor_headers)
self.assertIn( 'Arguments not set',cm.exception.message)
pre_func1 =\
"""def precondition_func(process, msg, headers):
if headers['op'] == 'disable_policy':
return False, 'Denied for no reason again'
else:
return True, ''
"""
#Create a dynamic precondition function to deny calls to disable policy
pre_func1_id = self.pol_client.add_process_operation_precondition_policy(process_name='policy_management', op='disable_policy', policy_content=pre_func1, headers=self.system_actor_header )
gevent.sleep(self.SLEEP_TIME) # Wait for events to be published and policy updated
#try to disable the test policy again
with self.assertRaises(Unauthorized) as cm:
self.pol_client.disable_policy(test_policy_id, headers=self.system_actor_header)
self.assertIn( 'Denied for no reason again',cm.exception.message)
#Now delete the most recent precondition policy
self.pol_client.delete_policy(pre_func1_id, headers=self.system_actor_header)
gevent.sleep(self.SLEEP_TIME) # Wait for events to be published and policy updated
#The previous attempt at this operations should now be allowed.
es_obj = IonObject(RT.ExchangeSpace, description= 'ION test XS', name='ioncore2' )
with self.assertRaises(BadRequest) as cm:
self.ems_client.create_exchange_space(es_obj, headers=self.anonymous_actor_headers)
self.assertIn( 'Arguments not set',cm.exception.message)
#Now test that a precondition function can be enabled and disabled
pre_func2 =\
"""def precondition_func(process, msg, headers):
if headers['op'] == 'create_exchange_space':
return False, 'Denied for from a operation precondition function'
else:
return True, ''
"""
#Create a dynamic precondition function to deny calls to disable policy
pre_func2_id = self.pol_client.add_process_operation_precondition_policy(process_name='exchange_management', op='create_exchange_space', policy_content=pre_func2, headers=self.system_actor_header )
gevent.sleep(self.SLEEP_TIME) # Wait for events to be published and policy updated
#The same request that previously was allowed should now be denied
es_obj = IonObject(RT.ExchangeSpace, description= 'ION test XS', name='ioncore2' )
with self.assertRaises(Unauthorized) as cm:
self.ems_client.create_exchange_space(es_obj, headers=self.anonymous_actor_headers)
self.assertIn( 'Denied for from a operation precondition function',cm.exception.message)
#try to enable the precondition policy
self.pol_client.disable_policy(pre_func2_id, headers=self.system_actor_header)
gevent.sleep(self.SLEEP_TIME) # Wait for events to be published and policy updated
#The previous attempt at this operations should now be allowed.
es_obj = IonObject(RT.ExchangeSpace, description= 'ION test XS', name='ioncore2' )
with self.assertRaises(BadRequest) as cm:
self.ems_client.create_exchange_space(es_obj, headers=self.anonymous_actor_headers)
self.assertIn( 'Arguments not set',cm.exception.message)
#try to enable the precondition policy
self.pol_client.enable_policy(pre_func2_id, headers=self.system_actor_header)
gevent.sleep(self.SLEEP_TIME) # Wait for events to be published and policy updated
#The same request that previously was allowed should now be denied
es_obj = IonObject(RT.ExchangeSpace, description= 'ION test XS', name='ioncore2' )
with self.assertRaises(Unauthorized) as cm:
self.ems_client.create_exchange_space(es_obj, headers=self.anonymous_actor_headers)
self.assertIn( 'Denied for from a operation precondition function',cm.exception.message)
#Delete the precondition policy
self.pol_client.delete_policy(pre_func2_id, headers=self.system_actor_header)
gevent.sleep(self.SLEEP_TIME) # Wait for events to be published and policy updated
#The previous attempt at this operations should now be allowed.
es_obj = IonObject(RT.ExchangeSpace, description= 'ION test XS', name='ioncore2' )
with self.assertRaises(BadRequest) as cm:
self.ems_client.create_exchange_space(es_obj, headers=self.anonymous_actor_headers)
self.assertIn( 'Arguments not set',cm.exception.message)
self.pol_client.delete_policy(test_policy_id, headers=self.system_actor_header)
gevent.sleep(self.SLEEP_TIME) # Wait for events to be published and policy updated
#The same request that previously was allowed should now be denied
es_obj = IonObject(RT.ExchangeSpace, description= 'ION test XS', name='ioncore2' )
with self.assertRaises(Unauthorized) as cm:
self.ems_client.create_exchange_space(es_obj, headers=self.anonymous_actor_headers)
self.assertIn( 'exchange_management(create_exchange_space) has been denied',cm.exception.message)
###########
### Now test access to service create* operations based on roles...
#Anonymous users should not be allowed
with self.assertRaises(Unauthorized) as cm:
id = self.ssclient.create_interval_timer(start_time="now", event_origin="Interval_Timer_233", headers=self.anonymous_actor_headers)
self.assertIn( 'scheduler(create_interval_timer) has been denied',cm.exception.message)
#now try creating a new user with a valid actor
actor_id, valid_until, registered = self.id_client.signon(USER1_CERTIFICATE, True, headers=self.apache_actor_header)
log.info( "actor id=" + actor_id)
actor_header = get_actor_header(actor_id)
#User without OPERATOR or MANAGER role should not be allowed
with self.assertRaises(Unauthorized) as cm:
id = self.ssclient.create_interval_timer(start_time="now", event_origin="Interval_Timer_233", headers=actor_header)
self.assertIn( 'scheduler(create_interval_timer) has been denied',cm.exception.message)
#Remove the INSTRUMENT_OPERATOR_ROLE from the user.
self.org_client.grant_role(self.ion_org._id, actor_id, ORG_MANAGER_ROLE, headers=self.system_actor_header)
#Refresh headers with new role
actor_header = get_actor_header(actor_id)
#User with proper role should now be allowed to access this service operation.
id = self.ssclient.create_interval_timer(start_time="now", end_time="-1", event_origin="Interval_Timer_233", headers=actor_header)
@attr('LOCOINT')
@attr('RESET')
@unittest.skipIf(os.getenv('CEI_LAUNCH_TEST', False),'Not integrated for CEI')
@patch.dict(CFG, {'container':{'org_boundary':True}})
def test_policy_cache_reset(self):
before_policy_set = self.container.governance_controller.get_active_policies()
#First clear all of the policies to test that failures will be caught due to missing policies
self.container.governance_controller._reset_container_policy_caches()
empty_policy_set = self.container.governance_controller.get_active_policies()
self.assertEqual(len(empty_policy_set['service_access'].keys()), 0)
self.assertEqual(len(empty_policy_set['resource_access'].keys()), 0)
#With policies gone, an anonymous user should be able to create an object
test_org_id = self.org_client.create_org(org=IonObject(RT.Org, name='test_org1', description='A test Org'))
test_org = self.org_client.find_org(name='test_org1')
self.assertEqual(test_org._id, test_org_id)
#Trigger the event to reset the policy caches
self.sys_management.reset_policy_cache()
gevent.sleep(20) # Wait for events to be published and policy reloaded for all running processes
after_policy_set = self.container.governance_controller.get_active_policies()
#With policies refreshed, an anonymous user should NOT be able to create an object
with self.assertRaises(Unauthorized) as cm:
test_org_id = self.org_client.create_org(org=IonObject(RT.Org, name='test_org2', description='A test Org'))
self.assertIn( 'org_management(create_org) has been denied',cm.exception.message)
with self.assertRaises(NotFound) as cm:
test_org = self.org_client.find_org(name='test_org2')
self.assertEqual(len(before_policy_set.keys()), len(after_policy_set.keys()))
self.assertEqual(len(before_policy_set['service_access'].keys()), len(after_policy_set['service_access'].keys()))
self.assertEqual(len(before_policy_set['resource_access'].keys()), len(after_policy_set['resource_access'].keys()))
self.assertEqual(len(before_policy_set['service_operation'].keys()), len(after_policy_set['service_operation'].keys()))
#If the number of keys for service operations were equal, then check each set of operation precondition functions
for key in before_policy_set['service_operation']:
self.assertEqual(len(before_policy_set['service_operation'][key]), len(after_policy_set['service_operation'][key]))
@attr('LOCOINT')
@attr('BOUNDARY')
@unittest.skipIf(os.getenv('CEI_LAUNCH_TEST', False),'Not integrated for CEI')
@patch.dict(CFG, {'container':{'org_boundary':True}})
def test_org_boundary(self):
with self.assertRaises(NotFound) as nf:
org2 = self.org_client.find_org(ORG2)
self.assertIn('The Org with name Org 2 does not exist',nf.exception.message)
#Create a second Org
org2 = IonObject(RT.Org, name=ORG2, description='A second Org')
org2_id = self.org_client.create_org(org2, headers=self.system_actor_header)
org2 = self.org_client.find_org(ORG2)
self.assertEqual(org2_id, org2._id)
#First try to get a list of Users by hitting the RR anonymously - should be allowed.
actors,_ = self.rr_msg_client.find_resources(restype=RT.ActorIdentity)
self.assertEqual(len(actors),2) #Should include the ION System Actor, Web auth actor.
log.debug('Begin testing with policies')
#Create a new actor - should be denied for anonymous access
with self.assertRaises(Unauthorized) as cm:
actor_id, valid_until, registered = self.id_client.signon(USER1_CERTIFICATE, True, headers=self.anonymous_actor_headers)
self.assertIn( 'identity_management(signon) has been denied',cm.exception.message)
#now try creating a new actors with a valid actor
actor_id, valid_until, registered = self.id_client.signon(USER1_CERTIFICATE, True, headers=self.apache_actor_header)
log.info( "actor id=" + actor_id)
actor_header = get_actor_header(actor_id)
#First try to get a list of Users by hitting the RR anonymously - should be allowed.
actors,_ = self.rr_msg_client.find_resources(restype=RT.ActorIdentity)
self.assertEqual(len(actors),3) #Should include the ION System Actor and web auth actor as well.
#Now enroll the actor as a member of the Second Org
self.org_client.enroll_member(org2_id,actor_id, headers=self.system_actor_header)
actor_header = get_actor_header(actor_id)
#Add a new Org boundary policy which deny's all anonymous access
test_policy_id = self.pol_client.create_resource_access_policy( org2_id, 'Org_Test_Policy',
'Deny all access for anonymous actor',
TEST_BOUNDARY_POLICY_TEXT, headers=self.system_actor_header)
gevent.sleep(self.SLEEP_TIME) # Wait for events to be fired and policy updated
#Hack to force container into an Org Boundary for second Org
self.container.governance_controller._container_org_name = org2.org_governance_name
self.container.governance_controller._is_container_org_boundary = True
#First try to get a list of Users by hitting the RR anonymously - should be denied.
with self.assertRaises(Unauthorized) as cm:
actors,_ = self.rr_msg_client.find_resources(restype=RT.ActorIdentity, headers=self.anonymous_actor_headers)
self.assertIn( 'resource_registry(find_resources) has been denied',cm.exception.message)
#Now try to hit the RR with a real user and should now be allowed
actors,_ = self.rr_msg_client.find_resources(restype=RT.ActorIdentity, headers=actor_header)
self.assertEqual(len(actors),3) #Should include the ION System Actor and web auth actor as well.
#TODO - figure out how to right a XACML rule to be a member of the specific Org as well
#Hack to force container back to default values
self.container.governance_controller._container_org_name = 'ION'
self.container.governance_controller._is_container_org_boundary = False
self.container.governance_controller._container_org_id = None
self.pol_client.delete_policy(test_policy_id, headers=self.system_actor_header)
gevent.sleep(self.SLEEP_TIME) # Wait for events to be published and policy updated
@attr('LOCOINT')
@attr('ENROLL')
@unittest.skipIf(os.getenv('CEI_LAUNCH_TEST', False),'Not integrated for CEI')
def test_org_enroll_negotiation(self):
#Make sure that the system policies have been loaded
policy_list,_ = self.rr_client.find_resources(restype=RT.Policy)
self.assertNotEqual(len(policy_list),0,"The system policies have not been loaded into the Resource Registry")
with self.assertRaises(BadRequest) as cm:
myorg = self.org_client.read_org()
self.assertTrue(cm.exception.message == 'The org_id parameter is missing')
log.debug('Begin testing with policies')
#Create a new user - should be denied for anonymous access
with self.assertRaises(Unauthorized) as cm:
actor_id, valid_until, registered = self.id_client.signon(USER1_CERTIFICATE, True, headers=self.anonymous_actor_headers)
self.assertIn( 'identity_management(signon) has been denied',cm.exception.message)
#Now create user with proper credentials
actor_id, valid_until, registered = self.id_client.signon(USER1_CERTIFICATE, True, headers=self.apache_actor_header)
log.info( "actor id=" + actor_id)
#Build the message headers used with this user
actor_header = get_actor_header(actor_id)
#Get the associated user id
user_info = IonObject(RT.UserInfo, name='Test User')
actor_user_id = self.id_client.create_user_info(actor_id=actor_id, user_info=user_info, headers=actor_header)
#Attempt to enroll a user anonymously - should not be allowed
with self.assertRaises(Unauthorized) as cm:
self.org_client.enroll_member(self.ion_org._id,actor_id, headers=self.anonymous_actor_headers)
self.assertIn( 'org_management(enroll_member) has been denied',cm.exception.message)
#Attempt to let a user enroll themselves - should not be allowed
with self.assertRaises(Unauthorized) as cm:
self.org_client.enroll_member(self.ion_org._id,actor_id, headers=actor_header)
self.assertIn( 'org_management(enroll_member) has been denied',cm.exception.message)
#Attept to enroll the user in the ION Root org as a manager - should not be allowed since
#registration with the system implies membership in the ROOT Org.
with self.assertRaises(BadRequest) as cm:
self.org_client.enroll_member(self.ion_org._id,actor_id, headers=self.system_actor_header)
self.assertTrue(cm.exception.message == 'A request to enroll in the root ION Org is not allowed')
#Verify that anonymous user cannot find a list of enrolled users in an Org
with self.assertRaises(Unauthorized) as cm:
actors = self.org_client.find_enrolled_users(self.ion_org._id, headers=self.anonymous_actor_headers)
self.assertIn('org_management(find_enrolled_users) has been denied',cm.exception.message)
#Verify that a user without the proper Org Manager cannot find a list of enrolled users in an Org
with self.assertRaises(Unauthorized) as cm:
actors = self.org_client.find_enrolled_users(self.ion_org._id, headers=actor_header)
self.assertIn( 'org_management(find_enrolled_users) has been denied',cm.exception.message)
actors = self.org_client.find_enrolled_users(self.ion_org._id, headers=self.system_actor_header)
self.assertEqual(len(actors),3) # WIll include the ION system actor
#Create a second Org
with self.assertRaises(NotFound) as nf:
org2 = self.org_client.find_org(ORG2)
self.assertIn('The Org with name Org 2 does not exist',nf.exception.message)
org2 = IonObject(RT.Org, name=ORG2, description='A second Org')
org2_id = self.org_client.create_org(org2, headers=self.system_actor_header)
org2 = self.org_client.find_org(ORG2)
self.assertEqual(org2_id, org2._id)
negotiations = self.org_client.find_org_negotiations(org2_id, headers=self.system_actor_header)
self.assertEqual(len(negotiations),0)
#Build the Service Agreement Proposal for enrollment request
sap = IonObject(OT.EnrollmentProposal,consumer=actor_id, provider=org2_id )
sap_response = self.org_client.negotiate(sap, headers=actor_header )
negotiations = self.org_client.find_org_negotiations(org2_id, headers=self.system_actor_header)
self.assertEqual(len(negotiations),1)
negotiations = self.org_client.find_user_negotiations(actor_id, org2_id, headers=actor_header)
self.assertEqual(len(negotiations),1)
#Build the Service Agreement Proposal for enrollment request
sap2 = IonObject(OT.EnrollmentProposal,consumer=actor_id, provider=org2_id )
#User tried proposing an enrollment again - this should fail
with self.assertRaises(BadRequest) as cm:
self.org_client.negotiate(sap2, headers=actor_header )
self.assertIn('A precondition for this request has not been satisfied: not is_enroll_negotiation_open',cm.exception.message)
#Manager trys to reject the proposal but incorrectly
negotiations = self.org_client.find_org_negotiations(org2_id, proposal_type=OT.EnrollmentProposal,
negotiation_status=NegotiationStatusEnum.OPEN, headers=self.system_actor_header)
sap_response = Negotiation.create_counter_proposal(negotiations[0], ProposalStatusEnum.REJECTED, ProposalOriginatorEnum.PROVIDER)
sap_response.sequence_num -= 1
#Should fail because the proposal sequence was not incremented
with self.assertRaises(Inconsistent) as cm:
self.org_client.negotiate(sap_response, headers=actor_header )
self.assertIn('The Service Agreement Proposal does not have the correct sequence_num value (0) for this negotiation (1)',cm.exception.message)
#Manager now trys to reject the proposal but with the correct proposal sequence
sap_response.sequence_num += 1
sap_response2 = self.org_client.negotiate(sap_response, headers=self.system_actor_header )
negotiations = self.org_client.find_org_negotiations(org2_id, headers=self.system_actor_header)
self.assertEqual(len(negotiations),1)
self.assertEqual(negotiations[0].negotiation_status, NegotiationStatusEnum.REJECTED)
gevent.sleep(self.SLEEP_TIME) # Wait for events to be published
#Check that there are the correct number of events
events_r = self.event_repo.find_events(origin=sap_response2.negotiation_id, event_type=OT.EnrollmentNegotiationStatusEvent)
self.assertEquals(len(events_r), 2)
self.assertEqual(events_r[-1][2].description, ProposalStatusEnum._str_map[ProposalStatusEnum.REJECTED])
#Create a new enrollment proposal
#Build the Service Agreement Proposal to enroll
sap = IonObject(OT.EnrollmentProposal,consumer=actor_id, provider=org2_id, description='Enrollment request for test user' )
sap_response = self.org_client.negotiate(sap, headers=actor_header )
negotiations = self.org_client.find_org_negotiations(org2_id, headers=self.system_actor_header)
self.assertEqual(len(negotiations),2)
negotiations = self.org_client.find_user_negotiations(actor_id, org2_id, headers=actor_header)
self.assertEqual(len(negotiations),2)
actors = self.org_client.find_enrolled_users(org2_id, headers=self.system_actor_header)
self.assertEqual(len(actors),0)
#Check the get extended marine facility to check on the open and closed negotiations when called by normal user
ext_mf = self.obs_client.get_marine_facility_extension(org_id=org2_id,user_id=actor_user_id, headers=actor_header)
self.assertEqual(len(ext_mf.closed_requests), 0)
self.assertEqual(len(ext_mf.open_requests), 0)
#Check the get extended marine facility to check on the open and closed negotiations when called by privledged user
ext_mf = self.obs_client.get_marine_facility_extension(org_id=org2_id,user_id=self.system_actor._id, headers=self.system_actor_header)
self.assertEqual(len(ext_mf.closed_requests), 1)
self.assertEqual(len(ext_mf.open_requests), 1)
#Manager approves proposal
negotiations = self.org_client.find_org_negotiations(org2_id, proposal_type=OT.EnrollmentProposal,
negotiation_status=NegotiationStatusEnum.OPEN, headers=self.system_actor_header)
#Make sure the Negotiation object has the proper description set from the initial SAP
self.assertEqual(negotiations[0].description, sap.description)
sap_response = Negotiation.create_counter_proposal(negotiations[0], ProposalStatusEnum.ACCEPTED, ProposalOriginatorEnum.PROVIDER)
sap_response2 = self.org_client.negotiate(sap_response, headers=self.system_actor_header )
actors = self.org_client.find_enrolled_users(org2_id, headers=self.system_actor_header)
self.assertEqual(len(actors),1)
#User tried requesting enrollment again - this should fail
with self.assertRaises(BadRequest) as cm:
sap = IonObject(OT.EnrollmentProposal,consumer=actor_id, provider=org2_id )
neg_id = self.org_client.negotiate(sap, headers=actor_header )
self.assertIn('A precondition for this request has not been satisfied: not is_enrolled',cm.exception.message)
#Check the get extended marine facility to check on the open and closed negotiations when called by normal user
ext_mf = self.obs_client.get_marine_facility_extension(org_id=org2_id,user_id=actor_user_id, headers=actor_header)
self.assertEqual(len(ext_mf.closed_requests), 0)
self.assertEqual(len(ext_mf.open_requests), 0)
#Check the get extended marine facility to check on the open and closed negotiations when called by privledged user
ext_mf = self.obs_client.get_marine_facility_extension(org_id=org2_id,user_id=self.system_actor._id, headers=self.system_actor_header)
self.assertEqual(len(ext_mf.closed_requests), 2)
self.assertEqual(len(ext_mf.open_requests), 0)
gevent.sleep(self.SLEEP_TIME) # Wait for events to be published
#Check that there are the correct number of events
events_r = self.event_repo.find_events(origin=sap_response2.negotiation_id, event_type=OT.EnrollmentNegotiationStatusEvent)
self.assertEquals(len(events_r), 4)
self.assertEqual(events_r[-1][2].description, ProposalStatusEnum._str_map[ProposalStatusEnum.GRANTED])
events_c = self.event_repo.find_events(origin=org2_id, event_type=OT.OrgMembershipGrantedEvent)
self.assertEquals(len(events_c), 1)
events_i = self.event_repo.find_events(origin=org2_id, event_type=OT.OrgNegotiationInitiatedEvent)
self.assertEquals(len(events_i), 2)
ret = self.org_client.is_enrolled(org_id=org2_id, actor_id=actor_id, headers=self.system_actor_header)
self.assertEquals(ret, True)
self.org_client.cancel_member_enrollment(org_id=org2_id, actor_id=actor_id, headers=self.system_actor_header)
ret = self.org_client.is_enrolled(org_id=org2_id, actor_id=actor_id, headers=self.system_actor_header)
self.assertEquals(ret, False)
@attr('LOCOINT')
@attr('ROLE')
@unittest.skipIf(os.getenv('CEI_LAUNCH_TEST', False),'Not integrated for CEI')
def test_org_role_negotiation(self):
#Make sure that the system policies have been loaded
policy_list,_ = self.rr_client.find_resources(restype=RT.Policy)
self.assertNotEqual(len(policy_list),0,"The system policies have not been loaded into the Resource Registry")
with self.assertRaises(BadRequest) as cm:
myorg = self.org_client.read_org()
self.assertTrue(cm.exception.message == 'The org_id parameter is missing')
log.debug('Begin testing with policies')
#Create a new user - should be denied for anonymous access
with self.assertRaises(Unauthorized) as cm:
actor_id, valid_until, registered = self.id_client.signon(USER1_CERTIFICATE, True, headers=self.anonymous_actor_headers)
self.assertIn( 'identity_management(signon) has been denied',cm.exception.message)
#Now create user with proper credentials
actor_id, valid_until, registered = self.id_client.signon(USER1_CERTIFICATE, True, headers=self.apache_actor_header)
log.info( "actor id=" + actor_id)
#Build the message headers used with this user
actor_header = get_actor_header(actor_id)
actors = self.org_client.find_enrolled_users(self.ion_org._id, headers=self.system_actor_header)
self.assertEqual(len(actors),3) # WIll include the ION system actor and the non user actor from setup
## test_org_roles and policies
roles = self.org_client.find_org_roles(self.ion_org._id)
self.assertEqual(len(roles),3)
self.assertItemsEqual([r.governance_name for r in roles], [ORG_MANAGER_ROLE, ORG_MEMBER_ROLE, ION_MANAGER])
roles = self.org_client.find_org_roles_by_user(self.ion_org._id, self.system_actor._id, headers=self.system_actor_header)
self.assertEqual(len(roles),3)
self.assertItemsEqual([r.governance_name for r in roles], [ORG_MEMBER_ROLE, ORG_MANAGER_ROLE, ION_MANAGER])
roles = self.org_client.find_org_roles_by_user(self.ion_org._id, actor_id, headers=self.system_actor_header)
self.assertEqual(len(roles),1)
self.assertItemsEqual([r.governance_name for r in roles], [ORG_MEMBER_ROLE])
#Create a second Org
with self.assertRaises(NotFound) as nf:
org2 = self.org_client.find_org(ORG2)
self.assertIn('The Org with name Org 2 does not exist',nf.exception.message)
org2 = IonObject(RT.Org, name=ORG2, description='A second Org')
org2_id = self.org_client.create_org(org2, headers=self.system_actor_header)
org2 = self.org_client.find_org(ORG2)
self.assertEqual(org2_id, org2._id)
roles = self.org_client.find_org_roles(org2_id)
self.assertEqual(len(roles),2)
self.assertItemsEqual([r.governance_name for r in roles], [ORG_MANAGER_ROLE, ORG_MEMBER_ROLE])
#Create the Instrument Operator Role
operator_role = IonObject(RT.UserRole, governance_name=INSTRUMENT_OPERATOR_ROLE,name='Instrument Operator', description='Instrument Operator')
#First try to add the user role anonymously
with self.assertRaises(Unauthorized) as cm:
self.org_client.add_user_role(org2_id, operator_role, headers=self.anonymous_actor_headers)
self.assertIn('org_management(add_user_role) has been denied',cm.exception.message)
self.org_client.add_user_role(org2_id, operator_role, headers=self.system_actor_header)
roles = self.org_client.find_org_roles(org2_id)
self.assertEqual(len(roles),3)
self.assertItemsEqual([r.governance_name for r in roles], [ORG_MANAGER_ROLE, ORG_MEMBER_ROLE, INSTRUMENT_OPERATOR_ROLE])
#Add the same role to the first Org as well
self.org_client.add_user_role(self.ion_org._id, operator_role, headers=self.system_actor_header)
# test proposals roles.
#First try to find user requests anonymously
with self.assertRaises(Unauthorized) as cm:
requests = self.org_client.find_org_negotiations(org2_id, headers=self.anonymous_actor_headers)
self.assertIn('org_management(find_org_negotiations) has been denied',cm.exception.message)
#Next try to find user requests as as a basic member
with self.assertRaises(Unauthorized) as cm:
requests = self.org_client.find_org_negotiations(org2_id, headers=actor_header)
self.assertIn('org_management(find_org_negotiations) has been denied',cm.exception.message)
#Should not be denied for user with Org Manager role or ION System manager role
requests = self.org_client.find_org_negotiations(org2_id, headers=self.system_actor_header)
self.assertEqual(len(requests),0)
#Build the Service Agreement Proposal for assigning a role to a user
sap = IonObject(OT.RequestRoleProposal,consumer=actor_id, provider=org2_id, role_name=INSTRUMENT_OPERATOR_ROLE )
# First try to request a role anonymously
with self.assertRaises(Unauthorized) as cm:
sap_response = self.org_client.negotiate(sap, headers=self.anonymous_actor_headers)
self.assertIn('org_management(negotiate) has been denied',cm.exception.message)
# Next try to propose to assign a role without being a member
with self.assertRaises(BadRequest) as cm:
sap_response = self.org_client.negotiate(sap, headers=actor_header )
self.assertIn('A precondition for this request has not been satisfied: is_enrolled',cm.exception.message)
negotiations = self.org_client.find_org_negotiations(org2_id, headers=self.system_actor_header)
self.assertEqual(len(negotiations),0)
#Build the Service Agreement Proposal to enroll
sap = IonObject(OT.EnrollmentProposal,consumer=actor_id, provider=org2_id )
sap_response = self.org_client.negotiate(sap, headers=actor_header )
negotiations = self.org_client.find_org_negotiations(org2_id, headers=self.system_actor_header)
self.assertEqual(len(negotiations),1)
negotiations = self.org_client.find_user_negotiations(actor_id, org2_id, headers=actor_header)
self.assertEqual(len(negotiations),1)
actors = self.org_client.find_enrolled_users(org2_id, headers=self.system_actor_header)
self.assertEqual(len(actors),0)
#Manager approves proposal
negotiations = self.org_client.find_org_negotiations(org2_id, proposal_type=OT.EnrollmentProposal,
negotiation_status=NegotiationStatusEnum.OPEN, headers=self.system_actor_header)
sap_response = Negotiation.create_counter_proposal(negotiations[0], ProposalStatusEnum.ACCEPTED, ProposalOriginatorEnum.PROVIDER)
sap_response2 = self.org_client.negotiate(sap_response, headers=self.system_actor_header )
actors = self.org_client.find_enrolled_users(org2_id, headers=self.system_actor_header)
self.assertEqual(len(actors),1)
#Create a proposal to add a role to a user
sap = IonObject(OT.RequestRoleProposal,consumer=actor_id, provider=org2_id, role_name=INSTRUMENT_OPERATOR_ROLE )
sap_response = self.org_client.negotiate(sap, headers=actor_header )
ret = self.org_client.has_role(org2_id, actor_id,INSTRUMENT_OPERATOR_ROLE, headers=actor_header )
self.assertEqual(ret, False)
#Run through a series of differet finds to ensure the various parameter filters are working.
negotiations = self.org_client.find_org_negotiations(org2_id, headers=self.system_actor_header)
self.assertEqual(len(negotiations),2)
negotiations = self.org_client.find_org_negotiations(org2_id,negotiation_status=NegotiationStatusEnum.OPEN, headers=self.system_actor_header)
self.assertEqual(len(negotiations),1)
negotiations = self.org_client.find_user_negotiations(actor_id, org2_id, headers=actor_header)
self.assertEqual(len(negotiations),2)
negotiations = self.org_client.find_user_negotiations(actor_id, org2_id, proposal_type=OT.RequestRoleProposal, headers=actor_header)
self.assertEqual(len(negotiations),1)
negotiations = self.org_client.find_user_negotiations(actor_id, org2_id, negotiation_status=NegotiationStatusEnum.OPEN, headers=actor_header)
self.assertEqual(len(negotiations),1)
#Manager rejects the initial role proposal
negotiations = self.org_client.find_org_negotiations(org2_id, proposal_type=OT.RequestRoleProposal,
negotiation_status=NegotiationStatusEnum.OPEN, headers=self.system_actor_header)
sap_response = Negotiation.create_counter_proposal(negotiations[0], ProposalStatusEnum.REJECTED, ProposalOriginatorEnum.PROVIDER)
sap_response2 = self.org_client.negotiate(sap_response, headers=self.system_actor_header )
negotiations = self.org_client.find_org_negotiations(org2_id, headers=self.system_actor_header)
self.assertEqual(len(negotiations),2)
negotiations = self.org_client.find_org_negotiations(org2_id,negotiation_status=NegotiationStatusEnum.REJECTED, headers=self.system_actor_header)
self.assertEqual(len(negotiations),1)
self.assertEqual(negotiations[0].negotiation_status, NegotiationStatusEnum.REJECTED)
#Make sure the user still does not have the requested role
ret = self.org_client.has_role(org2_id, actor_id,INSTRUMENT_OPERATOR_ROLE, headers=actor_header )
self.assertEqual(ret, False)
gevent.sleep(self.SLEEP_TIME) # Wait for events to be published
#Check that there are the correct number of events
events_r = self.event_repo.find_events(origin=sap_response2.negotiation_id, event_type=OT.RequestRoleNegotiationStatusEvent)
self.assertEquals(len(events_r), 2)
self.assertEqual(events_r[-1][2].description, ProposalStatusEnum._str_map[ProposalStatusEnum.REJECTED])
#Create a second proposal to add a role to a user
sap = IonObject(OT.RequestRoleProposal,consumer=actor_id, provider=org2_id, role_name=INSTRUMENT_OPERATOR_ROLE )
sap_response = self.org_client.negotiate(sap, headers=actor_header )
negotiations = self.org_client.find_org_negotiations(org2_id, headers=self.system_actor_header)
self.assertEqual(len(negotiations),3)
closed_negotiations = self.org_client.find_org_closed_negotiations(org2_id, headers=self.system_actor_header)
self.assertEqual(len(closed_negotiations),2)
#Create an instrument resource
ia_list,_ = self.rr_client.find_resources(restype=RT.InstrumentAgent)
self.assertEqual(len(ia_list),0)
ia_obj = IonObject(RT.InstrumentAgent, name='Instrument Agent1', description='The first Instrument Agent')
#Intruments should not be able to be created by anoymous users
with self.assertRaises(Unauthorized) as cm:
self.ims_client.create_instrument_agent(ia_obj, headers=self.anonymous_actor_headers)
self.assertIn('instrument_management(create_instrument_agent) has been denied',cm.exception.message)
#Intruments should not be able to be created by users that are not Instrument Operators
with self.assertRaises(Unauthorized) as cm:
self.ims_client.create_instrument_agent(ia_obj, headers=actor_header)
self.assertIn('instrument_management(create_instrument_agent) has been denied',cm.exception.message)
#Manager approves proposal for role request
negotiations = self.org_client.find_org_negotiations(org2_id, proposal_type=OT.RequestRoleProposal,
negotiation_status=NegotiationStatusEnum.OPEN, headers=self.system_actor_header)
sap_response = Negotiation.create_counter_proposal(negotiations[0], ProposalStatusEnum.ACCEPTED, ProposalOriginatorEnum.PROVIDER)
sap_response2 = self.org_client.negotiate(sap_response, headers=self.system_actor_header )
#mke sure there are no more open negotiations
negotiations = self.org_client.find_user_negotiations(actor_id, org2_id, negotiation_status=NegotiationStatusEnum.OPEN, headers=actor_header)
self.assertEqual(len(negotiations),0)
#Verify the user has been assigned the requested role in the second Org
ret = self.org_client.has_role(org2_id, actor_id,INSTRUMENT_OPERATOR_ROLE, headers=actor_header )
self.assertEqual(ret, True)
#Verify the user has only been assigned the requested role in the second Org and not in the first Org
ret = self.org_client.has_role(self.ion_org._id, actor_id,INSTRUMENT_OPERATOR_ROLE, headers=actor_header )
self.assertEqual(ret, False)
#Refresh headers with new role
actor_header = get_actor_header(actor_id)
#now try to request the same role for the same user - should be denied
with self.assertRaises(BadRequest) as cm:
sap = IonObject(OT.RequestRoleProposal,consumer=actor_id, provider=org2_id, role_name=INSTRUMENT_OPERATOR_ROLE )
sap_response = self.org_client.negotiate(sap, headers=actor_header )
self.assertIn('A precondition for this request has not been satisfied: not has_role',cm.exception.message)
#Now the user with the proper role should be able to create an instrument.
self.ims_client.create_instrument_agent(ia_obj, headers=actor_header)
gevent.sleep(self.SLEEP_TIME) # Wait for events to be published
#Check that there are the correct number of events
events_r = self.event_repo.find_events(origin=sap_response2.negotiation_id, event_type=OT.RequestRoleNegotiationStatusEvent)
self.assertEquals(len(events_r), 4)
self.assertEqual(events_r[-1][2].description, ProposalStatusEnum._str_map[ProposalStatusEnum.GRANTED])
self.assertEqual(events_r[-1][2].role_name, sap_response2.role_name)
events_c = self.event_repo.find_events(origin=org2_id, event_type=OT.UserRoleGrantedEvent)
self.assertEquals(len(events_c), 2)
events_i = self.event_repo.find_events(origin=org2_id, event_type=OT.OrgNegotiationInitiatedEvent)
self.assertEquals(len(events_i), 3)
@attr('LOCOINT')
@attr('ACQUIRE')
@unittest.skipIf(os.getenv('CEI_LAUNCH_TEST', False),'Not integrated for CEI')
def test_org_acquire_resource_negotiation(self):
#Make sure that the system policies have been loaded
policy_list,_ = self.rr_client.find_resources(restype=RT.Policy)
self.assertNotEqual(len(policy_list),0,"The system policies have not been loaded into the Resource Registry")
with self.assertRaises(BadRequest) as cm:
myorg = self.org_client.read_org()
self.assertTrue(cm.exception.message == 'The org_id parameter is missing')
log.debug('Begin testing with policies')
#Create a new user - should be denied for anonymous access
with self.assertRaises(Unauthorized) as cm:
actor_id, valid_until, registered = self.id_client.signon(USER1_CERTIFICATE, True, headers=self.anonymous_actor_headers)
self.assertIn( 'identity_management(signon) has been denied',cm.exception.message)
#Now create user with proper credentials
actor_id, valid_until, registered = self.id_client.signon(USER1_CERTIFICATE, True, headers=self.apache_actor_header)
log.info( "actor id=" + actor_id)
#Create a second Org
org2 = IonObject(RT.Org, name=ORG2, description='A second Org')
org2_id = self.org_client.create_org(org2, headers=self.system_actor_header)
org2 = self.org_client.find_org(ORG2)
self.assertEqual(org2_id, org2._id)
roles = self.org_client.find_org_roles(org2_id)
self.assertEqual(len(roles),2)
self.assertItemsEqual([r.governance_name for r in roles], [ORG_MANAGER_ROLE, ORG_MEMBER_ROLE])
#Create the Instrument Operator Role
operator_role = IonObject(RT.UserRole, governance_name=INSTRUMENT_OPERATOR_ROLE,name='Instrument Operator', description='Instrument Operator')
#And add it to all Orgs
self.org_client.add_user_role(self.ion_org._id, operator_role, headers=self.system_actor_header)
self.org_client.add_user_role(org2_id, operator_role, headers=self.system_actor_header)
#Add the INSTRUMENT_OPERATOR_ROLE to the User for the ION Org
self.org_client.grant_role(self.ion_org._id, actor_id, INSTRUMENT_OPERATOR_ROLE, headers=self.system_actor_header)
#Enroll the user in the second Org - do without Negotiation for test
self.org_client.enroll_member(org2_id, actor_id,headers=self.system_actor_header )
#Build the message headers used with this user
actor_header = get_actor_header(actor_id)
#Test the invitation process
#Create a invitation proposal to add a role to a user
sap = IonObject(OT.RequestRoleProposal,consumer=actor_id, provider=org2_id, role_name=INSTRUMENT_OPERATOR_ROLE,
originator=ProposalOriginatorEnum.PROVIDER )
sap_response = self.org_client.negotiate(sap, headers=self.system_actor_header )
ret = self.org_client.has_role(org2_id, actor_id,INSTRUMENT_OPERATOR_ROLE, headers=actor_header )
self.assertEqual(ret, False)
#User creates proposal to approve
negotiations = self.org_client.find_user_negotiations(actor_id, org2_id, proposal_type=OT.RequestRoleProposal,
negotiation_status=NegotiationStatusEnum.OPEN, headers=actor_header)
sap_response = Negotiation.create_counter_proposal(negotiations[0], ProposalStatusEnum.ACCEPTED)
sap_response2 = self.org_client.negotiate(sap_response, headers=actor_header )
#Verify the user has been assigned the requested role in the second Org
ret = self.org_client.has_role(org2_id, actor_id,INSTRUMENT_OPERATOR_ROLE, headers=actor_header )
self.assertEqual(ret, True)
#Build the message headers used with this user
actor_header = get_actor_header(actor_id)
gevent.sleep(self.SLEEP_TIME) # Wait for events to be published
#Check that there are the correct number of events
events_r = self.event_repo.find_events(origin=sap_response2.negotiation_id, event_type=OT.RequestRoleNegotiationStatusEvent)
self.assertEquals(len(events_r), 4)
self.assertEqual(events_r[-1][2].description, ProposalStatusEnum._str_map[ProposalStatusEnum.GRANTED])
#Create the instrument agent with the user that has the proper role
ia_obj = IonObject(RT.InstrumentAgent, name='Instrument Agent1', description='The Instrument Agent')
self.ims_client.create_instrument_agent(ia_obj, headers=actor_header)
#Ensure the instrument agent has been created
ia_list,_ = self.rr_client.find_resources(restype=RT.InstrumentAgent)
self.assertEqual(len(ia_list),1)
self.assertEquals(ia_list[0].lcstate, LCS.DRAFT)
self.assertEquals(ia_list[0].availability, AS.PRIVATE)
#Advance the Life cycle to planned. Must be INSTRUMENT_OPERATOR so anonymous user should fail
with self.assertRaises(Unauthorized) as cm:
self.ims_client.execute_instrument_agent_lifecycle(ia_list[0]._id, LCE.PLAN, headers=self.anonymous_actor_headers)
self.assertIn( 'instrument_management(execute_instrument_agent_lifecycle) has been denied',cm.exception.message)
#Advance the Life cycle to planned. Must be INSTRUMENT_OPERATOR
self.ims_client.execute_instrument_agent_lifecycle(ia_list[0]._id, LCE.PLAN, headers=actor_header)
ia = self.rr_client.read(ia_list[0]._id)
self.assertEquals(ia.lcstate, LCS.PLANNED)
#First make a acquire resource request with an non-enrolled user.
with self.assertRaises(BadRequest) as cm:
sap = IonObject(OT.AcquireResourceProposal,consumer=self.system_actor._id, provider=org2_id, resource_id=ia_list[0]._id )
sap_response = self.org_client.negotiate(sap, headers=self.system_actor_header )
self.assertIn('A precondition for this request has not been satisfied: is_enrolled',cm.exception.message)
#Make a proposal to acquire a resource with an enrolled user that has the right role but the resource is not shared the Org
with self.assertRaises(BadRequest) as cm:
sap = IonObject(OT.AcquireResourceProposal,consumer=actor_id, provider=org2_id, resource_id=ia_list[0]._id)
sap_response = self.org_client.negotiate(sap, headers=actor_header )
self.assertIn('A precondition for this request has not been satisfied: is_resource_shared',cm.exception.message)
#So share the resource
self.org_client.share_resource(org_id=org2_id, resource_id=ia_list[0]._id, headers=self.system_actor_header )
#Verify the resource is shared
res_list,_ = self.rr_client.find_objects(org2,PRED.hasResource)
self.assertEqual(len(res_list), 1)
self.assertEqual(res_list[0]._id, ia_list[0]._id)
#First try to acquire the resource exclusively but it should fail since the user cannot do this without first
#having had acquired the resource
with self.assertRaises(BadRequest) as cm:
sap = IonObject(OT.AcquireResourceExclusiveProposal,consumer=actor_id, provider=org2_id, resource_id=ia_list[0]._id)
sap_response = self.org_client.negotiate(sap, headers=actor_header )
self.assertIn('A precondition for this request has not been satisfied: is_resource_acquired',cm.exception.message)
#Make a proposal to acquire a resource with an enrolled user that has the right role and is now shared
sap = IonObject(OT.AcquireResourceProposal,consumer=actor_id, provider=org2_id, resource_id=ia_list[0]._id)
sap_response = self.org_client.negotiate(sap, headers=actor_header )
negotiations = self.org_client.find_org_negotiations(org2_id, headers=self.system_actor_header)
self.assertEqual(len(negotiations),2)
negotiations = self.org_client.find_user_negotiations(actor_id, org2_id, headers=actor_header)
self.assertEqual(len(negotiations),2)
negotiations = self.org_client.find_user_negotiations(actor_id, org2_id, proposal_type=OT.AcquireResourceProposal, headers=actor_header)
self.assertEqual(len(negotiations),1)
negotiations = self.org_client.find_user_negotiations(actor_id, org2_id, negotiation_status=NegotiationStatusEnum.OPEN, headers=actor_header)
self.assertEqual(len(negotiations),1)
self.assertEqual(negotiations[0]._id, sap_response.negotiation_id)
#Manager Creates a counter proposal
negotiations = self.org_client.find_org_negotiations(org2_id, proposal_type=OT.AcquireResourceProposal,
negotiation_status=NegotiationStatusEnum.OPEN, headers=self.system_actor_header)
#Counter proposals for demonstration only
#Calculate one week from now in milliseconds
cur_time = int(get_ion_ts())
week_expiration = cur_time + ( 7 * 24 * 60 * 60 * 1000 )
sap_response = Negotiation.create_counter_proposal(negotiations[0], originator=ProposalOriginatorEnum.PROVIDER)
sap_response.expiration = str(week_expiration)
sap_response2 = self.org_client.negotiate(sap_response, headers=self.system_actor_header )
#User Creates a counter proposal
negotiations = self.org_client.find_user_negotiations(actor_id, org2_id, proposal_type=OT.AcquireResourceProposal,
negotiation_status=NegotiationStatusEnum.OPEN, headers=actor_header)
cur_time = int(get_ion_ts())
month_expiration = cur_time + ( 30 * 24 * 60 * 60 * 1000 )
sap_response = Negotiation.create_counter_proposal(negotiations[0])
sap_response.expiration = str(month_expiration)
sap_response2 = self.org_client.negotiate(sap_response, headers=self.system_actor_header )
gevent.sleep(self.SLEEP_TIME+1) # Wait for events to be published
#Check that there are the correct number of events
events_r = self.event_repo.find_events(origin=sap_response2.negotiation_id, event_type=OT.AcquireResourceNegotiationStatusEvent)
self.assertEquals(len(events_r), 3)
self.assertEqual(events_r[-1][2].description, ProposalStatusEnum._str_map[ProposalStatusEnum.COUNTER])
self.assertEqual(events_r[-1][2].resource_id, ia_list[0]._id)
#Manager approves Instrument resource proposal
negotiations = self.org_client.find_org_negotiations(org2_id, proposal_type=OT.AcquireResourceProposal,
negotiation_status=NegotiationStatusEnum.OPEN, headers=self.system_actor_header)
sap_response = Negotiation.create_counter_proposal(negotiations[0], ProposalStatusEnum.ACCEPTED, ProposalOriginatorEnum.PROVIDER)
sap_response2 = self.org_client.negotiate(sap_response, headers=self.system_actor_header )
negotiations = self.org_client.find_user_negotiations(actor_id, org2_id, negotiation_status=NegotiationStatusEnum.OPEN, headers=actor_header)
self.assertEqual(len(negotiations),0) #Should be no more open negotiations for a user because auto-accept is enabled
#The following are no longer needed with auto-accept enabled for acquiring a resource
'''
self.assertEqual(len(negotiations),1)
#User accepts proposal in return
negotiations = self.org_client.find_user_negotiations(actor_id, org2_id, proposal_type=OT.AcquireResourceProposal,
negotiation_status=NegotiationStatusEnum.OPEN, headers=actor_header)
sap_response = Negotiation.create_counter_proposal(negotiations[0], ProposalStatusEnum.ACCEPTED)
sap_response2 = self.org_client.negotiate(sap_response, headers=actor_header )
'''
negotiations = self.org_client.find_user_negotiations(actor_id, org2_id, negotiation_status=NegotiationStatusEnum.OPEN, headers=actor_header)
self.assertEqual(len(negotiations),0)
#Check commitment to be active
commitments, _ = self.rr_client.find_objects(ia_list[0]._id,PRED.hasCommitment, RT.Commitment)
self.assertEqual(len(commitments),1)
resource_commitment, _ = self.rr_client.find_objects(actor_id,PRED.hasCommitment, RT.Commitment)
self.assertEqual(len(resource_commitment),1)
self.assertNotEqual(resource_commitment[0].lcstate, LCS.DELETED)
subjects, _ = self.rr_client.find_subjects(None,PRED.hasCommitment, commitments[0]._id)
self.assertEqual(len(subjects),3)
contracts, _ = self.rr_client.find_subjects(RT.Negotiation,PRED.hasContract, commitments[0]._id)
self.assertEqual(len(contracts),1)
cur_time = int(get_ion_ts())
invalid_expiration = cur_time + ( 13 * 60 * 60 * 1000 ) # 12 hours from now
#Now try to acquire the resource exclusively for longer than 12 hours
sap = IonObject(OT.AcquireResourceExclusiveProposal,consumer=actor_id, provider=org2_id, resource_id=ia_list[0]._id,
expiration=str(invalid_expiration))
sap_response = self.org_client.negotiate(sap, headers=actor_header )
#make sure the negotiation was rejected for being too long.
negotiation = self.rr_client.read(sap_response.negotiation_id)
self.assertEqual(negotiation.negotiation_status, NegotiationStatusEnum.REJECTED)
#Now try to acquire the resource exclusively for 20 minutes
cur_time = int(get_ion_ts())
valid_expiration = cur_time + ( 20 * 60 * 1000 ) # 12 hours from now
sap = IonObject(OT.AcquireResourceExclusiveProposal,consumer=actor_id, provider=org2_id, resource_id=ia_list[0]._id,
expiration=str(valid_expiration))
sap_response = self.org_client.negotiate(sap, headers=actor_header )
#Check commitment to be active
commitments, _ = self.rr_client.find_objects(ia_list[0]._id,PRED.hasCommitment, RT.Commitment)
self.assertEqual(len(commitments),2)
exclusive_contract, _ = self.rr_client.find_objects(sap_response.negotiation_id,PRED.hasContract, RT.Commitment)
self.assertEqual(len(contracts),1)
#Now try to acquire the resource exclusively again - should fail
with self.assertRaises(BadRequest) as cm:
sap = IonObject(OT.AcquireResourceExclusiveProposal,consumer=actor_id, provider=org2_id, resource_id=ia_list[0]._id)
sap_response = self.org_client.negotiate(sap, headers=actor_header )
self.assertIn('A precondition for this request has not been satisfied: not is_resource_acquired_exclusively',cm.exception.message)
#Release the exclusive commitment to the resource
self.org_client.release_commitment(exclusive_contract[0]._id, headers=actor_header)
#Check exclusive commitment to be inactive
commitments, _ = self.rr_client.find_resources(restype=RT.Commitment, lcstate=LCS.DELETED)
self.assertEqual(len(commitments),1)
self.assertEqual(commitments[0].commitment.exclusive, True)
#Shared commitment is still actove
commitments, _ = self.rr_client.find_objects(ia_list[0],PRED.hasCommitment, RT.Commitment)
self.assertEqual(len(commitments),1)
self.assertNotEqual(commitments[0].lcstate, LCS.DELETED)
#Now release the shared commitment
self.org_client.release_commitment(resource_commitment[0]._id, headers=actor_header)
#Check for both commitments to be inactive
commitments, _ = self.rr_client.find_resources(restype=RT.Commitment, lcstate=LCS.DELETED)
self.assertEqual(len(commitments),2)
commitments, _ = self.rr_client.find_objects(ia_list[0],PRED.hasCommitment, RT.Commitment)
self.assertEqual(len(commitments),0)
#Now check some negative cases...
#Attempt to acquire the same resource from the ION Org which is not sharing it - should fail
with self.assertRaises(BadRequest) as cm:
sap = IonObject(OT.AcquireResourceProposal,consumer=actor_id, provider=self.ion_org._id, resource_id=ia_list[0]._id)
sap_response = self.org_client.negotiate(sap, headers=actor_header )
self.assertIn('A precondition for this request has not been satisfied: is_resource_shared',cm.exception.message)
#Remove the INSTRUMENT_OPERATOR_ROLE from the user.
self.org_client.revoke_role(org2_id, actor_id, INSTRUMENT_OPERATOR_ROLE, headers=self.system_actor_header)
#Refresh headers with new role
actor_header = get_actor_header(actor_id)
#Make a proposal to acquire a resource with an enrolled user that does not have the right role
with self.assertRaises(BadRequest) as cm:
sap = IonObject(OT.AcquireResourceProposal,consumer=actor_id, provider=org2_id, resource_id=ia_list[0]._id )
sap_response = self.org_client.negotiate(sap, headers=actor_header )
self.assertIn('A precondition for this request has not been satisfied: has_role',cm.exception.message)
gevent.sleep(self.SLEEP_TIME+1) # Wait for events to be published
#Check that there are the correct number of events
events_r = self.event_repo.find_events(origin=sap_response2.negotiation_id, event_type=OT.AcquireResourceNegotiationStatusEvent)
self.assertEquals(len(events_r), 6)
self.assertEqual(events_r[-1][2].description, ProposalStatusEnum._str_map[ProposalStatusEnum.GRANTED])
self.assertEqual(events_r[-1][2].resource_id, ia_list[0]._id)
events_c = self.event_repo.find_events(origin=org2_id, event_type=OT.ResourceCommitmentCreatedEvent)
self.assertEquals(len(events_c), 2)
events_i = self.event_repo.find_events(origin=org2_id, event_type=OT.OrgNegotiationInitiatedEvent)
self.assertEquals(len(events_i), 4)
ret = self.org_client.is_resource_shared(org_id=org2_id, resource_id=ia_list[0]._id, headers=self.system_actor_header )
self.assertEquals(ret, True)
#So unshare the resource
self.org_client.unshare_resource(org_id=org2_id, resource_id=ia_list[0]._id, headers=self.system_actor_header )
ret = self.org_client.is_resource_shared(org_id=org2_id, resource_id=ia_list[0]._id, headers=self.system_actor_header )
self.assertEquals(ret, False)
def start_instrument_direct_access(self, ia_client, actor_header):
#The reset command should now be allowed
cmd = AgentCommand(command=ResourceAgentEvent.RESET)
retval = ia_client.execute_agent(cmd, headers=actor_header)
retval = ia_client.get_agent_state(headers=actor_header)
self.assertEqual(retval, ResourceAgentState.UNINITIALIZED)
cmd = AgentCommand(command=ResourceAgentEvent.INITIALIZE)
retval = ia_client.execute_agent(cmd, headers=actor_header)
state = ia_client.get_agent_state(headers=actor_header)
self.assertEqual(state, ResourceAgentState.INACTIVE)
cmd = AgentCommand(command=ResourceAgentEvent.GO_DIRECT_ACCESS,
#kwargs={'session_type': DirectAccessTypes.telnet,
kwargs={'session_type':DirectAccessTypes.vsp,
'session_timeout':600,
'inactivity_timeout':600})
retval = ia_client.execute_agent(cmd, headers=actor_header)
state = ia_client.get_agent_state(headers=actor_header)
self.assertEqual(state, ResourceAgentState.DIRECT_ACCESS)
def stop_instrument_direct_access(self, ia_client, actor_header):
cmd = AgentCommand(command=ResourceAgentEvent.RESET)
retval = ia_client.execute_agent(cmd, headers=actor_header)
state = ia_client.get_agent_state(headers=actor_header)
self.assertEqual(state, ResourceAgentState.UNINITIALIZED)
@attr('LOCOINT')
@attr('AGENT')
@unittest.skipIf(os.getenv('CEI_LAUNCH_TEST', False),'Not integrated for CEI')
@patch.dict(CFG, {'system':{'load_policy':True}})
def test_instrument_agent_policy(self):
# This import will dynamically load the driver egg. It is needed for the MI includes below
import ion.agents.instrument.test.test_instrument_agent
from mi.core.instrument.instrument_driver import DriverProtocolState
from mi.core.instrument.instrument_driver import DriverConnectionState
from mi.instrument.seabird.sbe37smb.ooicore.driver import SBE37ProtocolEvent
from mi.instrument.seabird.sbe37smb.ooicore.driver import SBE37Parameter
#Make sure that the system policies have been loaded
policy_list,_ = self.rr_client.find_resources(restype=RT.Policy)
self.assertNotEqual(len(policy_list),0,"The system policies have not been loaded into the Resource Registry")
log.debug('Begin testing with policies')
#Create a new user - should be denied for anonymous access
with self.assertRaises(Unauthorized) as cm:
actor_id, valid_until, registered = self.id_client.signon(USER1_CERTIFICATE, True, headers=self.anonymous_actor_headers)
self.assertIn( 'identity_management(signon) has been denied',cm.exception.message)
#Create user
actor_id, valid_until, registered = self.id_client.signon(USER1_CERTIFICATE, True, headers=self.apache_actor_header)
log.debug( "actor id=" + actor_id)
actor_header = get_actor_header(actor_id)
#Create a third user to be used as observatory operator
obs_operator_actor_obj = IonObject(RT.ActorIdentity, name='observatory operator actor')
obs_operator_actor_id,_ = self.rr_client.create(obs_operator_actor_obj)
assert(obs_operator_actor_id)
#Create a second Org
org2 = IonObject(RT.Org, name=ORG2, description='A second Org')
org2_id = self.org_client.create_org(org2, headers=self.system_actor_header)
org2 = self.org_client.find_org(ORG2)
self.assertEqual(org2_id, org2._id)
roles = self.org_client.find_org_roles(org2_id)
self.assertEqual(len(roles),2)
self.assertItemsEqual([r.governance_name for r in roles], [ORG_MANAGER_ROLE, ORG_MEMBER_ROLE])
#Create Instrument Operator Role and add it to the second Org
operator_role = IonObject(RT.UserRole, governance_name=INSTRUMENT_OPERATOR_ROLE, name='Instrument Operator', description='Instrument Operator')
self.org_client.add_user_role(org2_id, operator_role, headers=self.system_actor_header)
#Create Instrument Operator Role and add it to the second Org
obs_operator_role = IonObject(RT.UserRole, governance_name=OBSERVATORY_OPERATOR_ROLE, name='Observatory Operator', description='Observatory Operator')
self.org_client.add_user_role(org2_id, obs_operator_role, headers=self.system_actor_header)
roles = self.org_client.find_org_roles(org2_id)
self.assertEqual(len(roles),4)
self.assertItemsEqual([r.governance_name for r in roles], [ORG_MANAGER_ROLE, ORG_MEMBER_ROLE, INSTRUMENT_OPERATOR_ROLE, OBSERVATORY_OPERATOR_ROLE])
#Grant the role of Observatory Operator to the user
self.org_client.enroll_member(org2_id,obs_operator_actor_id, headers=self.system_actor_header)
self.org_client.grant_role(org2_id, obs_operator_actor_id, OBSERVATORY_OPERATOR_ROLE, headers=self.system_actor_header)
obs_operator_actor_header = get_actor_header(obs_operator_actor_id)
#Create Test InstrumentDevice - use the system admin for now
inst_obj = IonObject(RT.InstrumentDevice, name='Test_Instrument_123')
inst_obj_id,_ = self.rr_client.create(inst_obj )
#Startup an agent - TODO: will fail with Unauthorized to spawn process if not right user role
from ion.agents.instrument.test.test_instrument_agent import start_instrument_agent_process
ia_client = start_instrument_agent_process(self.container, resource_id=inst_obj_id, resource_name=inst_obj.name,
org_governance_name=org2.org_governance_name, message_headers=self.system_actor_header)
#First try a basic agent operation anonymously - it should be denied
with self.assertRaises(Unauthorized) as cm:
retval = ia_client.get_capabilities(headers=self.anonymous_actor_headers )
self.assertIn('InstrumentDevice(get_capabilities) has been denied',cm.exception.message)
#However the ION Manager should be allowed
retval = ia_client.get_capabilities(headers=self.system_actor_header)
#Next try a basic agent operation with a user that is not an Instrument Operator or Member of Org - it should be denied
with self.assertRaises(Unauthorized) as cm:
retval = ia_client.get_capabilities(headers=actor_header)
self.assertIn('InstrumentDevice(get_capabilities) has been denied',cm.exception.message)
#Attempt to grant role to user but not as a Member of the Org so it will fail
with self.assertRaises(BadRequest) as cm:
self.org_client.grant_role(org2_id,actor_id, INSTRUMENT_OPERATOR_ROLE, headers=self.system_actor_header)
self.assertIn('The actor is not a member of the specified Org',cm.exception.message)
#Enroll user in second Org
self.org_client.enroll_member(org2_id,actor_id, headers=self.system_actor_header)
#Refresh header with updated roles
actor_header = get_actor_header(actor_id)
#Next try a basic agent operation with a user that is a member of the Org but not an Instrument Operator - it should be allowed
retval = ia_client.get_capabilities(headers=actor_header)
#This agent operation should not be allowed for anonymous user
with self.assertRaises(Unauthorized) as cm:
retval = ia_client.get_resource_state(headers=self.anonymous_actor_headers)
self.assertIn('InstrumentDevice(get_resource_state) has been denied',cm.exception.message)
#This agent operation should not be allowed for a user that is not an Instrument Operator
with self.assertRaises(Unauthorized) as cm:
retval = ia_client.get_resource_state(headers=actor_header)
self.assertIn('InstrumentDevice(get_resource_state) has been denied',cm.exception.message)
#This agent operation should not be allowed for a user that is not an Instrument Operator
with self.assertRaises(Unauthorized) as cm:
params = [SBE37Parameter.ALL]
retval = ia_client.get_resource(params, headers=actor_header)
self.assertIn('InstrumentDevice(get_resource) has been denied',cm.exception.message)
#This agent operation should not be allowed for a user that is not an Instrument Operator
with self.assertRaises(Unauthorized) as cm:
retval = ia_client.get_agent_state(headers=actor_header)
self.assertIn('InstrumentDevice(get_agent_state) has been denied',cm.exception.message)
with self.assertRaises(Unauthorized) as cm:
retval = ia_client.get_agent(headers=actor_header)
self.assertIn('InstrumentDevice(get_agent) has been denied',cm.exception.message)
#Check the availability of the get_instrument_device_extension operation for various user types - some of the
#agent related status should not be allowed for users without the proper role
#Anonymous get extended is allowed, the internal agent status should be available.
extended_inst = self.ims_client.get_instrument_device_extension(inst_obj_id, headers=self.anonymous_actor_headers)
self.assertEqual(extended_inst._id, inst_obj_id)
self.assertEqual(extended_inst.computed.communications_status_roll_up.status, ComputedValueAvailability.PROVIDED)
self.assertEqual('', extended_inst.computed.communications_status_roll_up.reason)
#Org member get extended is allowed, but internal agent status should be available
extended_inst = self.ims_client.get_instrument_device_extension(inst_obj_id, headers=actor_header)
self.assertEqual(extended_inst._id, inst_obj_id)
self.assertEqual(extended_inst.computed.communications_status_roll_up.status, ComputedValueAvailability.PROVIDED)
self.assertEqual('', extended_inst.computed.communications_status_roll_up.reason)
#Grant the role of Instrument Operator to the user
self.org_client.grant_role(org2_id,actor_id, INSTRUMENT_OPERATOR_ROLE, headers=self.system_actor_header)
#Refresh header with updated roles
actor_header = get_actor_header(actor_id)
#This operation should now be allowed with the Instrument Operator role - just checking policy not agent functionality
with self.assertRaises(Conflict) as cm:
res_state = ia_client.get_resource_state(headers=actor_header)
#This operation should now be allowed with the Instrument Operator role
with self.assertRaises(Conflict) as cm:
params = [SBE37Parameter.ALL]
retval = ia_client.get_resource(params, headers=actor_header)
#Instrument Operator role get extended is allowed and contain agent status information
extended_inst = self.ims_client.get_instrument_device_extension(inst_obj_id, headers=actor_header)
self.assertEqual(extended_inst._id, inst_obj_id)
self.assertEqual(extended_inst.computed.communications_status_roll_up.status, ComputedValueAvailability.PROVIDED)
self.assertEqual(extended_inst.computed.communications_status_roll_up.reason, '')
#This agent operation should now be allowed for a user that is an Instrument Operator
retval = ia_client.get_agent_state(headers=actor_header)
self.assertEqual(retval, ResourceAgentState.UNINITIALIZED)
#This agent operation should now be allowed for a user that is an Instrument Operator
with self.assertRaises(BadRequest) as cm:
retval = ia_client.get_agent(params=[123], headers=actor_header)
#The execute commnand should fail if the user has not acquired the resource
with self.assertRaises(Unauthorized) as cm:
cmd = AgentCommand(command=SBE37ProtocolEvent.ACQUIRE_SAMPLE)
retval = ia_client.execute_resource(cmd, headers=actor_header)
self.assertIn('InstrumentDevice(execute_resource) has been denied',cm.exception.message)
#Going to try access to other operations on the agent, don't care if they actually work - just
#do they get denied or not
new_params = {
SBE37Parameter.TA0 : 2,
SBE37Parameter.INTERVAL : 1,
}
#First try anonymously - should be denied
with self.assertRaises(Unauthorized) as cm:
ia_client.set_resource(new_params, headers=self.anonymous_actor_headers)
self.assertIn('InstrumentDevice(set_resource) has been denied',cm.exception.message)
#THey try with user with Instrument Operator role, but should fail with out acquiring a resource
with self.assertRaises(Unauthorized) as cm:
ia_client.set_resource(new_params, headers=actor_header)
self.assertIn('InstrumentDevice(set_resource) has been denied',cm.exception.message)
#Make a proposal to acquire a resource with an enrolled user that has the right role - but the resource is not shared
with self.assertRaises(BadRequest) as cm:
sap = IonObject(OT.AcquireResourceProposal,consumer=actor_id, provider=org2_id, resource_id=inst_obj_id)
sap_response = self.org_client.negotiate(sap, headers=actor_header )
self.assertIn('A precondition for this request has not been satisfied: is_resource_shared',cm.exception.message)
#So share the resource
self.org_client.share_resource(org_id=org2_id, resource_id=inst_obj_id, headers=self.system_actor_header )
#Renegotiate the proposal
sap = IonObject(OT.AcquireResourceProposal,consumer=actor_id, provider=org2_id, resource_id=inst_obj_id)
sap_response = self.org_client.negotiate(sap, headers=actor_header )
#Have the Org accept the proposal
negotiation = self.rr_client.read(sap_response.negotiation_id)
sap_response2 = Negotiation.create_counter_proposal(negotiation, ProposalStatusEnum.ACCEPTED, ProposalOriginatorEnum.PROVIDER)
sap_response3 = self.org_client.negotiate(sap_response2, headers=self.system_actor_header )
#Have the User counter-accept the proposal
'''
negotiation = self.rr_client.read(sap_response3.negotiation_id)
sap_response4 = Negotiation.create_counter_proposal(negotiation, ProposalStatusEnum.ACCEPTED)
sap_response5 = self.org_client.negotiate(sap_response4, headers=actor_header )
'''
gevent.sleep(self.SLEEP_TIME) # Wait for events to be fired and commitments recorded
#This operation should now be allowed since the resource has been acquired
with self.assertRaises(Conflict) as cm:
cmd = AgentCommand(command=SBE37ProtocolEvent.ACQUIRE_SAMPLE)
retval = ia_client.execute_resource(cmd, headers=actor_header)
#This operation should now be allowed since the resource has been acquired
with self.assertRaises(Conflict) as cm:
ia_client.set_resource(new_params, headers=actor_header)
resource_commitment, _ = self.rr_client.find_objects(actor_id,PRED.hasCommitment, RT.Commitment)
self.assertEqual(len(resource_commitment),1)
self.assertNotEqual(resource_commitment[0].lcstate, LCS.DELETED)
#Request for the instrument to be put into Direct Access mode - should be denied for anonymous users
with self.assertRaises(Unauthorized) as cm:
self.start_instrument_direct_access(ia_client, actor_header=self.anonymous_actor_headers )
self.assertIn('InstrumentDevice(execute_agent) has been denied',cm.exception.message)
#Request for the instrument to be put into Direct Access mode - should be denied since user does not have exclusive access
with self.assertRaises(Unauthorized) as cm:
self.start_instrument_direct_access(ia_client, actor_header=actor_header )
self.assertIn('InstrumentDevice(execute_agent) has been denied',cm.exception.message)
#Request to access the resource exclusively for two hours
cur_time = int(get_ion_ts())
two_hour_expiration = cur_time + ( 2 * 60 * 60 * 1000 ) # 2 hours from now
sap = IonObject(OT.AcquireResourceExclusiveProposal,consumer=actor_id, provider=org2_id, resource_id=inst_obj_id,
expiration=str(two_hour_expiration))
sap_response = self.org_client.negotiate(sap, headers=actor_header )
gevent.sleep(self.SLEEP_TIME) # Wait for events to be fired and commitments recorded
#Should fail if another user has acquired the resource exclusively
with self.assertRaises(Unauthorized) as cm:
ia_client.set_resource(new_params, headers=obs_operator_actor_header)
self.assertIn('InstrumentDevice(set_resource) has been denied since another user',cm.exception.message)
#Should fail if another user has acquired the resource exclusively
with self.assertRaises(Unauthorized) as cm:
cmd = AgentCommand(command=SBE37ProtocolEvent.ACQUIRE_SAMPLE)
retval = ia_client.execute_resource(cmd, headers=obs_operator_actor_header)
self.assertIn('InstrumentDevice(execute_resource) has been denied since another user',cm.exception.message)
#Request Direct Access again - with a different user and it should fail since other user has exclusive access
with self.assertRaises(Unauthorized) as cm:
self.start_instrument_direct_access(ia_client, actor_header=obs_operator_actor_header )
self.assertIn('InstrumentDevice(execute_agent) has been denied since another user',cm.exception.message)
#Request Direct Access again with user that has exclusive access - and it should pass this time.
with self.assertRaises(Conflict) as cm:
self.start_instrument_direct_access(ia_client, actor_header=actor_header )
#Try stopping the direct access by other user - should fail
with self.assertRaises(Unauthorized) as cm:
self.stop_instrument_direct_access(ia_client, actor_header=obs_operator_actor_header)
self.assertIn('InstrumentDevice(execute_agent) has been denied since another user',cm.exception.message)
#Stop Direct Access by user with exclusive access - and it should pass this time.
with self.assertRaises(Conflict) as cm:
self.stop_instrument_direct_access(ia_client, actor_header=actor_header)
#Release the exclusive commitment to the resource
exclusive_contract, _ = self.rr_client.find_objects(sap_response.negotiation_id,PRED.hasContract, RT.Commitment)
self.org_client.release_commitment(exclusive_contract[0]._id, headers=actor_header)
#Try to Request for the instrument to be put into Direct Access mode - should be denied since user does not have exclusive access
with self.assertRaises(Unauthorized) as cm:
self.start_instrument_direct_access(ia_client, actor_header=actor_header )
self.assertIn('InstrumentDevice(execute_agent) has been denied',cm.exception.message)
#Request Direct Access again - with obs manager role that does not need a commitment.
with self.assertRaises(Conflict) as cm:
self.start_instrument_direct_access(ia_client, actor_header=obs_operator_actor_header )
#Stop Direct Access with obs manager role that does not need a commitment.
with self.assertRaises(Conflict) as cm:
self.stop_instrument_direct_access(ia_client, actor_header=obs_operator_actor_header)
#The agent related functions should not be allowed for a user that is not an Org Manager
with self.assertRaises(Unauthorized) as cm:
cmd = AgentCommand(command=ResourceAgentEvent.RESET)
retval = ia_client.execute_agent(cmd, headers=actor_header)
self.assertIn('InstrumentDevice(execute_agent) has been denied',cm.exception.message)
#Check exclusive commitment to be inactive
commitments, _ = self.rr_client.find_resources(restype=RT.Commitment, lcstate=LCS.DELETED)
self.assertEqual(len(commitments),1)
self.assertEqual(commitments[0].commitment.exclusive, True)
#Shared commitment is still active
commitments, _ = self.rr_client.find_objects(inst_obj_id,PRED.hasCommitment, RT.Commitment)
self.assertEqual(len(commitments),1)
self.assertNotEqual(commitments[0].lcstate, LCS.DELETED)
#Now release the shared commitment
self.org_client.release_commitment(resource_commitment[0]._id, headers=actor_header)
#Check for both commitments to be inactive
commitments, _ = self.rr_client.find_resources(restype=RT.Commitment, lcstate=LCS.DELETED)
self.assertEqual(len(commitments),2)
commitments, _ = self.rr_client.find_objects(inst_obj_id,PRED.hasCommitment, RT.Commitment)
self.assertEqual(len(commitments),0)
#Try again with user with only Instrument Operator role, but should fail with out acquiring a resource
with self.assertRaises(Unauthorized) as cm:
ia_client.set_resource(new_params, headers=actor_header)
self.assertIn('InstrumentDevice(set_resource) has been denied',cm.exception.message)
#Revoke the role of Inst Operator to the user
self.org_client.revoke_role(org2_id,actor_id, INSTRUMENT_OPERATOR_ROLE, headers=self.system_actor_header)
#Grant the role of Org Manager to the user
self.org_client.grant_role(org2_id,actor_id, ORG_MANAGER_ROLE, headers=self.system_actor_header)
#Refresh header with updated roles
actor_header = get_actor_header(actor_id)
#Try again with user with also Org Manager role and should pass even with out acquiring a resource
with self.assertRaises(Conflict) as cm:
ia_client.set_resource(new_params, headers=actor_header)
#This agent operation should now be allowed for a user that is an Org Manager
with self.assertRaises(BadRequest) as cm:
retval = ia_client.get_agent(params=[123], headers=actor_header)
#Org Manager role get extended is allowed and contain agent status information
extended_inst = self.ims_client.get_instrument_device_extension(inst_obj_id, headers=actor_header)
self.assertEqual(extended_inst._id, inst_obj_id)
self.assertEqual(extended_inst.computed.communications_status_roll_up.status, ComputedValueAvailability.PROVIDED)
self.assertEqual(extended_inst.computed.communications_status_roll_up.reason, '')
#Now reset the agent for checking operation based policy
#The reset command should now be allowed for the Org Manager
cmd = AgentCommand(command=ResourceAgentEvent.INITIALIZE)
retval = ia_client.execute_agent(cmd, headers=actor_header)
state = ia_client.get_agent_state(headers=actor_header)
self.assertEqual(state, ResourceAgentState.INACTIVE)
cmd = AgentCommand(command=ResourceAgentEvent.RESET)
retval = ia_client.execute_agent(cmd, headers=actor_header)
retval = ia_client.get_agent_state(headers=actor_header)
self.assertEqual(retval, ResourceAgentState.UNINITIALIZED)
#Test access precondition to deny get_current_state commands but allow all others
pre_func1 =\
"""def precondition_func(process, message, headers):
from pyon.agent.agent import ResourceAgentEvent
if message['command'].command == ResourceAgentEvent.RESET:
return False, 'ResourceAgentEvent.RESET is being denied'
else:
return True, ''
"""
#Add an example of a operation specific policy that checks internal values to decide on access
pol_id = self.pol_client.add_process_operation_precondition_policy(process_name=RT.InstrumentDevice, op='execute_agent', policy_content=pre_func1, headers=self.system_actor_header )
gevent.sleep(self.SLEEP_TIME) # Wait for events to be published and policy updated
#The initialize command should be allowed
cmd = AgentCommand(command=ResourceAgentEvent.INITIALIZE)
retval = ia_client.execute_agent(cmd, headers=actor_header)
retval = ia_client.get_agent_state(headers=actor_header)
self.assertEqual(retval, ResourceAgentState.INACTIVE)
#The reset command should be denied
with self.assertRaises(Unauthorized) as cm:
cmd = AgentCommand(command=ResourceAgentEvent.RESET)
retval = ia_client.execute_agent(cmd, headers=actor_header)
self.assertIn( 'ResourceAgentEvent.RESET is being denied',cm.exception.message)
#Now delete the get_current_state policy and try again
self.pol_client.delete_policy(pol_id, headers=self.system_actor_header)
gevent.sleep(self.SLEEP_TIME) # Wait for events to be published and policy updated
#Now try to go into Direct Access Mode directly through the agent as an Org Manager
with self.assertRaises(Conflict) as cm:
self.start_instrument_direct_access(ia_client, actor_header=actor_header )
#Exit DA Mode
self.stop_instrument_direct_access(ia_client, actor_header=actor_header)
events_i = self.event_repo.find_events(origin=org2_id, event_type=OT.OrgNegotiationInitiatedEvent)
self.assertEquals(len(events_i), 2)
@attr('LOCOINT')
@attr('MULTI_AGENT')
@unittest.skipIf(os.getenv('CEI_LAUNCH_TEST', False),'Not integrated for CEI')
@patch.dict(CFG, {'system':{'load_policy':True}})
def test_multiple_instrument_agent_policy(self):
# This import will dynamically load the driver egg. It is needed for the MI includes below
import ion.agents.instrument.test.test_instrument_agent
from mi.core.instrument.instrument_driver import DriverProtocolState
from mi.core.instrument.instrument_driver import DriverConnectionState
from mi.instrument.seabird.sbe37smb.ooicore.driver import SBE37ProtocolEvent
from mi.instrument.seabird.sbe37smb.ooicore.driver import SBE37Parameter
#Make sure that the system policies have been loaded
policy_list,_ = self.rr_client.find_resources(restype=RT.Policy)
self.assertNotEqual(len(policy_list),0,"The system policies have not been loaded into the Resource Registry")
log.debug('Begin testing with policies')
#Create a new user - should be denied for anonymous access
with self.assertRaises(Unauthorized) as cm:
actor_id, valid_until, registered = self.id_client.signon(USER1_CERTIFICATE, True, headers=self.anonymous_actor_headers)
self.assertIn( 'identity_management(signon) has been denied',cm.exception.message)
#Create user
actor_id, valid_until, registered = self.id_client.signon(USER1_CERTIFICATE, True, headers=self.apache_actor_header)
log.debug( "actor id=" + actor_id)
actor_header = get_actor_header(actor_id)
#Create a third user to be used as observatory operator
obs_operator_actor_obj = IonObject(RT.ActorIdentity, name='observatory operator actor')
obs_operator_actor_id,_ = self.rr_client.create(obs_operator_actor_obj)
assert(obs_operator_actor_id)
#Create a second Org
org2 = IonObject(RT.Org, name=ORG2, description='A second Org')
org2_id = self.org_client.create_org(org2, headers=self.system_actor_header)
org2 = self.org_client.find_org(ORG2)
self.assertEqual(org2_id, org2._id)
roles = self.org_client.find_org_roles(org2_id)
self.assertEqual(len(roles),2)
self.assertItemsEqual([r.governance_name for r in roles], [ORG_MANAGER_ROLE, ORG_MEMBER_ROLE])
#Create Instrument Operator Role and add it to the second Org
operator_role = IonObject(RT.UserRole, governance_name=INSTRUMENT_OPERATOR_ROLE, name='Instrument Operator', description='Instrument Operator')
self.org_client.add_user_role(org2_id, operator_role, headers=self.system_actor_header)
#Create Instrument Operator Role and add it to the second Org
obs_operator_role = IonObject(RT.UserRole, governance_name=OBSERVATORY_OPERATOR_ROLE, name='Observatory Operator', description='Observatory Operator')
self.org_client.add_user_role(org2_id, obs_operator_role, headers=self.system_actor_header)
roles = self.org_client.find_org_roles(org2_id)
self.assertEqual(len(roles),4)
self.assertItemsEqual([r.governance_name for r in roles], [ORG_MANAGER_ROLE, ORG_MEMBER_ROLE, INSTRUMENT_OPERATOR_ROLE, OBSERVATORY_OPERATOR_ROLE])
#Grant the role of Observatory Operator to the user
self.org_client.enroll_member(org2_id,obs_operator_actor_id, headers=self.system_actor_header)
self.org_client.grant_role(org2_id, obs_operator_actor_id, OBSERVATORY_OPERATOR_ROLE, headers=self.system_actor_header)
obs_operator_actor_header = get_actor_header(obs_operator_actor_id)
#Enroll user in second Org
self.org_client.enroll_member(org2_id,actor_id, headers=self.system_actor_header)
self.org_client.grant_role(org2_id, actor_id, INSTRUMENT_OPERATOR_ROLE, headers=self.system_actor_header)
#Refresh header with updated roles
actor_header = get_actor_header(actor_id)
#Create Test InstrumentDevice - use the system admin for now
inst_obj1 = IonObject(RT.InstrumentDevice, name='Test_Instrument_1')
inst_obj1_id,_ = self.rr_client.create(inst_obj1 )
#Startup an agent - TODO: will fail with Unauthorized to spawn process if not right user role
from ion.agents.instrument.test.test_instrument_agent import start_instrument_agent_process
ia1_client = start_instrument_agent_process(self.container, resource_id=inst_obj1_id, resource_name=inst_obj1.name,
org_governance_name=org2.org_governance_name, message_headers=self.system_actor_header)
#First try a basic agent operation anonymously - it should be denied
with self.assertRaises(Unauthorized) as cm:
retval = ia1_client.get_capabilities(headers=self.anonymous_actor_headers )
self.assertIn('InstrumentDevice(get_capabilities) has been denied',cm.exception.message)
#however the Inst Operator should be allowed
retval = ia1_client.get_capabilities(headers=actor_header)
#Create Test InstrumentDevice2 - use the system admin for now
inst_obj2 = IonObject(RT.InstrumentDevice, name='Test_Instrument_2')
inst_obj2_id,_ = self.rr_client.create(inst_obj2 )
#Startup an agent - TODO: will fail with Unauthorized to spawn process if not right user role
from ion.agents.instrument.test.test_instrument_agent import start_instrument_agent_process
ia2_client = start_instrument_agent_process(self.container, resource_id=inst_obj2_id, resource_name=inst_obj2.name,
org_governance_name=org2.org_governance_name, message_headers=self.system_actor_header)
#First try a basic agent operation anonymously - it should be denied
with self.assertRaises(Unauthorized) as cm:
retval = ia2_client.get_capabilities(headers=self.anonymous_actor_headers )
self.assertIn('InstrumentDevice(get_capabilities) has been denied',cm.exception.message)
#However the Inst Operator should be allowed
retval = ia2_client.get_capabilities(headers=actor_header)
#Create Test InstrumentDevice2 - use the system admin for now
inst_obj3 = IonObject(RT.InstrumentDevice, name='Test_Instrument_3')
inst_obj3_id,_ = self.rr_client.create(inst_obj3 )
#Startup an agent - TODO: will fail with Unauthorized to spawn process if not right user role
from ion.agents.instrument.test.test_instrument_agent import start_instrument_agent_process
ia3_client = start_instrument_agent_process(self.container, resource_id=inst_obj3_id, resource_name=inst_obj3.name,
org_governance_name=org2.org_governance_name, message_headers=self.system_actor_header)
#First try a basic agent operation anonymously - it should be denied
with self.assertRaises(Unauthorized) as cm:
retval = ia3_client.get_capabilities(headers=self.anonymous_actor_headers )
self.assertIn('InstrumentDevice(get_capabilities) has been denied',cm.exception.message)
#However the Inst Operator should be allowed
retval = ia3_client.get_capabilities(headers=actor_header)
#The reset command should be allowed
cmd = AgentCommand(command=ResourceAgentEvent.INITIALIZE)
retval = ia1_client.execute_agent(cmd, headers=obs_operator_actor_header)
retval = ia1_client.get_agent_state(headers=obs_operator_actor_header)
self.assertEqual(retval, ResourceAgentState.INACTIVE)
cmd = AgentCommand(command=ResourceAgentEvent.RESET)
retval = ia1_client.execute_agent(cmd, headers=obs_operator_actor_header)
retval = ia1_client.get_agent_state(headers=obs_operator_actor_header)
self.assertEqual(retval, ResourceAgentState.UNINITIALIZED)
cmd = AgentCommand(command=ResourceAgentEvent.INITIALIZE)
retval = ia2_client.execute_agent(cmd, headers=obs_operator_actor_header)
retval = ia2_client.get_agent_state(headers=obs_operator_actor_header)
self.assertEqual(retval, ResourceAgentState.INACTIVE)
cmd = AgentCommand(command=ResourceAgentEvent.RESET)
retval = ia2_client.execute_agent(cmd, headers=obs_operator_actor_header)
retval = ia2_client.get_agent_state(headers=obs_operator_actor_header)
self.assertEqual(retval, ResourceAgentState.UNINITIALIZED)
cmd = AgentCommand(command=ResourceAgentEvent.INITIALIZE)
retval = ia3_client.execute_agent(cmd, headers=obs_operator_actor_header)
retval = ia3_client.get_agent_state(headers=obs_operator_actor_header)
self.assertEqual(retval, ResourceAgentState.INACTIVE)
cmd = AgentCommand(command=ResourceAgentEvent.RESET)
retval = ia3_client.execute_agent(cmd, headers=obs_operator_actor_header)
retval = ia3_client.get_agent_state(headers=obs_operator_actor_header)
self.assertEqual(retval, ResourceAgentState.UNINITIALIZED)
#This operation should now be allowed with the Instrument Operator role
with self.assertRaises(Conflict) as cm:
retval = ia1_client.get_resource([SBE37Parameter.ALL], headers=actor_header)
with self.assertRaises(Conflict) as cm:
retval = ia2_client.get_resource([SBE37Parameter.ALL], headers=actor_header)
with self.assertRaises(Conflict) as cm:
retval = ia3_client.get_resource([SBE37Parameter.ALL], headers=actor_header)
testing_header1 = {'test_for_proc_name': 'Test_Instrument_1'}
testing_header1.update(actor_header)
#This operation should now be allowed with the Instrument Operator role
with self.assertRaises(Conflict) as cm:
retval = ia1_client.get_resource([SBE37Parameter.ALL], headers=testing_header1)
testing_header2 = {'test_for_proc_name': 'Test_Instrument_2'}
testing_header2.update(actor_header)
with self.assertRaises(Conflict) as cm:
retval = ia2_client.get_resource([SBE37Parameter.ALL], headers=testing_header2)
testing_header3 = {'test_for_proc_name': 'Test_Instrument_3'}
testing_header3.update(actor_header)
with self.assertRaises(Conflict) as cm:
retval = ia3_client.get_resource([SBE37Parameter.ALL], headers=testing_header3)
with self.assertRaises(Unauthorized) as cm:
retval = ia3_client.get_resource([SBE37Parameter.ALL], headers=testing_header2)
@attr('LOCOINT')
@attr('LCS')
@unittest.skipIf(os.getenv('CEI_LAUNCH_TEST', False),'Not integrated for CEI')
@patch.dict(CFG, {'system':{'load_policy':True}})
def test_instrument_lifecycle_policy(self):
#Make sure that the system policies have been loaded
policy_list,_ = self.rr_client.find_resources(restype=RT.Policy)
self.assertNotEqual(len(policy_list),0,"The system policies have not been loaded into the Resource Registry")
log.debug('Begin testing with policies')
#Create user
inst_operator_actor_id, valid_until, registered = self.id_client.signon(USER1_CERTIFICATE, True, headers=self.apache_actor_header)
log.debug( "actor id=" + inst_operator_actor_id)
inst_operator_actor_header = get_actor_header(inst_operator_actor_id)
#Create a second user to be used as regular member
member_actor_obj = IonObject(RT.ActorIdentity, name='org member actor')
member_actor_id,_ = self.rr_client.create(member_actor_obj)
assert(member_actor_id)
#Create a third user to be used as observatory operator
obs_operator_actor_obj = IonObject(RT.ActorIdentity, name='observatory operator actor')
obs_operator_actor_id,_ = self.rr_client.create(obs_operator_actor_obj)
assert(obs_operator_actor_id)
#Create a second Org
org2 = IonObject(RT.Org, name=ORG2, description='A second Org')
org2_id = self.org_client.create_org(org2, headers=self.system_actor_header)
org2 = self.org_client.find_org(ORG2)
self.assertEqual(org2_id, org2._id)
roles = self.org_client.find_org_roles(org2_id)
self.assertEqual(len(roles),2)
self.assertItemsEqual([r.governance_name for r in roles], [ORG_MANAGER_ROLE, ORG_MEMBER_ROLE])
#Create Instrument Operator Role and add it to the second Org
inst_operator_role = IonObject(RT.UserRole, governance_name=INSTRUMENT_OPERATOR_ROLE, name='Instrument Operator', description='Instrument Operator')
self.org_client.add_user_role(org2_id, inst_operator_role, headers=self.system_actor_header)
#Create Instrument Operator Role and add it to the second Org
obs_operator_role = IonObject(RT.UserRole, governance_name=OBSERVATORY_OPERATOR_ROLE, name='Observatory Operator', description='Observatory Operator')
self.org_client.add_user_role(org2_id, obs_operator_role, headers=self.system_actor_header)
roles = self.org_client.find_org_roles(org2_id)
self.assertEqual(len(roles),4)
self.assertItemsEqual([r.governance_name for r in roles], [ORG_MANAGER_ROLE, ORG_MEMBER_ROLE, INSTRUMENT_OPERATOR_ROLE, OBSERVATORY_OPERATOR_ROLE])
self.org_client.enroll_member(org2_id,inst_operator_actor_id, headers=self.system_actor_header)
self.org_client.enroll_member(org2_id,member_actor_id, headers=self.system_actor_header)
self.org_client.enroll_member(org2_id,obs_operator_actor_id, headers=self.system_actor_header)
#Grant the role of Instrument Operator to the user
self.org_client.grant_role(org2_id, inst_operator_actor_id, INSTRUMENT_OPERATOR_ROLE, headers=self.system_actor_header)
#Grant the role of Observatory Operator to the user
self.org_client.grant_role(org2_id, obs_operator_actor_id, OBSERVATORY_OPERATOR_ROLE, headers=self.system_actor_header)
#Refresh header with updated roles
inst_operator_actor_header = get_actor_header(inst_operator_actor_id)
member_actor_header = get_actor_header(member_actor_id)
obs_operator_actor_header = get_actor_header(obs_operator_actor_id)
#Attempt to Create Test InstrumentDevice as Org Member
inst_dev = IonObject(RT.InstrumentDevice, name='Test_Instrument_123')
#Should be denied without being the proper role
with self.assertRaises(Unauthorized) as cm:
inst_dev_id = self.ims_client.create_instrument_device(inst_dev, headers=member_actor_header)
self.assertIn('instrument_management(create_instrument_device) has been denied',cm.exception.message)
#Should be allowed
inst_dev_id = self.ims_client.create_instrument_device(inst_dev, headers=inst_operator_actor_header)
#Reads are always allowed anonymously
inst_dev_obj = self.ims_client.read_instrument_device(inst_dev_id)
self.assertEqual(inst_dev_obj.lcstate, LCS.DRAFT)
#Advance the Life cycle to planned. Must be INSTRUMENT_OPERATOR so anonymous user should fail
with self.assertRaises(Unauthorized) as cm:
self.ims_client.execute_instrument_device_lifecycle(inst_dev_id, LCE.PLAN, headers=self.anonymous_actor_headers)
self.assertIn( 'instrument_management(execute_instrument_device_lifecycle) has been denied',cm.exception.message)
#Advance the Life cycle to planned. Must be INSTRUMENT_OPERATOR so member user should fail
with self.assertRaises(Unauthorized) as cm:
self.ims_client.execute_instrument_device_lifecycle(inst_dev_id, LCE.PLAN, headers=member_actor_header)
self.assertIn( 'instrument_management(execute_instrument_device_lifecycle) has been denied',cm.exception.message)
#Advance the Life cycle to planned. Must be INSTRUMENT_OPERATOR - will fail since resource is not shared by an Org
with self.assertRaises(Unauthorized) as cm:
self.ims_client.execute_instrument_device_lifecycle(inst_dev_id, LCE.PLAN, headers=inst_operator_actor_header)
#self.assertIn( 'has not been shared with any Org',cm.exception.message)
#Ensure the resource is shareable
self.org_client.share_resource(org2_id, inst_dev_id, headers=self.system_actor_header)
#Successfully advance the Life cycle to planned - this user is owner and Inst operator
self.ims_client.execute_instrument_device_lifecycle(inst_dev_id, LCE.PLAN, headers=inst_operator_actor_header)
inst_dev_obj = self.ims_client.read_instrument_device(inst_dev_id)
self.assertEquals(inst_dev_obj.lcstate, LCS.PLANNED)
#Advance the Life cycle to DEVELOP - should pass governance but fail because of other hard wired preconditions.
with self.assertRaises(Unauthorized) as cm:
self.ims_client.execute_instrument_device_lifecycle(inst_dev_id, LCE.DEVELOP, headers=obs_operator_actor_header)
self.assertNotIn( 'instrument_management(execute_instrument_device_lifecycle) has been denied',cm.exception.message)
#These states are only ever allowed for the Observatory Operator ( or Org Manager)
with self.assertRaises(Unauthorized) as cm:
self.ims_client.execute_instrument_device_lifecycle(inst_dev_id, LCE.INTEGRATE, headers=inst_operator_actor_header)
self.assertIn( 'instrument_management(execute_instrument_device_lifecycle) has been denied',cm.exception.message)
with self.assertRaises(Unauthorized) as cm:
self.ims_client.execute_instrument_device_lifecycle(inst_dev_id, LCE.DEPLOY, headers=inst_operator_actor_header)
self.assertIn( 'instrument_management(execute_instrument_device_lifecycle) has been denied',cm.exception.message)
with self.assertRaises(Unauthorized) as cm:
self.ims_client.execute_instrument_device_lifecycle(inst_dev_id, LCE.RETIRE, headers=inst_operator_actor_header)
self.assertIn( 'instrument_management(execute_instrument_device_lifecycle) has been denied',cm.exception.message)
#These states are only ever allowed for the Observatory Operator ( or Org Manager)
with self.assertRaises(Unauthorized) as cm:
self.ims_client.execute_instrument_device_lifecycle(inst_dev_id, LCE.INTEGRATE, headers=obs_operator_actor_header)
self.assertNotIn( 'instrument_management(execute_instrument_device_lifecycle) has been denied',cm.exception.message)
with self.assertRaises(Unauthorized) as cm:
self.ims_client.execute_instrument_device_lifecycle(inst_dev_id, LCE.DEPLOY, headers=obs_operator_actor_header)
self.assertNotIn( 'instrument_management(execute_instrument_device_lifecycle) has been denied',cm.exception.message)
#Back to testing a few other states with different actors
with self.assertRaises(Unauthorized) as cm:
self.ims_client.execute_instrument_device_lifecycle(inst_dev_id, LCE.ANNOUNCE, headers=member_actor_header)
self.assertIn( 'instrument_management(execute_instrument_device_lifecycle) has been denied',cm.exception.message)
with self.assertRaises(Unauthorized) as cm:
self.ims_client.execute_instrument_device_lifecycle(inst_dev_id, LCE.ENABLE, headers=member_actor_header)
self.assertIn( 'instrument_management(execute_instrument_device_lifecycle) has been denied',cm.exception.message)
inst_dev_obj = self.ims_client.read_instrument_device(inst_dev_id)
self.assertEquals(inst_dev_obj.lcstate, LCS.PLANNED)
self.assertEquals(inst_dev_obj.availability, AS.PRIVATE)
#print "old state: " + inst_dev_obj.lcstate
self.ims_client.execute_instrument_device_lifecycle(inst_dev_id, LCE.ANNOUNCE, headers=inst_operator_actor_header)
inst_dev_obj = self.ims_client.read_instrument_device(inst_dev_id)
self.assertEquals(inst_dev_obj.lcstate, LCS.PLANNED)
self.assertEquals(inst_dev_obj.availability, AS.DISCOVERABLE)
#print "new state: " + inst_dev_obj.lcstate
self.ims_client.execute_instrument_device_lifecycle(inst_dev_id, LCE.ENABLE, headers=inst_operator_actor_header)
with self.assertRaises(BadRequest) as cm:
self.ims_client.execute_instrument_device_lifecycle(inst_dev_id, LCE.ANNOUNCE, headers=obs_operator_actor_header)
self.assertIn('has no transition for event announce', cm.exception.message)
with self.assertRaises(BadRequest) as cm:
self.ims_client.execute_instrument_device_lifecycle(inst_dev_id, LCE.ENABLE, headers=obs_operator_actor_header)
self.assertIn('has no transition for event enable', cm.exception.message)
#Should be able to retire a device anytime
self.ims_client.execute_instrument_device_lifecycle(inst_dev_id, LCE.RETIRE, headers=obs_operator_actor_header)
inst_dev_obj = self.ims_client.read_instrument_device(inst_dev_id)
self.assertEquals(inst_dev_obj.lcstate, LCS.RETIRED)
self.ims_client.execute_instrument_device_lifecycle(inst_dev_id, LCE.DELETE, headers=obs_operator_actor_header)
inst_dev_obj = self.ims_client.read_instrument_device(inst_dev_id)
self.assertEquals(inst_dev_obj.lcstate, LCS.DELETED)
self.ims_client.force_delete_instrument_device(inst_dev_id, headers=self.system_actor_header)
self.id_client.delete_actor_identity(inst_operator_actor_id,headers=self.system_actor_header )
self.rr_client.delete(member_actor_id)
self.rr_client.delete(obs_operator_actor_id)
@attr('INT', group='coi')
class TestResourcePolicyInt(IonIntegrationTestCase, ResourceHelper):
def __init__(self, *args, **kwargs):
#Hack for running tests on CentOS which is significantly slower than a Mac
self.SLEEP_TIME = 1
ver = platform.mac_ver()
if ver[0] == '':
self.SLEEP_TIME = 3 # Increase for non Macs
log.info('Not running on a Mac')
else:
log.info('Running on a Mac)')
self.care = {}
self.dontcare = {}
self.realtype = {}
IonIntegrationTestCase.__init__(self, *args, **kwargs)
def setUp(self):
# Start container
self._start_container()
#Load a deploy file
self.container.start_rel_from_url('res/deploy/r2deploy.yml')
#Instantiate a process to represent the test
process=GovernanceTestProcess()
#Load system policies after container has started all of the services
policy_loaded = CFG.get_safe('system.load_policy', False)
if not policy_loaded:
log.debug('Loading policy')
LoadSystemPolicy.op_load_system_policies(process)
gevent.sleep(self.SLEEP_TIME*2) # Wait for events to be fired and policy updated
#self.rr_client = ResourceRegistryServiceProcessClient(node=self.container.node, process=process)
self.rr_client = ResourceRegistryServiceWrapper(self.container.resource_registry, process)
self.RR = self.rr_client # support for the parent SA test class of this class
self.id_client = IdentityManagementServiceProcessClient(node=self.container.node, process=process)
self.pol_client = PolicyManagementServiceProcessClient(node=self.container.node, process=process)
self.org_client = OrgManagementServiceProcessClient(node=self.container.node, process=process)
self.ims_client = InstrumentManagementServiceProcessClient(node=self.container.node, process=process)
self.ems_client = ExchangeManagementServiceProcessClient(node=self.container.node, process=process)
self.ssclient = SchedulerServiceProcessClient(node=self.container.node, process=process)
self.sys_management = SystemManagementServiceProcessClient(node=self.container.node, process=process)
#Get info on the ION System Actor
self.system_actor = get_system_actor()
log.info('system actor:' + self.system_actor._id)
self.system_actor_header = get_system_actor_header()
#Get info on the Web Authentication Actor
self.apache_actor = get_web_authentication_actor()
if not self.apache_actor:
#Can't find the apache actor so just use the system actor
self.apache_actor = self.system_actor
self.apache_actor_header = get_actor_header(self.apache_actor._id)
self.anonymous_actor_headers = {'ion-actor-id':'anonymous'}
self.ion_org = self.org_client.find_org()
#Setup access to event repository
dsm = DatastoreManager()
ds = dsm.get_datastore("events")
self.event_repo = EventRepository(dsm)
self.care = {}
self.dontcare = {}
self.realtype = {}
def tearDown(self):
policy_list, _ = self.rr_client.find_resources(restype=RT.Policy )
#Must remove the policies in the reverse order they were added
for policy in sorted(policy_list,key=lambda p: p.ts_created, reverse=True):
self.pol_client.delete_policy(policy._id, headers=self.system_actor_header)
gevent.sleep(self.SLEEP_TIME) # Wait for events to be fired and policy updated
@attr('RESOURCE')
def test_related_resource_policies(self):
"""
This test is used to verify that policies of related resources can be setup and invoked in the "tree" of resources.
@return:
"""
# This import will dynamically load the driver egg. It is needed for the MI includes below
import ion.agents.instrument.test.test_instrument_agent
from mi.core.instrument.instrument_driver import DriverProtocolState
from mi.core.instrument.instrument_driver import DriverConnectionState
from mi.instrument.seabird.sbe37smb.ooicore.driver import SBE37ProtocolEvent
from mi.instrument.seabird.sbe37smb.ooicore.driver import SBE37Parameter
obs_id = self.create_observatory(True, create_with_marine_facility=True)
orgs,_ = self.rr_client.find_subjects(RT.Org ,PRED.hasResource, obs_id)
assert orgs
obs_org = orgs[0]
#Create user
inst_operator_actor_id, valid_until, registered = self.id_client.signon(USER1_CERTIFICATE, True, headers=self.apache_actor_header)
log.debug( "actor id=" + inst_operator_actor_id)
inst_operator_actor_header = get_actor_header(inst_operator_actor_id)
#Create a second user to be used as regular member
member_actor_obj = IonObject(RT.ActorIdentity, name='org member actor')
member_actor_id,_ = self.rr_client.create(member_actor_obj)
assert(member_actor_id)
#Create a third user to be used as observatory operator
obs_operator_actor_obj = IonObject(RT.ActorIdentity, name='observatory operator actor')
obs_operator_actor_id,_ = self.rr_client.create(obs_operator_actor_obj)
assert(obs_operator_actor_id)
#Create Instrument Operator Role and add it to the second Org
org_member_role = IonObject(RT.UserRole, governance_name=ORG_MEMBER_ROLE, name='Org Member', description='Org Member')
self.org_client.add_user_role(obs_org._id, org_member_role, headers=self.system_actor_header)
inst_operator_role = IonObject(RT.UserRole, governance_name=INSTRUMENT_OPERATOR_ROLE, name='Instrument Operator', description='Instrument Operator')
self.org_client.add_user_role(obs_org._id, inst_operator_role, headers=self.system_actor_header)
#Create Instrument Operator Role and add it to the second Org
obs_operator_role = IonObject(RT.UserRole, governance_name=OBSERVATORY_OPERATOR_ROLE, name='Observatory Operator', description='Observatory Operator')
self.org_client.add_user_role(obs_org._id, obs_operator_role, headers=self.system_actor_header)
roles = self.org_client.find_org_roles(obs_org._id)
self.assertEqual(len(roles),3)
self.assertItemsEqual([r.governance_name for r in roles], [ORG_MEMBER_ROLE, INSTRUMENT_OPERATOR_ROLE, OBSERVATORY_OPERATOR_ROLE])
self.org_client.enroll_member(obs_org._id,inst_operator_actor_id, headers=self.system_actor_header)
self.org_client.enroll_member(obs_org._id,member_actor_id, headers=self.system_actor_header)
self.org_client.enroll_member(obs_org._id,obs_operator_actor_id, headers=self.system_actor_header)
#Grant the role of Instrument Operator to the user
self.org_client.grant_role(obs_org._id, inst_operator_actor_id, INSTRUMENT_OPERATOR_ROLE, headers=self.system_actor_header)
#Grant the role of Observatory Operator to the user
self.org_client.grant_role(obs_org._id, obs_operator_actor_id, OBSERVATORY_OPERATOR_ROLE, headers=self.system_actor_header)
#Refresh header with updated roles
inst_operator_actor_header = get_actor_header(inst_operator_actor_id)
member_actor_header = get_actor_header(member_actor_id)
obs_operator_actor_header = get_actor_header(obs_operator_actor_id)
inst_devices,_ = self.rr_client.find_resources(restype=RT.InstrumentDevice)
assert len(inst_devices) > 0
instrument_id = inst_devices[1]._id #just pick one
instrument_name = inst_devices[1].name #just pick one
#Setup commitment to acquire the resource
self.org_client.create_resource_commitment(org_id=obs_org._id, actor_id=inst_operator_actor_id, resource_id=instrument_id, headers=self.system_actor_header)
#Creating second set of resources just to make it a more filled RR
self.create_observatory(False, create_with_marine_facility=True)
#Startup an agent - TODO: will fail with Unauthorized to spawn process if not right user role
from ion.agents.instrument.test.test_instrument_agent import start_instrument_agent_process
ia_client = start_instrument_agent_process(self.container, resource_id=instrument_id, resource_name=instrument_name,
org_governance_name=obs_org.org_governance_name, message_headers=self.system_actor_header)
#Going to try access to other operations on the agent, don't care if they actually work - just
#do they get denied or not
new_params = {
SBE37Parameter.TA0 : 2,
SBE37Parameter.INTERVAL : 100,
}
with self.assertRaises(Conflict) as cm:
ret_val = ia_client.set_resource(new_params, headers=inst_operator_actor_header)
print 'Creating policy for inst: ' + instrument_id
policy_id = self.pol_client.create_resource_access_policy(instrument_id, 'restrict_50_policy',
'Check the value of the SBE37Parameter.INTERVAL parameter',
DENY_PARAM_50_RULE, headers=self.system_actor_header)
gevent.sleep(15) # Wait for events to be published and policy updated
#The policy should now deny the same command as above since the SBE37Parameter.INTERVAL value is greater than 50
with self.assertRaises(Unauthorized) as cm:
ret_val = ia_client.set_resource(new_params, headers=inst_operator_actor_header)
self.assertIn( 'The value for SBE37Parameter.INTERVAL cannot be greater than 50',cm.exception.message)
new_params = {
SBE37Parameter.TA0 : 2,
SBE37Parameter.INTERVAL : 35,
}
#The policy should not deny this since the SBE37Parameter.INTERVAL value is less than 50
with self.assertRaises(Conflict) as cm:
ret_val = ia_client.set_resource(new_params, headers=inst_operator_actor_header)
#The policy should now deny the same command as above since the actor does not have the right role
with self.assertRaises(Unauthorized) as cm:
ret_val = ia_client.set_resource(new_params, headers=member_actor_header)
#The policy should not deny this since the SBE37Parameter.INTERVAL value is less than 50 and the user has the right role
with self.assertRaises(Conflict) as cm:
ret_val = ia_client.set_resource(new_params, headers=obs_operator_actor_header)
#FROM HERE ON DOWN THE TESTS ARE FOR POLICIES ON RELATED RESOURCES
'''
#Find Model
models,_ = self.rr_client.find_objects(instrument_id ,PRED.hasModel, RT.InstrumentModel)
assert models
inst_model = models[0]
#Add a more restrictive policy for instruments models associated to this instrument
print 'Creating policy for model: ' + inst_model._id
policy_id = self.pol_client.create_resource_access_policy(inst_model._id, 'restrict_30_policy',
'Check the value of the SBE37Parameter.INTERVAL parameter',
DENY_PARAM_30_RULE, headers=self.system_actor_header)
gevent.sleep(5) # Wait for events to be published and policy updated
#The policy should now deny the same command as above since the policy checks for a lower value for all models associated with this instrument
with self.assertRaises(Unauthorized) as cm:
ret_val = ia_client.set_resource(new_params, headers=inst_operator_actor_header)
self.assertIn( 'The value for SBE37Parameter.INTERVAL cannot be greater than 30',cm.exception.message)
new_params = {
SBE37Parameter.TA0 : 2,
SBE37Parameter.INTERVAL : 25,
}
#The policy should not deny this since the SBE37Parameter.INTERVAL value is less than 30
with self.assertRaises(Conflict) as cm:
ret_val = ia_client.set_resource(new_params, headers=inst_operator_actor_header)
#Add a more restrictive policy for the observatory that holds all of these resources
print 'Creating policy for obs: ' + obs_id
policy_id = self.pol_client.create_resource_access_policy(obs_id, 'restrict_10_policy',
'Check the value of the SBE37Parameter.INTERVAL parameter',
DENY_PARAM_10_RULE, headers=self.system_actor_header)
gevent.sleep(5) # Wait for events to be published and policy updated
#The policy should now deny the same command as above since the policy checks for a lower value for all instruments in thie observatory
with self.assertRaises(Unauthorized) as cm:
ret_val = ia_client.set_resource(new_params, headers=inst_operator_actor_header)
self.assertIn( 'The value for SBE37Parameter.INTERVAL cannot be greater than 10',cm.exception.message)
new_params = {
SBE37Parameter.TA0 : 2,
SBE37Parameter.INTERVAL : 11,
}
#The policy should still deny the same command as above since the policy checks for a lower value for all instruments in thie observatory
with self.assertRaises(Unauthorized) as cm:
ret_val = ia_client.set_resource(new_params, headers=inst_operator_actor_header)
self.assertIn( 'The value for SBE37Parameter.INTERVAL cannot be greater than 10',cm.exception.message)
new_params = {
SBE37Parameter.TA0 : 2,
SBE37Parameter.INTERVAL : 8,
}
#The policy should not deny this since the SBE37Parameter.INTERVAL value is less than 10
with self.assertRaises(Conflict) as cm:
ret_val = ia_client.set_resource(new_params, headers=inst_operator_actor_header)
'''
| 52.583893 | 205 | 0.720801 | 18,824 | 148,865 | 5.476041 | 0.046271 | 0.040764 | 0.02371 | 0.028318 | 0.868036 | 0.850477 | 0.831472 | 0.819802 | 0.801477 | 0.785712 | 0 | 0.010096 | 0.200262 | 148,865 | 2,830 | 206 | 52.602474 | 0.855745 | 0.158049 | 0 | 0.731893 | 0 | 0.02859 | 0.179194 | 0.054346 | 0 | 1 | 0 | 0.001413 | 0.273825 | 0 | null | null | 0.000635 | 0.031766 | null | null | 0.000635 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
c8f1330982e30f1bb2d047c7f81073dccc6e14f2 | 82,654 | py | Python | idaes/core/surrogate/pysmo/tests/test_radial_basis_function.py | OOAmusat/idaes-pse | ae7d3bb8e372bc32822dcdcb75e9fd96b78da539 | [
"RSA-MD"
] | null | null | null | idaes/core/surrogate/pysmo/tests/test_radial_basis_function.py | OOAmusat/idaes-pse | ae7d3bb8e372bc32822dcdcb75e9fd96b78da539 | [
"RSA-MD"
] | null | null | null | idaes/core/surrogate/pysmo/tests/test_radial_basis_function.py | OOAmusat/idaes-pse | ae7d3bb8e372bc32822dcdcb75e9fd96b78da539 | [
"RSA-MD"
] | null | null | null | #################################################################################
# The Institute for the Design of Advanced Energy Systems Integrated Platform
# Framework (IDAES IP) was produced under the DOE Institute for the
# Design of Advanced Energy Systems (IDAES), and is copyright (c) 2018-2021
# by the software owners: The Regents of the University of California, through
# Lawrence Berkeley National Laboratory, National Technology & Engineering
# Solutions of Sandia, LLC, Carnegie Mellon University, West Virginia University
# Research Corporation, et al. All rights reserved.
#
# Please see the files COPYRIGHT.md and LICENSE.md for full copyright and
# license information.
#################################################################################
import sys
import os
from unittest.mock import patch
sys.path.append(os.path.abspath("..")) # current folder is ~/tests\
from idaes.core.surrogate.pysmo.radial_basis_function import (
RadialBasisFunctions,
FeatureScaling,
)
import numpy as np
import pandas as pd
from scipy.spatial import distance
import pytest
class TestFeatureScaling:
test_data_1d = [[x] for x in range(10)]
test_data_2d = [[x, (x + 1) ** 2] for x in range(10)]
test_data_3d = [[x, x + 10, (x + 1) ** 2 + x + 10] for x in range(10)]
test_data_3d_constant = [[x, 10, (x + 1) ** 2 + 10] for x in range(10)]
@pytest.mark.unit
@pytest.mark.parametrize("array_type", [np.array, pd.DataFrame])
def test_data_scaling_minmax_01(self, array_type):
input_array = array_type(self.test_data_1d)
output_1, output_2, output_3 = FeatureScaling.data_scaling_minmax(input_array)
expected_output_3 = np.array([[9]])
expected_output_2 = np.array([[0]])
expected_output_1 = np.array(
(input_array - expected_output_2) / (expected_output_3 - expected_output_2)
)
np.testing.assert_array_equal(output_3, expected_output_3)
np.testing.assert_array_equal(output_2, expected_output_2)
np.testing.assert_array_equal(output_1, expected_output_1.reshape(10, 1))
@pytest.mark.unit
@pytest.mark.parametrize("array_type", [np.array, pd.DataFrame])
def test_data_scaling_minmax_02(self, array_type):
input_array = array_type(self.test_data_2d)
output_1, output_2, output_3 = FeatureScaling.data_scaling_minmax(input_array)
expected_output_3 = np.array([[9, 100]])
expected_output_2 = np.array([[0, 1]])
expected_output_1 = np.array(
(input_array - expected_output_2) / (expected_output_3 - expected_output_2)
)
np.testing.assert_array_equal(output_3, expected_output_3)
np.testing.assert_array_equal(output_2, expected_output_2)
np.testing.assert_array_equal(output_1, expected_output_1)
@pytest.mark.unit
@pytest.mark.parametrize("array_type", [np.array, pd.DataFrame])
def test_data_scaling_minmax_03(self, array_type):
input_array = array_type(self.test_data_3d)
output_1, output_2, output_3 = FeatureScaling.data_scaling_minmax(input_array)
expected_output_3 = np.array([[9, 19, 119]])
expected_output_2 = np.array([[0, 10, 11]])
expected_output_1 = np.array(
(input_array - expected_output_2) / (expected_output_3 - expected_output_2)
)
np.testing.assert_array_equal(output_3, expected_output_3)
np.testing.assert_array_equal(output_2, expected_output_2)
np.testing.assert_array_equal(output_1, expected_output_1)
@pytest.mark.unit
@pytest.mark.parametrize("array_type", [np.array, pd.DataFrame])
def test_data_scaling_minmax_04(self, array_type):
input_array = array_type(self.test_data_3d_constant)
output_1, output_2, output_3 = FeatureScaling.data_scaling_minmax(input_array)
expected_output_3 = np.array([[9, 10, 110]])
expected_output_2 = np.array([[0, 10, 11]])
scale = expected_output_3 - expected_output_2
scale[scale == 0.0] = 1.0
expected_output_1 = (input_array - expected_output_2) / scale
np.testing.assert_array_equal(output_3, expected_output_3)
np.testing.assert_array_equal(output_2, expected_output_2)
np.testing.assert_array_equal(output_1, expected_output_1)
@pytest.mark.unit
@pytest.mark.parametrize("array_type", [list])
def test_data_scaling_minmax_05(self, array_type):
input_array = array_type(self.test_data_2d)
with pytest.raises(TypeError):
FeatureScaling.data_scaling_minmax(input_array)
@pytest.mark.unit
@pytest.mark.parametrize("array_type", [np.array, pd.DataFrame])
def test_data_unscaling_minmax_01(self, array_type):
input_array = array_type(self.test_data_1d)
output_1, output_2, output_3 = FeatureScaling.data_scaling_minmax(input_array)
output_1 = np.array(output_1).reshape(
output_1.shape[0],
)
un_output_1 = FeatureScaling.data_unscaling_minmax(output_1, output_2, output_3)
np.testing.assert_array_equal(un_output_1, np.array(input_array).reshape(10, 1))
@pytest.mark.unit
@pytest.mark.parametrize("array_type", [np.array, pd.DataFrame])
def test_data_unscaling_minmax_02(self, array_type):
input_array = array_type(self.test_data_2d)
output_1, output_2, output_3 = FeatureScaling.data_scaling_minmax(input_array)
un_output_1 = FeatureScaling.data_unscaling_minmax(output_1, output_2, output_3)
np.testing.assert_array_equal(un_output_1, input_array)
@pytest.mark.unit
@pytest.mark.parametrize("array_type", [np.array, pd.DataFrame])
def test_data_unscaling_minmax_03(self, array_type):
input_array = array_type(self.test_data_3d)
output_1, output_2, output_3 = FeatureScaling.data_scaling_minmax(input_array)
un_output_1 = FeatureScaling.data_unscaling_minmax(output_1, output_2, output_3)
np.testing.assert_array_equal(un_output_1, input_array)
@pytest.mark.unit
@pytest.mark.parametrize("array_type", [np.array, pd.DataFrame])
def test_data_unscaling_minmax_04(self, array_type):
input_array = array_type(self.test_data_3d_constant)
output_1, output_2, output_3 = FeatureScaling.data_scaling_minmax(input_array)
un_output_1 = FeatureScaling.data_unscaling_minmax(output_1, output_2, output_3)
np.testing.assert_array_equal(un_output_1, input_array)
@pytest.mark.unit
@pytest.mark.parametrize("array_type", [np.array, pd.DataFrame])
def test_data_unscaling_minmax_05(self, array_type):
input_array = array_type(self.test_data_2d)
output_1, output_2, output_3 = FeatureScaling.data_scaling_minmax(input_array)
min_array = np.array([[1]])
max_array = np.array([[5]])
with pytest.raises(IndexError):
FeatureScaling.data_unscaling_minmax(output_1, min_array, max_array)
@pytest.mark.unit
@pytest.mark.parametrize("array_type", [np.array, pd.DataFrame])
def test_data_unscaling_minmax_06(self, array_type):
input_array = array_type(self.test_data_2d)
output_1, output_2, output_3 = FeatureScaling.data_scaling_minmax(input_array)
min_array = np.array([[1, 2, 3]])
max_array = np.array([[5, 6, 7]])
with pytest.raises(IndexError):
FeatureScaling.data_unscaling_minmax(output_1, min_array, max_array)
class TestRadialBasisFunction:
y = np.array(
[
[i, j, ((i + 1) ** 2) + ((j + 1) ** 2)]
for i in np.linspace(0, 10, 21)
for j in np.linspace(0, 10, 21)
]
)
full_data = {"x1": y[:, 0], "x2": y[:, 1], "y": y[:, 2]}
training_data = [
[i, j, ((i + 1) ** 2) + ((j + 1) ** 2)]
for i in np.linspace(0, 10, 5)
for j in np.linspace(0, 10, 5)
]
test_data = [[i, (i + 1) ** 2] for i in range(10)]
test_data_large = [[i, (i + 1) ** 2] for i in range(200)]
test_data_1d = [[(i + 1) ** 2] for i in range(10)]
test_data_3d = [[i, (i + 1) ** 2, (i + 2) ** 2] for i in range(10)]
sample_points = [[i, (i + 1) ** 2] for i in range(8)]
sample_points_large = [[i, (i + 1) ** 2] for i in range(100)]
sample_points_1d = [[(i + 1) ** 2] for i in range(8)]
sample_points_3d = [[i, (i + 1) ** 2, (i + 2) ** 2] for i in range(8)]
@pytest.mark.unit
@pytest.mark.parametrize("array_type", [np.array, pd.DataFrame])
def test__init__01(self, array_type):
input_array = array_type(self.test_data)
RbfClass = RadialBasisFunctions(
input_array, basis_function=None, solution_method=None, regularization=None
)
assert RbfClass.solution_method == "algebraic"
assert RbfClass.basis_function == "gaussian"
assert RbfClass.regularization == True
@pytest.mark.unit
@pytest.mark.parametrize("array_type", [np.array, pd.DataFrame])
def test__init__02(self, array_type):
input_array = array_type(self.test_data)
RbfClass = RadialBasisFunctions(
input_array,
basis_function="LineaR",
solution_method="PyoMo",
regularization=False,
)
assert RbfClass.solution_method == "pyomo"
assert RbfClass.basis_function == "linear"
assert RbfClass.regularization == False
@pytest.mark.unit
def test__init__03(self):
with pytest.raises(Exception):
RbfClass = RadialBasisFunctions(
[1, 2, 3, 4],
basis_function="LineaR",
solution_method="PyoMo",
regularization=False,
)
@pytest.mark.unit
@pytest.mark.parametrize("array_type", [np.array, pd.DataFrame])
def test__init__04(self, array_type):
with pytest.raises(Exception):
input_array = array_type(self.test_data)
RbfClass = RadialBasisFunctions(
input_array, basis_function=None, solution_method=1, regularization=None
)
@pytest.mark.unit
@pytest.mark.parametrize("array_type", [np.array, pd.DataFrame])
def test__init__05(self, array_type):
with pytest.raises(Exception):
input_array = array_type(self.test_data)
RbfClass = RadialBasisFunctions(
input_array,
basis_function=None,
solution_method="idaes",
regularization=None,
)
@pytest.mark.unit
@pytest.mark.parametrize("array_type", [np.array, pd.DataFrame])
def test__init__06(self, array_type):
with pytest.raises(Exception):
input_array = array_type(self.test_data)
RbfClass = RadialBasisFunctions(
input_array, basis_function=1, solution_method=None, regularization=None
)
@pytest.mark.unit
@pytest.mark.parametrize("array_type", [np.array, pd.DataFrame])
def test__init__07(self, array_type):
with pytest.raises(Exception):
input_array = array_type(self.test_data)
RbfClass = RadialBasisFunctions(
input_array,
basis_function="idaes",
solution_method=None,
regularization=None,
)
@pytest.mark.unit
@pytest.mark.parametrize("array_type", [np.array, pd.DataFrame])
def test__init__08(self, array_type):
with pytest.raises(Exception):
input_array = array_type(self.test_data)
RbfClass = RadialBasisFunctions(
input_array, basis_function=None, solution_method=None, regularization=1
)
@pytest.mark.unit
@pytest.mark.parametrize("array_type", [np.array, pd.DataFrame])
def test__init__09(self, array_type):
with pytest.raises(Exception):
input_array = array_type(self.test_data)
RbfClass = RadialBasisFunctions(
input_array,
basis_function="LineaR",
solution_method="PyoMo",
regularization=False,
overwrite=1,
)
@pytest.mark.unit
@pytest.mark.parametrize("array_type", [np.array, pd.DataFrame])
def test__init__10(self, array_type):
with pytest.raises(Exception):
input_array = array_type(self.test_data)
RbfClass = RadialBasisFunctions(
input_array,
basis_function="LineaR",
solution_method="PyoMo",
regularization=False,
fname="solution.pkl",
)
@pytest.mark.unit
@pytest.mark.parametrize("array_type", [np.array, pd.DataFrame])
def test__init__11(self, array_type):
with pytest.raises(Exception):
input_array = array_type(self.test_data)
RbfClass = RadialBasisFunctions(
input_array,
basis_function="LineaR",
solution_method="PyoMo",
regularization=False,
fname=1,
)
@pytest.mark.unit
@pytest.fixture(scope="module")
@pytest.mark.parametrize("array_type", [np.array, pd.DataFrame])
def test__init__12(self, array_type):
file_name = "test_filename.pickle"
input_array = array_type(self.test_data)
RbfClass1 = RadialBasisFunctions(
input_array,
basis_function="LineaR",
solution_method="PyoMo",
regularization=False,
fname=file_name,
overwrite=True,
)
p = RbfClass1.get_feature_vector()
results = RbfClass1.rbf_training()
RbfClass2 = RadialBasisFunctions(
input_array,
basis_function="LineaR",
solution_method="PyoMo",
regularization=False,
fname=file_name,
overwrite=True,
)
assert RbfClass1.filename == RbfClass2.filename
@pytest.mark.unit
@pytest.fixture(scope="module")
@pytest.mark.parametrize("array_type", [np.array, pd.DataFrame])
def test__init__14(self, array_type):
input_array = array_type(self.test_data)
file_name1 = "test_filename1.pickle"
file_name2 = "test_filename2.pickle"
RbfClass1 = RadialBasisFunctions(
input_array,
basis_function="LineaR",
solution_method="PyoMo",
regularization=False,
fname=file_name1,
overwrite=True,
)
p = RbfClass1.get_feature_vector()
RbfClass1.training()
RbfClass2 = RadialBasisFunctions(
input_array,
basis_function="LineaR",
solution_method="PyoMo",
regularization=False,
fname=file_name2,
overwrite=True,
)
assert RbfClass1.filename == file_name1
assert RbfClass2.filename == file_name2
@pytest.mark.unit
@pytest.mark.parametrize("array_type", [np.array, pd.DataFrame])
def test_r2_distance(self, array_type):
input_array = array_type(self.training_data)
u = np.array([[0.1, 0.9]])
data_feed = RadialBasisFunctions(input_array)
output = data_feed.r2_distance(u)
scaled = FeatureScaling.data_scaling_minmax(input_array)
scaled = scaled[0]
scaled_x = np.array(scaled)[:, :-1]
expected_output = np.sqrt(np.sum(np.square(scaled_x - u), axis=1))
np.testing.assert_almost_equal(expected_output, output, decimal=6)
@pytest.mark.unit
def test_gaussian_basis_transformation(self):
d_vec = np.array(
[
[0, 0],
[5e-6, 7e-6],
[0.005, 0.007],
[0.05, 0.07],
[0.5, 0.7],
[5, 7],
[50, 70],
]
)
shape_list = [0.001, 1, 1000]
expected_output_1 = np.exp(-1 * ((d_vec * shape_list[0]) ** 2))
expected_output_2 = np.exp(-1 * ((d_vec * shape_list[1]) ** 2))
expected_output_3 = np.exp(-1 * ((d_vec * shape_list[2]) ** 2))
output_1 = RadialBasisFunctions.gaussian_basis_transformation(
d_vec, shape_list[0]
)
output_2 = RadialBasisFunctions.gaussian_basis_transformation(
d_vec, shape_list[1]
)
output_3 = RadialBasisFunctions.gaussian_basis_transformation(
d_vec, shape_list[2]
)
np.testing.assert_array_equal(expected_output_1, output_1)
np.testing.assert_array_equal(expected_output_2, output_2)
np.testing.assert_array_equal(expected_output_3, output_3)
@pytest.mark.unit
def test_linear_transformation(self):
d_vec = np.array(
[
[0, 0],
[5e-6, 7e-6],
[0.005, 0.007],
[0.05, 0.07],
[0.5, 0.7],
[5, 7],
[50, 70],
]
)
output_1 = RadialBasisFunctions.linear_transformation(d_vec)
np.testing.assert_array_equal(d_vec, output_1)
@pytest.mark.unit
def test_cubic_transformation(self):
d_vec = np.array(
[
[0, 0],
[5e-6, 7e-6],
[0.005, 0.007],
[0.05, 0.07],
[0.5, 0.7],
[5, 7],
[50, 70],
]
)
expected_output = d_vec**3
output = RadialBasisFunctions.cubic_transformation(d_vec)
np.testing.assert_array_equal(expected_output, output)
@pytest.mark.unit
def test_multiquadric_basis_transformation(self):
d_vec = np.array(
[
[0, 0],
[5e-6, 7e-6],
[0.005, 0.007],
[0.05, 0.07],
[0.5, 0.7],
[5, 7],
[50, 70],
]
)
shape_list = [0.001, 1, 1000]
expected_output_1 = np.sqrt(((d_vec * shape_list[0]) ** 2) + 1)
expected_output_2 = np.sqrt(((d_vec * shape_list[1]) ** 2) + 1)
expected_output_3 = np.sqrt(((d_vec * shape_list[2]) ** 2) + 1)
output_1 = RadialBasisFunctions.multiquadric_basis_transformation(
d_vec, shape_list[0]
)
output_2 = RadialBasisFunctions.multiquadric_basis_transformation(
d_vec, shape_list[1]
)
output_3 = RadialBasisFunctions.multiquadric_basis_transformation(
d_vec, shape_list[2]
)
np.testing.assert_array_equal(expected_output_1, output_1)
np.testing.assert_array_equal(expected_output_2, output_2)
np.testing.assert_array_equal(expected_output_3, output_3)
@pytest.mark.unit
def test_inverse_multiquadric_basis_transformation(self):
d_vec = np.array(
[
[0, 0],
[5e-6, 7e-6],
[0.005, 0.007],
[0.05, 0.07],
[0.5, 0.7],
[5, 7],
[50, 70],
]
)
shape_list = [0.001, 1, 1000]
expected_output_1 = 1 / np.sqrt(((d_vec * shape_list[0]) ** 2) + 1)
expected_output_2 = 1 / np.sqrt(((d_vec * shape_list[1]) ** 2) + 1)
expected_output_3 = 1 / np.sqrt(((d_vec * shape_list[2]) ** 2) + 1)
output_1 = RadialBasisFunctions.inverse_multiquadric_basis_transformation(
d_vec, shape_list[0]
)
output_2 = RadialBasisFunctions.inverse_multiquadric_basis_transformation(
d_vec, shape_list[1]
)
output_3 = RadialBasisFunctions.inverse_multiquadric_basis_transformation(
d_vec, shape_list[2]
)
np.testing.assert_array_equal(expected_output_1, output_1)
np.testing.assert_array_equal(expected_output_2, output_2)
np.testing.assert_array_equal(expected_output_3, output_3)
@pytest.mark.unit
def test_thin_plate_spline_transformation(self):
d_vec = np.array(
[
[5e-6, 7e-6],
[0.005, 0.007],
[0.05, 0.07],
[0.5, 0.7],
[5, 7],
[50, 70],
[50, np.NaN],
]
)
expected_output = np.nan_to_num(d_vec**2 * np.log(d_vec))
output = RadialBasisFunctions.thin_plate_spline_transformation(d_vec)
np.testing.assert_array_equal(expected_output, output)
@pytest.mark.unit
@pytest.mark.parametrize("array_type", [np.array, pd.DataFrame])
def test_basis_generation(self, array_type):
input_array = array_type(self.training_data)
scaled = FeatureScaling.data_scaling_minmax(input_array[0:3])
scaled = scaled[0]
scaled_x = np.array(scaled)[:, :-1]
distance_array = distance.cdist(scaled_x, scaled_x, "euclidean")
# Linear
data_feed_01 = RadialBasisFunctions(input_array[0:3], basis_function="linear")
expected_output_1 = distance_array
output_1 = data_feed_01.basis_generation(2)
np.testing.assert_array_equal(expected_output_1, output_1)
# Cubic
data_feed_02 = RadialBasisFunctions(input_array[0:3], basis_function="cubic")
expected_output_2 = distance_array**3
output_2 = data_feed_02.basis_generation(2)
np.testing.assert_array_equal(expected_output_2, output_2)
# # Spline
data_feed_03 = RadialBasisFunctions(input_array[0:3], basis_function="spline")
expected_output_3 = np.nan_to_num(distance_array**2 * np.log(distance_array))
output_3 = data_feed_03.basis_generation(2)
np.testing.assert_array_equal(expected_output_3, output_3)
# # Gaussian
data_feed_04 = RadialBasisFunctions(input_array[0:3], basis_function="gaussian")
shape_value = 2
expected_output_4 = np.exp(-1 * ((distance_array * shape_value) ** 2))
output_4 = data_feed_04.basis_generation(shape_value)
np.testing.assert_array_equal(expected_output_4, output_4)
# # Multiquadric
data_feed_05 = RadialBasisFunctions(input_array[0:3], basis_function="mq")
shape_value = 2
expected_output_5 = np.sqrt(((distance_array * shape_value) ** 2) + 1)
output_5 = data_feed_05.basis_generation(shape_value)
np.testing.assert_array_equal(expected_output_5, output_5)
# # Inverse multiquadric
data_feed_06 = RadialBasisFunctions(input_array[0:3], basis_function="imq")
shape_value = 2
expected_output_6 = 1 / np.sqrt(((distance_array * shape_value) ** 2) + 1)
output_6 = data_feed_06.basis_generation(shape_value)
np.testing.assert_array_equal(expected_output_6, output_6)
@pytest.mark.unit
@pytest.mark.parametrize("array_type", [np.array])
def test_cost_function_01(self, array_type):
input_array = array_type(self.training_data)
x = input_array[:, :-1]
y = input_array[:, -1]
x_data_nr = x.shape[0]
x_data_nc = 6
x_vector = np.zeros((x_data_nr, x_data_nc))
x_vector[:, 0] = 1
x_vector[:, 1] = x[:, 0]
x_vector[:, 2] = x[:, 1]
x_vector[:, 3] = x[:, 0] ** 2
x_vector[:, 4] = x[:, 1] ** 2
x_vector[:, 5] = x[:, 0] * x[:, 1]
theta = np.zeros((x_data_nc, 1))
expected_value = 6613.875
output_1 = RadialBasisFunctions.cost_function(theta, x_vector, y)
assert output_1 == expected_value
@pytest.mark.unit
@pytest.mark.parametrize("array_type", [np.array])
def test_cost_function_02(self, array_type):
input_array = array_type(self.training_data)
x = input_array[:, :-1]
y = input_array[:, -1]
x_data_nr = x.shape[0]
x_data_nc = 6
x_vector = np.zeros((x_data_nr, x_data_nc))
x_vector[:, 0] = 1
x_vector[:, 1] = x[:, 0]
x_vector[:, 2] = x[:, 1]
x_vector[:, 3] = x[:, 0] ** 2
x_vector[:, 4] = x[:, 1] ** 2
x_vector[:, 5] = x[:, 0] * x[:, 1]
theta = np.array([[4.5], [3], [3], [1], [1], [0]])
expected_value = 90.625 # Calculated externally as sum(dy^2) / 2m
output_1 = RadialBasisFunctions.cost_function(theta, x_vector, y)
assert output_1 == expected_value
@pytest.mark.unit
@pytest.mark.parametrize("array_type", [np.array])
def test_cost_function_03(self, array_type):
input_array = array_type(self.training_data)
x = input_array[:, :-1]
y = input_array[:, -1]
x_data_nr = x.shape[0]
x_data_nc = 6
x_vector = np.zeros((x_data_nr, x_data_nc))
x_vector[:, 0] = 1
x_vector[:, 1] = x[:, 0]
x_vector[:, 2] = x[:, 1]
x_vector[:, 3] = x[:, 0] ** 2
x_vector[:, 4] = x[:, 1] ** 2
x_vector[:, 5] = x[:, 0] * x[:, 1]
theta = np.array([[2], [2], [2], [1], [1], [0]])
expected_value = 0
output_1 = RadialBasisFunctions.cost_function(theta, x_vector, y)
assert output_1 == expected_value
@pytest.mark.unit
@pytest.mark.parametrize("array_type", [np.array])
def test_gradient_function_01(self, array_type):
input_array = array_type(self.training_data)
x = input_array[:, :-1]
y = input_array[:, -1]
x_data_nr = x.shape[0]
x_data_nc = 6
x_vector = np.zeros((x_data_nr, x_data_nc))
x_vector[:, 0] = 1
x_vector[:, 1] = x[:, 0]
x_vector[:, 2] = x[:, 1]
x_vector[:, 3] = x[:, 0] ** 2
x_vector[:, 4] = x[:, 1] ** 2
x_vector[:, 5] = x[:, 0] * x[:, 1]
theta = np.zeros((x_data_nc,))
expected_value = np.array(
[[-97], [-635], [-635], [-5246.875], [-5246.875], [-3925]]
)
expected_value = expected_value.reshape(
expected_value.shape[0],
)
output_1 = RadialBasisFunctions.gradient_function(theta, x_vector, y)
np.testing.assert_equal(output_1, expected_value)
@pytest.mark.unit
@pytest.mark.parametrize("array_type", [np.array])
def test_gradient_function_02(self, array_type):
input_array = array_type(self.training_data)
x = input_array[:, :-1]
y = input_array[:, -1]
x_data_nr = x.shape[0]
x_data_nc = 6
x_vector = np.zeros((x_data_nr, x_data_nc))
x_vector[:, 0] = 1
x_vector[:, 1] = x[:, 0]
x_vector[:, 2] = x[:, 1]
x_vector[:, 3] = x[:, 0] ** 2
x_vector[:, 4] = x[:, 1] ** 2
x_vector[:, 5] = x[:, 0] * x[:, 1]
theta = np.array(
[[4.5], [3], [3], [1], [1], [0]]
) # coefficients in (x1 + 1.5)^2 + (x2 + 1.5) ^ 2
theta = theta.reshape(
theta.shape[0],
)
expected_value = np.array(
[[12.5], [75], [75], [593.75], [593.75], [437.5]]
) # Calculated externally: see Excel sheet
expected_value = expected_value.reshape(
expected_value.shape[0],
)
output_1 = RadialBasisFunctions.gradient_function(theta, x_vector, y)
np.testing.assert_equal(output_1, expected_value)
@pytest.mark.unit
@pytest.mark.parametrize("array_type", [np.array])
def test_gradient_function_03(self, array_type):
input_array = array_type(self.training_data)
x = input_array[:, :-1]
y = input_array[:, -1]
x_data_nr = x.shape[0]
x_data_nc = 6
x_vector = np.zeros((x_data_nr, x_data_nc))
x_vector[:, 0] = 1
x_vector[:, 1] = x[:, 0]
x_vector[:, 2] = x[:, 1]
x_vector[:, 3] = x[:, 0] ** 2
x_vector[:, 4] = x[:, 1] ** 2
x_vector[:, 5] = x[:, 0] * x[:, 1]
theta = np.array(
[[2], [2], [2], [1], [1], [0]]
) # Actual coefficients in (x1 + 1)^2 + (x2 + 1) ^ 2
theta = theta.reshape(
theta.shape[0],
)
expected_value = np.array(
[[0], [0], [0], [0], [0], [0]]
) # Calculated externally: see Excel sheet
expected_value = expected_value.reshape(
expected_value.shape[0],
)
output_1 = RadialBasisFunctions.gradient_function(theta, x_vector, y)
np.testing.assert_equal(output_1, expected_value)
@pytest.mark.unit
@pytest.mark.parametrize("array_type", [np.array, pd.DataFrame])
def test_bfgs_parameter_optimization_01(self, array_type):
input_array = np.array(
[
[0, 1],
[1, 4],
[2, 9],
[3, 16],
[4, 25],
[5, 36],
[6, 49],
[7, 64],
[8, 81],
[9, 100],
]
)
x = input_array[:, 0]
y = input_array[:, 1]
x_vector = np.zeros((x.shape[0], 3))
x_vector[:, 0] = (
x[
:,
]
** 2
)
x_vector[:, 1] = x[
:,
]
x_vector[:, 2] = 1
expected_value = np.array([[1.0], [2.0], [1.0]]).reshape(
3,
)
data_feed = RadialBasisFunctions(
array_type(self.test_data), basis_function="linear", solution_method="bfgs"
)
output_1 = data_feed.bfgs_parameter_optimization(x_vector, y)
assert data_feed.solution_method == "bfgs"
np.testing.assert_array_equal(expected_value, np.round(output_1, 4))
@pytest.mark.unit
@pytest.mark.parametrize("array_type1", [np.array])
@pytest.mark.parametrize("array_type2", [pd.DataFrame])
def test_bfgs_parameter_optimization_02(self, array_type1, array_type2):
input_array = array_type1(self.training_data)
x = input_array[:, :-1]
y = input_array[:, -1]
x_vector = np.zeros((x.shape[0], 6))
x_vector[:, 0] = x[:, 0] ** 2
x_vector[:, 1] = x[:, 1] ** 2
x_vector[:, 2] = x[:, 0]
x_vector[:, 3] = x[:, 1]
x_vector[:, 4] = x[:, 1] * x[:, 0]
x_vector[:, 5] = 1
expected_value = np.array([[1.0], [1.0], [2.0], [2.0], [0.0], [2.0]]).reshape(
6,
)
data_feed = RadialBasisFunctions(
array_type2(self.full_data), solution_method="bfgs"
)
output_1 = data_feed.bfgs_parameter_optimization(x_vector, y)
assert data_feed.solution_method == "bfgs"
np.testing.assert_array_equal(expected_value, np.round(output_1, 4))
@pytest.mark.unit
def test_explicit_linear_algebra_solution_01(self):
input_array = np.array(
[
[0, 1],
[1, 4],
[2, 9],
[3, 16],
[4, 25],
[5, 36],
[6, 49],
[7, 64],
[8, 81],
[9, 100],
]
)
x = input_array[:, 0]
y = input_array[:, 1]
x_vector = np.zeros((x.shape[0], 3))
x_vector[:, 0] = (
x[
:,
]
** 2
)
x_vector[:, 1] = x[
:,
]
x_vector[:, 2] = 1
expected_value = np.array([[1.0], [2.0], [1.0]]).reshape(
3,
)
output_1 = RadialBasisFunctions.explicit_linear_algebra_solution(x_vector, y)
np.testing.assert_array_equal(expected_value, np.round(output_1, 4))
@pytest.mark.unit
@pytest.mark.parametrize("array_type", [np.array])
def test_explicit_linear_algebra_solution_02(self, array_type):
input_array = array_type(self.training_data)
x = input_array[:, :-1]
y = input_array[:, -1]
x_vector = np.zeros((x.shape[0], 6))
x_vector[:, 0] = x[:, 0] ** 2
x_vector[:, 1] = x[:, 1] ** 2
x_vector[:, 2] = x[:, 0]
x_vector[:, 3] = x[:, 1]
x_vector[:, 4] = x[:, 1] * x[:, 0]
x_vector[:, 5] = 1
expected_value = np.array([[1.0], [1.0], [2.0], [2.0], [0.0], [2.0]]).reshape(
6,
)
output_1 = RadialBasisFunctions.explicit_linear_algebra_solution(x_vector, y)
np.testing.assert_array_equal(expected_value, np.round(output_1, 4))
@pytest.mark.unit
def test_pyomo_optimization_01(self):
input_array = np.array(
[
[0, 1],
[1, 4],
[2, 9],
[3, 16],
[4, 25],
[5, 36],
[6, 49],
[7, 64],
[8, 81],
[9, 100],
]
)
x = input_array[:, 0]
y = input_array[:, 1]
x_vector = np.zeros((x.shape[0], 3))
x_vector[:, 0] = (
x[
:,
]
** 2
)
x_vector[:, 1] = x[
:,
]
x_vector[:, 2] = 1
expected_value = np.array([[1.0], [2.0], [1.0]])
output_1 = RadialBasisFunctions.pyomo_optimization(x_vector, y)
np.testing.assert_array_equal(expected_value, np.round(output_1, 4))
@pytest.mark.unit
@pytest.mark.parametrize("array_type", [np.array])
def test_pyomo_optimization_02(self, array_type):
input_array = array_type(self.training_data)
x = input_array[:, :-1]
y = input_array[:, -1]
x_vector = np.zeros((x.shape[0], 6))
x_vector[:, 0] = x[:, 0] ** 2
x_vector[:, 1] = x[:, 1] ** 2
x_vector[:, 2] = x[:, 0]
x_vector[:, 3] = x[:, 1]
x_vector[:, 4] = x[:, 1] * x[:, 0]
x_vector[:, 5] = 1
expected_value = np.array([[1.0], [1.0], [2.0], [2.0], [0.0], [2.0]])
output_1 = RadialBasisFunctions.pyomo_optimization(x_vector, y)
np.testing.assert_array_equal(expected_value, np.round(output_1, 4))
@pytest.mark.unit
@pytest.mark.parametrize("array_type", [np.array])
def test_error_calculation_01(self, array_type):
input_array = array_type(self.training_data)
x = input_array[:, :-1]
y = input_array[:, -1].reshape(input_array.shape[0], 1)
x_data_nr = x.shape[0]
x_data_nc = 6
x_vector = np.zeros((x_data_nr, x_data_nc))
x_vector[:, 0] = 1
x_vector[:, 1] = x[:, 0]
x_vector[:, 2] = x[:, 1]
x_vector[:, 3] = x[:, 0] ** 2
x_vector[:, 4] = x[:, 1] ** 2
x_vector[:, 5] = x[:, 0] * x[:, 1]
theta = np.zeros((x_data_nc, 1))
expected_value_1 = 2 * 6613.875 # Calculated externally as sum(y^2) / m
expected_value_2 = expected_value_1**0.5
output_1, output_2, _ = RadialBasisFunctions.error_calculation(
theta, x_vector, y
)
assert output_1 == expected_value_1
assert output_2 == expected_value_2
@pytest.mark.unit
@pytest.mark.parametrize("array_type", [np.array])
def test_error_calculation_02(self, array_type):
input_array = array_type(self.training_data)
x = input_array[:, :-1]
y = input_array[:, -1].reshape(input_array.shape[0], 1)
x_data_nr = x.shape[0]
x_data_nc = 6
x_vector = np.zeros((x_data_nr, x_data_nc))
x_vector[:, 0] = 1
x_vector[:, 1] = x[:, 0]
x_vector[:, 2] = x[:, 1]
x_vector[:, 3] = x[:, 0] ** 2
x_vector[:, 4] = x[:, 1] ** 2
x_vector[:, 5] = x[:, 0] * x[:, 1]
theta = np.array(
[[4.5], [3], [3], [1], [1], [0]]
) # coefficients in (x1 + 1.5)^2 + (x2 + 1.5) ^ 2
expected_value_1 = 2 * 90.625 # Calculated externally as sum(dy^2) / 2m
expected_value_2 = expected_value_1**0.5
output_1, output_2, _ = RadialBasisFunctions.error_calculation(
theta, x_vector, y
)
assert output_1 == expected_value_1
assert output_2 == expected_value_2
@pytest.mark.unit
@pytest.mark.parametrize("array_type", [np.array])
def test_error_calculation_03(self, array_type):
input_array = array_type(self.training_data)
x = input_array[:, :-1]
y = input_array[:, -1].reshape(input_array.shape[0], 1)
x_data_nr = x.shape[0]
x_data_nc = 6
x_vector = np.zeros((x_data_nr, x_data_nc))
x_vector[:, 0] = 1
x_vector[:, 1] = x[:, 0]
x_vector[:, 2] = x[:, 1]
x_vector[:, 3] = x[:, 0] ** 2
x_vector[:, 4] = x[:, 1] ** 2
x_vector[:, 5] = x[:, 0] * x[:, 1]
theta = np.array(
[[2], [2], [2], [1], [1], [0]]
) # Actual coefficients in (x1 + 1)^2 + (x2 + 1) ^ 2
expected_value_1 = 2 * 0 # Value should return zero for exact solution
expected_value_2 = expected_value_1**0.5
output_1, output_2, _ = RadialBasisFunctions.error_calculation(
theta, x_vector, y
)
assert output_1 == expected_value_1
assert output_2 == expected_value_2
@pytest.mark.unit
def test_r2_calculation_01(self):
y_actual = np.array([[1], [4], [9], [16], [25], [36], [49], [64], [81], [100]])
y_pred = y_actual * 1.05
expected_output = 0.993974359 # Evaluated in Excel
output = RadialBasisFunctions.r2_calculation(y_actual, y_pred)
assert round(abs(expected_output - output), 7) == 0
@pytest.mark.unit
def test_r2_calculation_02(self):
y_actual = np.array([[1], [4], [9], [16], [25], [36], [49], [64], [81], [100]])
y_pred = y_actual * 1.50
expected_output = 0.3974358974 # Evaluated in Excel
output = RadialBasisFunctions.r2_calculation(y_actual, y_pred)
assert round(abs(expected_output - output), 7) == 0
def mock_basis_generation(self, r):
return np.ones((self.x_data.shape[0], self.x_data.shape[0]))
def mock_optimization(self, x, y):
return 500 * np.ones((x.shape[0], 1))
@patch.object(RadialBasisFunctions, "basis_generation", mock_basis_generation)
@patch.object(
RadialBasisFunctions, "explicit_linear_algebra_solution", mock_optimization
)
@pytest.mark.unit
@pytest.mark.parametrize("array_type", [np.array, pd.DataFrame])
def test_loo_error_estimation_with_rippa_method_01(self, array_type):
input_array = array_type(self.training_data)
reg_param = 0.1
shape_factor = 1
expected_x = np.ones((input_array.shape[0], input_array.shape[0])) + (
reg_param * np.eye(input_array.shape[0], input_array.shape[0])
)
expected_inverse_x = np.diag(np.linalg.pinv(expected_x))
expected_radial_weights = 500 * np.ones((input_array.shape[0], 1))
expected_errors = np.linalg.norm(
expected_radial_weights
/ (expected_inverse_x.reshape(expected_inverse_x.shape[0], 1))
)
data_feed = RadialBasisFunctions(input_array, solution_method="algebraic")
_, output_1, output_2 = data_feed.loo_error_estimation_with_rippa_method(
shape_factor, reg_param
)
assert output_1 == np.linalg.cond(expected_x)
np.testing.assert_array_equal(output_2, expected_errors)
@patch.object(RadialBasisFunctions, "basis_generation", mock_basis_generation)
@patch.object(RadialBasisFunctions, "pyomo_optimization", mock_optimization)
@pytest.mark.unit
@pytest.mark.parametrize("array_type", [np.array, pd.DataFrame])
def test_loo_error_estimation_with_rippa_method_02(self, array_type):
input_array = array_type(self.training_data)
reg_param = 0.1
shape_factor = 1
expected_x = np.ones((input_array.shape[0], input_array.shape[0])) + (
reg_param * np.eye(input_array.shape[0], input_array.shape[0])
)
expected_inverse_x = np.diag(np.linalg.pinv(expected_x))
expected_radial_weights = 500 * np.ones((input_array.shape[0], 1))
expected_errors = np.linalg.norm(
expected_radial_weights
/ (expected_inverse_x.reshape(expected_inverse_x.shape[0], 1))
)
data_feed = RadialBasisFunctions(input_array, solution_method="pyomo")
_, output_1, output_2 = data_feed.loo_error_estimation_with_rippa_method(
shape_factor, reg_param
)
assert output_1 == np.linalg.cond(expected_x)
np.testing.assert_array_equal(output_2, expected_errors)
@patch.object(RadialBasisFunctions, "basis_generation", mock_basis_generation)
@patch.object(
RadialBasisFunctions, "bfgs_parameter_optimization", mock_optimization
)
@pytest.mark.unit
@pytest.mark.parametrize("array_type", [np.array, pd.DataFrame])
def test_loo_error_estimation_with_rippa_method_03(self, array_type):
input_array = array_type(self.training_data)
reg_param = 0.1
shape_factor = 1
expected_x = np.ones((input_array.shape[0], input_array.shape[0])) + (
reg_param * np.eye(input_array.shape[0], input_array.shape[0])
)
expected_inverse_x = np.diag(np.linalg.pinv(expected_x))
expected_radial_weights = 500 * np.ones((input_array.shape[0], 1))
expected_errors = np.linalg.norm(
expected_radial_weights
/ (expected_inverse_x.reshape(expected_inverse_x.shape[0], 1))
)
data_feed = RadialBasisFunctions(input_array, solution_method="bfgs")
_, output_1, output_2 = data_feed.loo_error_estimation_with_rippa_method(
shape_factor, reg_param
)
assert output_1 == np.linalg.cond(expected_x)
np.testing.assert_array_equal(output_2, expected_errors)
@pytest.mark.unit
@pytest.mark.parametrize("array_type", [np.array, pd.DataFrame])
def test_leave_one_out_crossvalidation_01(self, array_type):
input_array = array_type(self.training_data)
data_feed = RadialBasisFunctions(
input_array, basis_function=None, solution_method=None, regularization=False
)
r_best, lambda_best, error_best = data_feed.leave_one_out_crossvalidation()
if (
(data_feed.basis_function == "gaussian")
or (data_feed.basis_function == "mq")
or (data_feed.basis_function.lower() == "imq")
):
r_set = [
0.001,
0.002,
0.005,
0.0075,
0.01,
0.02,
0.05,
0.075,
0.1,
0.2,
0.5,
0.75,
1.0,
2.0,
5.0,
7.5,
10.0,
20.0,
50.0,
75.0,
100.0,
200.0,
500.0,
1000.0,
]
else:
r_set = [0]
if data_feed.regularization is True:
reg_parameter = [
0.00001,
0.00002,
0.00005,
0.000075,
0.0001,
0.0002,
0.0005,
0.00075,
0.001,
0.002,
0.005,
0.0075,
0.01,
0.02,
0.05,
0.075,
0.1,
0.2,
0.5,
0.75,
1,
]
elif data_feed.regularization is False:
reg_parameter = [0]
_, _, expected_errors = data_feed.loo_error_estimation_with_rippa_method(
r_best, lambda_best
)
assert (r_best in r_set) == True
assert (lambda_best in reg_parameter) == True
assert error_best == expected_errors
@pytest.mark.unit
@pytest.mark.parametrize("array_type", [np.array, pd.DataFrame])
def test_leave_one_out_crossvalidation_02(self, array_type):
input_array = array_type(self.training_data)
data_feed = RadialBasisFunctions(
input_array,
basis_function="cubic",
solution_method=None,
regularization=False,
)
r_best, lambda_best, error_best = data_feed.leave_one_out_crossvalidation()
if (
(data_feed.basis_function == "gaussian")
or (data_feed.basis_function == "mq")
or (data_feed.basis_function.lower() == "imq")
):
r_set = [
0.001,
0.002,
0.005,
0.0075,
0.01,
0.02,
0.05,
0.075,
0.1,
0.2,
0.5,
0.75,
1.0,
2.0,
5.0,
7.5,
10.0,
20.0,
50.0,
75.0,
100.0,
200.0,
500.0,
1000.0,
]
else:
r_set = [0]
if data_feed.regularization is True:
reg_parameter = [
0.00001,
0.00002,
0.00005,
0.000075,
0.0001,
0.0002,
0.0005,
0.00075,
0.001,
0.002,
0.005,
0.0075,
0.01,
0.02,
0.05,
0.075,
0.1,
0.2,
0.5,
0.75,
1,
]
elif data_feed.regularization is False:
reg_parameter = [0]
_, _, expected_errors = data_feed.loo_error_estimation_with_rippa_method(
r_best, lambda_best
)
assert (r_best in r_set) == True
assert (lambda_best in reg_parameter) == True
assert error_best == expected_errors
@pytest.mark.unit
@pytest.mark.parametrize("array_type", [np.array, pd.DataFrame])
def test_leave_one_out_crossvalidation_03(self, array_type):
input_array = array_type(self.training_data)
data_feed = RadialBasisFunctions(
input_array,
basis_function="linear",
solution_method=None,
regularization=False,
)
r_best, lambda_best, error_best = data_feed.leave_one_out_crossvalidation()
if (
(data_feed.basis_function == "gaussian")
or (data_feed.basis_function == "mq")
or (data_feed.basis_function.lower() == "imq")
):
r_set = [
0.001,
0.002,
0.005,
0.0075,
0.01,
0.02,
0.05,
0.075,
0.1,
0.2,
0.5,
0.75,
1.0,
2.0,
5.0,
7.5,
10.0,
20.0,
50.0,
75.0,
100.0,
200.0,
500.0,
1000.0,
]
else:
r_set = [0]
if data_feed.regularization is True:
reg_parameter = [
0.00001,
0.00002,
0.00005,
0.000075,
0.0001,
0.0002,
0.0005,
0.00075,
0.001,
0.002,
0.005,
0.0075,
0.01,
0.02,
0.05,
0.075,
0.1,
0.2,
0.5,
0.75,
1,
]
elif data_feed.regularization is False:
reg_parameter = [0]
_, _, expected_errors = data_feed.loo_error_estimation_with_rippa_method(
r_best, lambda_best
)
assert (r_best in r_set) == True
assert (lambda_best in reg_parameter) == True
assert error_best == expected_errors
@pytest.mark.unit
@pytest.mark.parametrize("array_type", [np.array, pd.DataFrame])
def test_leave_one_out_crossvalidation_04(self, array_type):
input_array = array_type(self.training_data)
data_feed = RadialBasisFunctions(
input_array,
basis_function="spline",
solution_method=None,
regularization=False,
)
r_best, lambda_best, error_best = data_feed.leave_one_out_crossvalidation()
if (
(data_feed.basis_function == "gaussian")
or (data_feed.basis_function == "mq")
or (data_feed.basis_function.lower() == "imq")
):
r_set = [
0.001,
0.002,
0.005,
0.0075,
0.01,
0.02,
0.05,
0.075,
0.1,
0.2,
0.5,
0.75,
1.0,
2.0,
5.0,
7.5,
10.0,
20.0,
50.0,
75.0,
100.0,
200.0,
500.0,
1000.0,
]
else:
r_set = [0]
if data_feed.regularization is True:
reg_parameter = [
0.00001,
0.00002,
0.00005,
0.000075,
0.0001,
0.0002,
0.0005,
0.00075,
0.001,
0.002,
0.005,
0.0075,
0.01,
0.02,
0.05,
0.075,
0.1,
0.2,
0.5,
0.75,
1,
]
elif data_feed.regularization is False:
reg_parameter = [0]
_, _, expected_errors = data_feed.loo_error_estimation_with_rippa_method(
r_best, lambda_best
)
assert (r_best in r_set) == True
assert (lambda_best in reg_parameter) == True
assert error_best == expected_errors
@pytest.mark.unit
@pytest.mark.parametrize("array_type", [np.array, pd.DataFrame])
def test_leave_one_out_crossvalidation_05(self, array_type):
input_array = array_type(self.training_data)
data_feed = RadialBasisFunctions(
input_array,
basis_function="gaussian",
solution_method=None,
regularization=False,
)
r_best, lambda_best, error_best = data_feed.leave_one_out_crossvalidation()
if (
(data_feed.basis_function == "gaussian")
or (data_feed.basis_function == "mq")
or (data_feed.basis_function.lower() == "imq")
):
r_set = [
0.001,
0.002,
0.005,
0.0075,
0.01,
0.02,
0.05,
0.075,
0.1,
0.2,
0.5,
0.75,
1.0,
2.0,
5.0,
7.5,
10.0,
20.0,
50.0,
75.0,
100.0,
200.0,
500.0,
1000.0,
]
else:
r_set = [0]
if data_feed.regularization is True:
reg_parameter = [
0.00001,
0.00002,
0.00005,
0.000075,
0.0001,
0.0002,
0.0005,
0.00075,
0.001,
0.002,
0.005,
0.0075,
0.01,
0.02,
0.05,
0.075,
0.1,
0.2,
0.5,
0.75,
1,
]
elif data_feed.regularization is False:
reg_parameter = [0]
_, _, expected_errors = data_feed.loo_error_estimation_with_rippa_method(
r_best, lambda_best
)
assert (r_best in r_set) == True
assert (lambda_best in reg_parameter) == True
assert error_best == expected_errors
@pytest.mark.unit
@pytest.mark.parametrize("array_type", [np.array, pd.DataFrame])
def test_leave_one_out_crossvalidation_06(self, array_type):
input_array = array_type(self.training_data)
data_feed = RadialBasisFunctions(
input_array, basis_function="mq", solution_method=None, regularization=False
)
r_best, lambda_best, error_best = data_feed.leave_one_out_crossvalidation()
if (
(data_feed.basis_function == "gaussian")
or (data_feed.basis_function == "mq")
or (data_feed.basis_function.lower() == "imq")
):
r_set = [
0.001,
0.002,
0.005,
0.0075,
0.01,
0.02,
0.05,
0.075,
0.1,
0.2,
0.5,
0.75,
1.0,
2.0,
5.0,
7.5,
10.0,
20.0,
50.0,
75.0,
100.0,
200.0,
500.0,
1000.0,
]
else:
r_set = [0]
if data_feed.regularization is True:
reg_parameter = [
0.00001,
0.00002,
0.00005,
0.000075,
0.0001,
0.0002,
0.0005,
0.00075,
0.001,
0.002,
0.005,
0.0075,
0.01,
0.02,
0.05,
0.075,
0.1,
0.2,
0.5,
0.75,
1,
]
elif data_feed.regularization is False:
reg_parameter = [0]
_, _, expected_errors = data_feed.loo_error_estimation_with_rippa_method(
r_best, lambda_best
)
assert (r_best in r_set) == True
assert (lambda_best in reg_parameter) == True
assert error_best == expected_errors
@pytest.mark.unit
@pytest.mark.parametrize("array_type", [np.array, pd.DataFrame])
def test_leave_one_out_crossvalidation_07(self, array_type):
input_array = array_type(self.training_data)
data_feed = RadialBasisFunctions(
input_array,
basis_function="imq",
solution_method=None,
regularization=False,
)
r_best, lambda_best, error_best = data_feed.leave_one_out_crossvalidation()
if (
(data_feed.basis_function == "gaussian")
or (data_feed.basis_function == "mq")
or (data_feed.basis_function.lower() == "imq")
):
r_set = [
0.001,
0.002,
0.005,
0.0075,
0.01,
0.02,
0.05,
0.075,
0.1,
0.2,
0.5,
0.75,
1.0,
2.0,
5.0,
7.5,
10.0,
20.0,
50.0,
75.0,
100.0,
200.0,
500.0,
1000.0,
]
else:
r_set = [0]
if data_feed.regularization is True:
reg_parameter = [
0.00001,
0.00002,
0.00005,
0.000075,
0.0001,
0.0002,
0.0005,
0.00075,
0.001,
0.002,
0.005,
0.0075,
0.01,
0.02,
0.05,
0.075,
0.1,
0.2,
0.5,
0.75,
1,
]
elif data_feed.regularization is False:
reg_parameter = [0]
_, _, expected_errors = data_feed.loo_error_estimation_with_rippa_method(
r_best, lambda_best
)
assert (r_best in r_set) == True
assert (lambda_best in reg_parameter) == True
assert error_best == expected_errors
@pytest.mark.unit
@pytest.mark.parametrize("array_type", [np.array, pd.DataFrame])
def test_leave_one_out_crossvalidation_08(self, array_type):
input_array = array_type(self.training_data)
data_feed = RadialBasisFunctions(
input_array,
basis_function=None,
solution_method="algebraic",
regularization=False,
)
r_best, lambda_best, error_best = data_feed.leave_one_out_crossvalidation()
if (
(data_feed.basis_function == "gaussian")
or (data_feed.basis_function == "mq")
or (data_feed.basis_function.lower() == "imq")
):
r_set = [
0.001,
0.002,
0.005,
0.0075,
0.01,
0.02,
0.05,
0.075,
0.1,
0.2,
0.5,
0.75,
1.0,
2.0,
5.0,
7.5,
10.0,
20.0,
50.0,
75.0,
100.0,
200.0,
500.0,
1000.0,
]
else:
r_set = [0]
if data_feed.regularization is True:
reg_parameter = [
0.00001,
0.00002,
0.00005,
0.000075,
0.0001,
0.0002,
0.0005,
0.00075,
0.001,
0.002,
0.005,
0.0075,
0.01,
0.02,
0.05,
0.075,
0.1,
0.2,
0.5,
0.75,
1,
]
elif data_feed.regularization is False:
reg_parameter = [0]
_, _, expected_errors = data_feed.loo_error_estimation_with_rippa_method(
r_best, lambda_best
)
assert (r_best in r_set) == True
assert (lambda_best in reg_parameter) == True
assert error_best == expected_errors
@pytest.mark.unit
@pytest.mark.parametrize("array_type", [np.array, pd.DataFrame])
def test_leave_one_out_crossvalidation_09(self, array_type):
input_array = array_type(self.training_data)
data_feed = RadialBasisFunctions(
input_array,
basis_function=None,
solution_method="BFGS",
regularization=False,
)
r_best, lambda_best, error_best = data_feed.leave_one_out_crossvalidation()
if (
(data_feed.basis_function == "gaussian")
or (data_feed.basis_function == "mq")
or (data_feed.basis_function.lower() == "imq")
):
r_set = [
0.001,
0.002,
0.005,
0.0075,
0.01,
0.02,
0.05,
0.075,
0.1,
0.2,
0.5,
0.75,
1.0,
2.0,
5.0,
7.5,
10.0,
20.0,
50.0,
75.0,
100.0,
200.0,
500.0,
1000.0,
]
else:
r_set = [0]
if data_feed.regularization is True:
reg_parameter = [
0.00001,
0.00002,
0.00005,
0.000075,
0.0001,
0.0002,
0.0005,
0.00075,
0.001,
0.002,
0.005,
0.0075,
0.01,
0.02,
0.05,
0.075,
0.1,
0.2,
0.5,
0.75,
1,
]
elif data_feed.regularization is False:
reg_parameter = [0]
_, _, expected_errors = data_feed.loo_error_estimation_with_rippa_method(
r_best, lambda_best
)
assert (r_best in r_set) == True
assert (lambda_best in reg_parameter) == True
assert error_best == expected_errors
@pytest.mark.unit
@pytest.mark.parametrize("array_type", [np.array, pd.DataFrame])
def test_leave_one_out_crossvalidation_10(self, array_type):
input_array = array_type(self.training_data)
data_feed = RadialBasisFunctions(
input_array,
basis_function=None,
solution_method="pyomo",
regularization=False,
)
r_best, lambda_best, error_best = data_feed.leave_one_out_crossvalidation()
if (
(data_feed.basis_function == "gaussian")
or (data_feed.basis_function == "mq")
or (data_feed.basis_function.lower() == "imq")
):
r_set = [
0.001,
0.002,
0.005,
0.0075,
0.01,
0.02,
0.05,
0.075,
0.1,
0.2,
0.5,
0.75,
1.0,
2.0,
5.0,
7.5,
10.0,
20.0,
50.0,
75.0,
100.0,
200.0,
500.0,
1000.0,
]
else:
r_set = [0]
if data_feed.regularization is True:
reg_parameter = [
0.00001,
0.00002,
0.00005,
0.000075,
0.0001,
0.0002,
0.0005,
0.00075,
0.001,
0.002,
0.005,
0.0075,
0.01,
0.02,
0.05,
0.075,
0.1,
0.2,
0.5,
0.75,
1,
]
elif data_feed.regularization is False:
reg_parameter = [0]
_, _, expected_errors = data_feed.loo_error_estimation_with_rippa_method(
r_best, lambda_best
)
assert (r_best in r_set) == True
assert (lambda_best in reg_parameter) == True
assert error_best == expected_errors
@pytest.mark.unit
@pytest.mark.parametrize("array_type", [np.array, pd.DataFrame])
def test_leave_one_out_crossvalidation_11(self, array_type):
input_array = array_type(self.training_data)
data_feed = RadialBasisFunctions(
input_array, basis_function=None, solution_method=None, regularization=True
)
r_best, lambda_best, error_best = data_feed.leave_one_out_crossvalidation()
if (
(data_feed.basis_function == "gaussian")
or (data_feed.basis_function == "mq")
or (data_feed.basis_function.lower() == "imq")
):
r_set = [
0.001,
0.002,
0.005,
0.0075,
0.01,
0.02,
0.05,
0.075,
0.1,
0.2,
0.5,
0.75,
1.0,
2.0,
5.0,
7.5,
10.0,
20.0,
50.0,
75.0,
100.0,
200.0,
500.0,
1000.0,
]
else:
r_set = [0]
if data_feed.regularization is True:
reg_parameter = [
0.00001,
0.00002,
0.00005,
0.000075,
0.0001,
0.0002,
0.0005,
0.00075,
0.001,
0.002,
0.005,
0.0075,
0.01,
0.02,
0.05,
0.075,
0.1,
0.2,
0.5,
0.75,
1,
]
elif data_feed.regularization is False:
reg_parameter = [0]
_, _, expected_errors = data_feed.loo_error_estimation_with_rippa_method(
r_best, lambda_best
)
assert (r_best in r_set) == True
assert (lambda_best in reg_parameter) == True
assert error_best == expected_errors
@pytest.mark.unit
@pytest.fixture(scope="module")
@pytest.mark.parametrize("array_type", [np.array, pd.DataFrame])
def test_rbf_training_01(self, array_type):
input_array = array_type(self.test_data)
data_feed = RadialBasisFunctions(
input_array,
basis_function=None,
solution_method="algebraic",
regularization=False,
)
results = data_feed.training()
best_r_value, best_lambda_param, _ = data_feed.leave_one_out_crossvalidation()
x_transformed = data_feed.basis_generation(best_r_value)
x_transformed = x_transformed + (
best_lambda_param * np.eye(x_transformed.shape[0], x_transformed.shape[1])
)
x_condition_number = np.linalg.cond(x_transformed)
if data_feed.solution_method == "algebraic":
radial_weights = data_feed.explicit_linear_algebra_solution(
x_transformed, data_feed.y_data
)
elif data_feed.solution_method == "pyomo":
radial_weights = data_feed.pyomo_optimization(
x_transformed, data_feed.y_data
)
elif data_feed.solution_method == "bfgs":
radial_weights = data_feed.bfgs_parameter_optimization(
x_transformed, data_feed.y_data
)
radial_weights = radial_weights.reshape(radial_weights.shape[0], 1)
(
training_ss_error,
rmse_error,
y_training_predictions_scaled,
) = data_feed.error_calculation(radial_weights, x_transformed, data_feed.y_data)
r_square = data_feed.r2_calculation(
data_feed.y_data, y_training_predictions_scaled
)
y_training_predictions = data_feed.data_min[
0, -1
] + y_training_predictions_scaled * (
data_feed.data_max[0, -1] - data_feed.data_min[0, -1]
)
np.testing.assert_array_equal(radial_weights, results.weights)
np.testing.assert_array_equal(best_r_value, results.sigma)
np.testing.assert_array_equal(best_lambda_param, results.regularization)
np.testing.assert_array_equal(data_feed.centres, results.centres)
np.testing.assert_array_equal(
y_training_predictions, results.output_predictions
)
np.testing.assert_array_equal(rmse_error, results.rmse)
np.testing.assert_array_equal(x_condition_number, results.condition_number)
np.testing.assert_array_equal(data_feed.regularization, results.regularization)
np.testing.assert_array_equal(r_square, results.R2)
assert data_feed.basis_function == results.basis_function
np.testing.assert_array_equal(data_feed.data_min[:, :-1], results.x_data_min)
np.testing.assert_array_equal(data_feed.data_max[:, :-1], results.x_data_max)
np.testing.assert_array_equal(data_feed.data_min[:, -1], results.y_data_min)
np.testing.assert_array_equal(data_feed.data_max[:, -1], results.y_data_max)
assert results.solution_status == "ok"
@pytest.mark.unit
@pytest.fixture(scope="module")
@pytest.mark.parametrize("array_type", [np.array, pd.DataFrame])
def test_rbf_training_02(self, array_type):
input_array = array_type(self.test_data)
data_feed = RadialBasisFunctions(
input_array,
basis_function=None,
solution_method="pyomo",
regularization=False,
)
data_feed.training()
with pytest.warns(Warning):
results = data_feed.training()
assert data_feed.solution_status == "unstable solution"
@pytest.mark.unit
@pytest.fixture(scope="module")
@pytest.mark.parametrize("array_type", [np.array, pd.DataFrame])
def test_rbf_training_03(self, array_type):
input_array = array_type(self.test_data)
data_feed = RadialBasisFunctions(
input_array,
basis_function=None,
solution_method="bfgs",
regularization=False,
)
data_feed.training()
with pytest.warns(Warning):
data_feed.training()
assert data_feed.solution_status == "unstable solution"
@pytest.mark.unit
@pytest.fixture(scope="module")
@pytest.mark.parametrize("array_type", [np.array, pd.DataFrame])
def test_rbf_predict_output_01(self, array_type):
input_array = array_type(self.training_data)
data_feed = RadialBasisFunctions(
input_array, basis_function="linear", regularization=False
)
results = data_feed.training()
x_test = np.array([[0, 7.5]])
output = data_feed.predict_output(x_test)
data_minimum = results.x_data_min
data_maximum = results.x_data_max
scale = data_maximum - data_minimum
scale[scale == 0.0] = 1.0
x_pred_scaled = (x_test - data_minimum) / scale
x_test = x_pred_scaled.reshape(x_test.shape)
distance_vec = distance.cdist(x_test, results.centres, "euclidean")
expected_output = np.matmul(distance_vec, results.weights)
expected_output = results.y_data_min + expected_output * (
results.y_data_max - results.y_data_min
)
assert expected_output == output
@pytest.mark.unit
@pytest.fixture(scope="module")
@pytest.mark.parametrize("array_type", [np.array, pd.DataFrame])
def test_rbf_predict_output_02(self, array_type):
input_array = array_type(self.training_data)
data_feed = RadialBasisFunctions(
input_array, basis_function="cubic", regularization=False
)
results = data_feed.training()
x_test = np.array([[0, 7.5]])
output = data_feed.predict_output(x_test)
data_minimum = data_feed.x_data_min
data_maximum = results.x_data_max
scale = data_maximum - data_minimum
scale[scale == 0.0] = 1.0
x_pred_scaled = (x_test - data_minimum) / scale
x_test = x_pred_scaled.reshape(x_test.shape)
distance_vec = distance.cdist(x_test, results.centres, "euclidean")
expected_output = np.matmul(distance_vec**3, results.weights)
expected_output = results.y_data_min + expected_output * (
results.y_data_max - results.y_data_min
)
assert expected_output == output
@pytest.mark.unit
@pytest.fixture(scope="module")
@pytest.mark.parametrize("array_type", [np.array, pd.DataFrame])
def test_rbf_predict_output_03(self, array_type):
input_array = array_type(self.training_data)
data_feed = RadialBasisFunctions(
input_array, basis_function="gaussian", regularization=False
)
results = data_feed.training()
x_test = np.array([[0, 7.5]])
output = data_feed.predict_output(x_test)
data_minimum = results.x_data_min
data_maximum = results.x_data_max
scale = data_maximum - data_minimum
scale[scale == 0.0] = 1.0
x_pred_scaled = (x_test - data_minimum) / scale
x_test = x_pred_scaled.reshape(x_test.shape)
distance_vec = distance.cdist(x_test, results.centres, "euclidean")
expected_output = np.matmul(
np.exp(-1 * ((distance_vec * results.sigma) ** 2)), results.weights
)
expected_output = results.y_data_min + expected_output * (
results.y_data_max - results.y_data_min
)
assert expected_output == output
@pytest.mark.unit
@pytest.fixture(scope="module")
@pytest.mark.parametrize("array_type", [np.array, pd.DataFrame])
def test_rbf_predict_output_04(self, array_type):
input_array = array_type(self.training_data)
data_feed = RadialBasisFunctions(
input_array, basis_function="imq", regularization=False
)
results = data_feed.training()
x_test = np.array([[0, 7.5]])
output = data_feed.predict_output(x_test)
data_minimum = data_feed.x_data_min
data_maximum = results.x_data_max
scale = data_maximum - data_minimum
scale[scale == 0.0] = 1.0
x_pred_scaled = (x_test - data_minimum) / scale
x_test = x_pred_scaled.reshape(x_test.shape)
distance_vec = distance.cdist(x_test, results.centres, "euclidean")
expected_output = np.matmul(
1 / np.sqrt(((distance_vec * results.sigma) ** 2) + 1), results.weights
)
expected_output = results.y_data_min + expected_output * (
results.y_data_max - results.y_data_min
)
assert expected_output == output
@pytest.mark.unit
@pytest.fixture(scope="module")
@pytest.mark.parametrize("array_type", [np.array, pd.DataFrame])
def test_rbf_predict_output_05(self, array_type):
input_array = array_type(self.training_data)
data_feed = RadialBasisFunctions(
input_array, basis_function="mq", regularization=False
)
results = data_feed.training()
x_test = np.array([[0, 7.5]])
output = data_feed.predict_output(x_test)
data_minimum = results.x_data_min
data_maximum = results.x_data_max
scale = data_maximum - data_minimum
scale[scale == 0.0] = 1.0
x_pred_scaled = (x_test - data_minimum) / scale
x_test = x_pred_scaled.reshape(x_test.shape)
distance_vec = distance.cdist(x_test, results.centres, "euclidean")
expected_output = np.matmul(
np.sqrt(((distance_vec * results.sigma) ** 2) + 1), results.weights
)
expected_output = results.y_data_min + expected_output * (
results.y_data_max - results.y_data_min
)
assert expected_output == output
@pytest.mark.unit
@pytest.fixture(scope="module")
@pytest.mark.parametrize("array_type", [np.array, pd.DataFrame])
def test_rbf_predict_output_06(self, array_type):
input_array = array_type(self.training_data)
data_feed = RadialBasisFunctions(
input_array, basis_function="spline", regularization=False
)
results = data_feed.training()
x_test = np.array([[0, 7.5]])
output = data_feed.predict_output(x_test)
data_minimum = results.x_data_min
data_maximum = results.x_data_max
scale = data_maximum - data_minimum
scale[scale == 0.0] = 1.0
x_pred_scaled = (x_test - data_minimum) / scale
x_test = x_pred_scaled.reshape(x_test.shape)
distance_vec = distance.cdist(x_test, results.centres, "euclidean")
expected_output = np.matmul(
np.nan_to_num(distance_vec**2 * np.log(distance_vec)), results.weights
)
expected_output = results.y_data_min + expected_output * (
results.y_data_max - results.y_data_min
)
assert expected_output == output
@pytest.mark.unit
@pytest.mark.parametrize("array_type", [pd.DataFrame])
def test_get_feature_vector_01(self, array_type):
input_array = array_type(self.full_data)
data_feed = RadialBasisFunctions(input_array, basis_function="linear")
output = data_feed.get_feature_vector()
expected_dict = {"x1": 0, "x2": 0}
assert expected_dict == output.extract_values()
@pytest.mark.unit
@pytest.mark.parametrize("array_type", [np.array])
def test_get_feature_vector_02(self, array_type):
input_array = array_type(self.training_data)
data_feed = RadialBasisFunctions(input_array, basis_function="linear")
output = data_feed.get_feature_vector()
expected_dict = {0: 0, 1: 0}
assert expected_dict == output.extract_values()
@pytest.mark.unit
@pytest.fixture(scope="module")
@pytest.mark.parametrize("array_type", [np.array, pd.DataFrame])
def test_rbf_generate_expression_01(self, array_type):
input_array = array_type(self.training_data)
data_feed = RadialBasisFunctions(
input_array,
basis_function="linear",
solution_method=None,
regularization=False,
)
p = data_feed.get_feature_vector()
data_feed.training()
lv = []
for i in p.keys():
lv.append(p[i])
rbf_expr = data_feed.generate_expression((lv))
@pytest.mark.unit
@pytest.fixture(scope="module")
@pytest.mark.parametrize("array_type", [np.array, pd.DataFrame])
def test_rbf_generate_expression_02(self, array_type):
input_array = array_type(self.training_data)
data_feed = RadialBasisFunctions(
input_array,
basis_function="cubic",
solution_method=None,
regularization=False,
)
p = data_feed.get_feature_vector()
data_feed.training()
lv = []
for i in p.keys():
lv.append(p[i])
rbf_expr = data_feed.generate_expression((lv))
@pytest.mark.unit
@pytest.fixture(scope="module")
@pytest.mark.parametrize("array_type", [np.array, pd.DataFrame])
def test_rbf_generate_expression_03(self, array_type):
input_array = array_type(self.training_data)
data_feed = RadialBasisFunctions(
input_array,
basis_function="gaussian",
solution_method=None,
regularization=False,
)
p = data_feed.get_feature_vector()
results = data_feed.training()
lv = []
for i in p.keys():
lv.append(p[i])
rbf_expr = results.generate_expression((lv))
@pytest.mark.unit
@pytest.fixture(scope="module")
@pytest.mark.parametrize("array_type", [np.array, pd.DataFrame])
def test_rbf_generate_expression_04(self, array_type):
input_array = array_type(self.training_data)
data_feed = RadialBasisFunctions(
input_array, basis_function="mq", solution_method=None, regularization=False
)
p = data_feed.get_feature_vector()
results = data_feed.training()
lv = []
for i in p.keys():
lv.append(p[i])
rbf_expr = results.generate_expression((lv))
@pytest.mark.unit
@pytest.fixture(scope="module")
@pytest.mark.parametrize("array_type", [np.array, pd.DataFrame])
def test_rbf_generate_expression_05(self, array_type):
input_array = array_type(self.training_data)
data_feed = RadialBasisFunctions(
input_array,
basis_function="imq",
solution_method=None,
regularization=False,
)
p = data_feed.get_feature_vector()
results = data_feed.training()
lv = []
for i in p.keys():
lv.append(p[i])
rbf_expr = results.generate_expression((lv))
@pytest.mark.unit
@pytest.fixture(scope="module")
@pytest.mark.parametrize("array_type", [np.array, pd.DataFrame])
def test_rbf_generate_expression_06(self, array_type):
input_array = array_type(self.training_data)
data_feed = RadialBasisFunctions(
input_array,
basis_function="spline",
solution_method=None,
regularization=False,
)
p = data_feed.get_feature_vector()
results = data_feed.training()
lv = []
for i in p.keys():
lv.append(p[i])
rbf_expr = results.generate_expression((lv))
@pytest.mark.unit
@pytest.fixture(scope="module")
@pytest.mark.parametrize("array_type", [np.array, pd.DataFrame])
def test_pickle_load01(self, array_type):
input_array = array_type(self.training_data)
data_feed = RadialBasisFunctions(
input_array,
basis_function="spline",
solution_method=None,
regularization=False,
)
p = data_feed.get_feature_vector()
data_feed.training()
data_feed.pickle_load(data_feed.filename)
@pytest.mark.unit
@pytest.fixture(scope="module")
@pytest.mark.parametrize("array_type", [np.array, pd.DataFrame])
def test_pickle_load02(self, array_type):
input_array = array_type(self.training_data)
data_feed = RadialBasisFunctions(
input_array,
basis_function="spline",
solution_method=None,
regularization=False,
)
p = data_feed.get_feature_vector()
data_feed.training()
with pytest.raises(Exception):
data_feed.pickle_load("file_not_existing.pickle")
@pytest.mark.unit
@pytest.fixture(scope="module")
@patch("matplotlib.pyplot.show")
@pytest.mark.parametrize("array_type", [np.array, pd.DataFrame])
def test_parity_residual_plots(self, mock_show, array_type):
input_array = array_type(self.training_data)
data_feed = RadialBasisFunctions(
input_array,
basis_function="spline",
solution_method=None,
regularization=False,
)
p = data_feed.get_feature_vector()
data_feed.training()
data_feed.parity_residual_plots()
if __name__ == "__main__":
pytest.main()
| 35.858568 | 88 | 0.543289 | 9,843 | 82,654 | 4.276135 | 0.037489 | 0.045545 | 0.027608 | 0.045094 | 0.910192 | 0.891684 | 0.87838 | 0.856569 | 0.848539 | 0.839297 | 0 | 0.063781 | 0.343105 | 82,654 | 2,304 | 89 | 35.874132 | 0.711428 | 0.013926 | 0 | 0.774622 | 0 | 0 | 0.021388 | 0.001808 | 0 | 0 | 0 | 0 | 0.059551 | 1 | 0.038937 | false | 0 | 0.003665 | 0.000916 | 0.051306 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
cde926384883c9668782209c431d30d0fac50072 | 106 | py | Python | stable_baselines3/td3/__init__.py | koulakis/stable-baselines3 | 08e7519381e800edc6bbd09577f14381b7341873 | [
"MIT"
] | 1 | 2020-06-24T16:57:19.000Z | 2020-06-24T16:57:19.000Z | stable_baselines3/td3/__init__.py | koulakis/stable-baselines3 | 08e7519381e800edc6bbd09577f14381b7341873 | [
"MIT"
] | null | null | null | stable_baselines3/td3/__init__.py | koulakis/stable-baselines3 | 08e7519381e800edc6bbd09577f14381b7341873 | [
"MIT"
] | null | null | null | from stable_baselines3.td3.td3 import TD3
from stable_baselines3.td3.policies import MlpPolicy, CnnPolicy
| 35.333333 | 63 | 0.867925 | 15 | 106 | 6 | 0.533333 | 0.222222 | 0.444444 | 0.511111 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.061856 | 0.084906 | 106 | 2 | 64 | 53 | 0.865979 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
a83777be98439ac61ef27f9496a6fbe444eb2595 | 1,295 | py | Python | code_tele.py | SecurityLow/telegram | 8d3c52260e05cdc29b0e5a104f0178357f7a717b | [
"Apache-2.0"
] | null | null | null | code_tele.py | SecurityLow/telegram | 8d3c52260e05cdc29b0e5a104f0178357f7a717b | [
"Apache-2.0"
] | null | null | null | code_tele.py | SecurityLow/telegram | 8d3c52260e05cdc29b0e5a104f0178357f7a717b | [
"Apache-2.0"
] | null | null | null | #Compiled By MR_DARK
import marshal,zlib,base64
exec(zlib.decompress(base64.b64decode("eJztVc2O2zYQvvMpWOVgGzJU27ub/QESxLIVrVHLu5CcGkZbGFqZtolKokBSadxTTznlkAJN0/whaA8FCrTofXvNi6yfoI9QkhK9611v20tvJWxQ33Bm+H3iUDOjJIEcxYgvSApxkhHK4VDgOQ2TToxRyuuQLdOoDtFjAZiYKSVUzLM8jTgmqXiMSJoiBcBMJoxITER8qBM+JBTVoR1GX9ZhwJcxAuUCYcCH99S65TtdCGyN7P4jB7gaub7jDMBYw7HT75+MgKex13adwbANRtowOu4NHdDRsDNuD4Ddvszd7nwC7nx4fvHHK/jh24vz3xKYLi7Of+QgGAgnRdEanPheuw9BYK9Ntt9zj4cg6K4t3Z4H7khhc0rydApsKUfiUo6todJjuxoWguyxxqUi29MGLckeaUuhye5oXIhSqsodpCrCLLZkHCXVShSjkFZq4CwUp0OFmxF0/N7pELrOEHZOug4cOn3H9dsefCjoGIBjHiPppwsAPkjnCxxmcc7K1aZcdoXWDD6QkjOOaJI/SRJigIzilFc9M7BNA14bq29+Wr34bvXi+cbvh1/Kh1dvrlsufy9FrFErs99ILIdh2r7pmsbq5VM4cI977dP+owAKJBbWb6a25red3Otbyb39+XZy766SG5mV1fe//vn+2bn8V0xf4Yo5Kuf/8X+Jb7z3dc18nm6rGVGm4vaYY1MVtny+vciuezcVEIlvKUzhP1AhI0GmuH9FxN8X8tahqntkBl3T+BRRzEh61LQaxhUGIMzwBE/l1Ww1mnuNg11DmRYhWyjjLjqY7ewetvbuNnZRay88PDuY3j3cO2s1o9nOzo4BvlpgcfOHNEdHQO6ZiW4gvwQ4zXJeLbV4Zsc0etJSrqd5coboUUFFvHS4evd7RThV1OypGYpjkBkj1UtEys3mUjUYYkw0jo8NUyWtF1rqmv/VaKvsM9XCiGcwJVyvYTbJGaKTMBetjOKv0bRaK8TIwenyEiiF6hR04yn2Fl+GoGBTNLmPjNpGTLnTFLNNInKgJxHKODyVeQbqvdjy2KeOTLRla3hz7yLg326KYoYu884IhYngHs6RODQdNEd8UlpZdX9/v9Fo1GGME8zvNWubpBI2F4dTOlvlvOGxJPkkIlNZFsLZYlmMBZfPWl9YlHGKs2rFqtS2CXVlmY5JTmFHhd+/L2p63ZdNnbcs5n/Q/hdCltjf"))) | 431.666667 | 1,247 | 0.963707 | 41 | 1,295 | 30.414634 | 0.95122 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.144961 | 0.003861 | 1,295 | 3 | 1,247 | 431.666667 | 0.821705 | 0.014672 | 0 | 0 | 0 | 0.5 | 0.943574 | 0.943574 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 10 |
b57bab57c60d5fb3746b4021fec71ae9bd39971a | 5,122 | py | Python | whampy/tests/test_over_data_click_map.py | Deech08/whampy | befc8ff0d9d13f397e1d79a793283be2a7fc1eb9 | [
"BSD-3-Clause"
] | 5 | 2019-11-14T01:04:08.000Z | 2020-07-30T16:01:02.000Z | whampy/tests/test_over_data_click_map.py | Deech08/whampy | befc8ff0d9d13f397e1d79a793283be2a7fc1eb9 | [
"BSD-3-Clause"
] | 16 | 2019-12-09T13:18:34.000Z | 2020-10-29T19:54:21.000Z | whampy/tests/test_over_data_click_map.py | Deech08/whampy | befc8ff0d9d13f397e1d79a793283be2a7fc1eb9 | [
"BSD-3-Clause"
] | 2 | 2018-11-23T20:50:22.000Z | 2019-02-07T21:01:01.000Z | import pytest
import matplotlib.pyplot as plt
import astropy.units as u
import numpy as np
from ..skySurvey import SkySurvey
from unittest.mock import Mock
from ..skySurvey import directory
# Set up the random number generator.
np.random.seed(1234)
# Load survey
survey = SkySurvey()
BASELINE_DIR = 'baseline'
@pytest.mark.mpl_image_compare(baseline_dir=BASELINE_DIR, tolerance = 20)
def test_basic_click_event_over():
"""
Test click map with sample click event on SkySurvey over
"""
fig = plt.figure()
ax = fig.add_subplot(111)
click_map = survey.click_map(fig = fig, image_ax = ax, over_data = survey, share_yaxis = True)
event = Mock()
event.button = 1
event.inaxes = ax
event.xdata = 1
event.ydata = -4
click_map.on_click(event)
return plt.gcf()
@pytest.mark.mpl_image_compare(baseline_dir=BASELINE_DIR, tolerance = 20)
def test_basic_click_event_over_notsharey():
"""
Test click map with sample click event on SkySurvey over
"""
fig = plt.figure()
ax = fig.add_subplot(111)
click_map = survey.click_map(fig = fig, image_ax = ax, over_data = survey, share_yaxis = False)
event = Mock()
event.button = 1
event.inaxes = ax
event.xdata = 1
event.ydata = -4
click_map.on_click(event)
return plt.gcf()
@pytest.mark.mpl_image_compare(baseline_dir=BASELINE_DIR, tolerance = 20)
def test_str_click_event_over():
"""
Test click map with sample click event on string load over
"""
import os.path
over_data = os.path.join(directory, "tests/test_data/test_cube.fits")
fig = plt.figure()
ax = fig.add_subplot(111)
click_map = survey.click_map(fig = fig, image_ax = ax, over_data = over_data)
event = Mock()
event.button = 1
event.inaxes = ax
event.xdata = 1
event.ydata = -4
click_map.on_click(event)
return plt.gcf()
@pytest.mark.mpl_image_compare(baseline_dir=BASELINE_DIR, tolerance=20)
def test_cube_click_event_over():
"""
Test click map with sample click event on string load over
"""
import os.path
from spectral_cube import SpectralCube
over_data_path = os.path.join(directory, "tests/test_data/test_cube.fits")
over_data = SpectralCube.read(over_data_path)
fig = plt.figure()
ax = fig.add_subplot(111)
click_map = survey.click_map(fig = fig, image_ax = ax, over_data = over_data,
average_beam = True, radius = 0.5)
event = Mock()
event.button = 1
event.inaxes = ax
event.xdata = 1
event.ydata = -4
click_map.on_click(event)
return plt.gcf()
@pytest.mark.mpl_image_compare(baseline_dir=BASELINE_DIR, tolerance=20)
def test_wcs_click_event_over():
"""
Ensure click map works with wcs axes
"""
import os.path
from spectral_cube import SpectralCube
over_data_path = os.path.join(directory, "tests/test_data/test_cube.fits")
over_data = SpectralCube.read(over_data_path)
fig = plt.figure()
ax = fig.add_subplot(111, projection = over_data.wcs, slices = ['x', 'y', 0])
ax.imshow(over_data[0,:,:].data)
click_map = survey.click_map(fig = fig, image_ax = ax, over_data = over_data)
x,y,_ = over_data.wcs.wcs_world2pix(1,-4,0,0)
event = Mock()
event.button = 1
event.inaxes = ax
event.xdata = x
event.ydata = y
click_map.on_click(event)
return plt.gcf()
@pytest.mark.mpl_image_compare(baseline_dir=BASELINE_DIR, tolerance=20)
def test_cube_click_event_over_radius_deg():
"""
Test click map with sample click event on string load over with radius quantity
"""
import os.path
from spectral_cube import SpectralCube
over_data_path = os.path.join(directory, "tests/test_data/test_cube.fits")
over_data = SpectralCube.read(over_data_path)
fig = plt.figure()
ax = fig.add_subplot(111)
click_map = survey.click_map(fig = fig, image_ax = ax, over_data = over_data,
average_beam = True, radius = 0.5*u.deg)
event = Mock()
event.button = 1
event.inaxes = ax
event.xdata = 1
event.ydata = -4
click_map.on_click(event)
return plt.gcf()
@pytest.mark.mpl_image_compare(baseline_dir=BASELINE_DIR, tolerance=20)
def test_cube_click_event_over_radius_none():
"""
Test click map with sample click event on string load over with radius not set
"""
import os.path
from spectral_cube import SpectralCube
over_data_path = os.path.join(directory, "tests/test_data/test_cube.fits")
over_data = SpectralCube.read(over_data_path)
fig = plt.figure()
ax = fig.add_subplot(111)
click_map = survey.click_map(fig = fig, image_ax = ax, over_data = over_data,
average_beam = True)
event = Mock()
event.button = 1
event.inaxes = ax
event.xdata = 1
event.ydata = -4
click_map.on_click(event)
return plt.gcf()
def test_cube_over_readerror():
"""
Test click map with over data read error
"""
try:
click_map = survey.click_map(over_data = "fakefilename.fits")
except FileNotFoundError:
assert True
else:
assert False
| 28.142857 | 99 | 0.685084 | 759 | 5,122 | 4.403162 | 0.134387 | 0.074207 | 0.045482 | 0.045482 | 0.830341 | 0.823758 | 0.823758 | 0.823758 | 0.823758 | 0.823758 | 0 | 0.017079 | 0.211246 | 5,122 | 181 | 100 | 28.298343 | 0.810149 | 0.100937 | 0 | 0.717742 | 0 | 0 | 0.039642 | 0.033595 | 0 | 0 | 0 | 0 | 0.016129 | 1 | 0.064516 | false | 0 | 0.129032 | 0 | 0.25 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
b58dcc241f9341464342e44ca1195c3c434c64a6 | 2,765 | py | Python | python/tests/explainer/napoleon.py | geometer/sandbox | 373ec96e69df76744a19b51f7caa865cbc6b58cd | [
"Apache-2.0"
] | 6 | 2020-04-19T11:26:18.000Z | 2021-06-21T18:42:51.000Z | python/tests/explainer/napoleon.py | geometer/sandbox | 373ec96e69df76744a19b51f7caa865cbc6b58cd | [
"Apache-2.0"
] | 31 | 2020-04-21T17:24:39.000Z | 2020-08-27T15:59:12.000Z | python/tests/explainer/napoleon.py | geometer/sandbox | 373ec96e69df76744a19b51f7caa865cbc6b58cd | [
"Apache-2.0"
] | null | null | null | from sandbox import Scene
from sandbox.property import EquilateralTriangleProperty
from sandbox.util import Comment
from .base import ExplainerTest
class NapoleonOutward(ExplainerTest):
def createScene(self):
scene = Scene()
triangle = scene.nondegenerate_triangle(labels=['A', 'B', 'C'])
A, B, C = triangle.points
def napoleonic(A, B, C):
equilateral = scene.equilateral_triangle(A, B, C.label + '1')
_, _, V = equilateral.points
line = A.line_through(B, layer='auxiliary')
comment = Comment('$%{triangle:equilateral}$ is facing away from $%{triangle:triangle}$', {'equilateral': equilateral, 'triangle': triangle})
V.opposite_side_constraint(C, line, comment=comment)
D = scene.incentre_point(equilateral, label=C.label + '2')
napoleonic(A, B, C)
napoleonic(C, A, B)
napoleonic(B, C, A)
return scene
def testEquilateral(self):
prop = EquilateralTriangleProperty((self.scene.get('A2'), self.scene.get('B2'), self.scene.get('C2')))
self.assertIn(prop, self.explainer.context)
class NapoleonInward(ExplainerTest):
def createScene(self):
scene = Scene()
triangle = scene.nondegenerate_triangle(labels=['A', 'B', 'C'])
A, B, C = triangle.points
def napoleonic(A, B, C):
equilateral = scene.equilateral_triangle(A, B, C.label + '1')
_, _, V = equilateral.points
line = A.line_through(B, layer='auxiliary')
comment = Comment('$%{triangle:equilateral}$ is facing into $%{triangle:triangle}$', {'equilateral': equilateral, 'triangle': triangle})
V.same_side_constraint(C, line, comment=comment)
D = scene.incentre_point(equilateral, label=C.label + '2')
napoleonic(A, B, C)
napoleonic(C, A, B)
napoleonic(B, C, A)
return scene
def testEquilateral(self):
prop = EquilateralTriangleProperty((self.scene.get('A2'), self.scene.get('B2'), self.scene.get('C2')))
self.assertNotIn(prop, self.explainer.context)
class NapoleonInwardPlusTrigonometry(NapoleonInward):
def explainer_options(self):
return {'trigonometric': True}
def testEquilateral(self):
prop = EquilateralTriangleProperty((self.scene.get('A2'), self.scene.get('B2'), self.scene.get('C2')))
self.assertIn(prop, self.explainer.context)
class NapoleonInwardPlusAdvanced(NapoleonInward):
def explainer_options(self):
return {'advanced': True}
def testEquilateral(self):
prop = EquilateralTriangleProperty((self.scene.get('A2'), self.scene.get('B2'), self.scene.get('C2')))
self.assertIn(prop, self.explainer.context)
| 38.402778 | 153 | 0.643761 | 309 | 2,765 | 5.702265 | 0.197411 | 0.07151 | 0.081725 | 0.029512 | 0.843927 | 0.827469 | 0.778661 | 0.716232 | 0.716232 | 0.716232 | 0 | 0.007404 | 0.218445 | 2,765 | 71 | 154 | 38.943662 | 0.807959 | 0 | 0 | 0.722222 | 0 | 0 | 0.087523 | 0.033996 | 0 | 0 | 0 | 0 | 0.074074 | 1 | 0.185185 | false | 0 | 0.074074 | 0.037037 | 0.407407 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
a99260132274fcfd63bb5bfe4748952152328229 | 183 | py | Python | source/hbar/core/__init__.py | neu-spiral/HBaR | 167c7884167caba4efae3dbb824db822fe0a0170 | [
"MIT"
] | 9 | 2021-11-04T16:53:04.000Z | 2022-03-28T10:27:44.000Z | source/hbar/core/__init__.py | neu-spiral/HBaR | 167c7884167caba4efae3dbb824db822fe0a0170 | [
"MIT"
] | null | null | null | source/hbar/core/__init__.py | neu-spiral/HBaR | 167c7884167caba4efae3dbb824db822fe0a0170 | [
"MIT"
] | null | null | null | from ..utils import meter
from ..utils import misc
from ..utils.io import *
from ..utils.path import *
from ..utils.dataset import *
from ..math.hsic import *
| 26.142857 | 33 | 0.628415 | 24 | 183 | 4.791667 | 0.416667 | 0.391304 | 0.26087 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.262295 | 183 | 6 | 34 | 30.5 | 0.851852 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
a9938d617f2ad71872ee56b8d3bdb0d8b93125f0 | 1,345 | py | Python | utils/getlaplacian.py | AParayil/AParayil-Distribued-Learning-via-Bayesian-Inferencing | 68b52863a0f38ddd6ee1d77b3fffb0faf1e69cf4 | [
"MIT"
] | null | null | null | utils/getlaplacian.py | AParayil/AParayil-Distribued-Learning-via-Bayesian-Inferencing | 68b52863a0f38ddd6ee1d77b3fffb0faf1e69cf4 | [
"MIT"
] | null | null | null | utils/getlaplacian.py | AParayil/AParayil-Distribued-Learning-via-Bayesian-Inferencing | 68b52863a0f38ddd6ee1d77b3fffb0faf1e69cf4 | [
"MIT"
] | null | null | null | import numpy as np
def getlaplacian(n, type=0):
if type == 0:
adj = np.zeros((n, n))
for i in range(1, n-1):
adj[i, i-1] = 1
adj[i, i+1] = 1
adj[0, 1] = 1
adj[0, n-1] = 1
adj[n-1, n-2] = 1
adj[n-1, 0] = 1
elif type == 1:
adj = np.zeros((n, n))
for i in range(1, n-1):
adj[i, i-1] = 1
adj[i, i+1] = 1
adj[0, 1] = 1
adj[n-1, n-2] = 1
elif type == 2:
adj = np.array[[0, 1, 0, 0, 0, 0, 0, 0, 0, 0],
[1, 0, 1, 1, 0, 0, 0, 0, 0, 0],
[0, 1, 0, 1, 0, 0, 1, 0, 0, 0],
[0, 1, 1, 0, 1, 1, 0, 0, 0, 0],
[0, 0, 0, 1, 0, 1, 0, 0, 0, 0],
[0, 0, 0, 1, 1, 0, 0, 0, 0, 0],
[0, 0, 1, 0, 0, 0, 0, 1, 0, 0],
[0, 0, 0, 0, 0, 0, 1, 0, 1, 0],
[0, 0, 0, 0, 0, 0, 0, 1, 0, 1],
[0, 0, 0, 0, 0, 0, 0, 0, 1, 0]]
elif type == 3:
adj = np.array([[0, 1, 1, 1, 1],
[1, 0, 1, 1, 1],
[1, 1, 0, 1, 1],
[1, 1, 1, 0, 1],
[1, 1, 1, 1, 0]])
deg = np.diag(np.sum(adj, axis=1))
lap = deg - adj
return adj, lap
| 32.804878 | 54 | 0.282528 | 240 | 1,345 | 1.583333 | 0.1125 | 0.310526 | 0.378947 | 0.4 | 0.678947 | 0.626316 | 0.626316 | 0.626316 | 0.571053 | 0.565789 | 0 | 0.249226 | 0.519703 | 1,345 | 40 | 55 | 33.625 | 0.339009 | 0 | 0 | 0.315789 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.026316 | false | 0 | 0.026316 | 0 | 0.078947 | 0 | 0 | 0 | 1 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
8d62ed865dee9e82f3cff8c93a7a3681bc95d2c3 | 95 | py | Python | Back-End/Python/Basics/Part -1 - Functional/08 - Modules, Packages/03 - Using__main__/__main__.py | ASHISHKUMAR2411/Programming-CookBook | 9c60655d64d21985ccb4196360858d98344701f9 | [
"MIT"
] | 25 | 2021-04-28T02:51:26.000Z | 2022-03-24T13:58:04.000Z | Back-End/Python/Basics/Part -1 - Functional/08 - Modules, Packages/03 - Using__main__/__main__.py | ASHISHKUMAR2411/Programming-CookBook | 9c60655d64d21985ccb4196360858d98344701f9 | [
"MIT"
] | 1 | 2022-03-03T23:33:41.000Z | 2022-03-03T23:35:41.000Z | Back-End/Python/Basics/Part -1 - Functional/08 - Modules, Packages/03 - Using__main__/__main__.py | ASHISHKUMAR2411/Programming-CookBook | 9c60655d64d21985ccb4196360858d98344701f9 | [
"MIT"
] | 15 | 2021-05-30T01:35:20.000Z | 2022-03-25T12:38:25.000Z | print(f'Loading __main__ : __name__ = {__name__}')
# >>> Loading __main__ : __name__ = __main__ | 47.5 | 50 | 0.715789 | 10 | 95 | 4.4 | 0.5 | 0.5 | 0.681818 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.136842 | 95 | 2 | 51 | 47.5 | 0.536585 | 0.442105 | 0 | 0 | 0 | 0 | 0.769231 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 7 |
a5eb7cf42080272778fe7282a145e106cba61d8a | 206 | py | Python | MitreAttackAssessmentApp/runNotFirstTime.py | ashleycampion/GMIT_data_representation_module | b68d984e88919a3a812de55a84f14a90a296756f | [
"MIT"
] | null | null | null | MitreAttackAssessmentApp/runNotFirstTime.py | ashleycampion/GMIT_data_representation_module | b68d984e88919a3a812de55a84f14a90a296756f | [
"MIT"
] | null | null | null | MitreAttackAssessmentApp/runNotFirstTime.py | ashleycampion/GMIT_data_representation_module | b68d984e88919a3a812de55a84f14a90a296756f | [
"MIT"
] | null | null | null | from AttackAssessmentsApp import app
from AttackAssessmentsApp.dbFiles import createDB
from AttackAssessmentsApp.dbFiles.createTables import tableCreator
if __name__ == "__main__":
app.run(debug=True) | 29.428571 | 66 | 0.834951 | 22 | 206 | 7.454545 | 0.636364 | 0.439024 | 0.378049 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.106796 | 206 | 7 | 67 | 29.428571 | 0.891304 | 0 | 0 | 0 | 0 | 0 | 0.038647 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.6 | 0 | 0.6 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
5736eb738040240f5a46c0956f115112f9bf1447 | 1,223 | py | Python | Algo and DSA/LeetCode-Solutions-master/Python/next-permutation.py | Sourav692/FAANG-Interview-Preparation | f523e5c94d582328b3edc449ea16ac6ab28cdc81 | [
"Unlicense"
] | 3,269 | 2018-10-12T01:29:40.000Z | 2022-03-31T17:58:41.000Z | Algo and DSA/LeetCode-Solutions-master/Python/next-permutation.py | Sourav692/FAANG-Interview-Preparation | f523e5c94d582328b3edc449ea16ac6ab28cdc81 | [
"Unlicense"
] | 53 | 2018-12-16T22:54:20.000Z | 2022-02-25T08:31:20.000Z | Algo and DSA/LeetCode-Solutions-master/Python/next-permutation.py | Sourav692/FAANG-Interview-Preparation | f523e5c94d582328b3edc449ea16ac6ab28cdc81 | [
"Unlicense"
] | 1,236 | 2018-10-12T02:51:40.000Z | 2022-03-30T13:30:37.000Z | # Time: O(n)
# Space: O(1)
class Solution(object):
def nextPermutation(self, nums):
"""
:type nums: List[int]
:rtype: None Do not return anything, modify nums in-place instead.
"""
k, l = -1, 0
for i in reversed(xrange(len(nums)-1)):
if nums[i] < nums[i+1]:
k = i
break
else:
nums.reverse()
return
for i in reversed(xrange(k+1, len(nums))):
if nums[i] > nums[k]:
l = i
break
nums[k], nums[l] = nums[l], nums[k]
nums[k+1:] = nums[:k:-1]
# Time: O(n)
# Space: O(1)
class Solution2(object):
def nextPermutation(self, nums):
"""
:type nums: List[int]
:rtype: None Do not return anything, modify nums in-place instead.
"""
k, l = -1, 0
for i in xrange(len(nums)-1):
if nums[i] < nums[i+1]:
k = i
if k == -1:
nums.reverse()
return
for i in xrange(k+1, len(nums)):
if nums[i] > nums[k]:
l = i
nums[k], nums[l] = nums[l], nums[k]
nums[k+1:] = nums[:k:-1]
| 24.959184 | 74 | 0.434178 | 166 | 1,223 | 3.198795 | 0.228916 | 0.094162 | 0.045198 | 0.082863 | 0.934087 | 0.903955 | 0.817326 | 0.749529 | 0.749529 | 0.749529 | 0 | 0.025281 | 0.417825 | 1,223 | 48 | 75 | 25.479167 | 0.720506 | 0.184792 | 0 | 0.733333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.066667 | false | 0 | 0 | 0 | 0.2 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
93b6c8aff82848062d4b277ffd8208ca6c024824 | 110 | py | Python | tests/test_minus/test_another_minus.py | lejencodes/pytest_training | ec658e29e220201c1d8e9d7fabfbbef8a9783f5b | [
"MIT"
] | null | null | null | tests/test_minus/test_another_minus.py | lejencodes/pytest_training | ec658e29e220201c1d8e9d7fabfbbef8a9783f5b | [
"MIT"
] | null | null | null | tests/test_minus/test_another_minus.py | lejencodes/pytest_training | ec658e29e220201c1d8e9d7fabfbbef8a9783f5b | [
"MIT"
] | null | null | null | from minus.another_minus import another_minus
def test_another_minus():
assert another_minus(3, 2) == 12 | 22 | 45 | 0.772727 | 17 | 110 | 4.705882 | 0.588235 | 0.6 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.042553 | 0.145455 | 110 | 5 | 46 | 22 | 0.808511 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.333333 | 1 | 0.333333 | true | 0 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
93b715218ee450fc619a23edad38d9164fc1e5a1 | 29,994 | py | Python | coramin/relaxations/tests/test_auto_relax.py | jsiirola/Coramin | e5dd359ca885794c05447ff28f0f8aaf9c86da6e | [
"BSD-3-Clause"
] | null | null | null | coramin/relaxations/tests/test_auto_relax.py | jsiirola/Coramin | e5dd359ca885794c05447ff28f0f8aaf9c86da6e | [
"BSD-3-Clause"
] | null | null | null | coramin/relaxations/tests/test_auto_relax.py | jsiirola/Coramin | e5dd359ca885794c05447ff28f0f8aaf9c86da6e | [
"BSD-3-Clause"
] | null | null | null | import pyomo.environ as pe
import coramin
import unittest
from pyomo.contrib.derivatives.differentiate import reverse_sd
from pyomo.core.expr.visitor import identify_variables
import math
class TestAutoRelax(unittest.TestCase):
def test_product1(self):
m = pe.ConcreteModel()
m.x = pe.Var(bounds=(-1,1))
m.y = pe.Var(bounds=(-1,1))
m.z = pe.Var()
m.c = pe.Constraint(expr=m.z - m.x*m.y == 0)
rel = coramin.relaxations.relax(m)
self.assertTrue(hasattr(rel, 'aux_cons'))
self.assertTrue(hasattr(rel, 'aux_vars'))
self.assertEqual(len(rel.aux_cons), 1)
self.assertEqual(len(rel.aux_vars), 1)
self.assertAlmostEqual(rel.aux_vars[1].lb, -1)
self.assertAlmostEqual(rel.aux_vars[1].ub, 1)
self.assertEqual(rel.aux_cons[1].lower, 0)
self.assertEqual(rel.aux_cons[1].upper, 0)
ders = reverse_sd(rel.aux_cons[1].body)
self.assertEqual(ders[rel.z], 1)
self.assertEqual(ders[rel.aux_vars[1]], -1)
self.assertEqual(len(list(identify_variables(rel.aux_cons[1].body))), 2)
self.assertTrue(hasattr(rel, 'relaxations'))
self.assertTrue(hasattr(rel.relaxations, 'rel0'))
self.assertTrue(isinstance(rel.relaxations.rel0, coramin.relaxations.PWMcCormickRelaxation))
self.assertIn(id(rel.x), {id(rel.relaxations.rel0._x), id(rel.relaxations.rel0._y)})
self.assertIn(id(rel.y), {id(rel.relaxations.rel0._x), id(rel.relaxations.rel0._y)})
self.assertEqual(id(rel.aux_vars[1]), id(rel.relaxations.rel0._w))
def test_product2(self):
m = pe.ConcreteModel()
m.x = pe.Var(bounds=(-1,1))
m.y = pe.Var(bounds=(-1,1))
m.z = pe.Var()
m.v = pe.Var()
m.c1 = pe.Constraint(expr=m.z - m.x*m.y == 0)
m.c2 = pe.Constraint(expr=m.v - 3*m.x*m.y == 0)
rel = coramin.relaxations.relax(m)
self.assertTrue(hasattr(rel, 'aux_cons'))
self.assertTrue(hasattr(rel, 'aux_vars'))
self.assertEqual(len(rel.aux_cons), 2)
self.assertEqual(len(rel.aux_vars), 1)
self.assertAlmostEqual(rel.aux_vars[1].lb, -1)
self.assertAlmostEqual(rel.aux_vars[1].ub, 1)
self.assertEqual(rel.aux_cons[1].lower, 0)
self.assertEqual(rel.aux_cons[1].upper, 0)
ders = reverse_sd(rel.aux_cons[1].body)
self.assertEqual(ders[rel.z], 1)
self.assertEqual(ders[rel.aux_vars[1]], -1)
self.assertEqual(len(list(identify_variables(rel.aux_cons[1].body))), 2)
self.assertEqual(rel.aux_cons[2].lower, 0)
self.assertEqual(rel.aux_cons[2].upper, 0)
ders = reverse_sd(rel.aux_cons[2].body)
self.assertEqual(ders[rel.v], 1)
self.assertEqual(ders[rel.aux_vars[1]], -3)
self.assertEqual(len(list(identify_variables(rel.aux_cons[2].body))), 2)
self.assertTrue(hasattr(rel, 'relaxations'))
self.assertTrue(hasattr(rel.relaxations, 'rel0'))
self.assertTrue(isinstance(rel.relaxations.rel0, coramin.relaxations.PWMcCormickRelaxation))
self.assertIn(id(rel.x), {id(rel.relaxations.rel0._x), id(rel.relaxations.rel0._y)})
self.assertIn(id(rel.y), {id(rel.relaxations.rel0._x), id(rel.relaxations.rel0._y)})
self.assertEqual(id(rel.aux_vars[1]), id(rel.relaxations.rel0._w))
def test_quadratic(self):
m = pe.ConcreteModel()
m.x = pe.Var(bounds=(-1,1))
m.y = pe.Var()
m.z = pe.Var()
m.w = pe.Var()
m.c = pe.Constraint(expr=m.x**2 + m.y + m.z == 0)
m.c2 = pe.Constraint(expr=m.w - 3*m.x**2 == 0)
rel = coramin.relaxations.relax(m)
self.assertTrue(hasattr(rel, 'aux_cons'))
self.assertTrue(hasattr(rel, 'aux_vars'))
self.assertEqual(len(rel.aux_cons), 2)
self.assertEqual(len(rel.aux_vars), 1)
self.assertAlmostEqual(rel.aux_vars[1].lb, 0)
self.assertAlmostEqual(rel.aux_vars[1].ub, 1)
self.assertEqual(rel.aux_cons[1].lower, 0)
self.assertEqual(rel.aux_cons[1].upper, 0)
ders = reverse_sd(rel.aux_cons[1].body)
self.assertEqual(ders[rel.z], 1)
self.assertEqual(ders[rel.aux_vars[1]], 1)
self.assertEqual(ders[rel.y], 1)
self.assertEqual(len(list(identify_variables(rel.aux_cons[1].body))), 3)
self.assertEqual(rel.aux_cons[2].lower, 0)
self.assertEqual(rel.aux_cons[2].upper, 0)
ders = reverse_sd(rel.aux_cons[2].body)
self.assertEqual(ders[rel.w], 1)
self.assertEqual(ders[rel.aux_vars[1]], -3)
self.assertEqual(len(list(identify_variables(rel.aux_cons[2].body))), 2)
self.assertTrue(hasattr(rel, 'relaxations'))
self.assertTrue(hasattr(rel.relaxations, 'rel0'))
self.assertTrue(isinstance(rel.relaxations.rel0, coramin.relaxations.PWXSquaredRelaxation))
self.assertEqual(id(rel.x), id(rel.relaxations.rel0._x))
self.assertEqual(id(rel.aux_vars[1]), id(rel.relaxations.rel0._w))
self.assertFalse(hasattr(rel.relaxations, 'rel1'))
def test_cubic_convex(self):
m = pe.ConcreteModel()
m.x = pe.Var(bounds=(1,2))
m.y = pe.Var()
m.z = pe.Var()
m.w = pe.Var()
m.c = pe.Constraint(expr=m.x**3 + m.y + m.z == 0)
m.c2 = pe.Constraint(expr=m.w - 3*m.x**3 == 0)
rel = coramin.relaxations.relax(m)
self.assertTrue(hasattr(rel, 'aux_cons'))
self.assertTrue(hasattr(rel, 'aux_vars'))
self.assertEqual(len(rel.aux_cons), 2)
self.assertEqual(len(rel.aux_vars), 1)
self.assertAlmostEqual(rel.aux_vars[1].lb, 1)
self.assertAlmostEqual(rel.aux_vars[1].ub, 8)
self.assertEqual(rel.aux_cons[1].lower, 0)
self.assertEqual(rel.aux_cons[1].upper, 0)
ders = reverse_sd(rel.aux_cons[1].body)
self.assertEqual(ders[rel.z], 1)
self.assertEqual(ders[rel.aux_vars[1]], 1)
self.assertEqual(ders[rel.y], 1)
self.assertEqual(len(list(identify_variables(rel.aux_cons[1].body))), 3)
self.assertEqual(rel.aux_cons[2].lower, 0)
self.assertEqual(rel.aux_cons[2].upper, 0)
ders = reverse_sd(rel.aux_cons[2].body)
self.assertEqual(ders[rel.w], 1)
self.assertEqual(ders[rel.aux_vars[1]], -3)
self.assertEqual(len(list(identify_variables(rel.aux_cons[2].body))), 2)
self.assertTrue(hasattr(rel, 'relaxations'))
self.assertTrue(hasattr(rel.relaxations, 'rel0'))
self.assertTrue(isinstance(rel.relaxations.rel0, coramin.relaxations.PWUnivariateRelaxation))
self.assertEqual(id(rel.x), id(rel.relaxations.rel0._x))
self.assertEqual(id(rel.aux_vars[1]), id(rel.relaxations.rel0._w))
self.assertTrue(rel.relaxations.rel0.is_convex())
self.assertFalse(rel.relaxations.rel0.is_concave())
self.assertFalse(hasattr(rel.relaxations, 'rel1'))
def test_cubic_concave(self):
m = pe.ConcreteModel()
m.x = pe.Var(bounds=(-2,-1))
m.y = pe.Var()
m.z = pe.Var()
m.w = pe.Var()
m.c = pe.Constraint(expr=m.x**3 + m.y + m.z == 0)
m.c2 = pe.Constraint(expr=m.w - 3*m.x**3 == 0)
rel = coramin.relaxations.relax(m)
self.assertTrue(hasattr(rel, 'aux_cons'))
self.assertTrue(hasattr(rel, 'aux_vars'))
self.assertEqual(len(rel.aux_cons), 2)
self.assertEqual(len(rel.aux_vars), 1)
self.assertAlmostEqual(rel.aux_vars[1].lb, -8)
self.assertAlmostEqual(rel.aux_vars[1].ub, -1)
self.assertEqual(rel.aux_cons[1].lower, 0)
self.assertEqual(rel.aux_cons[1].upper, 0)
ders = reverse_sd(rel.aux_cons[1].body)
self.assertEqual(ders[rel.z], 1)
self.assertEqual(ders[rel.aux_vars[1]], 1)
self.assertEqual(ders[rel.y], 1)
self.assertEqual(len(list(identify_variables(rel.aux_cons[1].body))), 3)
self.assertEqual(rel.aux_cons[2].lower, 0)
self.assertEqual(rel.aux_cons[2].upper, 0)
ders = reverse_sd(rel.aux_cons[2].body)
self.assertEqual(ders[rel.w], 1)
self.assertEqual(ders[rel.aux_vars[1]], -3)
self.assertEqual(len(list(identify_variables(rel.aux_cons[2].body))), 2)
self.assertTrue(hasattr(rel, 'relaxations'))
self.assertTrue(hasattr(rel.relaxations, 'rel0'))
self.assertTrue(isinstance(rel.relaxations.rel0, coramin.relaxations.PWUnivariateRelaxation))
self.assertEqual(id(rel.x), id(rel.relaxations.rel0._x))
self.assertEqual(id(rel.aux_vars[1]), id(rel.relaxations.rel0._w))
self.assertFalse(rel.relaxations.rel0.is_convex())
self.assertTrue(rel.relaxations.rel0.is_concave())
self.assertFalse(hasattr(rel.relaxations, 'rel1'))
def test_cubic(self):
m = pe.ConcreteModel()
m.x = pe.Var(bounds=(-1,1))
m.y = pe.Var()
m.z = pe.Var()
m.w = pe.Var()
m.c = pe.Constraint(expr=m.x**3 + m.y + m.z == 0)
m.c2 = pe.Constraint(expr=m.w - 3*m.x**3 == 0)
rel = coramin.relaxations.relax(m)
# this problem should turn into
#
# aux2 + y + z = 0 => aux_con[1]
# w - 3*aux2 = 0 => aux_con[2]
# aux1 = x**2 => rel0
# aux2 = x*aux1 => rel1
self.assertTrue(hasattr(rel, 'aux_cons'))
self.assertTrue(hasattr(rel, 'aux_vars'))
self.assertEqual(len(rel.aux_cons), 2)
self.assertEqual(len(rel.aux_vars), 2)
self.assertAlmostEqual(rel.aux_vars[1].lb, 0)
self.assertAlmostEqual(rel.aux_vars[1].ub, 1)
self.assertAlmostEqual(rel.aux_vars[2].lb, -1)
self.assertAlmostEqual(rel.aux_vars[2].ub, 1)
self.assertEqual(rel.aux_cons[1].lower, 0)
self.assertEqual(rel.aux_cons[1].upper, 0)
ders = reverse_sd(rel.aux_cons[1].body)
self.assertEqual(ders[rel.z], 1)
self.assertEqual(ders[rel.aux_vars[2]], 1)
self.assertEqual(ders[rel.y], 1)
self.assertEqual(len(list(identify_variables(rel.aux_cons[1].body))), 3)
self.assertEqual(rel.aux_cons[2].lower, 0)
self.assertEqual(rel.aux_cons[2].upper, 0)
ders = reverse_sd(rel.aux_cons[2].body)
self.assertEqual(ders[rel.w], 1)
self.assertEqual(ders[rel.aux_vars[2]], -3)
self.assertEqual(len(list(identify_variables(rel.aux_cons[2].body))), 2)
self.assertTrue(hasattr(rel, 'relaxations'))
self.assertTrue(hasattr(rel.relaxations, 'rel0'))
self.assertTrue(isinstance(rel.relaxations.rel0, coramin.relaxations.PWXSquaredRelaxation))
self.assertEqual(id(rel.x), id(rel.relaxations.rel0._x))
self.assertEqual(id(rel.aux_vars[1]), id(rel.relaxations.rel0._w))
self.assertTrue(hasattr(rel.relaxations, 'rel1'))
self.assertTrue(isinstance(rel.relaxations.rel1, coramin.relaxations.PWMcCormickRelaxation))
self.assertIn(id(rel.x), {id(rel.relaxations.rel1._x), id(rel.relaxations.rel1._y)})
self.assertIn(id(rel.aux_vars[1]), {id(rel.relaxations.rel1._x), id(rel.relaxations.rel1._y)})
self.assertEqual(id(rel.aux_vars[2]), id(rel.relaxations.rel1._w))
def test_pow_fractional1(self):
m = pe.ConcreteModel()
m.x = pe.Var(bounds=(-1,1))
m.y = pe.Var()
m.z = pe.Var()
m.w = pe.Var()
m.p = pe.Param(initialize=0.5)
m.c = pe.Constraint(expr=m.x**m.p + m.y + m.z == 0)
m.c2 = pe.Constraint(expr=m.w - 3*m.x**m.p == 0)
rel = coramin.relaxations.relax(m)
self.assertTrue(hasattr(rel, 'aux_cons'))
self.assertTrue(hasattr(rel, 'aux_vars'))
self.assertEqual(len(rel.aux_cons), 2)
self.assertEqual(len(rel.aux_vars), 1)
self.assertAlmostEqual(rel.aux_vars[1].lb, 0)
self.assertAlmostEqual(rel.aux_vars[1].ub, 1)
self.assertEqual(rel.aux_cons[1].lower, 0)
self.assertEqual(rel.aux_cons[1].upper, 0)
ders = reverse_sd(rel.aux_cons[1].body)
self.assertEqual(ders[rel.z], 1)
self.assertEqual(ders[rel.aux_vars[1]], 1)
self.assertEqual(ders[rel.y], 1)
self.assertEqual(len(list(identify_variables(rel.aux_cons[1].body))), 3)
self.assertEqual(rel.aux_cons[2].lower, 0)
self.assertEqual(rel.aux_cons[2].upper, 0)
ders = reverse_sd(rel.aux_cons[2].body)
self.assertEqual(ders[rel.w], 1)
self.assertEqual(ders[rel.aux_vars[1]], -3)
self.assertEqual(len(list(identify_variables(rel.aux_cons[2].body))), 2)
self.assertTrue(hasattr(rel, 'relaxations'))
self.assertTrue(hasattr(rel.relaxations, 'rel0'))
self.assertTrue(isinstance(rel.relaxations.rel0, coramin.relaxations.PWUnivariateRelaxation))
self.assertEqual(id(rel.x), id(rel.relaxations.rel0._x))
self.assertEqual(id(rel.aux_vars[1]), id(rel.relaxations.rel0._w))
self.assertFalse(rel.relaxations.rel0.is_convex())
self.assertTrue(rel.relaxations.rel0.is_concave())
self.assertFalse(hasattr(rel.relaxations, 'rel1'))
def test_pow_fractional2(self):
m = pe.ConcreteModel()
m.x = pe.Var(bounds=(-1,1))
m.y = pe.Var()
m.z = pe.Var()
m.w = pe.Var()
m.p = pe.Param(initialize=1.5)
m.c = pe.Constraint(expr=m.x**m.p + m.y + m.z == 0)
m.c2 = pe.Constraint(expr=m.w - 3*m.x**m.p == 0)
rel = coramin.relaxations.relax(m)
self.assertTrue(hasattr(rel, 'aux_cons'))
self.assertTrue(hasattr(rel, 'aux_vars'))
self.assertEqual(len(rel.aux_cons), 2)
self.assertEqual(len(rel.aux_vars), 1)
self.assertAlmostEqual(rel.aux_vars[1].lb, 0)
self.assertAlmostEqual(rel.aux_vars[1].ub, 1)
self.assertEqual(rel.aux_cons[1].lower, 0)
self.assertEqual(rel.aux_cons[1].upper, 0)
ders = reverse_sd(rel.aux_cons[1].body)
self.assertEqual(ders[rel.z], 1)
self.assertEqual(ders[rel.aux_vars[1]], 1)
self.assertEqual(ders[rel.y], 1)
self.assertEqual(len(list(identify_variables(rel.aux_cons[1].body))), 3)
self.assertEqual(rel.aux_cons[2].lower, 0)
self.assertEqual(rel.aux_cons[2].upper, 0)
ders = reverse_sd(rel.aux_cons[2].body)
self.assertEqual(ders[rel.w], 1)
self.assertEqual(ders[rel.aux_vars[1]], -3)
self.assertEqual(len(list(identify_variables(rel.aux_cons[2].body))), 2)
self.assertTrue(hasattr(rel, 'relaxations'))
self.assertTrue(hasattr(rel.relaxations, 'rel0'))
self.assertTrue(isinstance(rel.relaxations.rel0, coramin.relaxations.PWUnivariateRelaxation))
self.assertEqual(id(rel.x), id(rel.relaxations.rel0._x))
self.assertEqual(id(rel.aux_vars[1]), id(rel.relaxations.rel0._w))
self.assertTrue(rel.relaxations.rel0.is_convex())
self.assertFalse(rel.relaxations.rel0.is_concave())
self.assertFalse(hasattr(rel.relaxations, 'rel1'))
def test_pow_neg_even1(self):
m = pe.ConcreteModel()
m.x = pe.Var(bounds=(1,2))
m.y = pe.Var()
m.z = pe.Var()
m.w = pe.Var()
m.p = pe.Param(initialize=-2)
m.c = pe.Constraint(expr=m.x**m.p + m.y + m.z == 0)
m.c2 = pe.Constraint(expr=m.w - 3*m.x**m.p == 0)
rel = coramin.relaxations.relax(m)
self.assertTrue(hasattr(rel, 'aux_cons'))
self.assertTrue(hasattr(rel, 'aux_vars'))
self.assertEqual(len(rel.aux_cons), 2)
self.assertEqual(len(rel.aux_vars), 1)
self.assertAlmostEqual(rel.aux_vars[1].lb, 0.25)
self.assertAlmostEqual(rel.aux_vars[1].ub, 1)
self.assertEqual(rel.aux_cons[1].lower, 0)
self.assertEqual(rel.aux_cons[1].upper, 0)
ders = reverse_sd(rel.aux_cons[1].body)
self.assertEqual(ders[rel.z], 1)
self.assertEqual(ders[rel.aux_vars[1]], 1)
self.assertEqual(ders[rel.y], 1)
self.assertEqual(len(list(identify_variables(rel.aux_cons[1].body))), 3)
self.assertEqual(rel.aux_cons[2].lower, 0)
self.assertEqual(rel.aux_cons[2].upper, 0)
ders = reverse_sd(rel.aux_cons[2].body)
self.assertEqual(ders[rel.w], 1)
self.assertEqual(ders[rel.aux_vars[1]], -3)
self.assertEqual(len(list(identify_variables(rel.aux_cons[2].body))), 2)
self.assertTrue(hasattr(rel, 'relaxations'))
self.assertTrue(hasattr(rel.relaxations, 'rel0'))
self.assertTrue(isinstance(rel.relaxations.rel0, coramin.relaxations.PWUnivariateRelaxation))
self.assertEqual(id(rel.x), id(rel.relaxations.rel0._x))
self.assertEqual(id(rel.aux_vars[1]), id(rel.relaxations.rel0._w))
self.assertTrue(rel.relaxations.rel0.is_convex())
self.assertFalse(rel.relaxations.rel0.is_concave())
self.assertFalse(hasattr(rel.relaxations, 'rel1'))
def test_pow_neg_even2(self):
m = pe.ConcreteModel()
m.x = pe.Var(bounds=(-2,-1))
m.y = pe.Var()
m.z = pe.Var()
m.w = pe.Var()
m.p = pe.Param(initialize=-2)
m.c = pe.Constraint(expr=m.x**m.p + m.y + m.z == 0)
m.c2 = pe.Constraint(expr=m.w - 3*m.x**m.p == 0)
rel = coramin.relaxations.relax(m)
self.assertTrue(hasattr(rel, 'aux_cons'))
self.assertTrue(hasattr(rel, 'aux_vars'))
self.assertEqual(len(rel.aux_cons), 2)
self.assertEqual(len(rel.aux_vars), 1)
self.assertAlmostEqual(rel.aux_vars[1].lb, 0.25)
self.assertAlmostEqual(rel.aux_vars[1].ub, 1)
self.assertEqual(rel.aux_cons[1].lower, 0)
self.assertEqual(rel.aux_cons[1].upper, 0)
ders = reverse_sd(rel.aux_cons[1].body)
self.assertEqual(ders[rel.z], 1)
self.assertEqual(ders[rel.aux_vars[1]], 1)
self.assertEqual(ders[rel.y], 1)
self.assertEqual(len(list(identify_variables(rel.aux_cons[1].body))), 3)
self.assertEqual(rel.aux_cons[2].lower, 0)
self.assertEqual(rel.aux_cons[2].upper, 0)
ders = reverse_sd(rel.aux_cons[2].body)
self.assertEqual(ders[rel.w], 1)
self.assertEqual(ders[rel.aux_vars[1]], -3)
self.assertEqual(len(list(identify_variables(rel.aux_cons[2].body))), 2)
self.assertTrue(hasattr(rel, 'relaxations'))
self.assertTrue(hasattr(rel.relaxations, 'rel0'))
self.assertTrue(isinstance(rel.relaxations.rel0, coramin.relaxations.PWUnivariateRelaxation))
self.assertEqual(id(rel.x), id(rel.relaxations.rel0._x))
self.assertEqual(id(rel.aux_vars[1]), id(rel.relaxations.rel0._w))
self.assertTrue(rel.relaxations.rel0.is_convex())
self.assertFalse(rel.relaxations.rel0.is_concave())
self.assertFalse(hasattr(rel.relaxations, 'rel1'))
def test_pow_neg_odd1(self):
m = pe.ConcreteModel()
m.x = pe.Var(bounds=(1,2))
m.y = pe.Var()
m.z = pe.Var()
m.w = pe.Var()
m.p = pe.Param(initialize=-3)
m.c = pe.Constraint(expr=m.x**m.p + m.y + m.z == 0)
m.c2 = pe.Constraint(expr=m.w - 3*m.x**m.p == 0)
rel = coramin.relaxations.relax(m)
self.assertTrue(hasattr(rel, 'aux_cons'))
self.assertTrue(hasattr(rel, 'aux_vars'))
self.assertEqual(len(rel.aux_cons), 2)
self.assertEqual(len(rel.aux_vars), 1)
self.assertAlmostEqual(rel.aux_vars[1].lb, 0.125)
self.assertAlmostEqual(rel.aux_vars[1].ub, 1)
self.assertEqual(rel.aux_cons[1].lower, 0)
self.assertEqual(rel.aux_cons[1].upper, 0)
ders = reverse_sd(rel.aux_cons[1].body)
self.assertEqual(ders[rel.z], 1)
self.assertEqual(ders[rel.aux_vars[1]], 1)
self.assertEqual(ders[rel.y], 1)
self.assertEqual(len(list(identify_variables(rel.aux_cons[1].body))), 3)
self.assertEqual(rel.aux_cons[2].lower, 0)
self.assertEqual(rel.aux_cons[2].upper, 0)
ders = reverse_sd(rel.aux_cons[2].body)
self.assertEqual(ders[rel.w], 1)
self.assertEqual(ders[rel.aux_vars[1]], -3)
self.assertEqual(len(list(identify_variables(rel.aux_cons[2].body))), 2)
self.assertTrue(hasattr(rel, 'relaxations'))
self.assertTrue(hasattr(rel.relaxations, 'rel0'))
self.assertTrue(isinstance(rel.relaxations.rel0, coramin.relaxations.PWUnivariateRelaxation))
self.assertEqual(id(rel.x), id(rel.relaxations.rel0._x))
self.assertEqual(id(rel.aux_vars[1]), id(rel.relaxations.rel0._w))
self.assertTrue(rel.relaxations.rel0.is_convex())
self.assertFalse(rel.relaxations.rel0.is_concave())
self.assertFalse(hasattr(rel.relaxations, 'rel1'))
def test_pow_neg_odd2(self):
m = pe.ConcreteModel()
m.x = pe.Var(bounds=(-2,-1))
m.y = pe.Var()
m.z = pe.Var()
m.w = pe.Var()
m.p = pe.Param(initialize=-3)
m.c = pe.Constraint(expr=m.x**m.p + m.y + m.z == 0)
m.c2 = pe.Constraint(expr=m.w - 3*m.x**m.p == 0)
rel = coramin.relaxations.relax(m)
self.assertTrue(hasattr(rel, 'aux_cons'))
self.assertTrue(hasattr(rel, 'aux_vars'))
self.assertEqual(len(rel.aux_cons), 2)
self.assertEqual(len(rel.aux_vars), 1)
self.assertAlmostEqual(rel.aux_vars[1].lb, -1)
self.assertAlmostEqual(rel.aux_vars[1].ub, -0.125)
self.assertEqual(rel.aux_cons[1].lower, 0)
self.assertEqual(rel.aux_cons[1].upper, 0)
ders = reverse_sd(rel.aux_cons[1].body)
self.assertEqual(ders[rel.z], 1)
self.assertEqual(ders[rel.aux_vars[1]], 1)
self.assertEqual(ders[rel.y], 1)
self.assertEqual(len(list(identify_variables(rel.aux_cons[1].body))), 3)
self.assertEqual(rel.aux_cons[2].lower, 0)
self.assertEqual(rel.aux_cons[2].upper, 0)
ders = reverse_sd(rel.aux_cons[2].body)
self.assertEqual(ders[rel.w], 1)
self.assertEqual(ders[rel.aux_vars[1]], -3)
self.assertEqual(len(list(identify_variables(rel.aux_cons[2].body))), 2)
self.assertTrue(hasattr(rel, 'relaxations'))
self.assertTrue(hasattr(rel.relaxations, 'rel0'))
self.assertTrue(isinstance(rel.relaxations.rel0, coramin.relaxations.PWUnivariateRelaxation))
self.assertEqual(id(rel.x), id(rel.relaxations.rel0._x))
self.assertEqual(id(rel.aux_vars[1]), id(rel.relaxations.rel0._w))
self.assertFalse(rel.relaxations.rel0.is_convex())
self.assertTrue(rel.relaxations.rel0.is_concave())
self.assertFalse(hasattr(rel.relaxations, 'rel1'))
def test_pow_neg(self):
m = pe.ConcreteModel()
m.x = pe.Var(bounds=(-1,1))
m.y = pe.Var()
m.z = pe.Var()
m.w = pe.Var()
m.p = pe.Param(initialize=-2)
m.c = pe.Constraint(expr=m.x**m.p + m.y + m.z == 0)
m.c2 = pe.Constraint(expr=m.w - 3*m.x**m.p == 0)
rel = coramin.relaxations.relax(m)
# This model should be relaxed to
#
# aux2 + y + z = 0
# w - 3 * aux2 = 0
# aux1 = x**2
# aux1*aux2 = aux3
# aux3 = 1
#
self.assertTrue(hasattr(rel, 'aux_cons'))
self.assertTrue(hasattr(rel, 'aux_vars'))
self.assertEqual(len(rel.aux_cons), 2)
self.assertEqual(len(rel.aux_vars), 3)
self.assertAlmostEqual(rel.aux_vars[1].lb, 0)
self.assertAlmostEqual(rel.aux_vars[1].ub, 1)
self.assertTrue(rel.aux_vars[3].is_fixed())
self.assertEqual(rel.aux_vars[3].value, 1)
self.assertEqual(rel.aux_cons[1].lower, 0)
self.assertEqual(rel.aux_cons[1].upper, 0)
ders = reverse_sd(rel.aux_cons[1].body)
self.assertEqual(ders[rel.z], 1)
self.assertEqual(ders[rel.aux_vars[2]], 1)
self.assertEqual(ders[rel.y], 1)
self.assertEqual(len(list(identify_variables(rel.aux_cons[1].body))), 3)
self.assertEqual(rel.aux_cons[2].lower, 0)
self.assertEqual(rel.aux_cons[2].upper, 0)
ders = reverse_sd(rel.aux_cons[2].body)
self.assertEqual(ders[rel.w], 1)
self.assertEqual(ders[rel.aux_vars[2]], -3)
self.assertEqual(len(list(identify_variables(rel.aux_cons[2].body))), 2)
self.assertTrue(hasattr(rel, 'relaxations'))
self.assertTrue(hasattr(rel.relaxations, 'rel0'))
self.assertTrue(isinstance(rel.relaxations.rel0, coramin.relaxations.PWXSquaredRelaxation))
self.assertEqual(id(rel.x), id(rel.relaxations.rel0._x))
self.assertEqual(id(rel.aux_vars[1]), id(rel.relaxations.rel0._w))
self.assertTrue(rel.relaxations.rel0.is_convex())
self.assertFalse(rel.relaxations.rel0.is_concave())
self.assertTrue(hasattr(rel.relaxations, 'rel1'))
self.assertTrue(isinstance(rel.relaxations.rel1, coramin.relaxations.PWMcCormickRelaxation))
self.assertIn(id(rel.aux_vars[1]), {id(rel.relaxations.rel1._x), id(rel.relaxations.rel1._y)})
self.assertIn(id(rel.aux_vars[2]), {id(rel.relaxations.rel1._x), id(rel.relaxations.rel1._y)})
self.assertEqual(id(rel.aux_vars[3]), id(rel.relaxations.rel1._w))
self.assertFalse(hasattr(rel.relaxations, 'rel2'))
def test_exp(self):
m = pe.ConcreteModel()
m.x = pe.Var(bounds=(-1,1))
m.y = pe.Var(bounds=(-1,1))
m.z = pe.Var()
m.w = pe.Var()
m.c = pe.Constraint(expr=pe.exp(m.x*m.y) + m.z == 0)
m.c2 = pe.Constraint(expr=m.w - 3*pe.exp(m.x*m.y) == 0)
rel = coramin.relaxations.relax(m)
self.assertTrue(hasattr(rel, 'aux_cons'))
self.assertTrue(hasattr(rel, 'aux_vars'))
self.assertEqual(len(rel.aux_cons), 2)
self.assertEqual(len(rel.aux_vars), 2)
self.assertAlmostEqual(rel.aux_vars[1].lb, -1)
self.assertAlmostEqual(rel.aux_vars[1].ub, 1)
self.assertAlmostEqual(rel.aux_vars[2].lb, math.exp(-1))
self.assertAlmostEqual(rel.aux_vars[2].ub, math.exp(1))
self.assertEqual(rel.aux_cons[1].lower, 0)
self.assertEqual(rel.aux_cons[1].upper, 0)
ders = reverse_sd(rel.aux_cons[1].body)
self.assertEqual(ders[rel.z], 1)
self.assertEqual(ders[rel.aux_vars[2]], 1)
self.assertEqual(len(list(identify_variables(rel.aux_cons[1].body))), 2)
self.assertEqual(rel.aux_cons[2].lower, 0)
self.assertEqual(rel.aux_cons[2].upper, 0)
ders = reverse_sd(rel.aux_cons[2].body)
self.assertEqual(ders[rel.w], 1)
self.assertEqual(ders[rel.aux_vars[2]], -3)
self.assertEqual(len(list(identify_variables(rel.aux_cons[2].body))), 2)
self.assertTrue(hasattr(rel, 'relaxations'))
self.assertTrue(hasattr(rel.relaxations, 'rel0'))
self.assertTrue(isinstance(rel.relaxations.rel0, coramin.relaxations.PWMcCormickRelaxation))
self.assertIn(id(rel.x), {id(rel.relaxations.rel0._x), id(rel.relaxations.rel0._y)})
self.assertIn(id(rel.y), {id(rel.relaxations.rel0._x), id(rel.relaxations.rel0._y)})
self.assertEqual(id(rel.aux_vars[1]), id(rel.relaxations.rel0._w))
self.assertTrue(hasattr(rel.relaxations, 'rel1'))
self.assertTrue(isinstance(rel.relaxations.rel1, coramin.relaxations.PWUnivariateRelaxation))
self.assertEqual(id(rel.aux_vars[1]), id(rel.relaxations.rel1._x))
self.assertEqual(id(rel.aux_vars[2]), id(rel.relaxations.rel1._w))
self.assertTrue(rel.relaxations.rel1.is_convex())
self.assertFalse(rel.relaxations.rel1.is_concave())
self.assertFalse(hasattr(rel.relaxations, 'rel2'))
def test_log(self):
m = pe.ConcreteModel()
m.x = pe.Var(bounds=(1,2))
m.y = pe.Var(bounds=(1,2))
m.z = pe.Var()
m.w = pe.Var()
m.c = pe.Constraint(expr=pe.log(m.x*m.y) + m.z == 0)
m.c2 = pe.Constraint(expr=m.w - 3*pe.log(m.x*m.y) == 0)
rel = coramin.relaxations.relax(m)
self.assertTrue(hasattr(rel, 'aux_cons'))
self.assertTrue(hasattr(rel, 'aux_vars'))
self.assertEqual(len(rel.aux_cons), 2)
self.assertEqual(len(rel.aux_vars), 2)
self.assertAlmostEqual(rel.aux_vars[1].lb, 1)
self.assertAlmostEqual(rel.aux_vars[1].ub, 4)
self.assertAlmostEqual(rel.aux_vars[2].lb, math.log(1))
self.assertAlmostEqual(rel.aux_vars[2].ub, math.log(4))
self.assertEqual(rel.aux_cons[1].lower, 0)
self.assertEqual(rel.aux_cons[1].upper, 0)
ders = reverse_sd(rel.aux_cons[1].body)
self.assertEqual(ders[rel.z], 1)
self.assertEqual(ders[rel.aux_vars[2]], 1)
self.assertEqual(len(list(identify_variables(rel.aux_cons[1].body))), 2)
self.assertEqual(rel.aux_cons[2].lower, 0)
self.assertEqual(rel.aux_cons[2].upper, 0)
ders = reverse_sd(rel.aux_cons[2].body)
self.assertEqual(ders[rel.w], 1)
self.assertEqual(ders[rel.aux_vars[2]], -3)
self.assertEqual(len(list(identify_variables(rel.aux_cons[2].body))), 2)
self.assertTrue(hasattr(rel, 'relaxations'))
self.assertTrue(hasattr(rel.relaxations, 'rel0'))
self.assertTrue(isinstance(rel.relaxations.rel0, coramin.relaxations.PWMcCormickRelaxation))
self.assertIn(id(rel.x), {id(rel.relaxations.rel0._x), id(rel.relaxations.rel0._y)})
self.assertIn(id(rel.y), {id(rel.relaxations.rel0._x), id(rel.relaxations.rel0._y)})
self.assertEqual(id(rel.aux_vars[1]), id(rel.relaxations.rel0._w))
self.assertTrue(hasattr(rel.relaxations, 'rel1'))
self.assertTrue(isinstance(rel.relaxations.rel1, coramin.relaxations.PWUnivariateRelaxation))
self.assertEqual(id(rel.aux_vars[1]), id(rel.relaxations.rel1._x))
self.assertEqual(id(rel.aux_vars[2]), id(rel.relaxations.rel1._w))
self.assertFalse(rel.relaxations.rel1.is_convex())
self.assertTrue(rel.relaxations.rel1.is_concave())
self.assertFalse(hasattr(rel.relaxations, 'rel2'))
| 43.979472 | 102 | 0.630793 | 4,350 | 29,994 | 4.249195 | 0.025287 | 0.08667 | 0.078987 | 0.048204 | 0.971435 | 0.971435 | 0.96543 | 0.962184 | 0.95515 | 0.948875 | 0 | 0.029896 | 0.203741 | 29,994 | 681 | 103 | 44.044053 | 0.744044 | 0.009102 | 0 | 0.894831 | 0 | 0 | 0.017808 | 0 | 0 | 0 | 0 | 0 | 0.686275 | 1 | 0.026738 | false | 0 | 0.010695 | 0 | 0.039216 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
93c883235d0f9ee891fa5f6b27de93813ce61213 | 14,856 | py | Python | canvas_sdk/methods/group_categories.py | david-house-harvard/canvas_python_sdk | 9c2e59621a9a5667bc43c253ef801482b241a2c1 | [
"MIT"
] | 21 | 2015-06-12T13:49:04.000Z | 2021-11-08T05:37:44.000Z | canvas_sdk/methods/group_categories.py | david-house-harvard/canvas_python_sdk | 9c2e59621a9a5667bc43c253ef801482b241a2c1 | [
"MIT"
] | 44 | 2015-02-04T15:26:52.000Z | 2021-12-03T17:47:00.000Z | canvas_sdk/methods/group_categories.py | david-house-harvard/canvas_python_sdk | 9c2e59621a9a5667bc43c253ef801482b241a2c1 | [
"MIT"
] | 7 | 2015-07-20T23:56:03.000Z | 2021-02-23T17:13:00.000Z | from canvas_sdk import client, utils
def list_group_categories_for_context_accounts(request_ctx, account_id, per_page=None, **request_kwargs):
"""
Returns a list of group categories in a context
:param request_ctx: The request context
:type request_ctx: :class:RequestContext
:param account_id: (required) ID
:type account_id: string
:param per_page: (optional) Set how many results canvas should return, defaults to config.LIMIT_PER_PAGE
:type per_page: integer or None
:return: List group categories for a context
:rtype: requests.Response (with array data)
"""
if per_page is None:
per_page = request_ctx.per_page
path = '/v1/accounts/{account_id}/group_categories'
payload = {
'per_page' : per_page,
}
url = request_ctx.base_api_url + path.format(account_id=account_id)
response = client.get(request_ctx, url, payload=payload, **request_kwargs)
return response
def list_group_categories_for_context_courses(request_ctx, course_id, per_page=None, **request_kwargs):
"""
Returns a list of group categories in a context
:param request_ctx: The request context
:type request_ctx: :class:RequestContext
:param course_id: (required) ID
:type course_id: string
:param per_page: (optional) Set how many results canvas should return, defaults to config.LIMIT_PER_PAGE
:type per_page: integer or None
:return: List group categories for a context
:rtype: requests.Response (with array data)
"""
if per_page is None:
per_page = request_ctx.per_page
path = '/v1/courses/{course_id}/group_categories'
payload = {
'per_page' : per_page,
}
url = request_ctx.base_api_url + path.format(course_id=course_id)
response = client.get(request_ctx, url, payload=payload, **request_kwargs)
return response
def get_single_group_category(request_ctx, group_category_id, **request_kwargs):
"""
Returns the data for a single group category, or a 401 if the caller doesn't have
the rights to see it.
:param request_ctx: The request context
:type request_ctx: :class:RequestContext
:param group_category_id: (required) ID
:type group_category_id: string
:return: Get a single group category
:rtype: requests.Response (with GroupCategory data)
"""
path = '/v1/group_categories/{group_category_id}'
url = request_ctx.base_api_url + path.format(group_category_id=group_category_id)
response = client.get(request_ctx, url, **request_kwargs)
return response
def create_group_category_accounts(request_ctx, account_id, name, self_signup=None, auto_leader=None, group_limit=None, create_group_count=None, split_group_count=None, **request_kwargs):
"""
Create a new group category
:param request_ctx: The request context
:type request_ctx: :class:RequestContext
:param account_id: (required) ID
:type account_id: string
:param name: (required) Name of the group category
:type name: string
:param self_signup: (optional) Allow students to sign up for a group themselves (Course Only). valid values are: "enabled":: allows students to self sign up for any group in course "restricted":: allows students to self sign up only for groups in the same section null disallows self sign up
:type self_signup: string or None
:param auto_leader: (optional) Assigns group leaders automatically when generating and allocating students to groups Valid values are: "first":: the first student to be allocated to a group is the leader "random":: a random student from all members is chosen as the leader
:type auto_leader: string or None
:param group_limit: (optional) Limit the maximum number of users in each group (Course Only). Requires self signup.
:type group_limit: string or None
:param create_group_count: (optional) Create this number of groups (Course Only).
:type create_group_count: string or None
:param split_group_count: (optional) (Deprecated) Create this number of groups, and evenly distribute students among them. not allowed with "enable_self_signup". because the group assignment happens synchronously, it's recommended that you instead use the assign_unassigned_members endpoint. (Course Only)
:type split_group_count: string or None
:return: Create a Group Category
:rtype: requests.Response (with GroupCategory data)
"""
self_signup_types = ('enabled', 'restricted')
auto_leader_types = ('first', 'random')
utils.validate_attr_is_acceptable(self_signup, self_signup_types)
utils.validate_attr_is_acceptable(auto_leader, auto_leader_types)
path = '/v1/accounts/{account_id}/group_categories'
payload = {
'name' : name,
'self_signup' : self_signup,
'auto_leader' : auto_leader,
'group_limit' : group_limit,
'create_group_count' : create_group_count,
'split_group_count' : split_group_count,
}
url = request_ctx.base_api_url + path.format(account_id=account_id)
response = client.post(request_ctx, url, payload=payload, **request_kwargs)
return response
def create_group_category_courses(request_ctx, course_id, name, self_signup=None, auto_leader=None, group_limit=None, create_group_count=None, split_group_count=None, **request_kwargs):
"""
Create a new group category
:param request_ctx: The request context
:type request_ctx: :class:RequestContext
:param course_id: (required) ID
:type course_id: string
:param name: (required) Name of the group category
:type name: string
:param self_signup: (optional) Allow students to sign up for a group themselves (Course Only). valid values are: "enabled":: allows students to self sign up for any group in course "restricted":: allows students to self sign up only for groups in the same section null disallows self sign up
:type self_signup: string or None
:param auto_leader: (optional) Assigns group leaders automatically when generating and allocating students to groups Valid values are: "first":: the first student to be allocated to a group is the leader "random":: a random student from all members is chosen as the leader
:type auto_leader: string or None
:param group_limit: (optional) Limit the maximum number of users in each group (Course Only). Requires self signup.
:type group_limit: string or None
:param create_group_count: (optional) Create this number of groups (Course Only).
:type create_group_count: string or None
:param split_group_count: (optional) (Deprecated) Create this number of groups, and evenly distribute students among them. not allowed with "enable_self_signup". because the group assignment happens synchronously, it's recommended that you instead use the assign_unassigned_members endpoint. (Course Only)
:type split_group_count: string or None
:return: Create a Group Category
:rtype: requests.Response (with GroupCategory data)
"""
self_signup_types = ('enabled', 'restricted')
auto_leader_types = ('first', 'random')
utils.validate_attr_is_acceptable(self_signup, self_signup_types)
utils.validate_attr_is_acceptable(auto_leader, auto_leader_types)
path = '/v1/courses/{course_id}/group_categories'
payload = {
'name' : name,
'self_signup' : self_signup,
'auto_leader' : auto_leader,
'group_limit' : group_limit,
'create_group_count' : create_group_count,
'split_group_count' : split_group_count,
}
url = request_ctx.base_api_url + path.format(course_id=course_id)
response = client.post(request_ctx, url, payload=payload, **request_kwargs)
return response
def update_group_category(request_ctx, group_category_id, name, self_signup=None, auto_leader=None, group_limit=None, create_group_count=None, split_group_count=None, **request_kwargs):
"""
Modifies an existing group category.
:param request_ctx: The request context
:type request_ctx: :class:RequestContext
:param group_category_id: (required) ID
:type group_category_id: string
:param name: (required) Name of the group category
:type name: string
:param self_signup: (optional) Allow students to sign up for a group themselves (Course Only). Valid values are: "enabled":: allows students to self sign up for any group in course "restricted":: allows students to self sign up only for groups in the same section null disallows self sign up
:type self_signup: string or None
:param auto_leader: (optional) Assigns group leaders automatically when generating and allocating students to groups Valid values are: "first":: the first student to be allocated to a group is the leader "random":: a random student from all members is chosen as the leader
:type auto_leader: string or None
:param group_limit: (optional) Limit the maximum number of users in each group (Course Only). Requires self signup.
:type group_limit: string or None
:param create_group_count: (optional) Create this number of groups (Course Only).
:type create_group_count: string or None
:param split_group_count: (optional) (Deprecated) Create this number of groups, and evenly distribute students among them. not allowed with "enable_self_signup". because the group assignment happens synchronously, it's recommended that you instead use the assign_unassigned_members endpoint. (Course Only)
:type split_group_count: string or None
:return: Update a Group Category
:rtype: requests.Response (with GroupCategory data)
"""
self_signup_types = ('enabled', 'restricted')
auto_leader_types = ('first', 'random')
utils.validate_attr_is_acceptable(self_signup, self_signup_types)
utils.validate_attr_is_acceptable(auto_leader, auto_leader_types)
path = '/v1/group_categories/{group_category_id}'
payload = {
'name' : name,
'self_signup' : self_signup,
'auto_leader' : auto_leader,
'group_limit' : group_limit,
'create_group_count' : create_group_count,
'split_group_count' : split_group_count,
}
url = request_ctx.base_api_url + path.format(group_category_id=group_category_id)
response = client.put(request_ctx, url, payload=payload, **request_kwargs)
return response
def delete_group_category(request_ctx, group_category_id, **request_kwargs):
"""
Deletes a group category and all groups under it. Protected group
categories can not be deleted, i.e. "communities", "student_organized", and "imported".
:param request_ctx: The request context
:type request_ctx: :class:RequestContext
:param group_category_id: (required) ID
:type group_category_id: string
:return: Delete a Group Category
:rtype: requests.Response (with void data)
"""
path = '/v1/group_categories/{group_category_id}'
url = request_ctx.base_api_url + path.format(group_category_id=group_category_id)
response = client.delete(request_ctx, url, **request_kwargs)
return response
def list_groups_in_group_category(request_ctx, group_category_id, per_page=None, **request_kwargs):
"""
Returns a list of groups in a group category
:param request_ctx: The request context
:type request_ctx: :class:RequestContext
:param group_category_id: (required) ID
:type group_category_id: string
:param per_page: (optional) Set how many results canvas should return, defaults to config.LIMIT_PER_PAGE
:type per_page: integer or None
:return: List groups in group category
:rtype: requests.Response (with array data)
"""
if per_page is None:
per_page = request_ctx.per_page
path = '/v1/group_categories/{group_category_id}/groups'
payload = {
'per_page' : per_page,
}
url = request_ctx.base_api_url + path.format(group_category_id=group_category_id)
response = client.get(request_ctx, url, payload=payload, **request_kwargs)
return response
def list_users_in_group_category(request_ctx, group_category_id, search_term=None, unassigned=None, per_page=None, **request_kwargs):
"""
Returns a list of users in the group category.
:param request_ctx: The request context
:type request_ctx: :class:RequestContext
:param group_category_id: (required) ID
:type group_category_id: string
:param search_term: (optional) The partial name or full ID of the users to match and return in the results list. Must be at least 3 characters.
:type search_term: string or None
:param unassigned: (optional) Set this value to true if you wish only to search unassigned users in the group category.
:type unassigned: boolean or None
:param per_page: (optional) Set how many results canvas should return, defaults to config.LIMIT_PER_PAGE
:type per_page: integer or None
:return: List users in group category
:rtype: requests.Response (with array data)
"""
if per_page is None:
per_page = request_ctx.per_page
path = '/v1/group_categories/{group_category_id}/users'
payload = {
'search_term' : search_term,
'unassigned' : unassigned,
'per_page' : per_page,
}
url = request_ctx.base_api_url + path.format(group_category_id=group_category_id)
response = client.get(request_ctx, url, payload=payload, **request_kwargs)
return response
def assign_unassigned_members(request_ctx, group_category_id, sync=None, **request_kwargs):
"""
Assign all unassigned members as evenly as possible among the existing
student groups.
:param request_ctx: The request context
:type request_ctx: :class:RequestContext
:param group_category_id: (required) ID
:type group_category_id: string
:param sync: (optional) The assigning is done asynchronously by default. If you would like to override this and have the assigning done synchronously, set this value to true.
:type sync: boolean or None
:return: Assign unassigned members
:rtype: requests.Response (with GroupMembership | Progress data)
"""
path = '/v1/group_categories/{group_category_id}/assign_unassigned_members'
payload = {
'sync' : sync,
}
url = request_ctx.base_api_url + path.format(group_category_id=group_category_id)
response = client.post(request_ctx, url, payload=payload, **request_kwargs)
return response
| 46.864353 | 313 | 0.714795 | 2,010 | 14,856 | 5.060697 | 0.095522 | 0.077959 | 0.053087 | 0.021726 | 0.903559 | 0.882717 | 0.87672 | 0.866398 | 0.833366 | 0.819603 | 0 | 0.001191 | 0.208939 | 14,856 | 316 | 314 | 47.012658 | 0.864364 | 0.569265 | 0 | 0.75 | 0 | 0 | 0.138313 | 0.076591 | 0 | 0 | 0 | 0 | 0 | 1 | 0.089286 | false | 0 | 0.008929 | 0 | 0.1875 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
199cc5b11ef27e8406e77f7bf2050e380d405fd3 | 112,585 | py | Python | sdk/storage/azure-storage-blob/tests/test_blob_access_conditions.py | praveenkuttappan/azure-sdk-for-python | 4b79413667b7539750a6c7dde15737013a3d4bd5 | [
"MIT"
] | 2,728 | 2015-01-09T10:19:32.000Z | 2022-03-31T14:50:33.000Z | sdk/storage/azure-storage-blob/tests/test_blob_access_conditions.py | v-xuto/azure-sdk-for-python | 9c6296d22094c5ede410bc83749e8df8694ccacc | [
"MIT"
] | 17,773 | 2015-01-05T15:57:17.000Z | 2022-03-31T23:50:25.000Z | sdk/storage/azure-storage-blob/tests/test_blob_access_conditions.py | v-xuto/azure-sdk-for-python | 9c6296d22094c5ede410bc83749e8df8694ccacc | [
"MIT"
] | 1,916 | 2015-01-19T05:05:41.000Z | 2022-03-31T19:36:44.000Z | # coding: utf-8
# -------------------------------------------------------------------------
# Copyright (c) Microsoft Corporation. All rights reserved.
# Licensed under the MIT License. See License.txt in the project root for
# license information.
# --------------------------------------------------------------------------
import pytest
from datetime import datetime, timedelta
import unittest
from azure.core import MatchConditions
from azure.core.exceptions import HttpResponseError, ResourceNotFoundError, ResourceModifiedError
from azure.storage.blob import (
BlobServiceClient,
BlobClient,
BlobLeaseClient,
StorageErrorCode,
BlobBlock,
BlobType,
ContentSettings,
BlobProperties,
ContainerSasPermissions,
AccessPolicy,
generate_blob_sas,
BlobSasPermissions,
generate_account_sas,
ResourceTypes,
AccountSasPermissions, generate_container_sas, ContainerClient, CustomerProvidedEncryptionKey,
)
from fake_credentials import CPK_KEY_HASH, CPK_KEY_VALUE
from settings.testcase import BlobPreparer
from devtools_testutils.storage import StorageTestCase
# ------------------------------------------------------------------------------
LARGE_APPEND_BLOB_SIZE = 64 * 1024
# ------------------------------------------------------------------------------
class StorageBlobAccessConditionsTest(StorageTestCase):
# --Helpers-----------------------------------------------------------------
def _setup(self):
self.container_name = self.get_resource_name('utcontainer')
def _create_container(self, container_name, bsc):
container = bsc.get_container_client(container_name)
container.create_container()
return container
def _create_container_and_block_blob(self, container_name, blob_name,
blob_data, bsc):
container = self._create_container(container_name, bsc)
blob = bsc.get_blob_client(container_name, blob_name)
resp = blob.upload_blob(blob_data, length=len(blob_data))
self.assertIsNotNone(resp.get('etag'))
return container, blob
def _create_container_and_page_blob(self, container_name, blob_name,
content_length, bsc):
container = self._create_container(container_name, bsc)
blob = bsc.get_blob_client(container_name, blob_name)
resp = blob.create_page_blob(str(content_length))
return container, blob
def _create_container_and_append_blob(self, container_name, blob_name, bsc):
container = self._create_container(container_name, bsc)
blob = bsc.get_blob_client(container_name, blob_name)
resp = blob.create_append_blob()
return container, blob
# --Test cases for blob service --------------------------------------------
@BlobPreparer()
def test_get_blob_service_client_from_container(
self, storage_account_name, storage_account_key):
bsc1 = BlobServiceClient(
self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
container_client1 = self._create_container(self.container_name, bsc1)
container_client1.get_container_properties()
test_datetime = (datetime.utcnow() - timedelta(minutes=15))
# Act
metadata = {'hello': 'world', 'number': '43'}
# Set metadata to check against later
container_client1.set_container_metadata(metadata, if_modified_since=test_datetime)
# Assert metadata is set
cc1_md1 = container_client1.get_container_properties().metadata
self.assertDictEqual(metadata, cc1_md1)
# Get blob service client from container client
bsc_props1 = bsc1.get_service_properties()
bsc2 = container_client1._get_blob_service_client()
bsc_props2 = bsc2.get_service_properties()
self.assertDictEqual(bsc_props1, bsc_props2)
# Return to container and assert its properties
container_client2 = bsc2.get_container_client(self.container_name)
cc2_md1 = container_client2.get_container_properties().metadata
self.assertDictEqual(cc2_md1, cc1_md1)
@BlobPreparer()
def test_get_container_client_from_blob(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(
self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
container_client1 = self._create_container(self.container_name, bsc)
test_datetime = (datetime.utcnow() - timedelta(minutes=15))
# Act
metadata = {'hello': 'world', 'number': '43'}
# Set metadata to check against later
container_client1.set_container_metadata(metadata, if_modified_since=test_datetime)
# Assert metadata is set
md1 = container_client1.get_container_properties().metadata
self.assertDictEqual(metadata, md1)
# Create a blob from container_client1
blob_name = self.get_resource_name("testblob1")
blob_client1 = container_client1.get_blob_client(blob_name)
# Upload data to blob and get container_client again
blob_client1.upload_blob(b"this is test data")
blob_client1_data = blob_client1.download_blob().readall()
container_client2 = blob_client1._get_container_client()
md2 = container_client2.get_container_properties().metadata
self.assertEqual(md1, md2)
# Ensure we can get blob client again
blob_client2 = container_client2.get_blob_client(blob_name)
blob_client2_data = blob_client2.download_blob().readall()
self.assertEqual(blob_client1_data, blob_client2_data)
@BlobPreparer()
def test_set_container_metadata_with_if_modified(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
container = self._create_container(self.container_name, bsc)
test_datetime = (datetime.utcnow() -
timedelta(minutes=15))
# Act
metadata = {'hello': 'world', 'number': '43'}
container.set_container_metadata(metadata, if_modified_since=test_datetime)
# Assert
md = container.get_container_properties().metadata
self.assertDictEqual(metadata, md)
@BlobPreparer()
def test_set_container_metadata_with_if_modified_fail(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
container = self._create_container(self.container_name, bsc)
test_datetime = (datetime.utcnow() +
timedelta(minutes=15))
# Act
with self.assertRaises(ResourceModifiedError) as e:
metadata = {'hello': 'world', 'number': '43'}
container.set_container_metadata(metadata, if_modified_since=test_datetime)
# Assert
self.assertEqual(StorageErrorCode.condition_not_met, e.exception.error_code)
@BlobPreparer()
def test_set_container_acl_with_if_modified(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
container = self._create_container(self.container_name, bsc)
test_datetime = (datetime.utcnow() -
timedelta(minutes=15))
# Act
access_policy = AccessPolicy(permission=ContainerSasPermissions(read=True),
expiry=datetime.utcnow() + timedelta(hours=1),
start=datetime.utcnow())
signed_identifiers = {'testid': access_policy}
container.set_container_access_policy(signed_identifiers, if_modified_since=test_datetime)
# Assert
acl = container.get_container_access_policy()
self.assertIsNotNone(acl)
@BlobPreparer()
def test_set_container_acl_with_if_modified_fail(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
container = self._create_container(self.container_name, bsc)
test_datetime = (datetime.utcnow() +
timedelta(minutes=15))
# Act
access_policy = AccessPolicy(permission=ContainerSasPermissions(read=True),
expiry=datetime.utcnow() + timedelta(hours=1),
start=datetime.utcnow())
signed_identifiers = {'testid': access_policy}
with self.assertRaises(ResourceModifiedError) as e:
container.set_container_access_policy(signed_identifiers, if_modified_since=test_datetime)
# Assert
self.assertEqual(StorageErrorCode.condition_not_met, e.exception.error_code)
@BlobPreparer()
def test_set_container_acl_with_if_unmodified(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
container = self._create_container(self.container_name, bsc)
test_datetime = (datetime.utcnow() +
timedelta(minutes=15))
# Act
access_policy = AccessPolicy(permission=ContainerSasPermissions(read=True),
expiry=datetime.utcnow() + timedelta(hours=1),
start=datetime.utcnow())
signed_identifiers = {'testid': access_policy}
container.set_container_access_policy(signed_identifiers, if_unmodified_since=test_datetime)
# Assert
acl = container.get_container_access_policy()
self.assertIsNotNone(acl)
@BlobPreparer()
def test_set_container_acl_with_if_unmodified_fail(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
container = self._create_container(self.container_name, bsc)
test_datetime = (datetime.utcnow() -
timedelta(minutes=15))
# Act
access_policy = AccessPolicy(permission=ContainerSasPermissions(read=True),
expiry=datetime.utcnow() + timedelta(hours=1),
start=datetime.utcnow())
signed_identifiers = {'testid': access_policy}
with self.assertRaises(ResourceModifiedError) as e:
container.set_container_access_policy(signed_identifiers, if_unmodified_since=test_datetime)
# Assert
self.assertEqual(StorageErrorCode.condition_not_met, e.exception.error_code)
@BlobPreparer()
def test_lease_container_acquire_with_if_modified(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
container = self._create_container(self.container_name, bsc)
test_datetime = (datetime.utcnow() -
timedelta(minutes=15))
# Act
lease = container.acquire_lease(if_modified_since=test_datetime)
lease.break_lease()
# Assert
@BlobPreparer()
def test_lease_container_acquire_with_if_modified_fail(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
container = self._create_container(self.container_name, bsc)
test_datetime = (datetime.utcnow() +
timedelta(minutes=15))
# Act
with self.assertRaises(ResourceModifiedError) as e:
container.acquire_lease(if_modified_since=test_datetime)
# Assert
self.assertEqual(StorageErrorCode.condition_not_met, e.exception.error_code)
@BlobPreparer()
def test_lease_container_acquire_with_if_unmodified(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
container = self._create_container(self.container_name, bsc)
test_datetime = (datetime.utcnow() +
timedelta(minutes=15))
# Act
lease = container.acquire_lease(if_unmodified_since=test_datetime)
lease.break_lease()
# Assert
@BlobPreparer()
def test_lease_container_acquire_with_if_unmodified_fail(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
container = self._create_container(self.container_name, bsc)
test_datetime = (datetime.utcnow() -
timedelta(minutes=15))
# Act
with self.assertRaises(ResourceModifiedError) as e:
container.acquire_lease(if_unmodified_since=test_datetime)
# Assert
self.assertEqual(StorageErrorCode.condition_not_met, e.exception.error_code)
@BlobPreparer()
def test_delete_container_with_if_modified(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
container = self._create_container(self.container_name, bsc)
test_datetime = (datetime.utcnow() -
timedelta(minutes=15))
# Act
deleted = container.delete_container(if_modified_since=test_datetime)
# Assert
self.assertIsNone(deleted)
with self.assertRaises(ResourceNotFoundError):
container.get_container_properties()
@BlobPreparer()
def test_delete_container_with_if_modified_fail(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
container = self._create_container(self.container_name, bsc)
test_datetime = (datetime.utcnow() +
timedelta(minutes=15))
# Act
with self.assertRaises(ResourceModifiedError) as e:
container.delete_container(if_modified_since=test_datetime)
# Assert
self.assertEqual(StorageErrorCode.condition_not_met, e.exception.error_code)
@BlobPreparer()
def test_delete_container_with_if_unmodified(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
container = self._create_container(self.container_name, bsc)
test_datetime = (datetime.utcnow() +
timedelta(minutes=15))
# Act
container.delete_container(if_unmodified_since=test_datetime)
# Assert
with self.assertRaises(ResourceNotFoundError):
container.get_container_properties()
@BlobPreparer()
def test_delete_container_with_if_unmodified_fail(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
container = self._create_container(self.container_name, bsc)
test_datetime = (datetime.utcnow() -
timedelta(minutes=15))
# Act
with self.assertRaises(ResourceModifiedError) as e:
container.delete_container(if_unmodified_since=test_datetime)
self.assertEqual(StorageErrorCode.condition_not_met, e.exception.error_code)
@BlobPreparer()
def test_multi_put_block_contains_headers(self, storage_account_name, storage_account_key):
counter = list()
def _validate_headers(request):
counter.append(request)
header = request.http_request.headers.get('x-custom-header')
self.assertEqual(header, 'test_value')
bsc = BlobServiceClient(
self.account_url(storage_account_name, "blob"), storage_account_key, max_single_put_size=100, max_block_size=50)
self._setup()
data = self.get_random_bytes(2 * 100)
self._create_container(self.container_name, bsc)
blob = bsc.get_blob_client(self.container_name, "blob1")
blob.upload_blob(
data,
headers={'x-custom-header': 'test_value'},
raw_request_hook=_validate_headers
)
self.assertEqual(len(counter), 5)
@BlobPreparer()
def test_put_blob_with_if_modified(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
data = b'hello world'
container, blob = self._create_container_and_block_blob(
self.container_name, 'blob1', data, bsc)
test_datetime = (datetime.utcnow() -
timedelta(minutes=15))
# Act
resp = blob.upload_blob(data, length=len(data), if_modified_since=test_datetime)
# Assert
self.assertIsNotNone(resp.get('etag'))
@BlobPreparer()
def test_put_blob_with_if_modified_fail(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
data = b'hello world'
container, blob = self._create_container_and_block_blob(
self.container_name, 'blob1', data, bsc)
test_datetime = (datetime.utcnow() +
timedelta(minutes=15))
# Act
with self.assertRaises(ResourceModifiedError) as e:
blob.upload_blob(data, length=len(data), if_modified_since=test_datetime, overwrite=True)
# Assert
self.assertEqual(StorageErrorCode.condition_not_met, e.exception.error_code)
@BlobPreparer()
def test_put_blob_with_if_unmodified(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
data = b'hello world'
container, blob = self._create_container_and_block_blob(
self.container_name, 'blob1', data, bsc)
test_datetime = (datetime.utcnow() +
timedelta(minutes=15))
# Act
resp = blob.upload_blob(data, length=len(data), if_unmodified_since=test_datetime)
# Assert
self.assertIsNotNone(resp.get('etag'))
@BlobPreparer()
def test_put_blob_with_if_unmodified_fail(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
data = b'hello world'
container, blob = self._create_container_and_block_blob(
self.container_name, 'blob1', data, bsc)
test_datetime = (datetime.utcnow() -
timedelta(minutes=15))
# Act
with self.assertRaises(ResourceModifiedError) as e:
blob.upload_blob(data, length=len(data), if_unmodified_since=test_datetime, overwrite=True)
# Assert
self.assertEqual(StorageErrorCode.condition_not_met, e.exception.error_code)
@BlobPreparer()
def test_put_blob_with_if_match(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
data = b'hello world'
container, blob = self._create_container_and_block_blob(
self.container_name, 'blob1', data, bsc)
etag = blob.get_blob_properties().etag
# Act
resp = blob.upload_blob(data, length=len(data), etag=etag, match_condition=MatchConditions.IfNotModified)
# Assert
self.assertIsNotNone(resp.get('etag'))
with self.assertRaises(ValueError):
blob.upload_blob(data, length=len(data), etag=etag)
with self.assertRaises(ValueError):
blob.upload_blob(data, length=len(data), match_condition=MatchConditions.IfNotModified)
@BlobPreparer()
def test_put_blob_with_if_match_fail(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
data = b'hello world'
container, blob = self._create_container_and_block_blob(
self.container_name, 'blob1', data, bsc)
# Act
with self.assertRaises(ResourceModifiedError) as e:
blob.upload_blob(
data,
length=len(data),
etag='0x111111111111111',
match_condition=MatchConditions.IfNotModified,
overwrite=True)
# Assert
self.assertEqual(StorageErrorCode.condition_not_met, e.exception.error_code)
@BlobPreparer()
def test_put_blob_with_if_none_match(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
data = b'hello world'
container, blob = self._create_container_and_block_blob(
self.container_name, 'blob1', data, bsc)
# Act
resp = blob.upload_blob(data, length=len(data), etag='0x111111111111111', match_condition=MatchConditions.IfModified)
# Assert
self.assertIsNotNone(resp.get('etag'))
with self.assertRaises(ValueError):
blob.upload_blob(data, length=len(data), etag='0x111111111111111')
with self.assertRaises(ValueError):
blob.upload_blob(data, length=len(data), match_condition=MatchConditions.IfModified)
@BlobPreparer()
def test_put_blob_with_if_none_match_fail(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
data = b'hello world'
container, blob = self._create_container_and_block_blob(
self.container_name, 'blob1', data, bsc)
etag = blob.get_blob_properties().etag
# Act
with self.assertRaises(ResourceModifiedError) as e:
blob.upload_blob(data, length=len(data), etag=etag, match_condition=MatchConditions.IfModified, overwrite=True)
# Assert
self.assertEqual(StorageErrorCode.condition_not_met, e.exception.error_code)
@BlobPreparer()
def test_get_blob_with_if_modified(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
container, blob = self._create_container_and_block_blob(
self.container_name, 'blob1', b'hello world', bsc)
test_datetime = (datetime.utcnow() -
timedelta(minutes=15))
# Act
content = blob.download_blob(if_modified_since=test_datetime).readall()
# Assert
self.assertEqual(content, b'hello world')
@BlobPreparer()
def test_get_blob_with_if_modified_fail(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
container, blob = self._create_container_and_block_blob(
self.container_name, 'blob1', b'hello world', bsc)
test_datetime = (datetime.utcnow() +
timedelta(minutes=15))
# Act
with self.assertRaises(ResourceModifiedError) as e:
blob.download_blob(if_modified_since=test_datetime)
# Assert
self.assertEqual(StorageErrorCode.condition_not_met, e.exception.error_code)
@BlobPreparer()
def test_get_blob_with_if_unmodified(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
container, blob = self._create_container_and_block_blob(
self.container_name, 'blob1', b'hello world', bsc)
test_datetime = (datetime.utcnow() +
timedelta(minutes=15))
# Act
content = blob.download_blob(if_unmodified_since=test_datetime).readall()
# Assert
self.assertEqual(content, b'hello world')
@BlobPreparer()
def test_get_blob_with_if_unmodified_fail(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
container, blob = self._create_container_and_block_blob(
self.container_name, 'blob1', b'hello world', bsc)
test_datetime = (datetime.utcnow() -
timedelta(minutes=15))
# Act
with self.assertRaises(ResourceModifiedError) as e:
blob.download_blob(if_unmodified_since=test_datetime)
# Assert
self.assertEqual(StorageErrorCode.condition_not_met, e.exception.error_code)
@BlobPreparer()
def test_get_blob_with_if_match(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
container, blob = self._create_container_and_block_blob(
self.container_name, 'blob1', b'hello world', bsc)
etag = blob.get_blob_properties().etag
# Act
content = blob.download_blob(etag=etag, match_condition=MatchConditions.IfNotModified).readall()
# Assert
self.assertEqual(content, b'hello world')
@BlobPreparer()
def test_get_blob_with_if_match_fail(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
container, blob = self._create_container_and_block_blob(
self.container_name, 'blob1', b'hello world', bsc)
# Act
with self.assertRaises(ResourceModifiedError) as e:
blob.download_blob(etag='0x111111111111111', match_condition=MatchConditions.IfNotModified)
# Assert
self.assertEqual(StorageErrorCode.condition_not_met, e.exception.error_code)
@BlobPreparer()
def test_get_blob_with_if_none_match(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
container, blob = self._create_container_and_block_blob(
self.container_name, 'blob1', b'hello world', bsc)
# Act
content = blob.download_blob(etag='0x111111111111111', match_condition=MatchConditions.IfModified).readall()
# Assert
self.assertEqual(content, b'hello world')
@BlobPreparer()
def test_get_blob_with_if_none_match_fail(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
container, blob = self._create_container_and_block_blob(
self.container_name, 'blob1', b'hello world', bsc)
etag = blob.get_blob_properties().etag
# Act
with self.assertRaises(ResourceModifiedError) as e:
blob.download_blob(etag=etag, match_condition=MatchConditions.IfModified)
# Assert
self.assertEqual(StorageErrorCode.condition_not_met, e.exception.error_code)
@BlobPreparer()
def test_set_blob_properties_with_if_modified(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
self._create_container_and_block_blob(
self.container_name, 'blob1', b'hello world', bsc)
test_datetime = (datetime.utcnow() -
timedelta(minutes=15))
# Act
content_settings = ContentSettings(
content_language='spanish',
content_disposition='inline')
blob = bsc.get_blob_client(self.container_name, 'blob1')
blob.set_http_headers(content_settings, if_modified_since=test_datetime)
# Assert
properties = blob.get_blob_properties()
self.assertEqual(content_settings.content_language, properties.content_settings.content_language)
self.assertEqual(content_settings.content_disposition, properties.content_settings.content_disposition)
@BlobPreparer()
def test_set_blob_properties_with_if_modified_fail(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
self._create_container_and_block_blob(
self.container_name, 'blob1', b'hello world', bsc)
test_datetime = (datetime.utcnow() +
timedelta(minutes=15))
# Act
with self.assertRaises(ResourceModifiedError) as e:
content_settings = ContentSettings(
content_language='spanish',
content_disposition='inline')
blob = bsc.get_blob_client(self.container_name, 'blob1')
blob.set_http_headers(content_settings, if_modified_since=test_datetime)
# Assert
self.assertEqual(StorageErrorCode.condition_not_met, e.exception.error_code)
@BlobPreparer()
def test_set_blob_properties_with_if_unmodified(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
self._create_container_and_block_blob(
self.container_name, 'blob1', b'hello world', bsc)
test_datetime = (datetime.utcnow() +
timedelta(minutes=15))
# Act
content_settings = ContentSettings(
content_language='spanish',
content_disposition='inline')
blob = bsc.get_blob_client(self.container_name, 'blob1')
blob.set_http_headers(content_settings, if_unmodified_since=test_datetime)
# Assert
properties = blob.get_blob_properties()
self.assertEqual(content_settings.content_language, properties.content_settings.content_language)
self.assertEqual(content_settings.content_disposition, properties.content_settings.content_disposition)
@BlobPreparer()
def test_set_blob_properties_with_if_unmodified_fail(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
self._create_container_and_block_blob(
self.container_name, 'blob1', b'hello world', bsc)
test_datetime = (datetime.utcnow() -
timedelta(minutes=15))
# Act
with self.assertRaises(ResourceModifiedError) as e:
content_settings = ContentSettings(
content_language='spanish',
content_disposition='inline')
blob = bsc.get_blob_client(self.container_name, 'blob1')
blob.set_http_headers(content_settings, if_unmodified_since=test_datetime)
# Assert
self.assertEqual(StorageErrorCode.condition_not_met, e.exception.error_code)
@pytest.mark.playback_test_only
@BlobPreparer()
def test_get_properties_last_access_time(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key,
connection_data_block_size=4 * 1024)
self._setup()
self._create_container_and_block_blob(self.container_name, 'blob1', b'hello world', bsc)
blob = bsc.get_blob_client(self.container_name, 'blob1')
# Assert
lat = blob.get_blob_properties().last_accessed_on
blob.stage_block(block_id='1', data="this is test content")
blob.commit_block_list(['1'])
new_lat = blob.get_blob_properties().last_accessed_on
self.assertIsInstance(lat, datetime)
self.assertIsInstance(new_lat, datetime)
self.assertGreater(new_lat, lat)
self.assertIsInstance(blob.download_blob().properties.last_accessed_on, datetime)
@BlobPreparer()
def test_set_blob_properties_with_if_match(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
self._create_container_and_block_blob(
self.container_name, 'blob1', b'hello world', bsc)
blob = bsc.get_blob_client(self.container_name, 'blob1')
etag = blob.get_blob_properties().etag
# Act
content_settings = ContentSettings(
content_language='spanish',
content_disposition='inline')
blob.set_http_headers(content_settings, etag=etag, match_condition=MatchConditions.IfNotModified)
# Assert
properties = blob.get_blob_properties()
self.assertEqual(content_settings.content_language, properties.content_settings.content_language)
self.assertEqual(content_settings.content_disposition, properties.content_settings.content_disposition)
@BlobPreparer()
def test_set_blob_properties_with_if_match_fail(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
self._create_container_and_block_blob(
self.container_name, 'blob1', b'hello world', bsc)
# Act
with self.assertRaises(ResourceModifiedError) as e:
content_settings = ContentSettings(
content_language='spanish',
content_disposition='inline')
blob = bsc.get_blob_client(self.container_name, 'blob1')
blob.set_http_headers(content_settings, etag='0x111111111111111', match_condition=MatchConditions.IfNotModified)
# Assert
self.assertEqual(StorageErrorCode.condition_not_met, e.exception.error_code)
@BlobPreparer()
def test_set_blob_properties_with_if_none_match(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
self._create_container_and_block_blob(
self.container_name, 'blob1', b'hello world', bsc)
# Act
content_settings = ContentSettings(
content_language='spanish',
content_disposition='inline')
blob = bsc.get_blob_client(self.container_name, 'blob1')
blob.set_http_headers(content_settings, etag='0x111111111111111', match_condition=MatchConditions.IfModified)
# Assert
properties = blob.get_blob_properties()
self.assertEqual(content_settings.content_language, properties.content_settings.content_language)
self.assertEqual(content_settings.content_disposition, properties.content_settings.content_disposition)
@BlobPreparer()
def test_set_blob_properties_with_if_none_match_fail(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
self._create_container_and_block_blob(
self.container_name, 'blob1', b'hello world', bsc)
blob = bsc.get_blob_client(self.container_name, 'blob1')
etag = blob.get_blob_properties().etag
# Act
with self.assertRaises(ResourceModifiedError) as e:
content_settings = ContentSettings(
content_language='spanish',
content_disposition='inline')
blob.set_http_headers(content_settings, etag=etag, match_condition=MatchConditions.IfModified)
# Assert
self.assertEqual(StorageErrorCode.condition_not_met, e.exception.error_code)
@BlobPreparer()
def test_get_blob_properties_with_if_modified(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
self._create_container_and_block_blob(
self.container_name, 'blob1', b'hello world', bsc)
test_datetime = (datetime.utcnow() -
timedelta(minutes=15))
# Act
blob = bsc.get_blob_client(self.container_name, 'blob1')
properties = blob.get_blob_properties(if_modified_since=test_datetime)
# Assert
self.assertIsInstance(properties, BlobProperties)
self.assertEqual(properties.blob_type.value, 'BlockBlob')
self.assertEqual(properties.size, 11)
self.assertEqual(properties.lease.status, 'unlocked')
@pytest.mark.playback_test_only
@BlobPreparer()
def test_if_blob_exists(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
self._create_container_and_block_blob(
self.container_name, 'blob1', b'hello world', bsc)
# Act
blob = bsc.get_blob_client(self.container_name, 'blob1')
old_blob_version_id = blob.get_blob_properties().get("version_id")
self.assertIsNotNone(old_blob_version_id)
blob.stage_block(block_id='1', data="this is test content")
blob.commit_block_list(['1'])
new_blob_version_id = blob.get_blob_properties().get("version_id")
# Assert
self.assertEqual(blob.exists(version_id=old_blob_version_id), True)
self.assertEqual(blob.exists(version_id=new_blob_version_id), True)
self.assertEqual(blob.exists(version_id="2020-08-21T21:24:15.3585832Z"), False)
# Act
test_snapshot = blob.create_snapshot()
blob_snapshot = bsc.get_blob_client(self.container_name, 'blob1', snapshot=test_snapshot)
self.assertEqual(blob_snapshot.exists(), True)
blob.stage_block(block_id='1', data="this is additional test content")
blob.commit_block_list(['1'])
# Assert
self.assertEqual(blob_snapshot.exists(), True)
self.assertEqual(blob.exists(), True)
@BlobPreparer()
def test_if_blob_with_cpk_exists(self, storage_account_name, storage_account_key):
container_name = self.get_resource_name("testcontainer1")
cc = ContainerClient(
self.account_url(storage_account_name, "blob"), credential=storage_account_key, container_name=container_name,
connection_data_block_size=4 * 1024)
cc.create_container()
self._setup()
test_cpk = CustomerProvidedEncryptionKey(key_value=CPK_KEY_VALUE, key_hash=CPK_KEY_HASH)
blob_client = cc.get_blob_client("test_blob")
blob_client.upload_blob(b"hello world", cpk=test_cpk)
# Act
self.assertTrue(blob_client.exists())
@BlobPreparer()
def test_get_blob_properties_with_if_modified_fail(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
self._create_container_and_block_blob(
self.container_name, 'blob1', b'hello world', bsc)
test_datetime = (datetime.utcnow() +
timedelta(minutes=15))
# Act
with self.assertRaises(ResourceModifiedError) as e:
blob = bsc.get_blob_client(self.container_name, 'blob1')
blob.get_blob_properties(if_modified_since=test_datetime)
# Assert
self.assertEqual(StorageErrorCode.condition_not_met, e.exception.error_code)
@BlobPreparer()
def test_get_blob_properties_with_if_unmodified(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
self._create_container_and_block_blob(
self.container_name, 'blob1', b'hello world', bsc)
test_datetime = (datetime.utcnow() +
timedelta(minutes=15))
# Act
blob = bsc.get_blob_client(self.container_name, 'blob1')
properties = blob.get_blob_properties(if_unmodified_since=test_datetime)
# Assert
self.assertIsNotNone(properties)
self.assertEqual(properties.blob_type.value, 'BlockBlob')
self.assertEqual(properties.size, 11)
self.assertEqual(properties.lease.status, 'unlocked')
@BlobPreparer()
def test_get_blob_properties_with_if_unmodified_fail(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
self._create_container_and_block_blob(
self.container_name, 'blob1', b'hello world', bsc)
test_datetime = (datetime.utcnow() -
timedelta(minutes=15))
# Act
with self.assertRaises(ResourceModifiedError) as e:
blob = bsc.get_blob_client(self.container_name, 'blob1')
blob.get_blob_properties(if_unmodified_since=test_datetime)
# Assert
self.assertEqual(StorageErrorCode.condition_not_met, e.exception.error_code)
@BlobPreparer()
def test_get_blob_properties_with_if_match(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
self._create_container_and_block_blob(
self.container_name, 'blob1', b'hello world', bsc)
blob = bsc.get_blob_client(self.container_name, 'blob1')
etag = blob.get_blob_properties().etag
# Act
properties = blob.get_blob_properties(etag=etag, match_condition=MatchConditions.IfNotModified)
# Assert
self.assertIsNotNone(properties)
self.assertEqual(properties.blob_type.value, 'BlockBlob')
self.assertEqual(properties.size, 11)
self.assertEqual(properties.lease.status, 'unlocked')
@BlobPreparer()
def test_get_blob_properties_with_if_match_fail(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
self._create_container_and_block_blob(
self.container_name, 'blob1', b'hello world', bsc)
# Act
with self.assertRaises(ResourceModifiedError) as e:
blob = bsc.get_blob_client(self.container_name, 'blob1')
blob.get_blob_properties(etag='0x111111111111111', match_condition=MatchConditions.IfNotModified)
# Assert
self.assertEqual(StorageErrorCode.condition_not_met, e.exception.error_code)
@BlobPreparer()
def test_get_blob_properties_with_if_none_match(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
self._create_container_and_block_blob(
self.container_name, 'blob1', b'hello world', bsc)
# Act
blob = bsc.get_blob_client(self.container_name, 'blob1')
properties = blob.get_blob_properties(etag='0x111111111111111', match_condition=MatchConditions.IfModified)
# Assert
self.assertIsNotNone(properties)
self.assertEqual(properties.blob_type.value, 'BlockBlob')
self.assertEqual(properties.size, 11)
self.assertEqual(properties.lease.status, 'unlocked')
@BlobPreparer()
def test_get_blob_properties_with_if_none_match_fail(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
self._create_container_and_block_blob(
self.container_name, 'blob1', b'hello world', bsc)
blob = bsc.get_blob_client(self.container_name, 'blob1')
etag = blob.get_blob_properties().etag
# Act
with self.assertRaises(ResourceModifiedError) as e:
blob.get_blob_properties(etag=etag, match_condition=MatchConditions.IfModified)
# Assert
self.assertEqual(StorageErrorCode.condition_not_met, e.exception.error_code)
@BlobPreparer()
def test_get_blob_metadata_with_if_modified(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
self._create_container_and_block_blob(
self.container_name, 'blob1', b'hello world', bsc)
test_datetime = (datetime.utcnow() -
timedelta(minutes=15))
# Act
blob = bsc.get_blob_client(self.container_name, 'blob1')
md = blob.get_blob_properties(if_modified_since=test_datetime).metadata
# Assert
self.assertIsNotNone(md)
@BlobPreparer()
def test_get_blob_metadata_with_if_modified_fail(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
self._create_container_and_block_blob(
self.container_name, 'blob1', b'hello world', bsc)
test_datetime = (datetime.utcnow() +
timedelta(minutes=15))
# Act
with self.assertRaises(ResourceModifiedError) as e:
blob = bsc.get_blob_client(self.container_name, 'blob1')
blob.get_blob_properties(if_modified_since=test_datetime).metadata
# Assert
self.assertEqual(StorageErrorCode.condition_not_met, e.exception.error_code)
@BlobPreparer()
def test_get_blob_metadata_with_if_unmodified(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
self._create_container_and_block_blob(
self.container_name, 'blob1', b'hello world', bsc)
test_datetime = (datetime.utcnow() +
timedelta(minutes=15))
# Act
blob = bsc.get_blob_client(self.container_name, 'blob1')
md = blob.get_blob_properties(if_unmodified_since=test_datetime).metadata
# Assert
self.assertIsNotNone(md)
@BlobPreparer()
def test_get_blob_metadata_with_if_unmodified_fail(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
self._create_container_and_block_blob(
self.container_name, 'blob1', b'hello world', bsc)
test_datetime = (datetime.utcnow() -
timedelta(minutes=15))
# Act
with self.assertRaises(ResourceModifiedError) as e:
blob = bsc.get_blob_client(self.container_name, 'blob1')
blob.get_blob_properties(if_unmodified_since=test_datetime).metadata
# Assert
self.assertEqual(StorageErrorCode.condition_not_met, e.exception.error_code)
@BlobPreparer()
def test_get_blob_metadata_with_if_match(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
self._create_container_and_block_blob(
self.container_name, 'blob1', b'hello world', bsc)
blob = bsc.get_blob_client(self.container_name, 'blob1')
etag = blob.get_blob_properties().etag
# Act
md = blob.get_blob_properties(etag=etag, match_condition=MatchConditions.IfNotModified).metadata
# Assert
self.assertIsNotNone(md)
@BlobPreparer()
def test_get_blob_metadata_with_if_match_fail(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
self._create_container_and_block_blob(
self.container_name, 'blob1', b'hello world', bsc)
# Act
with self.assertRaises(ResourceModifiedError) as e:
blob = bsc.get_blob_client(self.container_name, 'blob1')
blob.get_blob_properties(etag='0x111111111111111', match_condition=MatchConditions.IfNotModified).metadata
# Assert
self.assertEqual(StorageErrorCode.condition_not_met, e.exception.error_code)
@BlobPreparer()
def test_get_blob_metadata_with_if_none_match(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
self._create_container_and_block_blob(
self.container_name, 'blob1', b'hello world', bsc)
# Act
blob = bsc.get_blob_client(self.container_name, 'blob1')
md = blob.get_blob_properties(etag='0x111111111111111', match_condition=MatchConditions.IfModified).metadata
# Assert
self.assertIsNotNone(md)
@BlobPreparer()
def test_get_blob_metadata_with_if_none_match_fail(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
self._create_container_and_block_blob(
self.container_name, 'blob1', b'hello world', bsc)
blob = bsc.get_blob_client(self.container_name, 'blob1')
etag = blob.get_blob_properties().etag
# Act
with self.assertRaises(ResourceModifiedError) as e:
blob.get_blob_properties(etag=etag, match_condition=MatchConditions.IfModified).metadata
# Assert
self.assertEqual(StorageErrorCode.condition_not_met, e.exception.error_code)
@BlobPreparer()
def test_set_blob_metadata_with_if_modified(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
self._create_container_and_block_blob(
self.container_name, 'blob1', b'hello world', bsc)
test_datetime = (datetime.utcnow() -
timedelta(minutes=15))
# Act
metadata = {'hello': 'world', 'number': '42'}
blob = bsc.get_blob_client(self.container_name, 'blob1')
blob.set_blob_metadata(metadata, if_modified_since=test_datetime)
# Assert
md = blob.get_blob_properties().metadata
self.assertDictEqual(metadata, md)
@BlobPreparer()
def test_set_blob_metadata_with_if_modified_fail(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
self._create_container_and_block_blob(
self.container_name, 'blob1', b'hello world', bsc)
test_datetime = (datetime.utcnow() +
timedelta(minutes=15))
# Act
with self.assertRaises(ResourceModifiedError) as e:
metadata = {'hello': 'world', 'number': '42'}
blob = bsc.get_blob_client(self.container_name, 'blob1')
blob.set_blob_metadata(metadata, if_modified_since=test_datetime)
# Assert
self.assertEqual(StorageErrorCode.condition_not_met, e.exception.error_code)
@BlobPreparer()
def test_set_blob_metadata_with_if_unmodified(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
self._create_container_and_block_blob(
self.container_name, 'blob1', b'hello world', bsc)
test_datetime = (datetime.utcnow() +
timedelta(minutes=15))
# Act
metadata = {'hello': 'world', 'number': '42'}
blob = bsc.get_blob_client(self.container_name, 'blob1')
blob.set_blob_metadata(metadata, if_unmodified_since=test_datetime)
# Assert
md = blob.get_blob_properties().metadata
self.assertDictEqual(metadata, md)
@BlobPreparer()
def test_set_blob_metadata_with_if_unmodified_fail(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
self._create_container_and_block_blob(
self.container_name, 'blob1', b'hello world', bsc)
test_datetime = (datetime.utcnow() -
timedelta(minutes=15))
# Act
with self.assertRaises(ResourceModifiedError) as e:
metadata = {'hello': 'world', 'number': '42'}
blob = bsc.get_blob_client(self.container_name, 'blob1')
blob.set_blob_metadata(metadata, if_unmodified_since=test_datetime)
# Assert
self.assertEqual(StorageErrorCode.condition_not_met, e.exception.error_code)
@BlobPreparer()
def test_set_blob_metadata_with_if_match(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
self._create_container_and_block_blob(
self.container_name, 'blob1', b'hello world', bsc)
blob = bsc.get_blob_client(self.container_name, 'blob1')
etag = blob.get_blob_properties().etag
# Act
metadata = {'hello': 'world', 'number': '42'}
blob.set_blob_metadata(metadata, etag=etag, match_condition=MatchConditions.IfNotModified)
# Assert
md = blob.get_blob_properties().metadata
self.assertDictEqual(metadata, md)
@BlobPreparer()
def test_set_blob_metadata_with_if_match_fail(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
self._create_container_and_block_blob(
self.container_name, 'blob1', b'hello world', bsc)
# Act
with self.assertRaises(ResourceModifiedError) as e:
metadata = {'hello': 'world', 'number': '42'}
blob = bsc.get_blob_client(self.container_name, 'blob1')
blob.set_blob_metadata(metadata, etag='0x111111111111111', match_condition=MatchConditions.IfNotModified)
# Assert
self.assertEqual(StorageErrorCode.condition_not_met, e.exception.error_code)
@BlobPreparer()
def test_set_blob_metadata_with_if_none_match(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
self._create_container_and_block_blob(
self.container_name, 'blob1', b'hello world', bsc)
# Act
metadata = {'hello': 'world', 'number': '42'}
blob = bsc.get_blob_client(self.container_name, 'blob1')
blob.set_blob_metadata(metadata, etag='0x111111111111111', match_condition=MatchConditions.IfModified)
# Assert
md = blob.get_blob_properties().metadata
self.assertDictEqual(metadata, md)
@BlobPreparer()
def test_set_blob_metadata_with_if_none_match_fail(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
self._create_container_and_block_blob(
self.container_name, 'blob1', b'hello world', bsc)
blob = bsc.get_blob_client(self.container_name, 'blob1')
etag = blob.get_blob_properties().etag
# Act
with self.assertRaises(ResourceModifiedError) as e:
metadata = {'hello': 'world', 'number': '42'}
blob.set_blob_metadata(metadata, etag=etag, match_condition=MatchConditions.IfModified)
# Assert
self.assertEqual(StorageErrorCode.condition_not_met, e.exception.error_code)
@BlobPreparer()
def test_delete_blob_with_if_modified(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
test_datetime = (datetime.utcnow() -
timedelta(minutes=15))
self._create_container_and_block_blob(
self.container_name, 'blob1', b'hello world', bsc)
# Act
blob = bsc.get_blob_client(self.container_name, 'blob1')
resp = blob.delete_blob(if_modified_since=test_datetime)
# Assert
self.assertIsNone(resp)
@BlobPreparer()
def test_delete_blob_with_if_modified_fail(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
test_datetime = (datetime.utcnow() +
timedelta(minutes=15))
self._create_container_and_block_blob(
self.container_name, 'blob1', b'hello world', bsc)
# Act
blob = bsc.get_blob_client(self.container_name, 'blob1')
with self.assertRaises(ResourceModifiedError) as e:
blob.delete_blob(if_modified_since=test_datetime)
# Assert
self.assertEqual(StorageErrorCode.condition_not_met, e.exception.error_code)
@BlobPreparer()
def test_delete_blob_with_if_unmodified(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
test_datetime = (datetime.utcnow() +
timedelta(minutes=15))
self._create_container_and_block_blob(
self.container_name, 'blob1', b'hello world', bsc)
# Act
blob = bsc.get_blob_client(self.container_name, 'blob1')
resp = blob.delete_blob(if_unmodified_since=test_datetime)
# Assert
self.assertIsNone(resp)
@BlobPreparer()
def test_delete_blob_with_if_unmodified_fail(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
test_datetime = (datetime.utcnow() -
timedelta(minutes=15))
self._create_container_and_block_blob(
self.container_name, 'blob1', b'hello world', bsc)
# Act
blob = bsc.get_blob_client(self.container_name, 'blob1')
with self.assertRaises(ResourceModifiedError) as e:
blob.delete_blob(if_unmodified_since=test_datetime)
# Assert
self.assertEqual(StorageErrorCode.condition_not_met, e.exception.error_code)
@BlobPreparer()
def test_delete_blob_with_if_match(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
self._create_container_and_block_blob(
self.container_name, 'blob1', b'hello world', bsc)
blob = bsc.get_blob_client(self.container_name, 'blob1')
etag = blob.get_blob_properties().etag
# Act
resp = blob.delete_blob(etag=etag, match_condition=MatchConditions.IfNotModified)
# Assert
self.assertIsNone(resp)
@BlobPreparer()
def test_delete_blob_with_if_match_fail(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
self._create_container_and_block_blob(
self.container_name, 'blob1', b'hello world', bsc)
# Act
blob = bsc.get_blob_client(self.container_name, 'blob1')
with self.assertRaises(ResourceModifiedError) as e:
blob.delete_blob(etag='0x111111111111111', match_condition=MatchConditions.IfNotModified)
# Assert
self.assertEqual(StorageErrorCode.condition_not_met, e.exception.error_code)
@BlobPreparer()
def test_delete_blob_with_if_none_match(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
self._create_container_and_block_blob(
self.container_name, 'blob1', b'hello world', bsc)
# Act
blob = bsc.get_blob_client(self.container_name, 'blob1')
resp = blob.delete_blob(etag='0x111111111111111', match_condition=MatchConditions.IfModified)
# Assert
self.assertIsNone(resp)
@BlobPreparer()
def test_delete_blob_with_if_none_match_fail(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
self._create_container_and_block_blob(
self.container_name, 'blob1', b'hello world', bsc)
blob = bsc.get_blob_client(self.container_name, 'blob1')
etag = blob.get_blob_properties().etag
# Act
with self.assertRaises(ResourceModifiedError) as e:
blob.delete_blob(etag=etag, match_condition=MatchConditions.IfModified)
# Assert
self.assertEqual(StorageErrorCode.condition_not_met, e.exception.error_code)
@BlobPreparer()
def test_snapshot_blob_with_if_modified(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
self._create_container_and_block_blob(
self.container_name, 'blob1', b'hello world', bsc)
test_datetime = (datetime.utcnow() -
timedelta(minutes=15))
# Act
blob = bsc.get_blob_client(self.container_name, 'blob1')
resp = blob.create_snapshot(if_modified_since=test_datetime)
# Assert
self.assertIsNotNone(resp)
self.assertIsNotNone(resp['snapshot'])
@BlobPreparer()
def test_snapshot_blob_with_if_modified_fail(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
self._create_container_and_block_blob(
self.container_name, 'blob1', b'hello world', bsc)
test_datetime = (datetime.utcnow() +
timedelta(minutes=15))
# Act
with self.assertRaises(ResourceModifiedError) as e:
blob = bsc.get_blob_client(self.container_name, 'blob1')
blob.create_snapshot(if_modified_since=test_datetime)
# Assert
self.assertEqual(StorageErrorCode.condition_not_met, e.exception.error_code)
@BlobPreparer()
def test_snapshot_blob_with_if_unmodified(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
self._create_container_and_block_blob(
self.container_name, 'blob1', b'hello world', bsc)
test_datetime = (datetime.utcnow() +
timedelta(minutes=15))
# Act
blob = bsc.get_blob_client(self.container_name, 'blob1')
resp = blob.create_snapshot(if_unmodified_since=test_datetime)
# Assert
self.assertIsNotNone(resp)
self.assertIsNotNone(resp['snapshot'])
@BlobPreparer()
def test_snapshot_blob_with_if_unmodified_fail(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
self._create_container_and_block_blob(
self.container_name, 'blob1', b'hello world', bsc)
test_datetime = (datetime.utcnow() -
timedelta(minutes=15))
# Act
with self.assertRaises(ResourceModifiedError) as e:
blob = bsc.get_blob_client(self.container_name, 'blob1')
blob.create_snapshot(if_unmodified_since=test_datetime)
# Assert
self.assertEqual(StorageErrorCode.condition_not_met, e.exception.error_code)
@BlobPreparer()
def test_snapshot_blob_with_if_match(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
self._create_container_and_block_blob(
self.container_name, 'blob1', b'hello world', bsc)
blob = bsc.get_blob_client(self.container_name, 'blob1')
etag = blob.get_blob_properties().etag
# Act
resp = blob.create_snapshot(etag=etag, match_condition=MatchConditions.IfNotModified)
# Assert
self.assertIsNotNone(resp)
self.assertIsNotNone(resp['snapshot'])
@BlobPreparer()
def test_snapshot_blob_with_if_match_fail(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
self._create_container_and_block_blob(
self.container_name, 'blob1', b'hello world', bsc)
# Act
with self.assertRaises(ResourceModifiedError) as e:
blob = bsc.get_blob_client(self.container_name, 'blob1')
blob.create_snapshot(etag='0x111111111111111', match_condition=MatchConditions.IfNotModified)
# Assert
self.assertEqual(StorageErrorCode.condition_not_met, e.exception.error_code)
@BlobPreparer()
def test_snapshot_blob_with_if_none_match(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
self._create_container_and_block_blob(
self.container_name, 'blob1', b'hello world', bsc)
# Act
blob = bsc.get_blob_client(self.container_name, 'blob1')
resp = blob.create_snapshot(etag='0x111111111111111', match_condition=MatchConditions.IfModified)
# Assert
self.assertIsNotNone(resp)
self.assertIsNotNone(resp['snapshot'])
@BlobPreparer()
def test_snapshot_blob_with_if_none_match_fail(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
self._create_container_and_block_blob(
self.container_name, 'blob1', b'hello world', bsc)
blob = bsc.get_blob_client(self.container_name, 'blob1')
etag = blob.get_blob_properties().etag
# Act
with self.assertRaises(ResourceModifiedError) as e:
blob.create_snapshot(etag=etag, match_condition=MatchConditions.IfModified)
# Assert
self.assertEqual(StorageErrorCode.condition_not_met, e.exception.error_code)
@BlobPreparer()
def test_lease_blob_with_if_modified(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
self._create_container_and_block_blob(
self.container_name, 'blob1', b'hello world', bsc)
test_lease_id = '00000000-1111-2222-3333-444444444444'
test_datetime = (datetime.utcnow() -
timedelta(minutes=15))
# Act
blob = bsc.get_blob_client(self.container_name, 'blob1')
lease = blob.acquire_lease(
if_modified_since=test_datetime,
lease_id=test_lease_id)
lease.break_lease()
# Assert
self.assertIsInstance(lease, BlobLeaseClient)
self.assertIsNotNone(lease.id)
@BlobPreparer()
def test_lease_blob_with_if_modified_fail(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
self._create_container_and_block_blob(
self.container_name, 'blob1', b'hello world', bsc)
test_datetime = (datetime.utcnow() +
timedelta(minutes=15))
# Act
with self.assertRaises(ResourceModifiedError) as e:
blob = bsc.get_blob_client(self.container_name, 'blob1')
blob.acquire_lease(if_modified_since=test_datetime)
# Assert
self.assertEqual(StorageErrorCode.condition_not_met, e.exception.error_code)
@BlobPreparer()
def test_lease_blob_with_if_unmodified(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
self._create_container_and_block_blob(
self.container_name, 'blob1', b'hello world', bsc)
test_lease_id = '00000000-1111-2222-3333-444444444444'
test_datetime = (datetime.utcnow() +
timedelta(minutes=15))
# Act
blob = bsc.get_blob_client(self.container_name, 'blob1')
lease = blob.acquire_lease(
if_unmodified_since=test_datetime,
lease_id=test_lease_id)
lease.break_lease()
# Assert
self.assertIsInstance(lease, BlobLeaseClient)
self.assertIsNotNone(lease.id)
@BlobPreparer()
def test_lease_blob_with_if_unmodified_fail(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
self._create_container_and_block_blob(
self.container_name, 'blob1', b'hello world', bsc)
test_datetime = (datetime.utcnow() -
timedelta(minutes=15))
# Act
blob = bsc.get_blob_client(self.container_name, 'blob1')
with self.assertRaises(ResourceModifiedError) as e:
blob.acquire_lease(if_unmodified_since=test_datetime)
# Assert
self.assertEqual(StorageErrorCode.condition_not_met, e.exception.error_code)
@BlobPreparer()
def test_lease_blob_with_if_match(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
self._create_container_and_block_blob(
self.container_name, 'blob1', b'hello world', bsc)
blob = bsc.get_blob_client(self.container_name, 'blob1')
etag = blob.get_blob_properties().etag
test_lease_id = '00000000-1111-2222-3333-444444444444'
# Act
lease = blob.acquire_lease(
lease_id=test_lease_id,
etag=etag, match_condition=MatchConditions.IfNotModified)
lease.break_lease()
# Assert
self.assertIsInstance(lease, BlobLeaseClient)
self.assertIsNotNone(lease.id)
self.assertIsNotNone(lease.etag)
self.assertEqual(lease.etag, etag)
@BlobPreparer()
def test_lease_blob_with_if_match_fail(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
self._create_container_and_block_blob(
self.container_name, 'blob1', b'hello world', bsc)
# Act
blob = bsc.get_blob_client(self.container_name, 'blob1')
with self.assertRaises(ResourceModifiedError) as e:
blob.acquire_lease(etag='0x111111111111111', match_condition=MatchConditions.IfNotModified)
# Assert
self.assertEqual(StorageErrorCode.condition_not_met, e.exception.error_code)
@BlobPreparer()
def test_lease_blob_with_if_none_match(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
self._create_container_and_block_blob(
self.container_name, 'blob1', b'hello world', bsc)
test_lease_id = '00000000-1111-2222-3333-444444444444'
# Act
blob = bsc.get_blob_client(self.container_name, 'blob1')
lease = blob.acquire_lease(
lease_id=test_lease_id,
etag='0x111111111111111',
match_condition=MatchConditions.IfModified)
lease.break_lease()
# Assert
self.assertIsInstance(lease, BlobLeaseClient)
self.assertIsNotNone(lease.id)
@BlobPreparer()
def test_lease_blob_with_if_none_match_fail(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
self._create_container_and_block_blob(
self.container_name, 'blob1', b'hello world', bsc)
blob = bsc.get_blob_client(self.container_name, 'blob1')
etag = blob.get_blob_properties().etag
# Act
with self.assertRaises(ResourceModifiedError) as e:
blob.acquire_lease(etag=etag, match_condition=MatchConditions.IfModified)
# Assert
self.assertEqual(StorageErrorCode.condition_not_met, e.exception.error_code)
@BlobPreparer()
def test_put_block_list_with_if_modified(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
container, blob = self._create_container_and_block_blob(
self.container_name, 'blob1', b'', bsc)
blob.stage_block('1', b'AAA')
blob.stage_block('2', b'BBB')
blob.stage_block('3', b'CCC')
test_datetime = (datetime.utcnow() -
timedelta(minutes=15))
# Act
block_list = [BlobBlock(block_id='1'), BlobBlock(block_id='2'), BlobBlock(block_id='3')]
blob.commit_block_list(block_list, if_modified_since=test_datetime)
# Assert
content = blob.download_blob()
self.assertEqual(content.readall(), b'AAABBBCCC')
@pytest.mark.playback_test_only
@BlobPreparer()
def test_put_block_list_returns_vid(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
container, blob = self._create_container_and_block_blob(
self.container_name, 'blob1', b'', bsc)
blob.stage_block('1', b'AAA')
blob.stage_block('2', b'BBB')
blob.stage_block('3', b'CCC')
test_datetime = (datetime.utcnow() -
timedelta(minutes=15))
# Act
block_list = [BlobBlock(block_id='1'), BlobBlock(block_id='2'), BlobBlock(block_id='3')]
resp = blob.commit_block_list(block_list, if_modified_since=test_datetime)
# Assert
self.assertIsNotNone(resp['version_id'])
content = blob.download_blob()
self.assertEqual(content.readall(), b'AAABBBCCC')
@BlobPreparer()
def test_put_block_list_with_metadata(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
container, blob = self._create_container_and_block_blob(
self.container_name, 'blob1', b'', bsc)
blob.stage_block('1', b'AAA')
blob.stage_block('2', b'BBB')
blob.stage_block('3', b'CCC')
test_datetime = (datetime.utcnow() -
timedelta(minutes=15))
# Act
metadata = {'hello': 'world', 'number': '43'}
block_list = [BlobBlock(block_id='1'), BlobBlock(block_id='2'), BlobBlock(block_id='3')]
blob.commit_block_list(block_list, metadata=metadata, if_modified_since=test_datetime)
# Assert
content = blob.download_blob()
properties = blob.get_blob_properties()
self.assertEqual(content.readall(), b'AAABBBCCC')
self.assertEqual(properties.metadata, metadata)
@BlobPreparer()
def test_put_block_list_with_if_modified_fail(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
container, blob = self._create_container_and_block_blob(
self.container_name, 'blob1', b'', bsc)
blob.stage_block('1', b'AAA')
blob.stage_block('2', b'BBB')
blob.stage_block('3', b'CCC')
test_datetime = (datetime.utcnow() +
timedelta(minutes=15))
# Act
with self.assertRaises(ResourceModifiedError) as e:
blob.commit_block_list(
[BlobBlock(block_id='1'), BlobBlock(block_id='2'), BlobBlock(block_id='3')],
if_modified_since=test_datetime)
# Assert
self.assertEqual(StorageErrorCode.condition_not_met, e.exception.error_code)
@BlobPreparer()
def test_put_block_list_with_if_unmodified(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
container, blob = self._create_container_and_block_blob(
self.container_name, 'blob1', b'', bsc)
blob.stage_block('1', b'AAA')
blob.stage_block('2', b'BBB')
blob.stage_block('3', b'CCC')
test_datetime = (datetime.utcnow() +
timedelta(minutes=15))
# Act
block_list = [BlobBlock(block_id='1'), BlobBlock(block_id='2'), BlobBlock(block_id='3')]
blob.commit_block_list(block_list, if_unmodified_since=test_datetime)
# Assert
content = blob.download_blob()
self.assertEqual(content.readall(), b'AAABBBCCC')
@BlobPreparer()
def test_put_block_list_with_if_unmodified_fail(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
container, blob = self._create_container_and_block_blob(
self.container_name, 'blob1', b'', bsc)
blob.stage_block('1', b'AAA')
blob.stage_block('2', b'BBB')
blob.stage_block('3', b'CCC')
test_datetime = (datetime.utcnow() -
timedelta(minutes=15))
# Act
with self.assertRaises(ResourceModifiedError) as e:
blob.commit_block_list(
[BlobBlock(block_id='1'), BlobBlock(block_id='2'), BlobBlock(block_id='3')],
if_unmodified_since=test_datetime)
# Assert
self.assertEqual(StorageErrorCode.condition_not_met, e.exception.error_code)
@BlobPreparer()
def test_put_block_list_with_if_match(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
container, blob = self._create_container_and_block_blob(
self.container_name, 'blob1', b'', bsc)
blob.stage_block('1', b'AAA')
blob.stage_block('2', b'BBB')
blob.stage_block('3', b'CCC')
etag = blob.get_blob_properties().etag
# Act
block_list = [BlobBlock(block_id='1'), BlobBlock(block_id='2'), BlobBlock(block_id='3')]
blob.commit_block_list(block_list, etag=etag, match_condition=MatchConditions.IfNotModified)
# Assert
content = blob.download_blob()
self.assertEqual(content.readall(), b'AAABBBCCC')
@BlobPreparer()
def test_put_block_list_with_if_match_fail(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
container, blob = self._create_container_and_block_blob(
self.container_name, 'blob1', b'', bsc)
blob.stage_block('1', b'AAA')
blob.stage_block('2', b'BBB')
blob.stage_block('3', b'CCC')
# Act
with self.assertRaises(ResourceModifiedError) as e:
blob.commit_block_list(
[BlobBlock(block_id='1'), BlobBlock(block_id='2'), BlobBlock(block_id='3')],
etag='0x111111111111111', match_condition=MatchConditions.IfNotModified)
# Assert
self.assertEqual(StorageErrorCode.condition_not_met, e.exception.error_code)
@BlobPreparer()
def test_put_block_list_with_if_none_match(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
container, blob = self._create_container_and_block_blob(
self.container_name, 'blob1', b'', bsc)
blob.stage_block('1', b'AAA')
blob.stage_block('2', b'BBB')
blob.stage_block('3', b'CCC')
# Act
block_list = [BlobBlock(block_id='1'), BlobBlock(block_id='2'), BlobBlock(block_id='3')]
blob.commit_block_list(block_list, etag='0x111111111111111', match_condition=MatchConditions.IfModified)
# Assert
content = blob.download_blob()
self.assertEqual(content.readall(), b'AAABBBCCC')
@BlobPreparer()
def test_put_block_list_with_if_none_match_fail(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
container, blob = self._create_container_and_block_blob(
self.container_name, 'blob1', b'', bsc)
blob.stage_block('1', b'AAA')
blob.stage_block('2', b'BBB')
blob.stage_block('3', b'CCC')
etag = blob.get_blob_properties().etag
# Act
with self.assertRaises(ResourceModifiedError) as e:
block_list = [BlobBlock(block_id='1'), BlobBlock(block_id='2'), BlobBlock(block_id='3')]
blob.commit_block_list(block_list, etag=etag, match_condition=MatchConditions.IfModified)
# Assert
self.assertEqual(StorageErrorCode.condition_not_met, e.exception.error_code)
@BlobPreparer()
def test_update_page_with_if_modified(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
self._create_container_and_page_blob(
self.container_name, 'blob1', 1024, bsc)
test_datetime = (datetime.utcnow() -
timedelta(minutes=15))
data = b'abcdefghijklmnop' * 32
# Act
blob = bsc.get_blob_client(self.container_name, 'blob1')
blob.upload_page(data, offset=0, length=512, if_modified_since=test_datetime)
# Assert
@BlobPreparer()
def test_update_page_with_if_modified_fail(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
self._create_container_and_page_blob(
self.container_name, 'blob1', 1024, bsc)
test_datetime = (datetime.utcnow() +
timedelta(minutes=15))
data = b'abcdefghijklmnop' * 32
# Act
blob = bsc.get_blob_client(self.container_name, 'blob1')
with self.assertRaises(ResourceModifiedError) as e:
blob.upload_page(data, offset=0, length=512, if_modified_since=test_datetime)
# Assert
self.assertEqual(StorageErrorCode.condition_not_met, e.exception.error_code)
@BlobPreparer()
def test_update_page_with_if_unmodified(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
self._create_container_and_page_blob(
self.container_name, 'blob1', 1024, bsc)
test_datetime = (datetime.utcnow() +
timedelta(minutes=15))
data = b'abcdefghijklmnop' * 32
# Act
blob = bsc.get_blob_client(self.container_name, 'blob1')
blob.upload_page(data, offset=0, length=512, if_unmodified_since=test_datetime)
# Assert
@BlobPreparer()
def test_update_page_with_if_unmodified_fail(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
self._create_container_and_page_blob(
self.container_name, 'blob1', 1024, bsc)
test_datetime = (datetime.utcnow() -
timedelta(minutes=15))
data = b'abcdefghijklmnop' * 32
# Act
blob = bsc.get_blob_client(self.container_name, 'blob1')
with self.assertRaises(ResourceModifiedError) as e:
blob.upload_page(data, offset=0, length=512, if_unmodified_since=test_datetime)
# Assert
self.assertEqual(StorageErrorCode.condition_not_met, e.exception.error_code)
@BlobPreparer()
def test_update_page_with_if_match(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
self._create_container_and_page_blob(
self.container_name, 'blob1', 1024, bsc)
data = b'abcdefghijklmnop' * 32
blob = bsc.get_blob_client(self.container_name, 'blob1')
etag = blob.get_blob_properties().etag
# Act
blob.upload_page(data, offset=0, length=512, etag=etag, match_condition=MatchConditions.IfNotModified)
# Assert
@BlobPreparer()
def test_update_page_with_if_match_fail(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
self._create_container_and_page_blob(
self.container_name, 'blob1', 1024, bsc)
data = b'abcdefghijklmnop' * 32
# Act
blob = bsc.get_blob_client(self.container_name, 'blob1')
with self.assertRaises(ResourceModifiedError) as e:
blob.upload_page(data, offset=0, length=512, etag='0x111111111111111', match_condition=MatchConditions.IfNotModified)
# Assert
self.assertEqual(StorageErrorCode.condition_not_met, e.exception.error_code)
@BlobPreparer()
def test_update_page_with_if_none_match(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
self._create_container_and_page_blob(
self.container_name, 'blob1', 1024, bsc)
data = b'abcdefghijklmnop' * 32
# Act
blob = bsc.get_blob_client(self.container_name, 'blob1')
blob.upload_page(data, offset=0, length=512, etag='0x111111111111111', match_condition=MatchConditions.IfModified)
# Assert
@BlobPreparer()
def test_update_page_with_if_none_match_fail(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
self._create_container_and_page_blob(
self.container_name, 'blob1', 1024, bsc)
data = b'abcdefghijklmnop' * 32
blob = bsc.get_blob_client(self.container_name, 'blob1')
etag = blob.get_blob_properties().etag
# Act
with self.assertRaises(ResourceModifiedError) as e:
blob.upload_page(data, offset=0, length=512, etag=etag, match_condition=MatchConditions.IfModified)
# Assert
self.assertEqual(StorageErrorCode.condition_not_met, e.exception.error_code)
@BlobPreparer()
def test_get_page_ranges_iter_with_if_modified(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
container, blob = self._create_container_and_page_blob(
self.container_name, 'blob1', 2048, bsc)
data = b'abcdefghijklmnop' * 32
test_datetime = (datetime.utcnow() -
timedelta(minutes=15))
blob.upload_page(data, offset=0, length=512)
blob.upload_page(data, offset=1024, length=512)
# Act
ranges = blob.get_page_ranges(if_modified_since=test_datetime)
# Assert
self.assertEqual(len(ranges[0]), 2)
self.assertEqual(ranges[0][0], {'start': 0, 'end': 511})
self.assertEqual(ranges[0][1], {'start': 1024, 'end': 1535})
@BlobPreparer()
def test_get_page_ranges_iter_with_if_modified_fail(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
container, blob = self._create_container_and_page_blob(
self.container_name, 'blob1', 2048, bsc)
data = b'abcdefghijklmnop' * 32
test_datetime = (datetime.utcnow() +
timedelta(minutes=15))
blob.upload_page(data, offset=0, length=512)
blob.upload_page(data, offset=1024, length=512)
# Act
with self.assertRaises(ResourceModifiedError) as e:
blob.get_page_ranges(if_modified_since=test_datetime)
# Assert
self.assertEqual(StorageErrorCode.condition_not_met, e.exception.error_code)
@BlobPreparer()
def test_get_page_ranges_iter_with_if_unmodified(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
container, blob = self._create_container_and_page_blob(
self.container_name, 'blob1', 2048, bsc)
data = b'abcdefghijklmnop' * 32
test_datetime = (datetime.utcnow() +
timedelta(minutes=15))
blob.upload_page(data, offset=0, length=512)
blob.upload_page(data, offset=1024, length=512)
# Act
ranges = blob.get_page_ranges(if_unmodified_since=test_datetime)
# Assert
self.assertEqual(len(ranges[0]), 2)
self.assertEqual(ranges[0][0], {'start': 0, 'end': 511})
self.assertEqual(ranges[0][1], {'start': 1024, 'end': 1535})
@BlobPreparer()
def test_get_page_ranges_iter_with_if_unmodified_fail(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
container, blob = self._create_container_and_page_blob(
self.container_name, 'blob1', 2048, bsc)
data = b'abcdefghijklmnop' * 32
test_datetime = (datetime.utcnow() -
timedelta(minutes=15))
blob.upload_page(data, offset=0, length=512)
blob.upload_page(data, offset=1024, length=512)
# Act
with self.assertRaises(ResourceModifiedError) as e:
blob.get_page_ranges(if_unmodified_since=test_datetime)
# Assert
self.assertEqual(StorageErrorCode.condition_not_met, e.exception.error_code)
@BlobPreparer()
def test_get_page_ranges_iter_with_if_match(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
container, blob = self._create_container_and_page_blob(
self.container_name, 'blob1', 2048, bsc)
data = b'abcdefghijklmnop' * 32
blob.upload_page(data, offset=0, length=512)
blob.upload_page(data, offset=1024, length=512)
etag = blob.get_blob_properties().etag
# Act
ranges = blob.get_page_ranges(etag=etag, match_condition=MatchConditions.IfNotModified)
# Assert
self.assertEqual(len(ranges[0]), 2)
self.assertEqual(ranges[0][0], {'start': 0, 'end': 511})
self.assertEqual(ranges[0][1], {'start': 1024, 'end': 1535})
@BlobPreparer()
def test_get_page_ranges_iter_with_if_match_fail(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
container, blob = self._create_container_and_page_blob(
self.container_name, 'blob1', 2048, bsc)
data = b'abcdefghijklmnop' * 32
blob.upload_page(data, offset=0, length=512)
blob.upload_page(data, offset=1024, length=512)
# Act
with self.assertRaises(ResourceModifiedError) as e:
blob.get_page_ranges(etag='0x111111111111111', match_condition=MatchConditions.IfNotModified)
# Assert
self.assertEqual(StorageErrorCode.condition_not_met, e.exception.error_code)
@BlobPreparer()
def test_get_page_ranges_iter_with_if_none_match(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
container, blob = self._create_container_and_page_blob(
self.container_name, 'blob1', 2048, bsc)
data = b'abcdefghijklmnop' * 32
blob.upload_page(data, offset=0, length=512)
blob.upload_page(data, offset=1024, length=512)
# Act
ranges = blob.get_page_ranges(etag='0x111111111111111', match_condition=MatchConditions.IfModified)
# Assert
self.assertEqual(len(ranges[0]), 2)
self.assertEqual(ranges[0][0], {'start': 0, 'end': 511})
self.assertEqual(ranges[0][1], {'start': 1024, 'end': 1535})
@BlobPreparer()
def test_get_page_ranges_iter_with_if_none_match_fail(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
container, blob = self._create_container_and_page_blob(
self.container_name, 'blob1', 2048, bsc)
data = b'abcdefghijklmnop' * 32
blob.upload_page(data, offset=0, length=512)
blob.upload_page(data, offset=1024, length=512)
etag = blob.get_blob_properties().etag
# Act
with self.assertRaises(ResourceModifiedError) as e:
blob.get_page_ranges(etag=etag, match_condition=MatchConditions.IfModified)
# Assert
self.assertEqual(StorageErrorCode.condition_not_met, e.exception.error_code)
@BlobPreparer()
def test_append_block_with_if_modified(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
container, blob = self._create_container_and_append_blob(self.container_name, 'blob1', bsc)
test_datetime = (datetime.utcnow() -
timedelta(minutes=15))
# Act
for i in range(5):
resp = blob.append_block(u'block {0}'.format(i), if_modified_since=test_datetime)
self.assertIsNotNone(resp)
# Assert
content = blob.download_blob().readall()
self.assertEqual(b'block 0block 1block 2block 3block 4', content)
@BlobPreparer()
def test_append_block_with_if_modified_fail(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
container, blob = self._create_container_and_append_blob(self.container_name, 'blob1', bsc)
test_datetime = (datetime.utcnow() +
timedelta(minutes=15))
# Act
with self.assertRaises(ResourceModifiedError) as e:
for i in range(5):
resp = blob.append_block(u'block {0}'.format(i), if_modified_since=test_datetime)
# Assert
self.assertEqual(StorageErrorCode.condition_not_met, e.exception.error_code)
@BlobPreparer()
def test_append_block_with_if_unmodified(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
container, blob = self._create_container_and_append_blob(self.container_name, 'blob1', bsc)
test_datetime = (datetime.utcnow() +
timedelta(minutes=15))
# Act
for i in range(5):
resp = blob.append_block(u'block {0}'.format(i), if_unmodified_since=test_datetime)
self.assertIsNotNone(resp)
# Assert
content = blob.download_blob().readall()
self.assertEqual(b'block 0block 1block 2block 3block 4', content)
@BlobPreparer()
def test_append_block_with_if_unmodified_fail(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
container, blob = self._create_container_and_append_blob(self.container_name, 'blob1', bsc)
test_datetime = (datetime.utcnow() -
timedelta(minutes=15))
# Act
with self.assertRaises(ResourceModifiedError) as e:
for i in range(5):
resp = blob.append_block(u'block {0}'.format(i), if_unmodified_since=test_datetime)
# Assert
self.assertEqual(StorageErrorCode.condition_not_met, e.exception.error_code)
@BlobPreparer()
def test_append_block_with_if_match(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
container, blob = self._create_container_and_append_blob(self.container_name, 'blob1', bsc)
# Act
for i in range(5):
etag = blob.get_blob_properties().etag
resp = blob.append_block(u'block {0}'.format(i), etag=etag, match_condition=MatchConditions.IfNotModified)
self.assertIsNotNone(resp)
# Assert
content = blob.download_blob().readall()
self.assertEqual(b'block 0block 1block 2block 3block 4', content)
@BlobPreparer()
def test_append_block_with_if_match_fail(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
container, blob = self._create_container_and_append_blob(self.container_name, 'blob1', bsc)
# Act
with self.assertRaises(HttpResponseError) as e:
for i in range(5):
resp = blob.append_block(u'block {0}'.format(i), etag='0x111111111111111', match_condition=MatchConditions.IfNotModified)
# Assert
#self.assertEqual(StorageErrorCode.condition_not_met, e.exception.error_code)
@BlobPreparer()
def test_append_block_with_if_none_match(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
container, blob = self._create_container_and_append_blob(self.container_name, 'blob1', bsc)
# Act
for i in range(5):
resp = blob.append_block(u'block {0}'.format(i), etag='0x8D2C9167D53FC2C', match_condition=MatchConditions.IfModified)
self.assertIsNotNone(resp)
# Assert
content = blob.download_blob().readall()
self.assertEqual(b'block 0block 1block 2block 3block 4', content)
@BlobPreparer()
def test_append_block_with_if_none_match_fail(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
container, blob = self._create_container_and_append_blob(self.container_name, 'blob1', bsc)
# Act
with self.assertRaises(ResourceModifiedError) as e:
for i in range(5):
etag = blob.get_blob_properties().etag
resp = blob.append_block(u'block {0}'.format(i), etag=etag, match_condition=MatchConditions.IfModified)
# Assert
self.assertEqual(StorageErrorCode.condition_not_met, e.exception.error_code)
@BlobPreparer()
def test_append_blob_from_bytes_with_if_modified(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
blob_name = self.get_resource_name("blob")
container, blob = self._create_container_and_append_blob(self.container_name, blob_name, bsc)
test_datetime = (datetime.utcnow() - timedelta(minutes=15))
# Act
data = self.get_random_bytes(LARGE_APPEND_BLOB_SIZE)
blob.upload_blob(data, blob_type=BlobType.AppendBlob, if_modified_since=test_datetime)
# Assert
content = blob.download_blob().readall()
self.assertEqual(data, content)
@BlobPreparer()
def test_append_blob_from_bytes_with_if_modified_fail(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
blob_name = self.get_resource_name("blob")
container, blob = self._create_container_and_append_blob(self.container_name, blob_name, bsc)
test_datetime = (datetime.utcnow() + timedelta(minutes=15))
# Act
with self.assertRaises(ResourceModifiedError) as e:
data = self.get_random_bytes(LARGE_APPEND_BLOB_SIZE)
blob.upload_blob(data, blob_type=BlobType.AppendBlob, if_modified_since=test_datetime)
self.assertEqual(StorageErrorCode.condition_not_met, e.exception.error_code)
@BlobPreparer()
def test_append_blob_from_bytes_with_if_unmodified(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
blob_name = self.get_resource_name("blob")
container, blob = self._create_container_and_append_blob(self.container_name, blob_name, bsc)
test_datetime = (datetime.utcnow() + timedelta(minutes=15))
# Act
data = self.get_random_bytes(LARGE_APPEND_BLOB_SIZE)
blob.upload_blob(data, blob_type=BlobType.AppendBlob, if_unmodified_since=test_datetime)
# Assert
content = blob.download_blob().readall()
self.assertEqual(data, content)
@BlobPreparer()
def test_append_blob_from_bytes_with_if_unmodified_fail(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
blob_name = self.get_resource_name("blob")
container, blob = self._create_container_and_append_blob(self.container_name, blob_name, bsc)
test_datetime = (datetime.utcnow() - timedelta(minutes=15))
# Act
with self.assertRaises(ResourceModifiedError) as e:
data = self.get_random_bytes(LARGE_APPEND_BLOB_SIZE)
blob.upload_blob(data, blob_type=BlobType.AppendBlob, if_unmodified_since=test_datetime)
self.assertEqual(StorageErrorCode.condition_not_met, e.exception.error_code)
@BlobPreparer()
def test_append_blob_from_bytes_with_if_match(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
blob_name = self.get_resource_name("blob")
container, blob = self._create_container_and_append_blob(self.container_name, blob_name, bsc)
test_etag = blob.get_blob_properties().etag
# Act
data = self.get_random_bytes(LARGE_APPEND_BLOB_SIZE)
blob.upload_blob(data, blob_type=BlobType.AppendBlob, etag=test_etag, match_condition=MatchConditions.IfNotModified)
# Assert
content = blob.download_blob().readall()
self.assertEqual(data, content)
@BlobPreparer()
def test_append_blob_from_bytes_with_if_match_fail(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
blob_name = self.get_resource_name("blob")
container, blob = self._create_container_and_append_blob(self.container_name, blob_name, bsc)
test_etag = '0x8D2C9167D53FC2C'
# Act
with self.assertRaises(ResourceModifiedError) as e:
data = self.get_random_bytes(LARGE_APPEND_BLOB_SIZE)
blob.upload_blob(data, blob_type=BlobType.AppendBlob, etag=test_etag, match_condition=MatchConditions.IfNotModified)
self.assertEqual(StorageErrorCode.condition_not_met, e.exception.error_code)
@BlobPreparer()
def test_append_blob_from_bytes_with_if_none_match(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
blob_name = self.get_resource_name("blob")
container, blob = self._create_container_and_append_blob(self.container_name, blob_name, bsc)
test_etag = '0x8D2C9167D53FC2C'
# Act
data = self.get_random_bytes(LARGE_APPEND_BLOB_SIZE)
blob.upload_blob(data, blob_type=BlobType.AppendBlob, etag=test_etag, match_condition=MatchConditions.IfModified)
# Assert
content = blob.download_blob().readall()
self.assertEqual(data, content)
@BlobPreparer()
def test_append_blob_from_bytes_with_if_none_match_fail(self, storage_account_name, storage_account_key):
bsc = BlobServiceClient(self.account_url(storage_account_name, "blob"), storage_account_key, connection_data_block_size=4 * 1024)
self._setup()
blob_name = self.get_resource_name("blob")
container, blob = self._create_container_and_append_blob(self.container_name, blob_name, bsc)
test_etag = blob.get_blob_properties().etag
# Act
with self.assertRaises(ResourceModifiedError) as e:
data = self.get_random_bytes(LARGE_APPEND_BLOB_SIZE)
blob.upload_blob(data, blob_type=BlobType.AppendBlob, etag=test_etag, match_condition=MatchConditions.IfModified)
self.assertEqual(StorageErrorCode.condition_not_met, e.exception.error_code)
# ------------------------------------------------------------------------------
| 47.047639 | 137 | 0.693165 | 13,143 | 112,585 | 5.56395 | 0.021532 | 0.102616 | 0.065967 | 0.052949 | 0.960247 | 0.955365 | 0.947926 | 0.941923 | 0.92833 | 0.922464 | 0 | 0.02337 | 0.209078 | 112,585 | 2,392 | 138 | 47.067308 | 0.797857 | 0.022419 | 0 | 0.794842 | 0 | 0 | 0.040235 | 0.001567 | 0 | 0 | 0.004491 | 0 | 0.146542 | 1 | 0.082063 | false | 0 | 0.005276 | 0 | 0.09027 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
27306dfa8e631567ba120295a5c23a02b5ce255c | 13,461 | py | Python | src/abaqus/Property/MaterialOrientation.py | Haiiliin/PyAbaqus | f20db6ebea19b73059fe875a53be370253381078 | [
"MIT"
] | 7 | 2022-01-21T09:15:45.000Z | 2022-02-15T09:31:58.000Z | src/abaqus/Property/MaterialOrientation.py | Haiiliin/PyAbaqus | f20db6ebea19b73059fe875a53be370253381078 | [
"MIT"
] | null | null | null | src/abaqus/Property/MaterialOrientation.py | Haiiliin/PyAbaqus | f20db6ebea19b73059fe875a53be370253381078 | [
"MIT"
] | null | null | null | from abaqusConstants import *
from ..Datum.DatumAxis import DatumAxis
from ..Datum.DatumCsys import DatumCsys
from ..Region.Set import Set
from ..Region.Surface import Surface
class MaterialOrientation:
"""The MaterialOrientation object represents the orientation of the material properties and
composite layups.
Attributes
----------
additionalRotationType: SymbolicConstant
A SymbolicConstant specifying the method used to describe the additional rotation when a
valid orientation is specified. Possible values are ROTATION_NONE, ROTATION_ANGLE, and
ROTATION_FIELD. The default value is ROTATION_NONE.
additionalRotationField: str
A String specifying the name of the :py:class:`~abaqus.Field.DiscreteField.DiscreteField` object specifying the additional
rotation. The default value is an empty string.
Notes
-----
This object can be accessed by:
.. code-block:: python
import section
mdb.models[name].parts[name].compositeLayups[i].orientation
mdb.models[name].parts[name].materialOrientations[i]
import odbAccess
session.odbs[name].parts[name].materialOrientations[i]
session.odbs[name].rootAssembly.instances[name].materialOrientations[i]
session.odbs[name].steps[name].frames[i].fieldOutputs[name].values[i].instance.materialOrientations[i]
"""
# A SymbolicConstant specifying the method used to describe the additional rotation when a
# valid orientation is specified. Possible values are ROTATION_NONE, ROTATION_ANGLE, and
# ROTATION_FIELD. The default value is ROTATION_NONE.
additionalRotationType: SymbolicConstant = ROTATION_NONE
# A String specifying the name of the DiscreteField object specifying the additional
# rotation. The default value is an empty string.
additionalRotationField: str = ''
def __init__(self, region: Set = None, localCsys: DatumCsys = DatumCsys(),
axis: SymbolicConstant = AXIS_1, angle: float = 0,
stackDirection: SymbolicConstant = STACK_3, fieldName: str = '',
orientationType: SymbolicConstant = GLOBAL,
normalAxisDirection: SymbolicConstant = AXIS_3,
normalAxisDefinition: SymbolicConstant = NORMAL_VECTOR,
normalAxisRegion: Surface = None, normalAxisDatum: DatumAxis = DatumAxis(),
flipNormalDirection: Boolean = OFF, normalAxisVector: tuple = (),
primaryAxisDirection: SymbolicConstant = AXIS_1,
primaryAxisDefinition: SymbolicConstant = PRIMARY_VECTOR,
primaryAxisRegion: Set = None, primaryAxisDatum: DatumAxis = DatumAxis(),
flipPrimaryDirection: Boolean = OFF, primaryAxisVector: tuple = ()):
"""This method creates a MaterialOrientation object.
Notes
-----
This function can be accessed by:
.. code-block:: python
mdb.models[name].parts[*name*].MaterialOrientation
Parameters
----------
region
A Set object specifying a region for which the material orientation is defined.
localCsys
A DatumCsys object specifying the local coordinate system or None, describing the
material orientation for the given region. In the ODB, this member was previously
accessible using "csys," but support has now been added for localCsys and the csys
member will be deprecated.
axis
A SymbolicConstant specifying the axis of a datum coordinate system about which an
additional rotation is applied. For shells this axis is also the shell normal. Possible
values are AXIS_1, AXIS_2, and AXIS_3. The default value is AXIS_1.
angle
A Float specifying the angle of the additional rotation (if accessed from the ODB
instead of the MDB, it will be a string instead of a float). The default value is 0.0.
stackDirection
A SymbolicConstant specifying the stack or thickness direction. Possible values are
STACK_1, STACK_2, STACK_3, and STACK_ORIENTATION. The default value is STACK_3.
fieldName
A String specifying the name of the DiscreteField object specifying the orientation. The
default value is an empty string.
orientationType
A SymbolicConstant specifying the method used to define the material orientation. If
*orientationType*=SYSTEM, the *region* and *localCsys* arguments are required. If
*orientationType*=FIELD, the *fieldName* argument is required. Possible values are
GLOBAL, SYSTEM, FIELD, DISCRETE, and USER. The default value is GLOBAL.
normalAxisDirection
A SymbolicConstant specifying the axis that is defined by the normal axis direction for
a discrete orientation. Possible values are AXIS_1, AXIS_2, and AXIS_3. The default
value is AXIS_3.
normalAxisDefinition
A SymbolicConstant specifying the method used to define the normal axis direction for a
discrete orientation. Possible values are SURFACE, NORMAL_DATUM, and NORMAL_VECTOR. The
default value is NORMAL_VECTOR.
normalAxisRegion
A Surface object specifying a region whose geometric normals define the normal axis for
the discrete orientation.
normalAxisDatum
A DatumAxis object specifying the Datum Axis or None, describing the normal axis
direction for the discrete orientation.
flipNormalDirection
A Boolean specifying the flag to reverse the direction of the defined normal axis
direction. The default value is OFF.
normalAxisVector
A sequence of Floats specifying the vector that defines the direction of the normal axis
of the discrete orientation.
primaryAxisDirection
A SymbolicConstant specifying the axis that is defined by the primary axis direction for
a discrete orientation. Possible values are AXIS_1, AXIS_2, and AXIS_3. The default
value is AXIS_1.
primaryAxisDefinition
A SymbolicConstant specifying the method used to define the primary axis direction for a
discrete orientation. Possible values are SURFACE, PRIMARY_DATUM, and PRIMARY_VECTOR.
The default value is PRIMARY_VECTOR.
primaryAxisRegion
A Set object specifying a region whose geometric tangents define the primary axis for
the discrete orientation.
primaryAxisDatum
A DatumAxis object specifying the Datum Axis or None, describing the primary axis
direction for the discrete orientation.
flipPrimaryDirection
A Boolean specifying the flag to reverse the direction of the defined primary axis
direction. The default value is OFF.
primaryAxisVector
A sequence of Floats specifying the vector that defines the direction of the primary
axis of the discrete orientation.
Returns
-------
A MaterialOrientation object.
"""
pass
def ReferenceOrientation(self, localCsys: DatumCsys = DatumCsys(), axis: SymbolicConstant = AXIS_1,
angle: float = 0,
stackDirection: SymbolicConstant = STACK_3, fieldName: str = '',
orientationType: SymbolicConstant = GLOBAL, additionalRotationField: str = '',
additionalRotationType: SymbolicConstant = ROTATION_NONE,
normalAxisDirection: SymbolicConstant = AXIS_3,
normalAxisDefinition: SymbolicConstant = VECTOR, normalAxisRegion: Surface = None,
normalAxisDatum: DatumAxis = DatumAxis(), flipNormalDirection: Boolean = OFF,
normalAxisVector: tuple = (), primaryAxisDirection: SymbolicConstant = AXIS_1,
primaryAxisDefinition: SymbolicConstant = VECTOR, primaryAxisRegion: Set = None,
primaryAxisDatum: DatumAxis = DatumAxis(), flipPrimaryDirection: Boolean = OFF,
primaryAxisVector: tuple = ()):
"""This method creates a MaterialOrientation object.
Notes
-----
This function can be accessed by:
.. code-block:: python
mdb.models[name].parts[*name*].MaterialOrientation
Parameters
----------
localCsys
A DatumCsys object specifying the local coordinate system or None, describing the
material orientation for the given region. In the ODB, this member was previously
accessible using "csys," but support has now been added for localCsys and the csys
member will be deprecated.
axis
A SymbolicConstant specifying the axis of a datum coordinate system about which an
additional rotation is applied. For shells this axis is also the shell normal. Possible
values are AXIS_1, AXIS_2, and AXIS_3. The default value is AXIS_1.
angle
A Float specifying the angle of the additional rotation (if accessed from the ODB
instead of the MDB, it will be a string instead of a float). The default value is 0.0.
stackDirection
A SymbolicConstant specifying the stack or thickness direction. Possible values are
STACK_1, STACK_2, STACK_3, and STACK_ORIENTATION. The default value is STACK_3.
fieldName
A String specifying the name of the DiscreteField object specifying the orientation. The
default value is an empty string.
orientationType
A SymbolicConstant specifying the method used to define the material orientation. If
*orientationType*=SYSTEM, the *region* and *localCsys* arguments are required. If
*orientationType*=FIELD, the *fieldName* argument is required. Possible values are
GLOBAL, SYSTEM, FIELD, DISCRETE, and USER. The default value is GLOBAL.
additionalRotationField
A String specifying the name of the DiscreteField object specifying the additional
rotation. The default value is an empty string.
additionalRotationType
A SymbolicConstant specifying the method used to describe the additional rotation when a
valid orientation is specified. Possible values are ROTATION_NONE, ROTATION_ANGLE, and
ROTATION_FIELD. The default value is ROTATION_NONE.
normalAxisDirection
A SymbolicConstant specifying the axis that is defined by the normal axis direction for
a discrete orientation. Possible values are AXIS_1, AXIS_2, and AXIS_3. The default
value is AXIS_3.
normalAxisDefinition
A SymbolicConstant specifying the method used to define the normal axis direction for a
discrete orientation. Possible values are SURFACE, DATUM, and VECTOR. The default value
is VECTOR.
normalAxisRegion
A Surface object specifying a region whose geometric normals define the normal axis for
the discrete orientation.
normalAxisDatum
A DatumAxis object specifying the Datum Axis or None, describing the normal axis
direction for the discrete orientation.
flipNormalDirection
A Boolean specifying the flag to reverse the direction of the defined normal axis
direction. The default value is OFF.
normalAxisVector
A sequence of Floats specifying the vector that defines the direction of the normal axis
of the discrete orientation.
primaryAxisDirection
A SymbolicConstant specifying the axis that is defined by the primary axis direction for
a discrete orientation. Possible values are AXIS_1, AXIS_2, and AXIS_3. The default
value is AXIS_1.
primaryAxisDefinition
A SymbolicConstant specifying the method used to define the primary axis direction for a
discrete orientation. Possible values are EDGE, DATUM, and VECTOR. The default value is
VECTOR.
primaryAxisRegion
A Set object specifying a region whose geometric tangents define the primary axis for
the discrete orientation.
primaryAxisDatum
A DatumAxis object specifying the Datum Axis or None, describing the primary axis
direction for the discrete orientation.
flipPrimaryDirection
A Boolean specifying the flag to reverse the direction of the defined primary axis
direction. The default value is OFF.
primaryAxisVector
A sequence of Floats specifying the vector that defines the direction of the primary
axis of the discrete orientation.
Returns
-------
A MaterialOrientation object.
"""
pass
def setValues(self):
"""This method modifies the MaterialOrientation object.
"""
pass
| 54.278226 | 130 | 0.665404 | 1,474 | 13,461 | 6.030529 | 0.116011 | 0.062887 | 0.047249 | 0.053549 | 0.892001 | 0.878502 | 0.849364 | 0.842727 | 0.835752 | 0.835752 | 0 | 0.004835 | 0.293292 | 13,461 | 247 | 131 | 54.497976 | 0.92957 | 0.7128 | 0 | 0.194444 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.083333 | false | 0.083333 | 0.138889 | 0 | 0.305556 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 7 |
277d43083cb42876f00c29b38ad7cabc022ffe6b | 110,993 | py | Python | Models/PowerPlantsLoad/dummy_data/kw_daten/model.py | schmocker/Pyjamas | 52a72d6e8b915f77a2194d4e7d53c46d0ec28c17 | [
"MIT"
] | 2 | 2018-05-31T15:02:08.000Z | 2018-07-11T11:02:44.000Z | Models/PowerPlantsLoad/dummy_data/kw_daten/model.py | schmocker/Pyjamas | 52a72d6e8b915f77a2194d4e7d53c46d0ec28c17 | [
"MIT"
] | null | null | null | Models/PowerPlantsLoad/dummy_data/kw_daten/model.py | schmocker/Pyjamas | 52a72d6e8b915f77a2194d4e7d53c46d0ec28c17 | [
"MIT"
] | null | null | null | from pyjamas_core import Supermodel
from pyjamas_core.util import Input, Output, Property
import numpy as np
# define the model class and inherit from class "Supermodel"
class Model(Supermodel):
# model constructor
def __init__(self, id, name: str):
# instantiate supermodel
super(Model, self).__init__(id, name)
# define outputs
self.outputs['kw_data'] = Output('PowerPlantData')
# define persistent variables
self.kw_data = None
async def func_birth(self):
# Temporary dummy data for testing purpose, Will be replaced by the data coming from the database/ Kraftwerkspark
# KWDaten: Dictionary holding the different parameters of power plants
# ------------------------------------------------------------------------------------
# id fk_kwt kw_bezeichnung power[W] spez_info Capex Opex
# ------------------------------------------------------------------------------------
# 1 2 WT 1000000 NH: 150, Z0: 0.03 1 0.01
# 2 1 PV 2000000 NH: 0, Z0: {} 2 0.02
# 3 2 WT 3000000 NH: 200, Z0: 0.2 3 0.03
# 4 1 PV 4000000 NH: 0, Z0: {} 4 0.04
# 5 2 WT 5000000 NH: 250, Z0: 0.03 5 0.05
# 6 1 PV 6000000 NH: 0, Z0: {} 6 0.06
# 8 3 OTHER 1000000 NH: 0, Z0: {} 7 0.07
# 10 3 OTHER 1000000 NH: 0, Z0: {} 8 0.08
# 11 4 OTHER 1000000 NH: 0, Z0: {} 9 0.09
# [KWID, FKKWT, KWBezeichnung, Power, Weitere spezifische parameter(Nabenhoehe, Z0, usw.), Capex, Opex, KEV, Brennstoffkosten, Entsorgungskostne, CO2-Kosten, usw.]
'''
kev1 = [1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25,26,27,28,29,30,31,32
,1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25,26,27,28,29,30,31,32
,25,56,7,8,96,10,10,20,23,24,25,56,7,8,96,10,10,20,23,24,25,56,7,8,96,10,10,20,23,24,25,56]
kev2 = [1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25,26,27,28,29,30,31,32
,1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25,26,27,28,29,30,31,32
,25,56,7,8,96,10,10,20,23,24,25,56,7,8,96,10,10,20,23,24,25,56,7,8,96,10,10,20,23,24,25,56]
kev4 = [1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25,26,27,28,29,30,31,32
,1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25,26,27,28,29,30,31,32
,25,56,7,8,96,10,10,20,23,24,25,56,7,8,96,10,10,20,23,24,25,56,7,8,96,10,10,20,23,24,25,56]
kev5 = [1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25,26,27,28,29,30,31,32
,1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25,26,27,28,29,30,31,32
,25,56,7,8,96,10,10,20,23,24,25,56,7,8,96,10,10,20,23,24,25,56,7,8,96,10,10,20,23,24,25,56]
kev11 = [1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25,26,27,28,29,30,31,32
,1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25,26,27,28,29,30,31,32
,25,56,7,8,96,10,10,20,23,24,25,56,7,8,96,10,10,20,23,24,25,56,7,8,96,10,10,20,23,24,25,56]
kev13 = [1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25,26,27,28,29,30,31,32
,1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25,26,27,28,29,30,31,32
,25,56,7,8,96,10,10,20,23,24,25,56,7,8,96,10,10,20,23,24,25,56,7,8,96,10,10,20,23,24,25,56]
KeinKEV = [0]*96 #Array[96] filled with 0s
'kev':[kev1, kev2, KeinKEV, kev4, kev5, KeinKEV, KeinKEV, KeinKEV, kev11, KeinKEV, kev13, KeinKEV],
'''
self.KWDaten = {'id': [1, 2, 3, 4, 5, 6, 8, 10, 11, 12, 13, 14], 'fk_kraftwerkstyp': [2, 1, 2, 1, 2, 1, 5, 3, 3, 6, 4, 4],
'bez_kraftwerkstyp': ['Windturbine','Photovoltaik','Windturbine','Photovoltaik','Windturbine','Photovoltaik', 'Others', 'Laufwasserkraftwerk','Laufwasserkraftwerk','Others','Speicherwasserkraftwerk','Speicherwasserkraftwerk'],
'p_inst': [1000000, 2000000, 3000000, 4000000, 5000000, 6000000, 8000000, 10000000, 11000000, 12000000, 13000000, 14000000],
'spez_info': [{'NH': 150, 'Z0': 0.03}, {}, {'NH': 100, 'Z0': 0.2}, {}, {'NH': 250, 'Z0': 0.03}, {}, {}, {}, {}, {}, {}, {},],
'capex': [1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12],
'spez_opex': [0.01, 0.02, 0.03, 0.04, 0.05, 0.06, 0.07, 0.08, 0.09, 0.10, 0.11, 0.12],
'brennstoffkosten': [0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9, 0.10, 0.11, 0.12],
'co2_kosten': [0.001, 0.002, 0.003, 0.004, 0.005, 0.006, 0.007, 0.008, 0.009, 0.0010, 0.0011, 0.0012],
'entsorgungskosten': [0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9, 0.10, 0.11, 0.12],
'lat':[51.165691, 46.227638, 47.516231, 41.871940, 51.165691, 46.227638, 47.516231, 41.871940, 51.165691, 46.227638, 47.516231, 41.871940],
'long':[10.451526, 2.213749, 14.550072, 12.567380, 10.451526, 2.213749, 14.550072, 12.567380, 10.451526, 2.213749, 14.550072, 12.567380]}
'''
self.KWDaten = {'id': [1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 77, 78, 79, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 95, 96, 97, 98, 99, 100, 101, 102, 103, 104, 105, 106, 107, 108, 109, 110, 111, 112, 113, 114, 115, 116, 117, 118, 119, 120, 121, 122, 123, 124, 125, 126, 127, 128, 129, 130, 131, 132, 133, 134, 135, 136, 137, 138, 139, 140, 141, 142, 143, 144, 145, 146, 147, 148, 149, 150, 151, 152, 153, 154, 155, 156, 157, 158, 159, 160, 161, 162, 163, 164, 165, 166, 167, 168, 169, 170, 171, 172, 173, 174, 175, 176, 177, 178, 179, 180, 181, 182, 183, 184, 185, 186, 187, 188, 189, 190, 191, 192, 193, 194, 195, 196, 197, 198, 199, 200, 201, 202, 203, 204, 205, 206, 207, 208, 209, 210, 211, 212, 213, 214, 215, 216, 217, 218, 219, 220, 221, 222, 223, 224, 225, 226, 227, 228, 229, 230, 231, 232, 233, 234, 235, 236, 237, 238, 239, 240, 241, 242, 243, 244, 245, 246, 247, 248, 249, 250, 251, 252, 253, 254, 255, 256, 257, 258, 259, 260, 261, 262, 263, 264, 265, 266, 267, 268, 269, 270, 271, 272, 273, 274, 275, 276, 277, 278, 279, 280, 281, 282, 283, 284, 285, 286, 287, 288, 289, 290, 291, 292, 293, 294, 295, 296, 297, 298, 299, 300, 301, 302, 303, 304, 305, 306, 307, 308, 309, 310, 311, 312, 313, 314, 315, 316, 317, 318, 319, 320, 321, 322, 323, 324, 325, 326, 327, 328, 329, 330, 331, 332, 333, 334, 335, 336, 337, 338, 339, 340, 341, 342, 343, 344, 345, 346, 347, 348, 349, 350, 351, 352, 353, 354, 355, 356, 357, 358, 359, 360, 361, 362, 363, 364, 365, 366, 367, 368, 369, 370, 371, 372, 373, 374, 375, 376, 377, 378, 379, 380, 381, 382, 383, 384, 385, 386, 387, 388, 389, 390, 391, 392, 393, 394, 395, 396, 397, 398, 399, 400, 401, 402, 403, 404, 405, 406, 407, 408, 409, 410, 411, 412, 413, 414, 415, 416, 417, 418, 419, 420, 421, 422, 423, 424, 425, 426, 427, 428, 429, 430, 431, 432, 433, 434, 435, 436, 437, 438, 439, 440, 441, 442, 443, 444, 445, 446, 447, 448, 449, 450, 451, 452, 453, 454, 455, 456, 457, 458, 459, 460, 461, 462, 463, 464, 465, 466, 467, 468, 469, 470, 471, 472, 473, 474, 475],
'kw_bezeichnung': ['ALB_Biomassekraftwerk', 'ALB_GT absicherung', 'ALB_GT peak', 'ALB_Gaskombikraftwerk alt', 'ALB_Gaskombikraftwerk neu', 'ALB_Steinkohlekraftwerk alt', 'ALB_Steinkohlekraftwerk neu', 'ALB_Laufwasserkraftwerk', 'ALB_Speicherwasserkraftwerk', 'ALB_Braunkohlekraftwerk alt', 'ALB_Braunkohlekraftwerk neu', 'ALB_Kernkraftwerk', 'ALB_Oelkraftwerk', 'ALB_Weitere non-RES', 'ALB_Weitere RES', 'ALB_PV', 'ALB_Wind onshore stark', 'ALB_Wind onshore schwach', 'ALB_Wind offshore', 'AUT_Biomassekraftwerk', 'AUT_GT absicherung', 'AUT_GT peak', 'AUT_Gaskombikraftwerk alt', 'AUT_Gaskombikraftwerk neu', 'AUT_Steinkohlekraftwerk alt', 'AUT_Steinkohlekraftwerk neu', 'AUT_Laufwasserkraftwerk', 'AUT_Speicherwasserkraftwerk', 'AUT_Braunkohlekraftwerk alt', 'AUT_Braunkohlekraftwerk neu', 'AUT_Kernkraftwerk', 'AUT_Oelkraftwerk', 'AUT_Weitere non-RES', 'AUT_Weitere RES', 'AUT_PV', 'AUT_Wind onshore stark', 'AUT_Wind onshore schwach', 'AUT_Wind offshore', 'BIH_Biomassekraftwerk', 'BIH_GT absicherung', 'BIH_GT peak', 'BIH_Gaskombikraftwerk alt', 'BIH_Gaskombikraftwerk neu', 'BIH_Steinkohlekraftwerk alt', 'BIH_Steinkohlekraftwerk neu', 'BIH_Laufwasserkraftwerk', 'BIH_Speicherwasserkraftwerk', 'BIH_Braunkohlekraftwerk alt', 'BIH_Braunkohlekraftwerk neu', 'BIH_Kernkraftwerk', 'BIH_Oelkraftwerk', 'BIH_Weitere non-RES', 'BIH_Weitere RES', 'BIH_PV', 'BIH_Wind onshore stark', 'BIH_Wind onshore schwach', 'BIH_Wind offshore', 'BEL_Biomassekraftwerk', 'BEL_GT absicherung', 'BEL_GT peak', 'BEL_Gaskombikraftwerk alt', 'BEL_Gaskombikraftwerk neu', 'BEL_Steinkohlekraftwerk alt', 'BEL_Steinkohlekraftwerk neu', 'BEL_Laufwasserkraftwerk', 'BEL_Speicherwasserkraftwerk', 'BEL_Braunkohlekraftwerk alt', 'BEL_Braunkohlekraftwerk neu', 'BEL_Kernkraftwerk', 'BEL_Oelkraftwerk', 'BEL_Weitere non-RES', 'BEL_Weitere RES', 'BEL_PV', 'BEL_Wind onshore stark', 'BEL_Wind onshore schwach', 'BEL_Wind offshore', 'BGR_Biomassekraftwerk', 'BGR_GT absicherung', 'BGR_GT peak', 'BGR_Gaskombikraftwerk alt', 'BGR_Gaskombikraftwerk neu', 'BGR_Steinkohlekraftwerk alt', 'BGR_Steinkohlekraftwerk neu', 'BGR_Laufwasserkraftwerk', 'BGR_Speicherwasserkraftwerk', 'BGR_Braunkohlekraftwerk alt', 'BGR_Braunkohlekraftwerk neu', 'BGR_Kernkraftwerk', 'BGR_Oelkraftwerk', 'BGR_Weitere non-RES', 'BGR_Weitere RES', 'BGR_PV', 'BGR_Wind onshore stark', 'BGR_Wind onshore schwach', 'BGR_Wind offshore', 'CHE_Biomassekraftwerk', 'CHE_GT absicherung', 'CHE_GT peak', 'CHE_Gaskombikraftwerk alt', 'CHE_Gaskombikraftwerk neu', 'CHE_Steinkohlekraftwerk alt', 'CHE_Steinkohlekraftwerk neu', 'CHE_Laufwasserkraftwerk', 'CHE_Speicherwasserkraftwerk', 'CHE_Braunkohlekraftwerk alt', 'CHE_Braunkohlekraftwerk neu', 'CHE_Kernkraftwerk', 'CHE_Oelkraftwerk', 'CHE_Weitere non-RES', 'CHE_Weitere RES', 'CHE_PV', 'CHE_Wind onshore stark', 'CHE_Wind onshore schwach', 'CHE_Wind offshore', 'CZE_Biomassekraftwerk', 'CZE_GT absicherung', 'CZE_GT peak', 'CZE_Gaskombikraftwerk alt', 'CZE_Gaskombikraftwerk neu', 'CZE_Steinkohlekraftwerk alt', 'CZE_Steinkohlekraftwerk neu', 'CZE_Laufwasserkraftwerk', 'CZE_Speicherwasserkraftwerk', 'CZE_Braunkohlekraftwerk alt', 'CZE_Braunkohlekraftwerk neu', 'CZE_Kernkraftwerk', 'CZE_Oelkraftwerk', 'CZE_Weitere non-RES', 'CZE_Weitere RES', 'CZE_PV', 'CZE_Wind onshore stark', 'CZE_Wind onshore schwach', 'CZE_Wind offshore', 'DEU_Biomassekraftwerk', 'DEU_GT absicherung', 'DEU_GT peak', 'DEU_Gaskombikraftwerk alt', 'DEU_Gaskombikraftwerk neu', 'DEU_Steinkohlekraftwerk alt', 'DEU_Steinkohlekraftwerk neu', 'DEU_Laufwasserkraftwerk', 'DEU_Speicherwasserkraftwerk', 'DEU_Braunkohlekraftwerk alt', 'DEU_Braunkohlekraftwerk neu', 'DEU_Kernkraftwerk', 'DEU_Oelkraftwerk', 'DEU_Weitere non-RES', 'DEU_Weitere RES', 'DEU_PV', 'DEU_Wind onshore stark', 'DEU_Wind onshore schwach', 'DEU_Wind offshore', 'DNK_Biomassekraftwerk', 'DNK_GT absicherung', 'DNK_GT peak', 'DNK_Gaskombikraftwerk alt', 'DNK_Gaskombikraftwerk neu', 'DNK_Steinkohlekraftwerk alt', 'DNK_Steinkohlekraftwerk neu', 'DNK_Laufwasserkraftwerk', 'DNK_Speicherwasserkraftwerk', 'DNK_Braunkohlekraftwerk alt', 'DNK_Braunkohlekraftwerk neu', 'DNK_Kernkraftwerk', 'DNK_Oelkraftwerk', 'DNK_Weitere non-RES', 'DNK_Weitere RES', 'DNK_PV', 'DNK_Wind onshore stark', 'DNK_Wind onshore schwach', 'DNK_Wind offshore', 'ESP_Biomassekraftwerk', 'ESP_GT absicherung', 'ESP_GT peak', 'ESP_Gaskombikraftwerk alt', 'ESP_Gaskombikraftwerk neu', 'ESP_Steinkohlekraftwerk alt', 'ESP_Steinkohlekraftwerk neu', 'ESP_Laufwasserkraftwerk', 'ESP_Speicherwasserkraftwerk', 'ESP_Braunkohlekraftwerk alt', 'ESP_Braunkohlekraftwerk neu', 'ESP_Kernkraftwerk', 'ESP_Oelkraftwerk', 'ESP_Weitere non-RES', 'ESP_Weitere RES', 'ESP_PV', 'ESP_Wind onshore stark', 'ESP_Wind onshore schwach', 'ESP_Wind offshore', 'FRA_Biomassekraftwerk', 'FRA_GT absicherung', 'FRA_GT peak', 'FRA_Gaskombikraftwerk alt', 'FRA_Gaskombikraftwerk neu', 'FRA_Steinkohlekraftwerk alt', 'FRA_Steinkohlekraftwerk neu', 'FRA_Laufwasserkraftwerk', 'FRA_Speicherwasserkraftwerk', 'FRA_Braunkohlekraftwerk alt', 'FRA_Braunkohlekraftwerk neu', 'FRA_Kernkraftwerk', 'FRA_Oelkraftwerk', 'FRA_Weitere non-RES', 'FRA_Weitere RES', 'FRA_PV', 'FRA_Wind onshore stark', 'FRA_Wind onshore schwach', 'FRA_Wind offshore', 'GRC_Biomassekraftwerk', 'GRC_GT absicherung', 'GRC_GT peak', 'GRC_Gaskombikraftwerk alt', 'GRC_Gaskombikraftwerk neu', 'GRC_Steinkohlekraftwerk alt', 'GRC_Steinkohlekraftwerk neu', 'GRC_Laufwasserkraftwerk', 'GRC_Speicherwasserkraftwerk', 'GRC_Braunkohlekraftwerk alt', 'GRC_Braunkohlekraftwerk neu', 'GRC_Kernkraftwerk', 'GRC_Oelkraftwerk', 'GRC_Weitere non-RES', 'GRC_Weitere RES', 'GRC_PV', 'GRC_Wind onshore stark', 'GRC_Wind onshore schwach', 'GRC_Wind offshore', 'HRV_Biomassekraftwerk', 'HRV_GT absicherung', 'HRV_GT peak', 'HRV_Gaskombikraftwerk alt', 'HRV_Gaskombikraftwerk neu', 'HRV_Steinkohlekraftwerk alt', 'HRV_Steinkohlekraftwerk neu', 'HRV_Laufwasserkraftwerk', 'HRV_Speicherwasserkraftwerk', 'HRV_Braunkohlekraftwerk alt', 'HRV_Braunkohlekraftwerk neu', 'HRV_Kernkraftwerk', 'HRV_Oelkraftwerk', 'HRV_Weitere non-RES', 'HRV_Weitere RES', 'HRV_PV', 'HRV_Wind onshore stark', 'HRV_Wind onshore schwach', 'HRV_Wind offshore', 'HUN_Biomassekraftwerk', 'HUN_GT absicherung', 'HUN_GT peak', 'HUN_Gaskombikraftwerk alt', 'HUN_Gaskombikraftwerk neu', 'HUN_Steinkohlekraftwerk alt', 'HUN_Steinkohlekraftwerk neu', 'HUN_Laufwasserkraftwerk', 'HUN_Speicherwasserkraftwerk', 'HUN_Braunkohlekraftwerk alt', 'HUN_Braunkohlekraftwerk neu', 'HUN_Kernkraftwerk', 'HUN_Oelkraftwerk', 'HUN_Weitere non-RES', 'HUN_Weitere RES', 'HUN_PV', 'HUN_Wind onshore stark', 'HUN_Wind onshore schwach', 'HUN_Wind offshore', 'ITA_Biomassekraftwerk', 'ITA_GT absicherung', 'ITA_GT peak', 'ITA_Gaskombikraftwerk alt', 'ITA_Gaskombikraftwerk neu', 'ITA_Steinkohlekraftwerk alt', 'ITA_Steinkohlekraftwerk neu', 'ITA_Laufwasserkraftwerk', 'ITA_Speicherwasserkraftwerk', 'ITA_Braunkohlekraftwerk alt', 'ITA_Braunkohlekraftwerk neu', 'ITA_Kernkraftwerk', 'ITA_Oelkraftwerk', 'ITA_Weitere non-RES', 'ITA_Weitere RES', 'ITA_PV', 'ITA_Wind onshore stark', 'ITA_Wind onshore schwach', 'ITA_Wind offshore', 'LUX_Biomassekraftwerk', 'LUX_GT absicherung', 'LUX_GT peak', 'LUX_Gaskombikraftwerk alt', 'LUX_Gaskombikraftwerk neu', 'LUX_Steinkohlekraftwerk alt', 'LUX_Steinkohlekraftwerk neu', 'LUX_Laufwasserkraftwerk', 'LUX_Speicherwasserkraftwerk', 'LUX_Braunkohlekraftwerk alt', 'LUX_Braunkohlekraftwerk neu', 'LUX_Kernkraftwerk', 'LUX_Oelkraftwerk', 'LUX_Weitere non-RES', 'LUX_Weitere RES', 'LUX_PV', 'LUX_Wind onshore stark', 'LUX_Wind onshore schwach', 'LUX_Wind offshore', 'MNE_Biomassekraftwerk', 'MNE_GT absicherung', 'MNE_GT peak', 'MNE_Gaskombikraftwerk alt', 'MNE_Gaskombikraftwerk neu', 'MNE_Steinkohlekraftwerk alt', 'MNE_Steinkohlekraftwerk neu', 'MNE_Laufwasserkraftwerk', 'MNE_Speicherwasserkraftwerk', 'MNE_Braunkohlekraftwerk alt', 'MNE_Braunkohlekraftwerk neu', 'MNE_Kernkraftwerk', 'MNE_Oelkraftwerk', 'MNE_Weitere non-RES', 'MNE_Weitere RES', 'MNE_PV', 'MNE_Wind onshore stark', 'MNE_Wind onshore schwach', 'MNE_Wind offshore', 'MKD_Biomassekraftwerk', 'MKD_GT absicherung', 'MKD_GT peak', 'MKD_Gaskombikraftwerk alt', 'MKD_Gaskombikraftwerk neu', 'MKD_Steinkohlekraftwerk alt', 'MKD_Steinkohlekraftwerk neu', 'MKD_Laufwasserkraftwerk', 'MKD_Speicherwasserkraftwerk', 'MKD_Braunkohlekraftwerk alt', 'MKD_Braunkohlekraftwerk neu', 'MKD_Kernkraftwerk', 'MKD_Oelkraftwerk', 'MKD_Weitere non-RES', 'MKD_Weitere RES', 'MKD_PV', 'MKD_Wind onshore stark', 'MKD_Wind onshore schwach', 'MKD_Wind offshore', 'NLD_Biomassekraftwerk', 'NLD_GT absicherung', 'NLD_GT peak', 'NLD_Gaskombikraftwerk alt', 'NLD_Gaskombikraftwerk neu', 'NLD_Steinkohlekraftwerk alt', 'NLD_Steinkohlekraftwerk neu', 'NLD_Laufwasserkraftwerk', 'NLD_Speicherwasserkraftwerk', 'NLD_Braunkohlekraftwerk alt', 'NLD_Braunkohlekraftwerk neu', 'NLD_Kernkraftwerk', 'NLD_Oelkraftwerk', 'NLD_Weitere non-RES', 'NLD_Weitere RES', 'NLD_PV', 'NLD_Wind onshore stark', 'NLD_Wind onshore schwach', 'NLD_Wind offshore', 'POL_Biomassekraftwerk', 'POL_GT absicherung', 'POL_GT peak', 'POL_Gaskombikraftwerk alt', 'POL_Gaskombikraftwerk neu', 'POL_Steinkohlekraftwerk alt', 'POL_Steinkohlekraftwerk neu', 'POL_Laufwasserkraftwerk', 'POL_Speicherwasserkraftwerk', 'POL_Braunkohlekraftwerk alt', 'POL_Braunkohlekraftwerk neu', 'POL_Kernkraftwerk', 'POL_Oelkraftwerk', 'POL_Weitere non-RES', 'POL_Weitere RES', 'POL_PV', 'POL_Wind onshore stark', 'POL_Wind onshore schwach', 'POL_Wind offshore', 'PRT_Biomassekraftwerk', 'PRT_GT absicherung', 'PRT_GT peak', 'PRT_Gaskombikraftwerk alt', 'PRT_Gaskombikraftwerk neu', 'PRT_Steinkohlekraftwerk alt', 'PRT_Steinkohlekraftwerk neu', 'PRT_Laufwasserkraftwerk', 'PRT_Speicherwasserkraftwerk', 'PRT_Braunkohlekraftwerk alt', 'PRT_Braunkohlekraftwerk neu', 'PRT_Kernkraftwerk', 'PRT_Oelkraftwerk', 'PRT_Weitere non-RES', 'PRT_Weitere RES', 'PRT_PV', 'PRT_Wind onshore stark', 'PRT_Wind onshore schwach', 'PRT_Wind offshore', 'ROU_Biomassekraftwerk', 'ROU_GT absicherung', 'ROU_GT peak', 'ROU_Gaskombikraftwerk alt', 'ROU_Gaskombikraftwerk neu', 'ROU_Steinkohlekraftwerk alt', 'ROU_Steinkohlekraftwerk neu', 'ROU_Laufwasserkraftwerk', 'ROU_Speicherwasserkraftwerk', 'ROU_Braunkohlekraftwerk alt', 'ROU_Braunkohlekraftwerk neu', 'ROU_Kernkraftwerk', 'ROU_Oelkraftwerk', 'ROU_Weitere non-RES', 'ROU_Weitere RES', 'ROU_PV', 'ROU_Wind onshore stark', 'ROU_Wind onshore schwach', 'ROU_Wind offshore', 'SRB_Biomassekraftwerk', 'SRB_GT absicherung', 'SRB_GT peak', 'SRB_Gaskombikraftwerk alt', 'SRB_Gaskombikraftwerk neu', 'SRB_Steinkohlekraftwerk alt', 'SRB_Steinkohlekraftwerk neu', 'SRB_Laufwasserkraftwerk', 'SRB_Speicherwasserkraftwerk', 'SRB_Braunkohlekraftwerk alt', 'SRB_Braunkohlekraftwerk neu', 'SRB_Kernkraftwerk', 'SRB_Oelkraftwerk', 'SRB_Weitere non-RES', 'SRB_Weitere RES', 'SRB_PV', 'SRB_Wind onshore stark', 'SRB_Wind onshore schwach', 'SRB_Wind offshore', 'SVN_Biomassekraftwerk', 'SVN_GT absicherung', 'SVN_GT peak', 'SVN_Gaskombikraftwerk alt', 'SVN_Gaskombikraftwerk neu', 'SVN_Steinkohlekraftwerk alt', 'SVN_Steinkohlekraftwerk neu', 'SVN_Laufwasserkraftwerk', 'SVN_Speicherwasserkraftwerk', 'SVN_Braunkohlekraftwerk alt', 'SVN_Braunkohlekraftwerk neu', 'SVN_Kernkraftwerk', 'SVN_Oelkraftwerk', 'SVN_Weitere non-RES', 'SVN_Weitere RES', 'SVN_PV', 'SVN_Wind onshore stark', 'SVN_Wind onshore schwach', 'SVN_Wind offshore', 'SVK_Biomassekraftwerk', 'SVK_GT absicherung', 'SVK_GT peak', 'SVK_Gaskombikraftwerk alt', 'SVK_Gaskombikraftwerk neu', 'SVK_Steinkohlekraftwerk alt', 'SVK_Steinkohlekraftwerk neu', 'SVK_Laufwasserkraftwerk', 'SVK_Speicherwasserkraftwerk', 'SVK_Braunkohlekraftwerk alt', 'SVK_Braunkohlekraftwerk neu', 'SVK_Kernkraftwerk', 'SVK_Oelkraftwerk', 'SVK_Weitere non-RES', 'SVK_Weitere RES', 'SVK_PV', 'SVK_Wind onshore stark', 'SVK_Wind onshore schwach', 'SVK_Wind offshore'],
'lat': [41.14244989, 41.14244989, 41.14244989, 41.14244989, 41.14244989, 41.14244989, 41.14244989, 41.14244989, 41.14244989, 41.14244989, 41.14244989, 41.14244989, 41.14244989, 41.14244989, 41.14244989, 41.14244989, 41.14244989, 41.14244989, 41.14244989, 47.58549439, 47.58549439, 47.58549439, 47.58549439, 47.58549439, 47.58549439, 47.58549439, 47.58549439, 47.58549439, 47.58549439, 47.58549439, 47.58549439, 47.58549439, 47.58549439, 47.58549439, 47.58549439, 47.58549439, 47.58549439, 47.58549439, 44.17450125, 44.17450125, 44.17450125, 44.17450125, 44.17450125, 44.17450125, 44.17450125, 44.17450125, 44.17450125, 44.17450125, 44.17450125, 44.17450125, 44.17450125, 44.17450125, 44.17450125, 44.17450125, 44.17450125, 44.17450125, 44.17450125, 50.63981576, 50.63981576, 50.63981576, 50.63981576, 50.63981576, 50.63981576, 50.63981576, 50.63981576, 50.63981576, 50.63981576, 50.63981576, 50.63981576, 50.63981576, 50.63981576, 50.63981576, 50.63981576, 50.63981576, 50.63981576, 50.63981576, 42.76890318, 42.76890318, 42.76890318, 42.76890318, 42.76890318, 42.76890318, 42.76890318, 42.76890318, 42.76890318, 42.76890318, 42.76890318, 42.76890318, 42.76890318, 42.76890318, 42.76890318, 42.76890318, 42.76890318, 42.76890318, 42.76890318, 46.79785878, 46.79785878, 46.79785878, 46.79785878, 46.79785878, 46.79785878, 46.79785878, 46.79785878, 46.79785878, 46.79785878, 46.79785878, 46.79785878, 46.79785878, 46.79785878, 46.79785878, 46.79785878, 46.79785878, 46.79785878, 46.79785878, 49.73341233, 49.73341233, 49.73341233, 49.73341233, 49.73341233, 49.73341233, 49.73341233, 49.73341233, 49.73341233, 49.73341233, 49.73341233, 49.73341233, 49.73341233, 49.73341233, 49.73341233, 49.73341233, 49.73341233, 49.73341233, 49.73341233, 51.10698181, 51.10698181, 51.10698181, 51.10698181, 51.10698181, 51.10698181, 51.10698181, 51.10698181, 51.10698181, 51.10698181, 51.10698181, 51.10698181, 51.10698181, 51.10698181, 51.10698181, 51.10698181, 51.10698181, 51.10698181, 51.10698181, 55.98125296, 55.98125296, 55.98125296, 55.98125296, 55.98125296, 55.98125296, 55.98125296, 55.98125296, 55.98125296, 55.98125296, 55.98125296, 55.98125296, 55.98125296, 55.98125296, 55.98125296, 55.98125296, 55.98125296, 55.98125296, 55.98125296, 40.24448698, 40.24448698, 40.24448698, 40.24448698, 40.24448698, 40.24448698, 40.24448698, 40.24448698, 40.24448698, 40.24448698, 40.24448698, 40.24448698, 40.24448698, 40.24448698, 40.24448698, 40.24448698, 40.24448698, 40.24448698, 40.24448698, 42.17344011, 42.17344011, 42.17344011, 42.17344011, 42.17344011, 42.17344011, 42.17344011, 42.17344011, 42.17344011, 42.17344011, 42.17344011, 42.17344011, 42.17344011, 42.17344011, 42.17344011, 42.17344011, 42.17344011, 42.17344011, 42.17344011, 39.07469623, 39.07469623, 39.07469623, 39.07469623, 39.07469623, 39.07469623, 39.07469623, 39.07469623, 39.07469623, 39.07469623, 39.07469623, 39.07469623, 39.07469623, 39.07469623, 39.07469623, 39.07469623, 39.07469623, 39.07469623, 39.07469623, 45.08047631, 45.08047631, 45.08047631, 45.08047631, 45.08047631, 45.08047631, 45.08047631, 45.08047631, 45.08047631, 45.08047631, 45.08047631, 45.08047631, 45.08047631, 45.08047631, 45.08047631, 45.08047631, 45.08047631, 45.08047631, 45.08047631, 47.16277506, 47.16277506, 47.16277506, 47.16277506, 47.16277506, 47.16277506, 47.16277506, 47.16277506, 47.16277506, 47.16277506, 47.16277506, 47.16277506, 47.16277506, 47.16277506, 47.16277506, 47.16277506, 47.16277506, 47.16277506, 47.16277506, 42.79662641, 42.79662641, 42.79662641, 42.79662641, 42.79662641, 42.79662641, 42.79662641, 42.79662641, 42.79662641, 42.79662641, 42.79662641, 42.79662641, 42.79662641, 42.79662641, 42.79662641, 42.79662641, 42.79662641, 42.79662641, 42.79662641, 49.76725361, 49.76725361, 49.76725361, 49.76725361, 49.76725361, 49.76725361, 49.76725361, 49.76725361, 49.76725361, 49.76725361, 49.76725361, 49.76725361, 49.76725361, 49.76725361, 49.76725361, 49.76725361, 49.76725361, 49.76725361, 49.76725361, 42.78890259, 42.78890259, 42.78890259, 42.78890259, 42.78890259, 42.78890259, 42.78890259, 42.78890259, 42.78890259, 42.78890259, 42.78890259, 42.78890259, 42.78890259, 42.78890259, 42.78890259, 42.78890259, 42.78890259, 42.78890259, 42.78890259, 41.59530893, 41.59530893, 41.59530893, 41.59530893, 41.59530893, 41.59530893, 41.59530893, 41.59530893, 41.59530893, 41.59530893, 41.59530893, 41.59530893, 41.59530893, 41.59530893, 41.59530893, 41.59530893, 41.59530893, 41.59530893, 41.59530893, 52.1007899, 52.1007899, 52.1007899, 52.1007899, 52.1007899, 52.1007899, 52.1007899, 52.1007899, 52.1007899, 52.1007899, 52.1007899, 52.1007899, 52.1007899, 52.1007899, 52.1007899, 52.1007899, 52.1007899, 52.1007899, 52.1007899, 52.12759564, 52.12759564, 52.12759564, 52.12759564, 52.12759564, 52.12759564, 52.12759564, 52.12759564, 52.12759564, 52.12759564, 52.12759564, 52.12759564, 52.12759564, 52.12759564, 52.12759564, 52.12759564, 52.12759564, 52.12759564, 52.12759564, 39.59550671, 39.59550671, 39.59550671, 39.59550671, 39.59550671, 39.59550671, 39.59550671, 39.59550671, 39.59550671, 39.59550671, 39.59550671, 39.59550671, 39.59550671, 39.59550671, 39.59550671, 39.59550671, 39.59550671, 39.59550671, 39.59550671, 45.85243127, 45.85243127, 45.85243127, 45.85243127, 45.85243127, 45.85243127, 45.85243127, 45.85243127, 45.85243127, 45.85243127, 45.85243127, 45.85243127, 45.85243127, 45.85243127, 45.85243127, 45.85243127, 45.85243127, 45.85243127, 45.85243127, 44.2215032, 44.2215032, 44.2215032, 44.2215032, 44.2215032, 44.2215032, 44.2215032, 44.2215032, 44.2215032, 44.2215032, 44.2215032, 44.2215032, 44.2215032, 44.2215032, 44.2215032, 44.2215032, 44.2215032, 44.2215032, 44.2215032, 46.11554772, 46.11554772, 46.11554772, 46.11554772, 46.11554772, 46.11554772, 46.11554772, 46.11554772, 46.11554772, 46.11554772, 46.11554772, 46.11554772, 46.11554772, 46.11554772, 46.11554772, 46.11554772, 46.11554772, 46.11554772, 46.11554772, 48.70547528, 48.70547528, 48.70547528, 48.70547528, 48.70547528, 48.70547528, 48.70547528, 48.70547528, 48.70547528, 48.70547528, 48.70547528, 48.70547528, 48.70547528, 48.70547528, 48.70547528, 48.70547528, 48.70547528, 48.70547528, 48.70547528],
'long': [20.04983396, 20.04983396, 20.04983396, 20.04983396, 20.04983396, 20.04983396, 20.04983396, 20.04983396, 20.04983396, 20.04983396, 20.04983396, 20.04983396, 20.04983396, 20.04983396, 20.04983396, 20.04983396, 20.04983396, 20.04983396, 20.04983396, 14.1264761, 14.1264761, 14.1264761, 14.1264761, 14.1264761, 14.1264761, 14.1264761, 14.1264761, 14.1264761, 14.1264761, 14.1264761, 14.1264761, 14.1264761, 14.1264761, 14.1264761, 14.1264761, 14.1264761, 14.1264761, 14.1264761, 17.76876733, 17.76876733, 17.76876733, 17.76876733, 17.76876733, 17.76876733, 17.76876733, 17.76876733, 17.76876733, 17.76876733, 17.76876733, 17.76876733, 17.76876733, 17.76876733, 17.76876733, 17.76876733, 17.76876733, 17.76876733, 17.76876733, 4.640651139, 4.640651139, 4.640651139, 4.640651139, 4.640651139, 4.640651139, 4.640651139, 4.640651139, 4.640651139, 4.640651139, 4.640651139, 4.640651139, 4.640651139, 4.640651139, 4.640651139, 4.640651139, 4.640651139, 4.640651139, 4.640651139, 25.21552909, 25.21552909, 25.21552909, 25.21552909, 25.21552909, 25.21552909, 25.21552909, 25.21552909, 25.21552909, 25.21552909, 25.21552909, 25.21552909, 25.21552909, 25.21552909, 25.21552909, 25.21552909, 25.21552909, 25.21552909, 25.21552909, 8.208674706, 8.208674706, 8.208674706, 8.208674706, 8.208674706, 8.208674706, 8.208674706, 8.208674706, 8.208674706, 8.208674706, 8.208674706, 8.208674706, 8.208674706, 8.208674706, 8.208674706, 8.208674706, 8.208674706, 8.208674706, 8.208674706, 15.31240163, 15.31240163, 15.31240163, 15.31240163, 15.31240163, 15.31240163, 15.31240163, 15.31240163, 15.31240163, 15.31240163, 15.31240163, 15.31240163, 15.31240163, 15.31240163, 15.31240163, 15.31240163, 15.31240163, 15.31240163, 15.31240163, 10.38578051, 10.38578051, 10.38578051, 10.38578051, 10.38578051, 10.38578051, 10.38578051, 10.38578051, 10.38578051, 10.38578051, 10.38578051, 10.38578051, 10.38578051, 10.38578051, 10.38578051, 10.38578051, 10.38578051, 10.38578051, 10.38578051, 10.02800992, 10.02800992, 10.02800992, 10.02800992, 10.02800992, 10.02800992, 10.02800992, 10.02800992, 10.02800992, 10.02800992, 10.02800992, 10.02800992, 10.02800992, 10.02800992, 10.02800992, 10.02800992, 10.02800992, 10.02800992, 10.02800992, -3.647550473, -3.647550473, -3.647550473, -3.647550473, -3.647550473, -3.647550473, -3.647550473, -3.647550473, -3.647550473, -3.647550473, -3.647550473, -3.647550473, -3.647550473, -3.647550473, -3.647550473, -3.647550473, -3.647550473, -3.647550473, -3.647550473, -2.761729445, -2.761729445, -2.761729445, -2.761729445, -2.761729445, -2.761729445, -2.761729445, -2.761729445, -2.761729445, -2.761729445, -2.761729445, -2.761729445, -2.761729445, -2.761729445, -2.761729445, -2.761729445, -2.761729445, -2.761729445, -2.761729445, 22.95555794, 22.95555794, 22.95555794, 22.95555794, 22.95555794, 22.95555794, 22.95555794, 22.95555794, 22.95555794, 22.95555794, 22.95555794, 22.95555794, 22.95555794, 22.95555794, 22.95555794, 22.95555794, 22.95555794, 22.95555794, 22.95555794, 16.40412899, 16.40412899, 16.40412899, 16.40412899, 16.40412899, 16.40412899, 16.40412899, 16.40412899, 16.40412899, 16.40412899, 16.40412899, 16.40412899, 16.40412899, 16.40412899, 16.40412899, 16.40412899, 16.40412899, 16.40412899, 16.40412899, 19.39559116, 19.39559116, 19.39559116, 19.39559116, 19.39559116, 19.39559116, 19.39559116, 19.39559116, 19.39559116, 19.39559116, 19.39559116, 19.39559116, 19.39559116, 19.39559116, 19.39559116, 19.39559116, 19.39559116, 19.39559116, 19.39559116, 12.07001339, 12.07001339, 12.07001339, 12.07001339, 12.07001339, 12.07001339, 12.07001339, 12.07001339, 12.07001339, 12.07001339, 12.07001339, 12.07001339, 12.07001339, 12.07001339, 12.07001339, 12.07001339, 12.07001339, 12.07001339, 12.07001339, 6.071822011, 6.071822011, 6.071822011, 6.071822011, 6.071822011, 6.071822011, 6.071822011, 6.071822011, 6.071822011, 6.071822011, 6.071822011, 6.071822011, 6.071822011, 6.071822011, 6.071822011, 6.071822011, 6.071822011, 6.071822011, 6.071822011, 19.23883939, 19.23883939, 19.23883939, 19.23883939, 19.23883939, 19.23883939, 19.23883939, 19.23883939, 19.23883939, 19.23883939, 19.23883939, 19.23883939, 19.23883939, 19.23883939, 19.23883939, 19.23883939, 19.23883939, 19.23883939, 19.23883939, 21.68211346, 21.68211346, 21.68211346, 21.68211346, 21.68211346, 21.68211346, 21.68211346, 21.68211346, 21.68211346, 21.68211346, 21.68211346, 21.68211346, 21.68211346, 21.68211346, 21.68211346, 21.68211346, 21.68211346, 21.68211346, 21.68211346, 5.28144793, 5.28144793, 5.28144793, 5.28144793, 5.28144793, 5.28144793, 5.28144793, 5.28144793, 5.28144793, 5.28144793, 5.28144793, 5.28144793, 5.28144793, 5.28144793, 5.28144793, 5.28144793, 5.28144793, 5.28144793, 5.28144793, 19.39012835, 19.39012835, 19.39012835, 19.39012835, 19.39012835, 19.39012835, 19.39012835, 19.39012835, 19.39012835, 19.39012835, 19.39012835, 19.39012835, 19.39012835, 19.39012835, 19.39012835, 19.39012835, 19.39012835, 19.39012835, 19.39012835, -8.501043613, -8.501043613, -8.501043613, -8.501043613, -8.501043613, -8.501043613, -8.501043613, -8.501043613, -8.501043613, -8.501043613, -8.501043613, -8.501043613, -8.501043613, -8.501043613, -8.501043613, -8.501043613, -8.501043613, -8.501043613, -8.501043613, 24.97293039, 24.97293039, 24.97293039, 24.97293039, 24.97293039, 24.97293039, 24.97293039, 24.97293039, 24.97293039, 24.97293039, 24.97293039, 24.97293039, 24.97293039, 24.97293039, 24.97293039, 24.97293039, 24.97293039, 24.97293039, 24.97293039, 20.78958334, 20.78958334, 20.78958334, 20.78958334, 20.78958334, 20.78958334, 20.78958334, 20.78958334, 20.78958334, 20.78958334, 20.78958334, 20.78958334, 20.78958334, 20.78958334, 20.78958334, 20.78958334, 20.78958334, 20.78958334, 20.78958334, 14.80444238, 14.80444238, 14.80444238, 14.80444238, 14.80444238, 14.80444238, 14.80444238, 14.80444238, 14.80444238, 14.80444238, 14.80444238, 14.80444238, 14.80444238, 14.80444238, 14.80444238, 14.80444238, 14.80444238, 14.80444238, 14.80444238, 19.47905218, 19.47905218, 19.47905218, 19.47905218, 19.47905218, 19.47905218, 19.47905218, 19.47905218, 19.47905218, 19.47905218, 19.47905218, 19.47905218, 19.47905218, 19.47905218, 19.47905218, 19.47905218, 19.47905218, 19.47905218, 19.47905218], 'p_inst': [0.0, 100000000.0, 30000000.0, 35000000.0, 35000000.0, 0.0, 0.0, 393563000.0, 1818000000.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 4820000000.0, 1446000000.0, 1687000000.0, 1687000000.0, 299000000.0, 299000000.0, 4509000000.0, 9989000000.0, 0.0, 0.0, 0.0, 174000000.0, 984000000.0, 620000000.0, 2000000000.0, 1940000000.0, 1940000000.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1079000000.0, 961000000.0, 1161500000.0, 1161500000.0, 0.0, 0.0, 0.0, 0.0, 0.0, 50000000.0, 50000000.0, 0.0, 648600000.0, 4285000000.0, 1285500000.0, 1499750000.0, 1499750000.0, 0.0, 0.0, 117000000.0, 1308000000.0, 0.0, 0.0, 5919000000.0, 0.0, 1157000000.0, 658000000.0, 4044250000.0, 1206500000.0, 1206500000.0, 2310000000.0, 0.0, 12000000.0, 3600000.0, 4200000.0, 4200000.0, 84000000.0, 84000000.0, 540000000.0, 2650000000.0, 1947500000.0, 1947500000.0, 2200000000.0, 0.0, 1314000000.0, 90000000.0, 1300000000.0, 450000000.0, 450000000.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 4139000000.0, 13580000000.0, 0.0, 0.0, 2905000000.0, 0.0, 520000000.0, 380000000.0, 2600000000.0, 60000000.0, 60000000.0, 0.0, 0.0, 1390000000.0, 417000000.0, 486500000.0, 486500000.0, 493500000.0, 493500000.0, 365000000.0, 1050000000.0, 3352000000.0, 3352000000.0, 4055000000.0, 0.0, 1907000000.0, 917000000.0, 2380000000.0, 300000000.0, 300000000.0, 0.0, 0.0, 23234752170.0, 6970425652.0, 8132163261.0, 8132163261.0, 11612775000.0, 11612775000.0, 4329000000.0, 6300000000.0, 8317250000.0, 8317250000.0, 8107000000.0, 1713490000.0, 7943520000.0, 7827000000.0, 49000000000.0, 26250000000.0, 26250000000.0, 7600000000.0, 1170000000.0, 430000000.0, 129000000.0, 150500000.0, 150500000.0, 595000000.0, 595000000.0, 6608200.0, 0.0, 0.0, 0.0, 0.0, 817000000.0, 125751000.0, 790400000.0, 1083000000.0, 2173000000.0, 2173000000.0, 2272000000.0, 0.0, 24560000000.0, 7368000000.0, 8596000000.0, 8596000000.0, 4700000000.0, 4700000000.0, 3600000000.0, 16890000000.0, 0.0, 0.0, 7117000000.0, 0.0, 7570000000.0, 1250000000.0, 8100000000.0, 13050000000.0, 13050000000.0, 0.0, 0.0, 11035000000.0, 3310500000.0, 3862250000.0, 3862250000.0, 1465000000.0, 1465000000.0, 13600000000.0, 11500000000.0, 0.0, 0.0, 63020000000.0, 1358000000.0, 0.0, 2191000000.0, 11600000000.0, 8150000000.0, 8150000000.0, 1000000000.0, 0.0, 5214000000.0, 1564200000.0, 1824900000.0, 1824900000.0, 0.0, 0.0, 252000000.0, 3331000000.0, 1120000000.0, 1120000000.0, 0.0, 0.0, 0.0, 325000000.0, 2800000000.0, 1350000000.0, 1350000000.0, 0.0, 0.0, 1500000000.0, 450000000.0, 525000000.0, 525000000.0, 100000000.0, 100000000.0, 300000000.0, 1800000000.0, 0.0, 0.0, 0.0, 0.0, 200000000.0, 250000000.0, 45000000.0, 375000000.0, 375000000.0, 0.0, 0.0, 2260400000.0, 678120000.0, 791140000.0, 791140000.0, 82550000.0, 82550000.0, 60000000.0, 0.0, 426000000.0, 426000000.0, 1888000000.0, 410000000.0, 585000000.0, 330000000.0, 750000000.0, 164500000.0, 164500000.0, 0.0, 0.0, 36000364000.0, 10800109200.0, 12600127400.0, 12600127400.0, 3528000000.0, 3528000000.0, 5390151286.0, 16455869690.0, 0.0, 0.0, 0.0, 1145700000.0, 7564236000.0, 5212853900.0, 21077000000.0, 5587500000.0, 5587500000.0, 30000000.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 34000000.0, 1310000000.0, 0.0, 0.0, 0.0, 0.0, 90000000.0, 41300000.0, 140000000.0, 75000000.0, 75000000.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 91350000.0, 728500000.0, 100000000.0, 100000000.0, 0.0, 0.0, 0.0, 33000000.0, 10000000.0, 75600000.0, 75600000.0, 0.0, 0.0, 290000000.0, 87000000.0, 101500000.0, 101500000.0, 0.0, 0.0, 141000000.0, 579000000.0, 307500000.0, 307500000.0, 0.0, 198000000.0, 0.0, 26000000.0, 32000000.0, 25000000.0, 25000000.0, 0.0, 0.0, 9285800000.0, 2785740000.0, 3250030000.0, 3250030000.0, 2304000000.0, 2304000000.0, 38000000.0, 0.0, 0.0, 0.0, 486000000.0, 0.0, 4289000000.0, 507000000.0, 4376000000.0, 2570035000.0, 2570035000.0, 2374120000.0, 0.0, 1506000000.0, 451800000.0, 527100000.0, 527100000.0, 7551000000.0, 7551000000.0, 983000000.0, 1413000000.0, 3568000000.0, 3568000000.0, 0.0, 0.0, 7500000000.0, 985000000.0, 350000000.0, 3525000000.0, 3525000000.0, 0.0, 0.0, 3829000000.0, 1148700000.0, 1340150000.0, 1340150000.0, 878000000.0, 878000000.0, 734700000.0, 6466300000.0, 0.0, 0.0, 0.0, 0.0, 1052000000.0, 843000000.0, 1816000000.0, 2728000000.0, 2728000000.0, 35000000.0, 0.0, 3427555000.0, 1028266500.0, 1199644250.0, 1199644250.0, 214000000.0, 214000000.0, 3214832867.0, 3289825700.0, 1608250000.0, 1608250000.0, 1300000000.0, 0.0, 0.0, 180000000.0, 1480000000.0, 1600000000.0, 1600000000.0, 0.0, 0.0, 140000000.0, 42000000.0, 49000000.0, 49000000.0, 0.0, 0.0, 2025000000.0, 1063000000.0, 2432000000.0, 2432000000.0, 0.0, 0.0, 0.0, 24000000.0, 10000000.0, 534000000.0, 534000000.0, 0.0, 0.0, 479000000.0, 143700000.0, 167650000.0, 167650000.0, 25000000.0, 25000000.0, 1197000000.0, 180000000.0, 393500000.0, 393500000.0, 696000000.0, 0.0, 141000000.0, 61000000.0, 277000000.0, 21000000.0, 21000000.0, 0.0, 0.0, 866790000.0, 260036999.99999997, 303376500.0, 303376500.0, 98590000.0, 98590000.0, 848000000.0, 1708000000.0, 120140000.0, 120140000.0, 2761920000.0, 0.0, 863606000.0, 364150000.0, 554500000.0, 27630000.0, 27630000.0, 0.0],
'fk_kraftwerkstyp': [1, 4, 5, 6, 7, 12, 13, 9, 11, 2, 3, 8, 17, 19, 18, 10, 16, 15, 14, 1, 4, 5, 6, 7, 12, 13, 9, 11, 2, 3, 8, 17, 19, 18, 10, 16, 15, 14, 1, 4, 5, 6, 7, 12, 13, 9, 11, 2, 3, 8, 17, 19, 18, 10, 16, 15, 14, 1, 4, 5, 6, 7, 12, 13, 9, 11, 2, 3, 8, 17, 19, 18, 10, 16, 15, 14, 1, 4, 5, 6, 7, 12, 13, 9, 11, 2, 3, 8, 17, 19, 18, 10, 16, 15, 14, 1, 4, 5, 6, 7, 12, 13, 9, 11, 2, 3, 8, 17, 19, 18, 10, 16, 15, 14, 1, 4, 5, 6, 7, 12, 13, 9, 11, 2, 3, 8, 17, 19, 18, 10, 16, 15, 14, 1, 4, 5, 6, 7, 12, 13, 9, 11, 2, 3, 8, 17, 19, 18, 10, 16, 15, 14, 1, 4, 5, 6, 7, 12, 13, 9, 11, 2, 3, 8, 17, 19, 18, 10, 16, 15, 14, 1, 4, 5, 6, 7, 12, 13, 9, 11, 2, 3, 8, 17, 19, 18, 10, 16, 15, 14, 1, 4, 5, 6, 7, 12, 13, 9, 11, 2, 3, 8, 17, 19, 18, 10, 16, 15, 14, 1, 4, 5, 6, 7, 12, 13, 9, 11, 2, 3, 8, 17, 19, 18, 10, 16, 15, 14, 1, 4, 5, 6, 7, 12, 13, 9, 11, 2, 3, 8, 17, 19, 18, 10, 16, 15, 14, 1, 4, 5, 6, 7, 12, 13, 9, 11, 2, 3, 8, 17, 19, 18, 10, 16, 15, 14, 1, 4, 5, 6, 7, 12, 13, 9, 11, 2, 3, 8, 17, 19, 18, 10, 16, 15, 14, 1, 4, 5, 6, 7, 12, 13, 9, 11, 2, 3, 8, 17, 19, 18, 10, 16, 15, 14, 1, 4, 5, 6, 7, 12, 13, 9, 11, 2, 3, 8, 17, 19, 18, 10, 16, 15, 14, 1, 4, 5, 6, 7, 12, 13, 9, 11, 2, 3, 8, 17, 19, 18, 10, 16, 15, 14, 1, 4, 5, 6, 7, 12, 13, 9, 11, 2, 3, 8, 17, 19, 18, 10, 16, 15, 14, 1, 4, 5, 6, 7, 12, 13, 9, 11, 2, 3, 8, 17, 19, 18, 10, 16, 15, 14, 1, 4, 5, 6, 7, 12, 13, 9, 11, 2, 3, 8, 17, 19, 18, 10, 16, 15, 14, 1, 4, 5, 6, 7, 12, 13, 9, 11, 2, 3, 8, 17, 19, 18, 10, 16, 15, 14, 1, 4, 5, 6, 7, 12, 13, 9, 11, 2, 3, 8, 17, 19, 18, 10, 16, 15, 14, 1, 4, 5, 6, 7, 12, 13, 9, 11, 2, 3, 8, 17, 19, 18, 10, 16, 15, 14, 1, 4, 5, 6, 7, 12, 13, 9, 11, 2, 3, 8, 17, 19, 18, 10, 16, 15, 14], 'kwt_id': [1, 4, 5, 6, 7, 12, 13, 9, 11, 2, 3, 8, 17, 19, 18, 10, 16, 15, 14, 1, 4, 5, 6, 7, 12, 13, 9, 11, 2, 3, 8, 17, 19, 18, 10, 16, 15, 14, 1, 4, 5, 6, 7, 12, 13, 9, 11, 2, 3, 8, 17, 19, 18, 10, 16, 15, 14, 1, 4, 5, 6, 7, 12, 13, 9, 11, 2, 3, 8, 17, 19, 18, 10, 16, 15, 14, 1, 4, 5, 6, 7, 12, 13, 9, 11, 2, 3, 8, 17, 19, 18, 10, 16, 15, 14, 1, 4, 5, 6, 7, 12, 13, 9, 11, 2, 3, 8, 17, 19, 18, 10, 16, 15, 14, 1, 4, 5, 6, 7, 12, 13, 9, 11, 2, 3, 8, 17, 19, 18, 10, 16, 15, 14, 1, 4, 5, 6, 7, 12, 13, 9, 11, 2, 3, 8, 17, 19, 18, 10, 16, 15, 14, 1, 4, 5, 6, 7, 12, 13, 9, 11, 2, 3, 8, 17, 19, 18, 10, 16, 15, 14, 1, 4, 5, 6, 7, 12, 13, 9, 11, 2, 3, 8, 17, 19, 18, 10, 16, 15, 14, 1, 4, 5, 6, 7, 12, 13, 9, 11, 2, 3, 8, 17, 19, 18, 10, 16, 15, 14, 1, 4, 5, 6, 7, 12, 13, 9, 11, 2, 3, 8, 17, 19, 18, 10, 16, 15, 14, 1, 4, 5, 6, 7, 12, 13, 9, 11, 2, 3, 8, 17, 19, 18, 10, 16, 15, 14, 1, 4, 5, 6, 7, 12, 13, 9, 11, 2, 3, 8, 17, 19, 18, 10, 16, 15, 14, 1, 4, 5, 6, 7, 12, 13, 9, 11, 2, 3, 8, 17, 19, 18, 10, 16, 15, 14, 1, 4, 5, 6, 7, 12, 13, 9, 11, 2, 3, 8, 17, 19, 18, 10, 16, 15, 14, 1, 4, 5, 6, 7, 12, 13, 9, 11, 2, 3, 8, 17, 19, 18, 10, 16, 15, 14, 1, 4, 5, 6, 7, 12, 13, 9, 11, 2, 3, 8, 17, 19, 18, 10, 16, 15, 14, 1, 4, 5, 6, 7, 12, 13, 9, 11, 2, 3, 8, 17, 19, 18, 10, 16, 15, 14, 1, 4, 5, 6, 7, 12, 13, 9, 11, 2, 3, 8, 17, 19, 18, 10, 16, 15, 14, 1, 4, 5, 6, 7, 12, 13, 9, 11, 2, 3, 8, 17, 19, 18, 10, 16, 15, 14, 1, 4, 5, 6, 7, 12, 13, 9, 11, 2, 3, 8, 17, 19, 18, 10, 16, 15, 14, 1, 4, 5, 6, 7, 12, 13, 9, 11, 2, 3, 8, 17, 19, 18, 10, 16, 15, 14, 1, 4, 5, 6, 7, 12, 13, 9, 11, 2, 3, 8, 17, 19, 18, 10, 16, 15, 14, 1, 4, 5, 6, 7, 12, 13, 9, 11, 2, 3, 8, 17, 19, 18, 10, 16, 15, 14],
'bez_kraftwerkstyp': ['Biomassekraftwerk', 'Gasturbine', 'Gasturbine', 'Gaskombikraftwerk', 'Gaskombikraftwerk', 'Steinkohlekraftwerk', 'Steinkohlekraftwerk', 'Laufwasserkraftwerk', 'Speicherwasserkraftwerk', 'Braunkohlekraftwerk', 'Braunkohlekraftwerk', 'Kernkraftwerk', 'Oelkraftwerk', 'Weitere', 'Weitere', 'Photovoltaik', 'Windturbine', 'Windturbine', 'Windturbine', 'Biomassekraftwerk', 'Gasturbine', 'Gasturbine', 'Gaskombikraftwerk', 'Gaskombikraftwerk', 'Steinkohlekraftwerk', 'Steinkohlekraftwerk', 'Laufwasserkraftwerk', 'Speicherwasserkraftwerk', 'Braunkohlekraftwerk', 'Braunkohlekraftwerk', 'Kernkraftwerk', 'Oelkraftwerk', 'Weitere', 'Weitere', 'Photovoltaik', 'Windturbine', 'Windturbine', 'Windturbine', 'Biomassekraftwerk', 'Gasturbine', 'Gasturbine', 'Gaskombikraftwerk', 'Gaskombikraftwerk', 'Steinkohlekraftwerk', 'Steinkohlekraftwerk', 'Laufwasserkraftwerk', 'Speicherwasserkraftwerk', 'Braunkohlekraftwerk', 'Braunkohlekraftwerk', 'Kernkraftwerk', 'Oelkraftwerk', 'Weitere', 'Weitere', 'Photovoltaik', 'Windturbine', 'Windturbine', 'Windturbine', 'Biomassekraftwerk', 'Gasturbine', 'Gasturbine', 'Gaskombikraftwerk', 'Gaskombikraftwerk', 'Steinkohlekraftwerk', 'Steinkohlekraftwerk', 'Laufwasserkraftwerk', 'Speicherwasserkraftwerk', 'Braunkohlekraftwerk', 'Braunkohlekraftwerk', 'Kernkraftwerk', 'Oelkraftwerk', 'Weitere', 'Weitere', 'Photovoltaik', 'Windturbine', 'Windturbine', 'Windturbine', 'Biomassekraftwerk', 'Gasturbine', 'Gasturbine', 'Gaskombikraftwerk', 'Gaskombikraftwerk', 'Steinkohlekraftwerk', 'Steinkohlekraftwerk', 'Laufwasserkraftwerk', 'Speicherwasserkraftwerk', 'Braunkohlekraftwerk', 'Braunkohlekraftwerk', 'Kernkraftwerk', 'Oelkraftwerk', 'Weitere', 'Weitere', 'Photovoltaik', 'Windturbine', 'Windturbine', 'Windturbine', 'Biomassekraftwerk', 'Gasturbine', 'Gasturbine', 'Gaskombikraftwerk', 'Gaskombikraftwerk', 'Steinkohlekraftwerk', 'Steinkohlekraftwerk', 'Laufwasserkraftwerk', 'Speicherwasserkraftwerk', 'Braunkohlekraftwerk', 'Braunkohlekraftwerk', 'Kernkraftwerk', 'Oelkraftwerk', 'Weitere', 'Weitere', 'Photovoltaik', 'Windturbine', 'Windturbine', 'Windturbine', 'Biomassekraftwerk', 'Gasturbine', 'Gasturbine', 'Gaskombikraftwerk', 'Gaskombikraftwerk', 'Steinkohlekraftwerk', 'Steinkohlekraftwerk', 'Laufwasserkraftwerk', 'Speicherwasserkraftwerk', 'Braunkohlekraftwerk', 'Braunkohlekraftwerk', 'Kernkraftwerk', 'Oelkraftwerk', 'Weitere', 'Weitere', 'Photovoltaik', 'Windturbine', 'Windturbine', 'Windturbine', 'Biomassekraftwerk', 'Gasturbine', 'Gasturbine', 'Gaskombikraftwerk', 'Gaskombikraftwerk', 'Steinkohlekraftwerk', 'Steinkohlekraftwerk', 'Laufwasserkraftwerk', 'Speicherwasserkraftwerk', 'Braunkohlekraftwerk', 'Braunkohlekraftwerk', 'Kernkraftwerk', 'Oelkraftwerk', 'Weitere', 'Weitere', 'Photovoltaik', 'Windturbine', 'Windturbine', 'Windturbine', 'Biomassekraftwerk', 'Gasturbine', 'Gasturbine', 'Gaskombikraftwerk', 'Gaskombikraftwerk', 'Steinkohlekraftwerk', 'Steinkohlekraftwerk', 'Laufwasserkraftwerk', 'Speicherwasserkraftwerk', 'Braunkohlekraftwerk', 'Braunkohlekraftwerk', 'Kernkraftwerk', 'Oelkraftwerk', 'Weitere', 'Weitere', 'Photovoltaik', 'Windturbine', 'Windturbine', 'Windturbine', 'Biomassekraftwerk', 'Gasturbine', 'Gasturbine', 'Gaskombikraftwerk', 'Gaskombikraftwerk', 'Steinkohlekraftwerk', 'Steinkohlekraftwerk', 'Laufwasserkraftwerk', 'Speicherwasserkraftwerk', 'Braunkohlekraftwerk', 'Braunkohlekraftwerk', 'Kernkraftwerk', 'Oelkraftwerk', 'Weitere', 'Weitere', 'Photovoltaik', 'Windturbine', 'Windturbine', 'Windturbine', 'Biomassekraftwerk', 'Gasturbine', 'Gasturbine', 'Gaskombikraftwerk', 'Gaskombikraftwerk', 'Steinkohlekraftwerk', 'Steinkohlekraftwerk', 'Laufwasserkraftwerk', 'Speicherwasserkraftwerk', 'Braunkohlekraftwerk', 'Braunkohlekraftwerk', 'Kernkraftwerk', 'Oelkraftwerk', 'Weitere', 'Weitere', 'Photovoltaik', 'Windturbine', 'Windturbine', 'Windturbine', 'Biomassekraftwerk', 'Gasturbine', 'Gasturbine', 'Gaskombikraftwerk', 'Gaskombikraftwerk', 'Steinkohlekraftwerk', 'Steinkohlekraftwerk', 'Laufwasserkraftwerk', 'Speicherwasserkraftwerk', 'Braunkohlekraftwerk', 'Braunkohlekraftwerk', 'Kernkraftwerk', 'Oelkraftwerk', 'Weitere', 'Weitere', 'Photovoltaik', 'Windturbine', 'Windturbine', 'Windturbine', 'Biomassekraftwerk', 'Gasturbine', 'Gasturbine', 'Gaskombikraftwerk', 'Gaskombikraftwerk', 'Steinkohlekraftwerk', 'Steinkohlekraftwerk', 'Laufwasserkraftwerk', 'Speicherwasserkraftwerk', 'Braunkohlekraftwerk', 'Braunkohlekraftwerk', 'Kernkraftwerk', 'Oelkraftwerk', 'Weitere', 'Weitere', 'Photovoltaik', 'Windturbine', 'Windturbine', 'Windturbine', 'Biomassekraftwerk', 'Gasturbine', 'Gasturbine', 'Gaskombikraftwerk', 'Gaskombikraftwerk', 'Steinkohlekraftwerk', 'Steinkohlekraftwerk', 'Laufwasserkraftwerk', 'Speicherwasserkraftwerk', 'Braunkohlekraftwerk', 'Braunkohlekraftwerk', 'Kernkraftwerk', 'Oelkraftwerk', 'Weitere', 'Weitere', 'Photovoltaik', 'Windturbine', 'Windturbine', 'Windturbine', 'Biomassekraftwerk', 'Gasturbine', 'Gasturbine', 'Gaskombikraftwerk', 'Gaskombikraftwerk', 'Steinkohlekraftwerk', 'Steinkohlekraftwerk', 'Laufwasserkraftwerk', 'Speicherwasserkraftwerk', 'Braunkohlekraftwerk', 'Braunkohlekraftwerk', 'Kernkraftwerk', 'Oelkraftwerk', 'Weitere', 'Weitere', 'Photovoltaik', 'Windturbine', 'Windturbine', 'Windturbine', 'Biomassekraftwerk', 'Gasturbine', 'Gasturbine', 'Gaskombikraftwerk', 'Gaskombikraftwerk', 'Steinkohlekraftwerk', 'Steinkohlekraftwerk', 'Laufwasserkraftwerk', 'Speicherwasserkraftwerk', 'Braunkohlekraftwerk', 'Braunkohlekraftwerk', 'Kernkraftwerk', 'Oelkraftwerk', 'Weitere', 'Weitere', 'Photovoltaik', 'Windturbine', 'Windturbine', 'Windturbine', 'Biomassekraftwerk', 'Gasturbine', 'Gasturbine', 'Gaskombikraftwerk', 'Gaskombikraftwerk', 'Steinkohlekraftwerk', 'Steinkohlekraftwerk', 'Laufwasserkraftwerk', 'Speicherwasserkraftwerk', 'Braunkohlekraftwerk', 'Braunkohlekraftwerk', 'Kernkraftwerk', 'Oelkraftwerk', 'Weitere', 'Weitere', 'Photovoltaik', 'Windturbine', 'Windturbine', 'Windturbine', 'Biomassekraftwerk', 'Gasturbine', 'Gasturbine', 'Gaskombikraftwerk', 'Gaskombikraftwerk', 'Steinkohlekraftwerk', 'Steinkohlekraftwerk', 'Laufwasserkraftwerk', 'Speicherwasserkraftwerk', 'Braunkohlekraftwerk', 'Braunkohlekraftwerk', 'Kernkraftwerk', 'Oelkraftwerk', 'Weitere', 'Weitere', 'Photovoltaik', 'Windturbine', 'Windturbine', 'Windturbine', 'Biomassekraftwerk', 'Gasturbine', 'Gasturbine', 'Gaskombikraftwerk', 'Gaskombikraftwerk', 'Steinkohlekraftwerk', 'Steinkohlekraftwerk', 'Laufwasserkraftwerk', 'Speicherwasserkraftwerk', 'Braunkohlekraftwerk', 'Braunkohlekraftwerk', 'Kernkraftwerk', 'Oelkraftwerk', 'Weitere', 'Weitere', 'Photovoltaik', 'Windturbine', 'Windturbine', 'Windturbine', 'Biomassekraftwerk', 'Gasturbine', 'Gasturbine', 'Gaskombikraftwerk', 'Gaskombikraftwerk', 'Steinkohlekraftwerk', 'Steinkohlekraftwerk', 'Laufwasserkraftwerk', 'Speicherwasserkraftwerk', 'Braunkohlekraftwerk', 'Braunkohlekraftwerk', 'Kernkraftwerk', 'Oelkraftwerk', 'Weitere', 'Weitere', 'Photovoltaik', 'Windturbine', 'Windturbine', 'Windturbine', 'Biomassekraftwerk', 'Gasturbine', 'Gasturbine', 'Gaskombikraftwerk', 'Gaskombikraftwerk', 'Steinkohlekraftwerk', 'Steinkohlekraftwerk', 'Laufwasserkraftwerk', 'Speicherwasserkraftwerk', 'Braunkohlekraftwerk', 'Braunkohlekraftwerk', 'Kernkraftwerk', 'Oelkraftwerk', 'Weitere', 'Weitere', 'Photovoltaik', 'Windturbine', 'Windturbine', 'Windturbine', 'Biomassekraftwerk', 'Gasturbine', 'Gasturbine', 'Gaskombikraftwerk', 'Gaskombikraftwerk', 'Steinkohlekraftwerk', 'Steinkohlekraftwerk', 'Laufwasserkraftwerk', 'Speicherwasserkraftwerk', 'Braunkohlekraftwerk', 'Braunkohlekraftwerk', 'Kernkraftwerk', 'Oelkraftwerk', 'Weitere', 'Weitere', 'Photovoltaik', 'Windturbine', 'Windturbine', 'Windturbine', 'Biomassekraftwerk', 'Gasturbine', 'Gasturbine', 'Gaskombikraftwerk', 'Gaskombikraftwerk', 'Steinkohlekraftwerk', 'Steinkohlekraftwerk', 'Laufwasserkraftwerk', 'Speicherwasserkraftwerk', 'Braunkohlekraftwerk', 'Braunkohlekraftwerk', 'Kernkraftwerk', 'Oelkraftwerk', 'Weitere', 'Weitere', 'Photovoltaik', 'Windturbine', 'Windturbine', 'Windturbine', 'Biomassekraftwerk', 'Gasturbine', 'Gasturbine', 'Gaskombikraftwerk', 'Gaskombikraftwerk', 'Steinkohlekraftwerk', 'Steinkohlekraftwerk', 'Laufwasserkraftwerk', 'Speicherwasserkraftwerk', 'Braunkohlekraftwerk', 'Braunkohlekraftwerk', 'Kernkraftwerk', 'Oelkraftwerk', 'Weitere', 'Weitere', 'Photovoltaik', 'Windturbine', 'Windturbine', 'Windturbine', 'Biomassekraftwerk', 'Gasturbine', 'Gasturbine', 'Gaskombikraftwerk', 'Gaskombikraftwerk', 'Steinkohlekraftwerk', 'Steinkohlekraftwerk', 'Laufwasserkraftwerk', 'Speicherwasserkraftwerk', 'Braunkohlekraftwerk', 'Braunkohlekraftwerk', 'Kernkraftwerk', 'Oelkraftwerk', 'Weitere', 'Weitere', 'Photovoltaik', 'Windturbine', 'Windturbine', 'Windturbine'], 'bez_subtyp': ['', 'absicherung', 'peak', 'alt', 'neu', 'alt', 'neu', '', '', 'alt', 'neu', '', '', 'non RES', 'RES', '', 'onshore starkwind', 'onshore schwachwind', 'offshore', '', 'absicherung', 'peak', 'alt', 'neu', 'alt', 'neu', '', '', 'alt', 'neu', '', '', 'non RES', 'RES', '', 'onshore starkwind', 'onshore schwachwind', 'offshore', '', 'absicherung', 'peak', 'alt', 'neu', 'alt', 'neu', '', '', 'alt', 'neu', '', '', 'non RES', 'RES', '', 'onshore starkwind', 'onshore schwachwind', 'offshore', '', 'absicherung', 'peak', 'alt', 'neu', 'alt', 'neu', '', '', 'alt', 'neu', '', '', 'non RES', 'RES', '', 'onshore starkwind', 'onshore schwachwind', 'offshore', '', 'absicherung', 'peak', 'alt', 'neu', 'alt', 'neu', '', '', 'alt', 'neu', '', '', 'non RES', 'RES', '', 'onshore starkwind', 'onshore schwachwind', 'offshore', '', 'absicherung', 'peak', 'alt', 'neu', 'alt', 'neu', '', '', 'alt', 'neu', '', '', 'non RES', 'RES', '', 'onshore starkwind', 'onshore schwachwind', 'offshore', '', 'absicherung', 'peak', 'alt', 'neu', 'alt', 'neu', '', '', 'alt', 'neu', '', '', 'non RES', 'RES', '', 'onshore starkwind', 'onshore schwachwind', 'offshore', '', 'absicherung', 'peak', 'alt', 'neu', 'alt', 'neu', '', '', 'alt', 'neu', '', '', 'non RES', 'RES', '', 'onshore starkwind', 'onshore schwachwind', 'offshore', '', 'absicherung', 'peak', 'alt', 'neu', 'alt', 'neu', '', '', 'alt', 'neu', '', '', 'non RES', 'RES', '', 'onshore starkwind', 'onshore schwachwind', 'offshore', '', 'absicherung', 'peak', 'alt', 'neu', 'alt', 'neu', '', '', 'alt', 'neu', '', '', 'non RES', 'RES', '', 'onshore starkwind', 'onshore schwachwind', 'offshore', '', 'absicherung', 'peak', 'alt', 'neu', 'alt', 'neu', '', '', 'alt', 'neu', '', '', 'non RES', 'RES', '', 'onshore starkwind', 'onshore schwachwind', 'offshore', '', 'absicherung', 'peak', 'alt', 'neu', 'alt', 'neu', '', '', 'alt', 'neu', '', '', 'non RES', 'RES', '', 'onshore starkwind', 'onshore schwachwind', 'offshore', '', 'absicherung', 'peak', 'alt', 'neu', 'alt', 'neu', '', '', 'alt', 'neu', '', '', 'non RES', 'RES', '', 'onshore starkwind', 'onshore schwachwind', 'offshore', '', 'absicherung', 'peak', 'alt', 'neu', 'alt', 'neu', '', '', 'alt', 'neu', '', '', 'non RES', 'RES', '', 'onshore starkwind', 'onshore schwachwind', 'offshore', '', 'absicherung', 'peak', 'alt', 'neu', 'alt', 'neu', '', '', 'alt', 'neu', '', '', 'non RES', 'RES', '', 'onshore starkwind', 'onshore schwachwind', 'offshore', '', 'absicherung', 'peak', 'alt', 'neu', 'alt', 'neu', '', '', 'alt', 'neu', '', '', 'non RES', 'RES', '', 'onshore starkwind', 'onshore schwachwind', 'offshore', '', 'absicherung', 'peak', 'alt', 'neu', 'alt', 'neu', '', '', 'alt', 'neu', '', '', 'non RES', 'RES', '', 'onshore starkwind', 'onshore schwachwind', 'offshore', '', 'absicherung', 'peak', 'alt', 'neu', 'alt', 'neu', '', '', 'alt', 'neu', '', '', 'non RES', 'RES', '', 'onshore starkwind', 'onshore schwachwind', 'offshore', '', 'absicherung', 'peak', 'alt', 'neu', 'alt', 'neu', '', '', 'alt', 'neu', '', '', 'non RES', 'RES', '', 'onshore starkwind', 'onshore schwachwind', 'offshore', '', 'absicherung', 'peak', 'alt', 'neu', 'alt', 'neu', '', '', 'alt', 'neu', '', '', 'non RES', 'RES', '', 'onshore starkwind', 'onshore schwachwind', 'offshore', '', 'absicherung', 'peak', 'alt', 'neu', 'alt', 'neu', '', '', 'alt', 'neu', '', '', 'non RES', 'RES', '', 'onshore starkwind', 'onshore schwachwind', 'offshore', '', 'absicherung', 'peak', 'alt', 'neu', 'alt', 'neu', '', '', 'alt', 'neu', '', '', 'non RES', 'RES', '', 'onshore starkwind', 'onshore schwachwind', 'offshore', '', 'absicherung', 'peak', 'alt', 'neu', 'alt', 'neu', '', '', 'alt', 'neu', '', '', 'non RES', 'RES', '', 'onshore starkwind', 'onshore schwachwind', 'offshore', '', 'absicherung', 'peak', 'alt', 'neu', 'alt', 'neu', '', '', 'alt', 'neu', '', '', 'non RES', 'RES', '', 'onshore starkwind', 'onshore schwachwind', 'offshore', '', 'absicherung', 'peak', 'alt', 'neu', 'alt', 'neu', '', '', 'alt', 'neu', '', '', 'non RES', 'RES', '', 'onshore starkwind', 'onshore schwachwind', 'offshore'],
'wirkungsgrad': [0.6, 0.7, 0.7, 0.65, 0.68, 0.5, 0.53, 0.93, 0.9, 0.55, 0.58, 0.35, 0.5, 0.5, 0.85, 0.85, 0.85, 0.8, 0.9, 0.6, 0.7, 0.7, 0.65, 0.68, 0.5, 0.53, 0.93, 0.9, 0.55, 0.58, 0.35, 0.5, 0.5, 0.85, 0.85, 0.85, 0.8, 0.9, 0.6, 0.7, 0.7, 0.65, 0.68, 0.5, 0.53, 0.93, 0.9, 0.55, 0.58, 0.35, 0.5, 0.5, 0.85, 0.85, 0.85, 0.8, 0.9, 0.6, 0.7, 0.7, 0.65, 0.68, 0.5, 0.53, 0.93, 0.9, 0.55, 0.58, 0.35, 0.5, 0.5, 0.85, 0.85, 0.85, 0.8, 0.9, 0.6, 0.7, 0.7, 0.65, 0.68, 0.5, 0.53, 0.93, 0.9, 0.55, 0.58, 0.35, 0.5, 0.5, 0.85, 0.85, 0.85, 0.8, 0.9, 0.6, 0.7, 0.7, 0.65, 0.68, 0.5, 0.53, 0.93, 0.9, 0.55, 0.58, 0.35, 0.5, 0.5, 0.85, 0.85, 0.85, 0.8, 0.9, 0.6, 0.7, 0.7, 0.65, 0.68, 0.5, 0.53, 0.93, 0.9, 0.55, 0.58, 0.35, 0.5, 0.5, 0.85, 0.85, 0.85, 0.8, 0.9, 0.6, 0.7, 0.7, 0.65, 0.68, 0.5, 0.53, 0.93, 0.9, 0.55, 0.58, 0.35, 0.5, 0.5, 0.85, 0.85, 0.85, 0.8, 0.9, 0.6, 0.7, 0.7, 0.65, 0.68, 0.5, 0.53, 0.93, 0.9, 0.55, 0.58, 0.35, 0.5, 0.5, 0.85, 0.85, 0.85, 0.8, 0.9, 0.6, 0.7, 0.7, 0.65, 0.68, 0.5, 0.53, 0.93, 0.9, 0.55, 0.58, 0.35, 0.5, 0.5, 0.85, 0.85, 0.85, 0.8, 0.9, 0.6, 0.7, 0.7, 0.65, 0.68, 0.5, 0.53, 0.93, 0.9, 0.55, 0.58, 0.35, 0.5, 0.5, 0.85, 0.85, 0.85, 0.8, 0.9, 0.6, 0.7, 0.7, 0.65, 0.68, 0.5, 0.53, 0.93, 0.9, 0.55, 0.58, 0.35, 0.5, 0.5, 0.85, 0.85, 0.85, 0.8, 0.9, 0.6, 0.7, 0.7, 0.65, 0.68, 0.5, 0.53, 0.93, 0.9, 0.55, 0.58, 0.35, 0.5, 0.5, 0.85, 0.85, 0.85, 0.8, 0.9, 0.6, 0.7, 0.7, 0.65, 0.68, 0.5, 0.53, 0.93, 0.9, 0.55, 0.58, 0.35, 0.5, 0.5, 0.85, 0.85, 0.85, 0.8, 0.9, 0.6, 0.7, 0.7, 0.65, 0.68, 0.5, 0.53, 0.93, 0.9, 0.55, 0.58, 0.35, 0.5, 0.5, 0.85, 0.85, 0.85, 0.8, 0.9, 0.6, 0.7, 0.7, 0.65, 0.68, 0.5, 0.53, 0.93, 0.9, 0.55, 0.58, 0.35, 0.5, 0.5, 0.85, 0.85, 0.85, 0.8, 0.9, 0.6, 0.7, 0.7, 0.65, 0.68, 0.5, 0.53, 0.93, 0.9, 0.55, 0.58, 0.35, 0.5, 0.5, 0.85, 0.85, 0.85, 0.8, 0.9, 0.6, 0.7, 0.7, 0.65, 0.68, 0.5, 0.53, 0.93, 0.9, 0.55, 0.58, 0.35, 0.5, 0.5, 0.85, 0.85, 0.85, 0.8, 0.9, 0.6, 0.7, 0.7, 0.65, 0.68, 0.5, 0.53, 0.93, 0.9, 0.55, 0.58, 0.35, 0.5, 0.5, 0.85, 0.85, 0.85, 0.8, 0.9, 0.6, 0.7, 0.7, 0.65, 0.68, 0.5, 0.53, 0.93, 0.9, 0.55, 0.58, 0.35, 0.5, 0.5, 0.85, 0.85, 0.85, 0.8, 0.9, 0.6, 0.7, 0.7, 0.65, 0.68, 0.5, 0.53, 0.93, 0.9, 0.55, 0.58, 0.35, 0.5, 0.5, 0.85, 0.85, 0.85, 0.8, 0.9, 0.6, 0.7, 0.7, 0.65, 0.68, 0.5, 0.53, 0.93, 0.9, 0.55, 0.58, 0.35, 0.5, 0.5, 0.85, 0.85, 0.85, 0.8, 0.9, 0.6, 0.7, 0.7, 0.65, 0.68, 0.5, 0.53, 0.93, 0.9, 0.55, 0.58, 0.35, 0.5, 0.5, 0.85, 0.85, 0.85, 0.8, 0.9, 0.6, 0.7, 0.7, 0.65, 0.68, 0.5, 0.53, 0.93, 0.9, 0.55, 0.58, 0.35, 0.5, 0.5, 0.85, 0.85, 0.85, 0.8, 0.9, 0.6, 0.7, 0.7, 0.65, 0.68, 0.5, 0.53, 0.93, 0.9, 0.55, 0.58, 0.35, 0.5, 0.5, 0.85, 0.85, 0.85, 0.8, 0.9], 'spez_opex': [0.566, 0.709, 0.623, 0.63, 0.56, 0.697, 0.784, 0.789, 0.628, 0.696, 0.508, 0.472, 0.723, 0.723, 0.723, 0.687, 0.723, 0.512, 0.523, 0.566, 0.709, 0.623, 0.63, 0.56, 0.697, 0.784, 0.789, 0.628, 0.696, 0.508, 0.472, 0.723, 0.723, 0.723, 0.687, 0.723, 0.512, 0.523, 0.566, 0.709, 0.623, 0.63, 0.56, 0.697, 0.784, 0.789, 0.628, 0.696, 0.508, 0.472, 0.723, 0.723, 0.723, 0.687, 0.723, 0.512, 0.523, 0.566, 0.709, 0.623, 0.63, 0.56, 0.697, 0.784, 0.789, 0.628, 0.696, 0.508, 0.472, 0.723, 0.723, 0.723, 0.687, 0.723, 0.512, 0.523, 0.566, 0.709, 0.623, 0.63, 0.56, 0.697, 0.784, 0.789, 0.628, 0.696, 0.508, 0.472, 0.723, 0.723, 0.723, 0.687, 0.723, 0.512, 0.523, 0.566, 0.709, 0.623, 0.63, 0.56, 0.697, 0.784, 0.789, 0.628, 0.696, 0.508, 0.472, 0.723, 0.723, 0.723, 0.687, 0.723, 0.512, 0.523, 0.566, 0.709, 0.623, 0.63, 0.56, 0.697, 0.784, 0.789, 0.628, 0.696, 0.508, 0.472, 0.723, 0.723, 0.723, 0.687, 0.723, 0.512, 0.523, 0.566, 0.709, 0.623, 0.63, 0.56, 0.697, 0.784, 0.789, 0.628, 0.696, 0.508, 0.472, 0.723, 0.723, 0.723, 0.687, 0.723, 0.512, 0.523, 0.566, 0.709, 0.623, 0.63, 0.56, 0.697, 0.784, 0.789, 0.628, 0.696, 0.508, 0.472, 0.723, 0.723, 0.723, 0.687, 0.723, 0.512, 0.523, 0.566, 0.709, 0.623, 0.63, 0.56, 0.697, 0.784, 0.789, 0.628, 0.696, 0.508, 0.472, 0.723, 0.723, 0.723, 0.687, 0.723, 0.512, 0.523, 0.566, 0.709, 0.623, 0.63, 0.56, 0.697, 0.784, 0.789, 0.628, 0.696, 0.508, 0.472, 0.723, 0.723, 0.723, 0.687, 0.723, 0.512, 0.523, 0.566, 0.709, 0.623, 0.63, 0.56, 0.697, 0.784, 0.789, 0.628, 0.696, 0.508, 0.472, 0.723, 0.723, 0.723, 0.687, 0.723, 0.512, 0.523, 0.566, 0.709, 0.623, 0.63, 0.56, 0.697, 0.784, 0.789, 0.628, 0.696, 0.508, 0.472, 0.723, 0.723, 0.723, 0.687, 0.723, 0.512, 0.523, 0.566, 0.709, 0.623, 0.63, 0.56, 0.697, 0.784, 0.789, 0.628, 0.696, 0.508, 0.472, 0.723, 0.723, 0.723, 0.687, 0.723, 0.512, 0.523, 0.566, 0.709, 0.623, 0.63, 0.56, 0.697, 0.784, 0.789, 0.628, 0.696, 0.508, 0.472, 0.723, 0.723, 0.723, 0.687, 0.723, 0.512, 0.523, 0.566, 0.709, 0.623, 0.63, 0.56, 0.697, 0.784, 0.789, 0.628, 0.696, 0.508, 0.472, 0.723, 0.723, 0.723, 0.687, 0.723, 0.512, 0.523, 0.566, 0.709, 0.623, 0.63, 0.56, 0.697, 0.784, 0.789, 0.628, 0.696, 0.508, 0.472, 0.723, 0.723, 0.723, 0.687, 0.723, 0.512, 0.523, 0.566, 0.709, 0.623, 0.63, 0.56, 0.697, 0.784, 0.789, 0.628, 0.696, 0.508, 0.472, 0.723, 0.723, 0.723, 0.687, 0.723, 0.512, 0.523, 0.566, 0.709, 0.623, 0.63, 0.56, 0.697, 0.784, 0.789, 0.628, 0.696, 0.508, 0.472, 0.723, 0.723, 0.723, 0.687, 0.723, 0.512, 0.523, 0.566, 0.709, 0.623, 0.63, 0.56, 0.697, 0.784, 0.789, 0.628, 0.696, 0.508, 0.472, 0.723, 0.723, 0.723, 0.687, 0.723, 0.512, 0.523, 0.566, 0.709, 0.623, 0.63, 0.56, 0.697, 0.784, 0.789, 0.628, 0.696, 0.508, 0.472, 0.723, 0.723, 0.723, 0.687, 0.723, 0.512, 0.523, 0.566, 0.709, 0.623, 0.63, 0.56, 0.697, 0.784, 0.789, 0.628, 0.696, 0.508, 0.472, 0.723, 0.723, 0.723, 0.687, 0.723, 0.512, 0.523, 0.566, 0.709, 0.623, 0.63, 0.56, 0.697, 0.784, 0.789, 0.628, 0.696, 0.508, 0.472, 0.723, 0.723, 0.723, 0.687, 0.723, 0.512, 0.523, 0.566, 0.709, 0.623, 0.63, 0.56, 0.697, 0.784, 0.789, 0.628, 0.696, 0.508, 0.472, 0.723, 0.723, 0.723, 0.687, 0.723, 0.512, 0.523, 0.566, 0.709, 0.623, 0.63, 0.56, 0.697, 0.784, 0.789, 0.628, 0.696, 0.508, 0.472, 0.723, 0.723, 0.723, 0.687, 0.723, 0.512, 0.523], 'capex': [0.0008, 0.0007, 0.0005, 0.0006, 0.0008, 0.0006, 0.0008, 0.0007, 0.0007, 0.0007, 0.0007, 0.0006, 0.0007, 0.0007, 0.0007, 0.0006, 0.0007, 0.0006, 0.0006, 0.0008, 0.0007, 0.0005, 0.0006, 0.0008, 0.0006, 0.0008, 0.0007, 0.0007, 0.0007, 0.0007, 0.0006, 0.0007, 0.0007, 0.0007, 0.0006, 0.0007, 0.0006, 0.0006, 0.0008, 0.0007, 0.0005, 0.0006, 0.0008, 0.0006, 0.0008, 0.0007, 0.0007, 0.0007, 0.0007, 0.0006, 0.0007, 0.0007, 0.0007, 0.0006, 0.0007, 0.0006, 0.0006, 0.0008, 0.0007, 0.0005, 0.0006, 0.0008, 0.0006, 0.0008, 0.0007, 0.0007, 0.0007, 0.0007, 0.0006, 0.0007, 0.0007, 0.0007, 0.0006, 0.0007, 0.0006, 0.0006, 0.0008, 0.0007, 0.0005, 0.0006, 0.0008, 0.0006, 0.0008, 0.0007, 0.0007, 0.0007, 0.0007, 0.0006, 0.0007, 0.0007, 0.0007, 0.0006, 0.0007, 0.0006, 0.0006, 0.0008, 0.0007, 0.0005, 0.0006, 0.0008, 0.0006, 0.0008, 0.0007, 0.0007, 0.0007, 0.0007, 0.0006, 0.0007, 0.0007, 0.0007, 0.0006, 0.0007, 0.0006, 0.0006, 0.0008, 0.0007, 0.0005, 0.0006, 0.0008, 0.0006, 0.0008, 0.0007, 0.0007, 0.0007, 0.0007, 0.0006, 0.0007, 0.0007, 0.0007, 0.0006, 0.0007, 0.0006, 0.0006, 0.0008, 0.0007, 0.0005, 0.0006, 0.0008, 0.0006, 0.0008, 0.0007, 0.0007, 0.0007, 0.0007, 0.0006, 0.0007, 0.0007, 0.0007, 0.0006, 0.0007, 0.0006, 0.0006, 0.0008, 0.0007, 0.0005, 0.0006, 0.0008, 0.0006, 0.0008, 0.0007, 0.0007, 0.0007, 0.0007, 0.0006, 0.0007, 0.0007, 0.0007, 0.0006, 0.0007, 0.0006, 0.0006, 0.0008, 0.0007, 0.0005, 0.0006, 0.0008, 0.0006, 0.0008, 0.0007, 0.0007, 0.0007, 0.0007, 0.0006, 0.0007, 0.0007, 0.0007, 0.0006, 0.0007, 0.0006, 0.0006, 0.0008, 0.0007, 0.0005, 0.0006, 0.0008, 0.0006, 0.0008, 0.0007, 0.0007, 0.0007, 0.0007, 0.0006, 0.0007, 0.0007, 0.0007, 0.0006, 0.0007, 0.0006, 0.0006, 0.0008, 0.0007, 0.0005, 0.0006, 0.0008, 0.0006, 0.0008, 0.0007, 0.0007, 0.0007, 0.0007, 0.0006, 0.0007, 0.0007, 0.0007, 0.0006, 0.0007, 0.0006, 0.0006, 0.0008, 0.0007, 0.0005, 0.0006, 0.0008, 0.0006, 0.0008, 0.0007, 0.0007, 0.0007, 0.0007, 0.0006, 0.0007, 0.0007, 0.0007, 0.0006, 0.0007, 0.0006, 0.0006, 0.0008, 0.0007, 0.0005, 0.0006, 0.0008, 0.0006, 0.0008, 0.0007, 0.0007, 0.0007, 0.0007, 0.0006, 0.0007, 0.0007, 0.0007, 0.0006, 0.0007, 0.0006, 0.0006, 0.0008, 0.0007, 0.0005, 0.0006, 0.0008, 0.0006, 0.0008, 0.0007, 0.0007, 0.0007, 0.0007, 0.0006, 0.0007, 0.0007, 0.0007, 0.0006, 0.0007, 0.0006, 0.0006, 0.0008, 0.0007, 0.0005, 0.0006, 0.0008, 0.0006, 0.0008, 0.0007, 0.0007, 0.0007, 0.0007, 0.0006, 0.0007, 0.0007, 0.0007, 0.0006, 0.0007, 0.0006, 0.0006, 0.0008, 0.0007, 0.0005, 0.0006, 0.0008, 0.0006, 0.0008, 0.0007, 0.0007, 0.0007, 0.0007, 0.0006, 0.0007, 0.0007, 0.0007, 0.0006, 0.0007, 0.0006, 0.0006, 0.0008, 0.0007, 0.0005, 0.0006, 0.0008, 0.0006, 0.0008, 0.0007, 0.0007, 0.0007, 0.0007, 0.0006, 0.0007, 0.0007, 0.0007, 0.0006, 0.0007, 0.0006, 0.0006, 0.0008, 0.0007, 0.0005, 0.0006, 0.0008, 0.0006, 0.0008, 0.0007, 0.0007, 0.0007, 0.0007, 0.0006, 0.0007, 0.0007, 0.0007, 0.0006, 0.0007, 0.0006, 0.0006, 0.0008, 0.0007, 0.0005, 0.0006, 0.0008, 0.0006, 0.0008, 0.0007, 0.0007, 0.0007, 0.0007, 0.0006, 0.0007, 0.0007, 0.0007, 0.0006, 0.0007, 0.0006, 0.0006, 0.0008, 0.0007, 0.0005, 0.0006, 0.0008, 0.0006, 0.0008, 0.0007, 0.0007, 0.0007, 0.0007, 0.0006, 0.0007, 0.0007, 0.0007, 0.0006, 0.0007, 0.0006, 0.0006, 0.0008, 0.0007, 0.0005, 0.0006, 0.0008, 0.0006, 0.0008, 0.0007, 0.0007, 0.0007, 0.0007, 0.0006, 0.0007, 0.0007, 0.0007, 0.0006, 0.0007, 0.0006, 0.0006, 0.0008, 0.0007, 0.0005, 0.0006, 0.0008, 0.0006, 0.0008, 0.0007, 0.0007, 0.0007, 0.0007, 0.0006, 0.0007, 0.0007, 0.0007, 0.0006, 0.0007, 0.0006, 0.0006, 0.0008, 0.0007, 0.0005, 0.0006, 0.0008, 0.0006, 0.0008, 0.0007, 0.0007, 0.0007, 0.0007, 0.0006, 0.0007, 0.0007, 0.0007, 0.0006, 0.0007, 0.0006, 0.0006, 0.0008, 0.0007, 0.0005, 0.0006, 0.0008, 0.0006, 0.0008, 0.0007, 0.0007, 0.0007, 0.0007, 0.0006, 0.0007, 0.0007, 0.0007, 0.0006, 0.0007, 0.0006, 0.0006],
'p_typisch': [50000000.0, 300000000.0, 200000000.0, 500000000.0, 600000000.0, 500000000.0, 550000000.0, 7000000.0, 100000000.0, 300000000.0, 400000000.0, 1000000000.0, 500000000.0, 100000000.0, 50000000.0, 50000000.0, 80000000.0, 50000000.0, 200000000.0, 50000000.0, 300000000.0, 200000000.0, 500000000.0, 600000000.0, 500000000.0, 550000000.0, 7000000.0, 100000000.0, 300000000.0, 400000000.0, 1000000000.0, 500000000.0, 100000000.0, 50000000.0, 50000000.0, 80000000.0, 50000000.0, 200000000.0, 50000000.0, 300000000.0, 200000000.0, 500000000.0, 600000000.0, 500000000.0, 550000000.0, 7000000.0, 100000000.0, 300000000.0, 400000000.0, 1000000000.0, 500000000.0, 100000000.0, 50000000.0, 50000000.0, 80000000.0, 50000000.0, 200000000.0, 50000000.0, 300000000.0, 200000000.0, 500000000.0, 600000000.0, 500000000.0, 550000000.0, 7000000.0, 100000000.0, 300000000.0, 400000000.0, 1000000000.0, 500000000.0, 100000000.0, 50000000.0, 50000000.0, 80000000.0, 50000000.0, 200000000.0, 50000000.0, 300000000.0, 200000000.0, 500000000.0, 600000000.0, 500000000.0, 550000000.0, 7000000.0, 100000000.0, 300000000.0, 400000000.0, 1000000000.0, 500000000.0, 100000000.0, 50000000.0, 50000000.0, 80000000.0, 50000000.0, 200000000.0, 50000000.0, 300000000.0, 200000000.0, 500000000.0, 600000000.0, 500000000.0, 550000000.0, 7000000.0, 100000000.0, 300000000.0, 400000000.0, 1000000000.0, 500000000.0, 100000000.0, 50000000.0, 50000000.0, 80000000.0, 50000000.0, 200000000.0, 50000000.0, 300000000.0, 200000000.0, 500000000.0, 600000000.0, 500000000.0, 550000000.0, 7000000.0, 100000000.0, 300000000.0, 400000000.0, 1000000000.0, 500000000.0, 100000000.0, 50000000.0, 50000000.0, 80000000.0, 50000000.0, 200000000.0, 50000000.0, 300000000.0, 200000000.0, 500000000.0, 600000000.0, 500000000.0, 550000000.0, 7000000.0, 100000000.0, 300000000.0, 400000000.0, 1000000000.0, 500000000.0, 100000000.0, 50000000.0, 50000000.0, 80000000.0, 50000000.0, 200000000.0, 50000000.0, 300000000.0, 200000000.0, 500000000.0, 600000000.0, 500000000.0, 550000000.0, 7000000.0, 100000000.0, 300000000.0, 400000000.0, 1000000000.0, 500000000.0, 100000000.0, 50000000.0, 50000000.0, 80000000.0, 50000000.0, 200000000.0, 50000000.0, 300000000.0, 200000000.0, 500000000.0, 600000000.0, 500000000.0, 550000000.0, 7000000.0, 100000000.0, 300000000.0, 400000000.0, 1000000000.0, 500000000.0, 100000000.0, 50000000.0, 50000000.0, 80000000.0, 50000000.0, 200000000.0, 50000000.0, 300000000.0, 200000000.0, 500000000.0, 600000000.0, 500000000.0, 550000000.0, 7000000.0, 100000000.0, 300000000.0, 400000000.0, 1000000000.0, 500000000.0, 100000000.0, 50000000.0, 50000000.0, 80000000.0, 50000000.0, 200000000.0, 50000000.0, 300000000.0, 200000000.0, 500000000.0, 600000000.0, 500000000.0, 550000000.0, 7000000.0, 100000000.0, 300000000.0, 400000000.0, 1000000000.0, 500000000.0, 100000000.0, 50000000.0, 50000000.0, 80000000.0, 50000000.0, 200000000.0, 50000000.0, 300000000.0, 200000000.0, 500000000.0, 600000000.0, 500000000.0, 550000000.0, 7000000.0, 100000000.0, 300000000.0, 400000000.0, 1000000000.0, 500000000.0, 100000000.0, 50000000.0, 50000000.0, 80000000.0, 50000000.0, 200000000.0, 50000000.0, 300000000.0, 200000000.0, 500000000.0, 600000000.0, 500000000.0, 550000000.0, 7000000.0, 100000000.0, 300000000.0, 400000000.0, 1000000000.0, 500000000.0, 100000000.0, 50000000.0, 50000000.0, 80000000.0, 50000000.0, 200000000.0, 50000000.0, 300000000.0, 200000000.0, 500000000.0, 600000000.0, 500000000.0, 550000000.0, 7000000.0, 100000000.0, 300000000.0, 400000000.0, 1000000000.0, 500000000.0, 100000000.0, 50000000.0, 50000000.0, 80000000.0, 50000000.0, 200000000.0, 50000000.0, 300000000.0, 200000000.0, 500000000.0, 600000000.0, 500000000.0, 550000000.0, 7000000.0, 100000000.0, 300000000.0, 400000000.0, 1000000000.0, 500000000.0, 100000000.0, 50000000.0, 50000000.0, 80000000.0, 50000000.0, 200000000.0, 50000000.0, 300000000.0, 200000000.0, 500000000.0, 600000000.0, 500000000.0, 550000000.0, 7000000.0, 100000000.0, 300000000.0, 400000000.0, 1000000000.0, 500000000.0, 100000000.0, 50000000.0, 50000000.0, 80000000.0, 50000000.0, 200000000.0, 50000000.0, 300000000.0, 200000000.0, 500000000.0, 600000000.0, 500000000.0, 550000000.0, 7000000.0, 100000000.0, 300000000.0, 400000000.0, 1000000000.0, 500000000.0, 100000000.0, 50000000.0, 50000000.0, 80000000.0, 50000000.0, 200000000.0, 50000000.0, 300000000.0, 200000000.0, 500000000.0, 600000000.0, 500000000.0, 550000000.0, 7000000.0, 100000000.0, 300000000.0, 400000000.0, 1000000000.0, 500000000.0, 100000000.0, 50000000.0, 50000000.0, 80000000.0, 50000000.0, 200000000.0, 50000000.0, 300000000.0, 200000000.0, 500000000.0, 600000000.0, 500000000.0, 550000000.0, 7000000.0, 100000000.0, 300000000.0, 400000000.0, 1000000000.0, 500000000.0, 100000000.0, 50000000.0, 50000000.0, 80000000.0, 50000000.0, 200000000.0, 50000000.0, 300000000.0, 200000000.0, 500000000.0, 600000000.0, 500000000.0, 550000000.0, 7000000.0, 100000000.0, 300000000.0, 400000000.0, 1000000000.0, 500000000.0, 100000000.0, 50000000.0, 50000000.0, 80000000.0, 50000000.0, 200000000.0, 50000000.0, 300000000.0, 200000000.0, 500000000.0, 600000000.0, 500000000.0, 550000000.0, 7000000.0, 100000000.0, 300000000.0, 400000000.0, 1000000000.0, 500000000.0, 100000000.0, 50000000.0, 50000000.0, 80000000.0, 50000000.0, 200000000.0, 50000000.0, 300000000.0, 200000000.0, 500000000.0, 600000000.0, 500000000.0, 550000000.0, 7000000.0, 100000000.0, 300000000.0, 400000000.0, 1000000000.0, 500000000.0, 100000000.0, 50000000.0, 50000000.0, 80000000.0, 50000000.0, 200000000.0, 50000000.0, 300000000.0, 200000000.0, 500000000.0, 600000000.0, 500000000.0, 550000000.0, 7000000.0, 100000000.0, 300000000.0, 400000000.0, 1000000000.0, 500000000.0, 100000000.0, 50000000.0, 50000000.0, 80000000.0, 50000000.0, 200000000.0, 50000000.0, 300000000.0, 200000000.0, 500000000.0, 600000000.0, 500000000.0, 550000000.0, 7000000.0, 100000000.0, 300000000.0, 400000000.0, 1000000000.0, 500000000.0, 100000000.0, 50000000.0, 50000000.0, 80000000.0, 50000000.0, 200000000.0],
'spez_info': [{}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {'NH': 100, 'Z0': 0.1}, {'NH': 100, 'Z0': 0.1}, {'NH': 100, 'Z0': 0.0002}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {'NH': 100, 'Z0': 0.1}, {'NH': 100, 'Z0': 0.1}, {'NH': 100, 'Z0': 0.0002}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {'NH': 100, 'Z0': 0.1}, {'NH': 100, 'Z0': 0.1}, {'NH': 100, 'Z0': 0.0002}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {'NH': 100, 'Z0': 0.1}, {'NH': 100, 'Z0': 0.1}, {'NH': 100, 'Z0': 0.0002}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {'NH': 100, 'Z0': 0.1}, {'NH': 100, 'Z0': 0.1}, {'NH': 100, 'Z0': 0.0002}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {'NH': 100, 'Z0': 0.1}, {'NH': 100, 'Z0': 0.1}, {'NH': 100, 'Z0': 0.0002}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {'NH': 100, 'Z0': 0.1}, {'NH': 100, 'Z0': 0.1}, {'NH': 100, 'Z0': 0.0002}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {'NH': 100, 'Z0': 0.1}, {'NH': 100, 'Z0': 0.1}, {'NH': 100, 'Z0': 0.0002}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {'NH': 100, 'Z0': 0.1}, {'NH': 100, 'Z0': 0.1}, {'NH': 100, 'Z0': 0.0002}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {'NH': 100, 'Z0': 0.1}, {'NH': 100, 'Z0': 0.1}, {'NH': 100, 'Z0': 0.0002}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {'NH': 100, 'Z0': 0.1}, {'NH': 100, 'Z0': 0.1}, {'NH': 100, 'Z0': 0.0002}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {'NH': 100, 'Z0': 0.1}, {'NH': 100, 'Z0': 0.1}, {'NH': 100, 'Z0': 0.0002}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {'NH': 100, 'Z0': 0.1}, {'NH': 100, 'Z0': 0.1}, {'NH': 100, 'Z0': 0.0002}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {'NH': 100, 'Z0': 0.1}, {'NH': 100, 'Z0': 0.1}, {'NH': 100, 'Z0': 0.0002}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {'NH': 100, 'Z0': 0.1}, {'NH': 100, 'Z0': 0.1}, {'NH': 100, 'Z0': 0.0002}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {'NH': 100, 'Z0': 0.1}, {'NH': 100, 'Z0': 0.1}, {'NH': 100, 'Z0': 0.0002}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {'NH': 100, 'Z0': 0.1}, {'NH': 100, 'Z0': 0.1}, {'NH': 100, 'Z0': 0.0002}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {'NH': 100, 'Z0': 0.1}, {'NH': 100, 'Z0': 0.1}, {'NH': 100, 'Z0': 0.0002}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {'NH': 100, 'Z0': 0.1}, {'NH': 100, 'Z0': 0.1}, {'NH': 100, 'Z0': 0.0002}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {'NH': 100, 'Z0': 0.1}, {'NH': 100, 'Z0': 0.1}, {'NH': 100, 'Z0': 0.0002}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {'NH': 100, 'Z0': 0.1}, {'NH': 100, 'Z0': 0.1}, {'NH': 100, 'Z0': 0.0002}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {'NH': 100, 'Z0': 0.1}, {'NH': 100, 'Z0': 0.1}, {'NH': 100, 'Z0': 0.0002}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {'NH': 100, 'Z0': 0.1}, {'NH': 100, 'Z0': 0.1}, {'NH': 100, 'Z0': 0.0002}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {'NH': 100, 'Z0': 0.1}, {'NH': 100, 'Z0': 0.1}, {'NH': 100, 'Z0': 0.0002}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {}, {'NH': 100, 'Z0': 0.1}, {'NH': 100, 'Z0': 0.1}, {'NH': 100, 'Z0': 0.0002}],
'entsorgungspreis': [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], 'fk_brennstofftyp': [2, 1, 1, 1, 1, 5, 5, 6, 6, 4, 4, 3, 7, 9, 8, 6, 6, 6, 6, 2, 1, 1, 1, 1, 5, 5, 6, 6, 4, 4, 3, 7, 9, 8, 6, 6, 6, 6, 2, 1, 1, 1, 1, 5, 5, 6, 6, 4, 4, 3, 7, 9, 8, 6, 6, 6, 6, 2, 1, 1, 1, 1, 5, 5, 6, 6, 4, 4, 3, 7, 9, 8, 6, 6, 6, 6, 2, 1, 1, 1, 1, 5, 5, 6, 6, 4, 4, 3, 7, 9, 8, 6, 6, 6, 6, 2, 1, 1, 1, 1, 5, 5, 6, 6, 4, 4, 3, 7, 9, 8, 6, 6, 6, 6, 2, 1, 1, 1, 1, 5, 5, 6, 6, 4, 4, 3, 7, 9, 8, 6, 6, 6, 6, 2, 1, 1, 1, 1, 5, 5, 6, 6, 4, 4, 3, 7, 9, 8, 6, 6, 6, 6, 2, 1, 1, 1, 1, 5, 5, 6, 6, 4, 4, 3, 7, 9, 8, 6, 6, 6, 6, 2, 1, 1, 1, 1, 5, 5, 6, 6, 4, 4, 3, 7, 9, 8, 6, 6, 6, 6, 2, 1, 1, 1, 1, 5, 5, 6, 6, 4, 4, 3, 7, 9, 8, 6, 6, 6, 6, 2, 1, 1, 1, 1, 5, 5, 6, 6, 4, 4, 3, 7, 9, 8, 6, 6, 6, 6, 2, 1, 1, 1, 1, 5, 5, 6, 6, 4, 4, 3, 7, 9, 8, 6, 6, 6, 6, 2, 1, 1, 1, 1, 5, 5, 6, 6, 4, 4, 3, 7, 9, 8, 6, 6, 6, 6, 2, 1, 1, 1, 1, 5, 5, 6, 6, 4, 4, 3, 7, 9, 8, 6, 6, 6, 6, 2, 1, 1, 1, 1, 5, 5, 6, 6, 4, 4, 3, 7, 9, 8, 6, 6, 6, 6, 2, 1, 1, 1, 1, 5, 5, 6, 6, 4, 4, 3, 7, 9, 8, 6, 6, 6, 6, 2, 1, 1, 1, 1, 5, 5, 6, 6, 4, 4, 3, 7, 9, 8, 6, 6, 6, 6, 2, 1, 1, 1, 1, 5, 5, 6, 6, 4, 4, 3, 7, 9, 8, 6, 6, 6, 6, 2, 1, 1, 1, 1, 5, 5, 6, 6, 4, 4, 3, 7, 9, 8, 6, 6, 6, 6, 2, 1, 1, 1, 1, 5, 5, 6, 6, 4, 4, 3, 7, 9, 8, 6, 6, 6, 6, 2, 1, 1, 1, 1, 5, 5, 6, 6, 4, 4, 3, 7, 9, 8, 6, 6, 6, 6, 2, 1, 1, 1, 1, 5, 5, 6, 6, 4, 4, 3, 7, 9, 8, 6, 6, 6, 6, 2, 1, 1, 1, 1, 5, 5, 6, 6, 4, 4, 3, 7, 9, 8, 6, 6, 6, 6, 2, 1, 1, 1, 1, 5, 5, 6, 6, 4, 4, 3, 7, 9, 8, 6, 6, 6, 6],
'brennstofftyp_id': [2, 1, 1, 1, 1, 5, 5, 6, 6, 4, 4, 3, 7, 9, 8, 6, 6, 6, 6, 2, 1, 1, 1, 1, 5, 5, 6, 6, 4, 4, 3, 7, 9, 8, 6, 6, 6, 6, 2, 1, 1, 1, 1, 5, 5, 6, 6, 4, 4, 3, 7, 9, 8, 6, 6, 6, 6, 2, 1, 1, 1, 1, 5, 5, 6, 6, 4, 4, 3, 7, 9, 8, 6, 6, 6, 6, 2, 1, 1, 1, 1, 5, 5, 6, 6, 4, 4, 3, 7, 9, 8, 6, 6, 6, 6, 2, 1, 1, 1, 1, 5, 5, 6, 6, 4, 4, 3, 7, 9, 8, 6, 6, 6, 6, 2, 1, 1, 1, 1, 5, 5, 6, 6, 4, 4, 3, 7, 9, 8, 6, 6, 6, 6, 2, 1, 1, 1, 1, 5, 5, 6, 6, 4, 4, 3, 7, 9, 8, 6, 6, 6, 6, 2, 1, 1, 1, 1, 5, 5, 6, 6, 4, 4, 3, 7, 9, 8, 6, 6, 6, 6, 2, 1, 1, 1, 1, 5, 5, 6, 6, 4, 4, 3, 7, 9, 8, 6, 6, 6, 6, 2, 1, 1, 1, 1, 5, 5, 6, 6, 4, 4, 3, 7, 9, 8, 6, 6, 6, 6, 2, 1, 1, 1, 1, 5, 5, 6, 6, 4, 4, 3, 7, 9, 8, 6, 6, 6, 6, 2, 1, 1, 1, 1, 5, 5, 6, 6, 4, 4, 3, 7, 9, 8, 6, 6, 6, 6, 2, 1, 1, 1, 1, 5, 5, 6, 6, 4, 4, 3, 7, 9, 8, 6, 6, 6, 6, 2, 1, 1, 1, 1, 5, 5, 6, 6, 4, 4, 3, 7, 9, 8, 6, 6, 6, 6, 2, 1, 1, 1, 1, 5, 5, 6, 6, 4, 4, 3, 7, 9, 8, 6, 6, 6, 6, 2, 1, 1, 1, 1, 5, 5, 6, 6, 4, 4, 3, 7, 9, 8, 6, 6, 6, 6, 2, 1, 1, 1, 1, 5, 5, 6, 6, 4, 4, 3, 7, 9, 8, 6, 6, 6, 6, 2, 1, 1, 1, 1, 5, 5, 6, 6, 4, 4, 3, 7, 9, 8, 6, 6, 6, 6, 2, 1, 1, 1, 1, 5, 5, 6, 6, 4, 4, 3, 7, 9, 8, 6, 6, 6, 6, 2, 1, 1, 1, 1, 5, 5, 6, 6, 4, 4, 3, 7, 9, 8, 6, 6, 6, 6, 2, 1, 1, 1, 1, 5, 5, 6, 6, 4, 4, 3, 7, 9, 8, 6, 6, 6, 6, 2, 1, 1, 1, 1, 5, 5, 6, 6, 4, 4, 3, 7, 9, 8, 6, 6, 6, 6, 2, 1, 1, 1, 1, 5, 5, 6, 6, 4, 4, 3, 7, 9, 8, 6, 6, 6, 6, 2, 1, 1, 1, 1, 5, 5, 6, 6, 4, 4, 3, 7, 9, 8, 6, 6, 6, 6],
'bez_brennstofftyp': ['Biomasse', 'Erdgas', 'Erdgas', 'Erdgas', 'Erdgas', 'Steinkohle', 'Steinkohle', 'None', 'None', 'Braunkohle', 'Braunkohle', 'Kernbrennstoff', 'Oel', 'Weitere nonRES', 'Weitere RES', 'None', 'None', 'None', 'None', 'Biomasse', 'Erdgas', 'Erdgas', 'Erdgas', 'Erdgas', 'Steinkohle', 'Steinkohle', 'None', 'None', 'Braunkohle', 'Braunkohle', 'Kernbrennstoff', 'Oel', 'Weitere nonRES', 'Weitere RES', 'None', 'None', 'None', 'None', 'Biomasse', 'Erdgas', 'Erdgas', 'Erdgas', 'Erdgas', 'Steinkohle', 'Steinkohle', 'None', 'None', 'Braunkohle', 'Braunkohle', 'Kernbrennstoff', 'Oel', 'Weitere nonRES', 'Weitere RES', 'None', 'None', 'None', 'None', 'Biomasse', 'Erdgas', 'Erdgas', 'Erdgas', 'Erdgas', 'Steinkohle', 'Steinkohle', 'None', 'None', 'Braunkohle', 'Braunkohle', 'Kernbrennstoff', 'Oel', 'Weitere nonRES', 'Weitere RES', 'None', 'None', 'None', 'None', 'Biomasse', 'Erdgas', 'Erdgas', 'Erdgas', 'Erdgas', 'Steinkohle', 'Steinkohle', 'None', 'None', 'Braunkohle', 'Braunkohle', 'Kernbrennstoff', 'Oel', 'Weitere nonRES', 'Weitere RES', 'None', 'None', 'None', 'None', 'Biomasse', 'Erdgas', 'Erdgas', 'Erdgas', 'Erdgas', 'Steinkohle', 'Steinkohle', 'None', 'None', 'Braunkohle', 'Braunkohle', 'Kernbrennstoff', 'Oel', 'Weitere nonRES', 'Weitere RES', 'None', 'None', 'None', 'None', 'Biomasse', 'Erdgas', 'Erdgas', 'Erdgas', 'Erdgas', 'Steinkohle', 'Steinkohle', 'None', 'None', 'Braunkohle', 'Braunkohle', 'Kernbrennstoff', 'Oel', 'Weitere nonRES', 'Weitere RES', 'None', 'None', 'None', 'None', 'Biomasse', 'Erdgas', 'Erdgas', 'Erdgas', 'Erdgas', 'Steinkohle', 'Steinkohle', 'None', 'None', 'Braunkohle', 'Braunkohle', 'Kernbrennstoff', 'Oel', 'Weitere nonRES', 'Weitere RES', 'None', 'None', 'None', 'None', 'Biomasse', 'Erdgas', 'Erdgas', 'Erdgas', 'Erdgas', 'Steinkohle', 'Steinkohle', 'None', 'None', 'Braunkohle', 'Braunkohle', 'Kernbrennstoff', 'Oel', 'Weitere nonRES', 'Weitere RES', 'None', 'None', 'None', 'None', 'Biomasse', 'Erdgas', 'Erdgas', 'Erdgas', 'Erdgas', 'Steinkohle', 'Steinkohle', 'None', 'None', 'Braunkohle', 'Braunkohle', 'Kernbrennstoff', 'Oel', 'Weitere nonRES', 'Weitere RES', 'None', 'None', 'None', 'None', 'Biomasse', 'Erdgas', 'Erdgas', 'Erdgas', 'Erdgas', 'Steinkohle', 'Steinkohle', 'None', 'None', 'Braunkohle', 'Braunkohle', 'Kernbrennstoff', 'Oel', 'Weitere nonRES', 'Weitere RES', 'None', 'None', 'None', 'None', 'Biomasse', 'Erdgas', 'Erdgas', 'Erdgas', 'Erdgas', 'Steinkohle', 'Steinkohle', 'None', 'None', 'Braunkohle', 'Braunkohle', 'Kernbrennstoff', 'Oel', 'Weitere nonRES', 'Weitere RES', 'None', 'None', 'None', 'None', 'Biomasse', 'Erdgas', 'Erdgas', 'Erdgas', 'Erdgas', 'Steinkohle', 'Steinkohle', 'None', 'None', 'Braunkohle', 'Braunkohle', 'Kernbrennstoff', 'Oel', 'Weitere nonRES', 'Weitere RES', 'None', 'None', 'None', 'None', 'Biomasse', 'Erdgas', 'Erdgas', 'Erdgas', 'Erdgas', 'Steinkohle', 'Steinkohle', 'None', 'None', 'Braunkohle', 'Braunkohle', 'Kernbrennstoff', 'Oel', 'Weitere nonRES', 'Weitere RES', 'None', 'None', 'None', 'None', 'Biomasse', 'Erdgas', 'Erdgas', 'Erdgas', 'Erdgas', 'Steinkohle', 'Steinkohle', 'None', 'None', 'Braunkohle', 'Braunkohle', 'Kernbrennstoff', 'Oel', 'Weitere nonRES', 'Weitere RES', 'None', 'None', 'None', 'None', 'Biomasse', 'Erdgas', 'Erdgas', 'Erdgas', 'Erdgas', 'Steinkohle', 'Steinkohle', 'None', 'None', 'Braunkohle', 'Braunkohle', 'Kernbrennstoff', 'Oel', 'Weitere nonRES', 'Weitere RES', 'None', 'None', 'None', 'None', 'Biomasse', 'Erdgas', 'Erdgas', 'Erdgas', 'Erdgas', 'Steinkohle', 'Steinkohle', 'None', 'None', 'Braunkohle', 'Braunkohle', 'Kernbrennstoff', 'Oel', 'Weitere nonRES', 'Weitere RES', 'None', 'None', 'None', 'None', 'Biomasse', 'Erdgas', 'Erdgas', 'Erdgas', 'Erdgas', 'Steinkohle', 'Steinkohle', 'None', 'None', 'Braunkohle', 'Braunkohle', 'Kernbrennstoff', 'Oel', 'Weitere nonRES', 'Weitere RES', 'None', 'None', 'None', 'None', 'Biomasse', 'Erdgas', 'Erdgas', 'Erdgas', 'Erdgas', 'Steinkohle', 'Steinkohle', 'None', 'None', 'Braunkohle', 'Braunkohle', 'Kernbrennstoff', 'Oel', 'Weitere nonRES', 'Weitere RES', 'None', 'None', 'None', 'None', 'Biomasse', 'Erdgas', 'Erdgas', 'Erdgas', 'Erdgas', 'Steinkohle', 'Steinkohle', 'None', 'None', 'Braunkohle', 'Braunkohle', 'Kernbrennstoff', 'Oel', 'Weitere nonRES', 'Weitere RES', 'None', 'None', 'None', 'None', 'Biomasse', 'Erdgas', 'Erdgas', 'Erdgas', 'Erdgas', 'Steinkohle', 'Steinkohle', 'None', 'None', 'Braunkohle', 'Braunkohle', 'Kernbrennstoff', 'Oel', 'Weitere nonRES', 'Weitere RES', 'None', 'None', 'None', 'None', 'Biomasse', 'Erdgas', 'Erdgas', 'Erdgas', 'Erdgas', 'Steinkohle', 'Steinkohle', 'None', 'None', 'Braunkohle', 'Braunkohle', 'Kernbrennstoff', 'Oel', 'Weitere nonRES', 'Weitere RES', 'None', 'None', 'None', 'None', 'Biomasse', 'Erdgas', 'Erdgas', 'Erdgas', 'Erdgas', 'Steinkohle', 'Steinkohle', 'None', 'None', 'Braunkohle', 'Braunkohle', 'Kernbrennstoff', 'Oel', 'Weitere nonRES', 'Weitere RES', 'None', 'None', 'None', 'None', 'Biomasse', 'Erdgas', 'Erdgas', 'Erdgas', 'Erdgas', 'Steinkohle', 'Steinkohle', 'None', 'None', 'Braunkohle', 'Braunkohle', 'Kernbrennstoff', 'Oel', 'Weitere nonRES', 'Weitere RES', 'None', 'None', 'None', 'None', 'Biomasse', 'Erdgas', 'Erdgas', 'Erdgas', 'Erdgas', 'Steinkohle', 'Steinkohle', 'None', 'None', 'Braunkohle', 'Braunkohle', 'Kernbrennstoff', 'Oel', 'Weitere nonRES', 'Weitere RES', 'None', 'None', 'None', 'None'],
'co2emissfakt': [4e-09, 5.6e-08, 5.6e-08, 5.6e-08, 5.6e-08, 9.5e-08, 9.5e-08, 0.0, 0.0, 1e-07, 1e-07, 0.0, 7.8e-08, 5.6e-08, 4e-09, 0.0, 0.0, 0.0, 0.0, 4e-09, 5.6e-08, 5.6e-08, 5.6e-08, 5.6e-08, 9.5e-08, 9.5e-08, 0.0, 0.0, 1e-07, 1e-07, 0.0, 7.8e-08, 5.6e-08, 4e-09, 0.0, 0.0, 0.0, 0.0, 4e-09, 5.6e-08, 5.6e-08, 5.6e-08, 5.6e-08, 9.5e-08, 9.5e-08, 0.0, 0.0, 1e-07, 1e-07, 0.0, 7.8e-08, 5.6e-08, 4e-09, 0.0, 0.0, 0.0, 0.0, 4e-09, 5.6e-08, 5.6e-08, 5.6e-08, 5.6e-08, 9.5e-08, 9.5e-08, 0.0, 0.0, 1e-07, 1e-07, 0.0, 7.8e-08, 5.6e-08, 4e-09, 0.0, 0.0, 0.0, 0.0, 4e-09, 5.6e-08, 5.6e-08, 5.6e-08, 5.6e-08, 9.5e-08, 9.5e-08, 0.0, 0.0, 1e-07, 1e-07, 0.0, 7.8e-08, 5.6e-08, 4e-09, 0.0, 0.0, 0.0, 0.0, 4e-09, 5.6e-08, 5.6e-08, 5.6e-08, 5.6e-08, 9.5e-08, 9.5e-08, 0.0, 0.0, 1e-07, 1e-07, 0.0, 7.8e-08, 5.6e-08, 4e-09, 0.0, 0.0, 0.0, 0.0, 4e-09, 5.6e-08, 5.6e-08, 5.6e-08, 5.6e-08, 9.5e-08, 9.5e-08, 0.0, 0.0, 1e-07, 1e-07, 0.0, 7.8e-08, 5.6e-08, 4e-09, 0.0, 0.0, 0.0, 0.0, 4e-09, 5.6e-08, 5.6e-08, 5.6e-08, 5.6e-08, 9.5e-08, 9.5e-08, 0.0, 0.0, 1e-07, 1e-07, 0.0, 7.8e-08, 5.6e-08, 4e-09, 0.0, 0.0, 0.0, 0.0, 4e-09, 5.6e-08, 5.6e-08, 5.6e-08, 5.6e-08, 9.5e-08, 9.5e-08, 0.0, 0.0, 1e-07, 1e-07, 0.0, 7.8e-08, 5.6e-08, 4e-09, 0.0, 0.0, 0.0, 0.0, 4e-09, 5.6e-08, 5.6e-08, 5.6e-08, 5.6e-08, 9.5e-08, 9.5e-08, 0.0, 0.0, 1e-07, 1e-07, 0.0, 7.8e-08, 5.6e-08, 4e-09, 0.0, 0.0, 0.0, 0.0, 4e-09, 5.6e-08, 5.6e-08, 5.6e-08, 5.6e-08, 9.5e-08, 9.5e-08, 0.0, 0.0, 1e-07, 1e-07, 0.0, 7.8e-08, 5.6e-08, 4e-09, 0.0, 0.0, 0.0, 0.0, 4e-09, 5.6e-08, 5.6e-08, 5.6e-08, 5.6e-08, 9.5e-08, 9.5e-08, 0.0, 0.0, 1e-07, 1e-07, 0.0, 7.8e-08, 5.6e-08, 4e-09, 0.0, 0.0, 0.0, 0.0, 4e-09, 5.6e-08, 5.6e-08, 5.6e-08, 5.6e-08, 9.5e-08, 9.5e-08, 0.0, 0.0, 1e-07, 1e-07, 0.0, 7.8e-08, 5.6e-08, 4e-09, 0.0, 0.0, 0.0, 0.0, 4e-09, 5.6e-08, 5.6e-08, 5.6e-08, 5.6e-08, 9.5e-08, 9.5e-08, 0.0, 0.0, 1e-07, 1e-07, 0.0, 7.8e-08, 5.6e-08, 4e-09, 0.0, 0.0, 0.0, 0.0, 4e-09, 5.6e-08, 5.6e-08, 5.6e-08, 5.6e-08, 9.5e-08, 9.5e-08, 0.0, 0.0, 1e-07, 1e-07, 0.0, 7.8e-08, 5.6e-08, 4e-09, 0.0, 0.0, 0.0, 0.0, 4e-09, 5.6e-08, 5.6e-08, 5.6e-08, 5.6e-08, 9.5e-08, 9.5e-08, 0.0, 0.0, 1e-07, 1e-07, 0.0, 7.8e-08, 5.6e-08, 4e-09, 0.0, 0.0, 0.0, 0.0, 4e-09, 5.6e-08, 5.6e-08, 5.6e-08, 5.6e-08, 9.5e-08, 9.5e-08, 0.0, 0.0, 1e-07, 1e-07, 0.0, 7.8e-08, 5.6e-08, 4e-09, 0.0, 0.0, 0.0, 0.0, 4e-09, 5.6e-08, 5.6e-08, 5.6e-08, 5.6e-08, 9.5e-08, 9.5e-08, 0.0, 0.0, 1e-07, 1e-07, 0.0, 7.8e-08, 5.6e-08, 4e-09, 0.0, 0.0, 0.0, 0.0, 4e-09, 5.6e-08, 5.6e-08, 5.6e-08, 5.6e-08, 9.5e-08, 9.5e-08, 0.0, 0.0, 1e-07, 1e-07, 0.0, 7.8e-08, 5.6e-08, 4e-09, 0.0, 0.0, 0.0, 0.0, 4e-09, 5.6e-08, 5.6e-08, 5.6e-08, 5.6e-08, 9.5e-08, 9.5e-08, 0.0, 0.0, 1e-07, 1e-07, 0.0, 7.8e-08, 5.6e-08, 4e-09, 0.0, 0.0, 0.0, 0.0, 4e-09, 5.6e-08, 5.6e-08, 5.6e-08, 5.6e-08, 9.5e-08, 9.5e-08, 0.0, 0.0, 1e-07, 1e-07, 0.0, 7.8e-08, 5.6e-08, 4e-09, 0.0, 0.0, 0.0, 0.0, 4e-09, 5.6e-08, 5.6e-08, 5.6e-08, 5.6e-08, 9.5e-08, 9.5e-08, 0.0, 0.0, 1e-07, 1e-07, 0.0, 7.8e-08, 5.6e-08, 4e-09, 0.0, 0.0, 0.0, 0.0, 4e-09, 5.6e-08, 5.6e-08, 5.6e-08, 5.6e-08, 9.5e-08, 9.5e-08, 0.0, 0.0, 1e-07, 1e-07, 0.0, 7.8e-08, 5.6e-08, 4e-09, 0.0, 0.0, 0.0, 0.0, 4e-09, 5.6e-08, 5.6e-08, 5.6e-08, 5.6e-08, 9.5e-08, 9.5e-08, 0.0, 0.0, 1e-07, 1e-07, 0.0, 7.8e-08, 5.6e-08, 4e-09, 0.0, 0.0, 0.0, 0.0, 4e-09, 5.6e-08, 5.6e-08, 5.6e-08, 5.6e-08, 9.5e-08, 9.5e-08, 0.0, 0.0, 1e-07, 1e-07, 0.0, 7.8e-08, 5.6e-08, 4e-09, 0.0, 0.0, 0.0, 0.0], 'bs_preis': [2e-09, 6.1e-09, 6.1e-09, 6.1e-09, 6.1e-09, 2.3e-09, 2.3e-09, 0.0, 0.0, 1.1e-09, 1.1e-09, 4.7e-10, 1.41e-08, 5e-10, 2e-10, 0.0, 0.0, 0.0, 0.0, 2e-09, 6.1e-09, 6.1e-09, 6.1e-09, 6.1e-09, 2.3e-09, 2.3e-09, 0.0, 0.0, 1.1e-09, 1.1e-09, 4.7e-10, 1.41e-08, 5e-10, 2e-10, 0.0, 0.0, 0.0, 0.0, 2e-09, 6.1e-09, 6.1e-09, 6.1e-09, 6.1e-09, 2.3e-09, 2.3e-09, 0.0, 0.0, 1.1e-09, 1.1e-09, 4.7e-10, 1.41e-08, 5e-10, 2e-10, 0.0, 0.0, 0.0, 0.0, 2e-09, 6.1e-09, 6.1e-09, 6.1e-09, 6.1e-09, 2.3e-09, 2.3e-09, 0.0, 0.0, 1.1e-09, 1.1e-09, 4.7e-10, 1.41e-08, 5e-10, 2e-10, 0.0, 0.0, 0.0, 0.0, 2e-09, 6.1e-09, 6.1e-09, 6.1e-09, 6.1e-09, 2.3e-09, 2.3e-09, 0.0, 0.0, 1.1e-09, 1.1e-09, 4.7e-10, 1.41e-08, 5e-10, 2e-10, 0.0, 0.0, 0.0, 0.0, 2e-09, 6.1e-09, 6.1e-09, 6.1e-09, 6.1e-09, 2.3e-09, 2.3e-09, 0.0, 0.0, 1.1e-09, 1.1e-09, 4.7e-10, 1.41e-08, 5e-10, 2e-10, 0.0, 0.0, 0.0, 0.0, 2e-09, 6.1e-09, 6.1e-09, 6.1e-09, 6.1e-09, 2.3e-09, 2.3e-09, 0.0, 0.0, 1.1e-09, 1.1e-09, 4.7e-10, 1.41e-08, 5e-10, 2e-10, 0.0, 0.0, 0.0, 0.0, 2e-09, 6.1e-09, 6.1e-09, 6.1e-09, 6.1e-09, 2.3e-09, 2.3e-09, 0.0, 0.0, 1.1e-09, 1.1e-09, 4.7e-10, 1.41e-08, 5e-10, 2e-10, 0.0, 0.0, 0.0, 0.0, 2e-09, 6.1e-09, 6.1e-09, 6.1e-09, 6.1e-09, 2.3e-09, 2.3e-09, 0.0, 0.0, 1.1e-09, 1.1e-09, 4.7e-10, 1.41e-08, 5e-10, 2e-10, 0.0, 0.0, 0.0, 0.0, 2e-09, 6.1e-09, 6.1e-09, 6.1e-09, 6.1e-09, 2.3e-09, 2.3e-09, 0.0, 0.0, 1.1e-09, 1.1e-09, 4.7e-10, 1.41e-08, 5e-10, 2e-10, 0.0, 0.0, 0.0, 0.0, 2e-09, 6.1e-09, 6.1e-09, 6.1e-09, 6.1e-09, 2.3e-09, 2.3e-09, 0.0, 0.0, 1.1e-09, 1.1e-09, 4.7e-10, 1.41e-08, 5e-10, 2e-10, 0.0, 0.0, 0.0, 0.0, 2e-09, 6.1e-09, 6.1e-09, 6.1e-09, 6.1e-09, 2.3e-09, 2.3e-09, 0.0, 0.0, 1.1e-09, 1.1e-09, 4.7e-10, 1.41e-08, 5e-10, 2e-10, 0.0, 0.0, 0.0, 0.0, 2e-09, 6.1e-09, 6.1e-09, 6.1e-09, 6.1e-09, 2.3e-09, 2.3e-09, 0.0, 0.0, 1.1e-09, 1.1e-09, 4.7e-10, 1.41e-08, 5e-10, 2e-10, 0.0, 0.0, 0.0, 0.0, 2e-09, 6.1e-09, 6.1e-09, 6.1e-09, 6.1e-09, 2.3e-09, 2.3e-09, 0.0, 0.0, 1.1e-09, 1.1e-09, 4.7e-10, 1.41e-08, 5e-10, 2e-10, 0.0, 0.0, 0.0, 0.0, 2e-09, 6.1e-09, 6.1e-09, 6.1e-09, 6.1e-09, 2.3e-09, 2.3e-09, 0.0, 0.0, 1.1e-09, 1.1e-09, 4.7e-10, 1.41e-08, 5e-10, 2e-10, 0.0, 0.0, 0.0, 0.0, 2e-09, 6.1e-09, 6.1e-09, 6.1e-09, 6.1e-09, 2.3e-09, 2.3e-09, 0.0, 0.0, 1.1e-09, 1.1e-09, 4.7e-10, 1.41e-08, 5e-10, 2e-10, 0.0, 0.0, 0.0, 0.0, 2e-09, 6.1e-09, 6.1e-09, 6.1e-09, 6.1e-09, 2.3e-09, 2.3e-09, 0.0, 0.0, 1.1e-09, 1.1e-09, 4.7e-10, 1.41e-08, 5e-10, 2e-10, 0.0, 0.0, 0.0, 0.0, 2e-09, 6.1e-09, 6.1e-09, 6.1e-09, 6.1e-09, 2.3e-09, 2.3e-09, 0.0, 0.0, 1.1e-09, 1.1e-09, 4.7e-10, 1.41e-08, 5e-10, 2e-10, 0.0, 0.0, 0.0, 0.0, 2e-09, 6.1e-09, 6.1e-09, 6.1e-09, 6.1e-09, 2.3e-09, 2.3e-09, 0.0, 0.0, 1.1e-09, 1.1e-09, 4.7e-10, 1.41e-08, 5e-10, 2e-10, 0.0, 0.0, 0.0, 0.0, 2e-09, 6.1e-09, 6.1e-09, 6.1e-09, 6.1e-09, 2.3e-09, 2.3e-09, 0.0, 0.0, 1.1e-09, 1.1e-09, 4.7e-10, 1.41e-08, 5e-10, 2e-10, 0.0, 0.0, 0.0, 0.0, 2e-09, 6.1e-09, 6.1e-09, 6.1e-09, 6.1e-09, 2.3e-09, 2.3e-09, 0.0, 0.0, 1.1e-09, 1.1e-09, 4.7e-10, 1.41e-08, 5e-10, 2e-10, 0.0, 0.0, 0.0, 0.0, 2e-09, 6.1e-09, 6.1e-09, 6.1e-09, 6.1e-09, 2.3e-09, 2.3e-09, 0.0, 0.0, 1.1e-09, 1.1e-09, 4.7e-10, 1.41e-08, 5e-10, 2e-10, 0.0, 0.0, 0.0, 0.0, 2e-09, 6.1e-09, 6.1e-09, 6.1e-09, 6.1e-09, 2.3e-09, 2.3e-09, 0.0, 0.0, 1.1e-09, 1.1e-09, 4.7e-10, 1.41e-08, 5e-10, 2e-10, 0.0, 0.0, 0.0, 0.0, 2e-09, 6.1e-09, 6.1e-09, 6.1e-09, 6.1e-09, 2.3e-09, 2.3e-09, 0.0, 0.0, 1.1e-09, 1.1e-09, 4.7e-10, 1.41e-08, 5e-10, 2e-10, 0.0, 0.0, 0.0, 0.0, 2e-09, 6.1e-09, 6.1e-09, 6.1e-09, 6.1e-09, 2.3e-09, 2.3e-09, 0.0, 0.0, 1.1e-09, 1.1e-09, 4.7e-10, 1.41e-08, 5e-10, 2e-10, 0.0, 0.0, 0.0, 0.0], 'co2_preis': [0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018, 0.018],
'co2_kosten': [1.2e-10, 1.4399999999999998e-09, 1.4399999999999998e-09, 1.5507692307692305e-09, 1.4823529411764703e-09, 3.4199999999999998e-09, 3.2264150943396223e-09, 0.0, 0.0, 3.272727272727272e-09, 3.103448275862069e-09, 0.0, 2.808e-09, 2.0159999999999997e-09, 8.470588235294118e-11, 0.0, 0.0, 0.0, 0.0, 1.2e-10, 1.4399999999999998e-09, 1.4399999999999998e-09, 1.5507692307692305e-09, 1.4823529411764703e-09, 3.4199999999999998e-09, 3.2264150943396223e-09, 0.0, 0.0, 3.272727272727272e-09, 3.103448275862069e-09, 0.0, 2.808e-09, 2.0159999999999997e-09, 8.470588235294118e-11, 0.0, 0.0, 0.0, 0.0, 1.2e-10, 1.4399999999999998e-09, 1.4399999999999998e-09, 1.5507692307692305e-09, 1.4823529411764703e-09, 3.4199999999999998e-09, 3.2264150943396223e-09, 0.0, 0.0, 3.272727272727272e-09, 3.103448275862069e-09, 0.0, 2.808e-09, 2.0159999999999997e-09, 8.470588235294118e-11, 0.0, 0.0, 0.0, 0.0, 1.2e-10, 1.4399999999999998e-09, 1.4399999999999998e-09, 1.5507692307692305e-09, 1.4823529411764703e-09, 3.4199999999999998e-09, 3.2264150943396223e-09, 0.0, 0.0, 3.272727272727272e-09, 3.103448275862069e-09, 0.0, 2.808e-09, 2.0159999999999997e-09, 8.470588235294118e-11, 0.0, 0.0, 0.0, 0.0, 1.2e-10, 1.4399999999999998e-09, 1.4399999999999998e-09, 1.5507692307692305e-09, 1.4823529411764703e-09, 3.4199999999999998e-09, 3.2264150943396223e-09, 0.0, 0.0, 3.272727272727272e-09, 3.103448275862069e-09, 0.0, 2.808e-09, 2.0159999999999997e-09, 8.470588235294118e-11, 0.0, 0.0, 0.0, 0.0, 1.2e-10, 1.4399999999999998e-09, 1.4399999999999998e-09, 1.5507692307692305e-09, 1.4823529411764703e-09, 3.4199999999999998e-09, 3.2264150943396223e-09, 0.0, 0.0, 3.272727272727272e-09, 3.103448275862069e-09, 0.0, 2.808e-09, 2.0159999999999997e-09, 8.470588235294118e-11, 0.0, 0.0, 0.0, 0.0, 1.2e-10, 1.4399999999999998e-09, 1.4399999999999998e-09, 1.5507692307692305e-09, 1.4823529411764703e-09, 3.4199999999999998e-09, 3.2264150943396223e-09, 0.0, 0.0, 3.272727272727272e-09, 3.103448275862069e-09, 0.0, 2.808e-09, 2.0159999999999997e-09, 8.470588235294118e-11, 0.0, 0.0, 0.0, 0.0, 1.2e-10, 1.4399999999999998e-09, 1.4399999999999998e-09, 1.5507692307692305e-09, 1.4823529411764703e-09, 3.4199999999999998e-09, 3.2264150943396223e-09, 0.0, 0.0, 3.272727272727272e-09, 3.103448275862069e-09, 0.0, 2.808e-09, 2.0159999999999997e-09, 8.470588235294118e-11, 0.0, 0.0, 0.0, 0.0, 1.2e-10, 1.4399999999999998e-09, 1.4399999999999998e-09, 1.5507692307692305e-09, 1.4823529411764703e-09, 3.4199999999999998e-09, 3.2264150943396223e-09, 0.0, 0.0, 3.272727272727272e-09, 3.103448275862069e-09, 0.0, 2.808e-09, 2.0159999999999997e-09, 8.470588235294118e-11, 0.0, 0.0, 0.0, 0.0, 1.2e-10, 1.4399999999999998e-09, 1.4399999999999998e-09, 1.5507692307692305e-09, 1.4823529411764703e-09, 3.4199999999999998e-09, 3.2264150943396223e-09, 0.0, 0.0, 3.272727272727272e-09, 3.103448275862069e-09, 0.0, 2.808e-09, 2.0159999999999997e-09, 8.470588235294118e-11, 0.0, 0.0, 0.0, 0.0, 1.2e-10, 1.4399999999999998e-09, 1.4399999999999998e-09, 1.5507692307692305e-09, 1.4823529411764703e-09, 3.4199999999999998e-09, 3.2264150943396223e-09, 0.0, 0.0, 3.272727272727272e-09, 3.103448275862069e-09, 0.0, 2.808e-09, 2.0159999999999997e-09, 8.470588235294118e-11, 0.0, 0.0, 0.0, 0.0, 1.2e-10, 1.4399999999999998e-09, 1.4399999999999998e-09, 1.5507692307692305e-09, 1.4823529411764703e-09, 3.4199999999999998e-09, 3.2264150943396223e-09, 0.0, 0.0, 3.272727272727272e-09, 3.103448275862069e-09, 0.0, 2.808e-09, 2.0159999999999997e-09, 8.470588235294118e-11, 0.0, 0.0, 0.0, 0.0, 1.2e-10, 1.4399999999999998e-09, 1.4399999999999998e-09, 1.5507692307692305e-09, 1.4823529411764703e-09, 3.4199999999999998e-09, 3.2264150943396223e-09, 0.0, 0.0, 3.272727272727272e-09, 3.103448275862069e-09, 0.0, 2.808e-09, 2.0159999999999997e-09, 8.470588235294118e-11, 0.0, 0.0, 0.0, 0.0, 1.2e-10, 1.4399999999999998e-09, 1.4399999999999998e-09, 1.5507692307692305e-09, 1.4823529411764703e-09, 3.4199999999999998e-09, 3.2264150943396223e-09, 0.0, 0.0, 3.272727272727272e-09, 3.103448275862069e-09, 0.0, 2.808e-09, 2.0159999999999997e-09, 8.470588235294118e-11, 0.0, 0.0, 0.0, 0.0, 1.2e-10, 1.4399999999999998e-09, 1.4399999999999998e-09, 1.5507692307692305e-09, 1.4823529411764703e-09, 3.4199999999999998e-09, 3.2264150943396223e-09, 0.0, 0.0, 3.272727272727272e-09, 3.103448275862069e-09, 0.0, 2.808e-09, 2.0159999999999997e-09, 8.470588235294118e-11, 0.0, 0.0, 0.0, 0.0, 1.2e-10, 1.4399999999999998e-09, 1.4399999999999998e-09, 1.5507692307692305e-09, 1.4823529411764703e-09, 3.4199999999999998e-09, 3.2264150943396223e-09, 0.0, 0.0, 3.272727272727272e-09, 3.103448275862069e-09, 0.0, 2.808e-09, 2.0159999999999997e-09, 8.470588235294118e-11, 0.0, 0.0, 0.0, 0.0, 1.2e-10, 1.4399999999999998e-09, 1.4399999999999998e-09, 1.5507692307692305e-09, 1.4823529411764703e-09, 3.4199999999999998e-09, 3.2264150943396223e-09, 0.0, 0.0, 3.272727272727272e-09, 3.103448275862069e-09, 0.0, 2.808e-09, 2.0159999999999997e-09, 8.470588235294118e-11, 0.0, 0.0, 0.0, 0.0, 1.2e-10, 1.4399999999999998e-09, 1.4399999999999998e-09, 1.5507692307692305e-09, 1.4823529411764703e-09, 3.4199999999999998e-09, 3.2264150943396223e-09, 0.0, 0.0, 3.272727272727272e-09, 3.103448275862069e-09, 0.0, 2.808e-09, 2.0159999999999997e-09, 8.470588235294118e-11, 0.0, 0.0, 0.0, 0.0, 1.2e-10, 1.4399999999999998e-09, 1.4399999999999998e-09, 1.5507692307692305e-09, 1.4823529411764703e-09, 3.4199999999999998e-09, 3.2264150943396223e-09, 0.0, 0.0, 3.272727272727272e-09, 3.103448275862069e-09, 0.0, 2.808e-09, 2.0159999999999997e-09, 8.470588235294118e-11, 0.0, 0.0, 0.0, 0.0, 1.2e-10, 1.4399999999999998e-09, 1.4399999999999998e-09, 1.5507692307692305e-09, 1.4823529411764703e-09, 3.4199999999999998e-09, 3.2264150943396223e-09, 0.0, 0.0, 3.272727272727272e-09, 3.103448275862069e-09, 0.0, 2.808e-09, 2.0159999999999997e-09, 8.470588235294118e-11, 0.0, 0.0, 0.0, 0.0, 1.2e-10, 1.4399999999999998e-09, 1.4399999999999998e-09, 1.5507692307692305e-09, 1.4823529411764703e-09, 3.4199999999999998e-09, 3.2264150943396223e-09, 0.0, 0.0, 3.272727272727272e-09, 3.103448275862069e-09, 0.0, 2.808e-09, 2.0159999999999997e-09, 8.470588235294118e-11, 0.0, 0.0, 0.0, 0.0, 1.2e-10, 1.4399999999999998e-09, 1.4399999999999998e-09, 1.5507692307692305e-09, 1.4823529411764703e-09, 3.4199999999999998e-09, 3.2264150943396223e-09, 0.0, 0.0, 3.272727272727272e-09, 3.103448275862069e-09, 0.0, 2.808e-09, 2.0159999999999997e-09, 8.470588235294118e-11, 0.0, 0.0, 0.0, 0.0, 1.2e-10, 1.4399999999999998e-09, 1.4399999999999998e-09, 1.5507692307692305e-09, 1.4823529411764703e-09, 3.4199999999999998e-09, 3.2264150943396223e-09, 0.0, 0.0, 3.272727272727272e-09, 3.103448275862069e-09, 0.0, 2.808e-09, 2.0159999999999997e-09, 8.470588235294118e-11, 0.0, 0.0, 0.0, 0.0, 1.2e-10, 1.4399999999999998e-09, 1.4399999999999998e-09, 1.5507692307692305e-09, 1.4823529411764703e-09, 3.4199999999999998e-09, 3.2264150943396223e-09, 0.0, 0.0, 3.272727272727272e-09, 3.103448275862069e-09, 0.0, 2.808e-09, 2.0159999999999997e-09, 8.470588235294118e-11, 0.0, 0.0, 0.0, 0.0, 1.2e-10, 1.4399999999999998e-09, 1.4399999999999998e-09, 1.5507692307692305e-09, 1.4823529411764703e-09, 3.4199999999999998e-09, 3.2264150943396223e-09, 0.0, 0.0, 3.272727272727272e-09, 3.103448275862069e-09, 0.0, 2.808e-09, 2.0159999999999997e-09, 8.470588235294118e-11, 0.0, 0.0, 0.0, 0.0],
'entsorgungskosten': [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], 'brennstoffkosten': [3.333333333333334e-09, 8.714285714285715e-09, 8.714285714285715e-09, 9.384615384615384e-09, 8.970588235294117e-09, 4.6e-09, 4.3396226415094335e-09, 0.0, 0.0, 1.9999999999999997e-09, 1.896551724137931e-09, 1.342857142857143e-09, 2.82e-08, 1e-09, 2.352941176470588e-10, 0.0, 0.0, 0.0, 0.0, 3.333333333333334e-09, 8.714285714285715e-09, 8.714285714285715e-09, 9.384615384615384e-09, 8.970588235294117e-09, 4.6e-09, 4.3396226415094335e-09, 0.0, 0.0, 1.9999999999999997e-09, 1.896551724137931e-09, 1.342857142857143e-09, 2.82e-08, 1e-09, 2.352941176470588e-10, 0.0, 0.0, 0.0, 0.0, 3.333333333333334e-09, 8.714285714285715e-09, 8.714285714285715e-09, 9.384615384615384e-09, 8.970588235294117e-09, 4.6e-09, 4.3396226415094335e-09, 0.0, 0.0, 1.9999999999999997e-09, 1.896551724137931e-09, 1.342857142857143e-09, 2.82e-08, 1e-09, 2.352941176470588e-10, 0.0, 0.0, 0.0, 0.0, 3.333333333333334e-09, 8.714285714285715e-09, 8.714285714285715e-09, 9.384615384615384e-09, 8.970588235294117e-09, 4.6e-09, 4.3396226415094335e-09, 0.0, 0.0, 1.9999999999999997e-09, 1.896551724137931e-09, 1.342857142857143e-09, 2.82e-08, 1e-09, 2.352941176470588e-10, 0.0, 0.0, 0.0, 0.0, 3.333333333333334e-09, 8.714285714285715e-09, 8.714285714285715e-09, 9.384615384615384e-09, 8.970588235294117e-09, 4.6e-09, 4.3396226415094335e-09, 0.0, 0.0, 1.9999999999999997e-09, 1.896551724137931e-09, 1.342857142857143e-09, 2.82e-08, 1e-09, 2.352941176470588e-10, 0.0, 0.0, 0.0, 0.0, 3.333333333333334e-09, 8.714285714285715e-09, 8.714285714285715e-09, 9.384615384615384e-09, 8.970588235294117e-09, 4.6e-09, 4.3396226415094335e-09, 0.0, 0.0, 1.9999999999999997e-09, 1.896551724137931e-09, 1.342857142857143e-09, 2.82e-08, 1e-09, 2.352941176470588e-10, 0.0, 0.0, 0.0, 0.0, 3.333333333333334e-09, 8.714285714285715e-09, 8.714285714285715e-09, 9.384615384615384e-09, 8.970588235294117e-09, 4.6e-09, 4.3396226415094335e-09, 0.0, 0.0, 1.9999999999999997e-09, 1.896551724137931e-09, 1.342857142857143e-09, 2.82e-08, 1e-09, 2.352941176470588e-10, 0.0, 0.0, 0.0, 0.0, 3.333333333333334e-09, 8.714285714285715e-09, 8.714285714285715e-09, 9.384615384615384e-09, 8.970588235294117e-09, 4.6e-09, 4.3396226415094335e-09, 0.0, 0.0, 1.9999999999999997e-09, 1.896551724137931e-09, 1.342857142857143e-09, 2.82e-08, 1e-09, 2.352941176470588e-10, 0.0, 0.0, 0.0, 0.0, 3.333333333333334e-09, 8.714285714285715e-09, 8.714285714285715e-09, 9.384615384615384e-09, 8.970588235294117e-09, 4.6e-09, 4.3396226415094335e-09, 0.0, 0.0, 1.9999999999999997e-09, 1.896551724137931e-09, 1.342857142857143e-09, 2.82e-08, 1e-09, 2.352941176470588e-10, 0.0, 0.0, 0.0, 0.0, 3.333333333333334e-09, 8.714285714285715e-09, 8.714285714285715e-09, 9.384615384615384e-09, 8.970588235294117e-09, 4.6e-09, 4.3396226415094335e-09, 0.0, 0.0, 1.9999999999999997e-09, 1.896551724137931e-09, 1.342857142857143e-09, 2.82e-08, 1e-09, 2.352941176470588e-10, 0.0, 0.0, 0.0, 0.0, 3.333333333333334e-09, 8.714285714285715e-09, 8.714285714285715e-09, 9.384615384615384e-09, 8.970588235294117e-09, 4.6e-09, 4.3396226415094335e-09, 0.0, 0.0, 1.9999999999999997e-09, 1.896551724137931e-09, 1.342857142857143e-09, 2.82e-08, 1e-09, 2.352941176470588e-10, 0.0, 0.0, 0.0, 0.0, 3.333333333333334e-09, 8.714285714285715e-09, 8.714285714285715e-09, 9.384615384615384e-09, 8.970588235294117e-09, 4.6e-09, 4.3396226415094335e-09, 0.0, 0.0, 1.9999999999999997e-09, 1.896551724137931e-09, 1.342857142857143e-09, 2.82e-08, 1e-09, 2.352941176470588e-10, 0.0, 0.0, 0.0, 0.0, 3.333333333333334e-09, 8.714285714285715e-09, 8.714285714285715e-09, 9.384615384615384e-09, 8.970588235294117e-09, 4.6e-09, 4.3396226415094335e-09, 0.0, 0.0, 1.9999999999999997e-09, 1.896551724137931e-09, 1.342857142857143e-09, 2.82e-08, 1e-09, 2.352941176470588e-10, 0.0, 0.0, 0.0, 0.0, 3.333333333333334e-09, 8.714285714285715e-09, 8.714285714285715e-09, 9.384615384615384e-09, 8.970588235294117e-09, 4.6e-09, 4.3396226415094335e-09, 0.0, 0.0, 1.9999999999999997e-09, 1.896551724137931e-09, 1.342857142857143e-09, 2.82e-08, 1e-09, 2.352941176470588e-10, 0.0, 0.0, 0.0, 0.0, 3.333333333333334e-09, 8.714285714285715e-09, 8.714285714285715e-09, 9.384615384615384e-09, 8.970588235294117e-09, 4.6e-09, 4.3396226415094335e-09, 0.0, 0.0, 1.9999999999999997e-09, 1.896551724137931e-09, 1.342857142857143e-09, 2.82e-08, 1e-09, 2.352941176470588e-10, 0.0, 0.0, 0.0, 0.0, 3.333333333333334e-09, 8.714285714285715e-09, 8.714285714285715e-09, 9.384615384615384e-09, 8.970588235294117e-09, 4.6e-09, 4.3396226415094335e-09, 0.0, 0.0, 1.9999999999999997e-09, 1.896551724137931e-09, 1.342857142857143e-09, 2.82e-08, 1e-09, 2.352941176470588e-10, 0.0, 0.0, 0.0, 0.0, 3.333333333333334e-09, 8.714285714285715e-09, 8.714285714285715e-09, 9.384615384615384e-09, 8.970588235294117e-09, 4.6e-09, 4.3396226415094335e-09, 0.0, 0.0, 1.9999999999999997e-09, 1.896551724137931e-09, 1.342857142857143e-09, 2.82e-08, 1e-09, 2.352941176470588e-10, 0.0, 0.0, 0.0, 0.0, 3.333333333333334e-09, 8.714285714285715e-09, 8.714285714285715e-09, 9.384615384615384e-09, 8.970588235294117e-09, 4.6e-09, 4.3396226415094335e-09, 0.0, 0.0, 1.9999999999999997e-09, 1.896551724137931e-09, 1.342857142857143e-09, 2.82e-08, 1e-09, 2.352941176470588e-10, 0.0, 0.0, 0.0, 0.0, 3.333333333333334e-09, 8.714285714285715e-09, 8.714285714285715e-09, 9.384615384615384e-09, 8.970588235294117e-09, 4.6e-09, 4.3396226415094335e-09, 0.0, 0.0, 1.9999999999999997e-09, 1.896551724137931e-09, 1.342857142857143e-09, 2.82e-08, 1e-09, 2.352941176470588e-10, 0.0, 0.0, 0.0, 0.0, 3.333333333333334e-09, 8.714285714285715e-09, 8.714285714285715e-09, 9.384615384615384e-09, 8.970588235294117e-09, 4.6e-09, 4.3396226415094335e-09, 0.0, 0.0, 1.9999999999999997e-09, 1.896551724137931e-09, 1.342857142857143e-09, 2.82e-08, 1e-09, 2.352941176470588e-10, 0.0, 0.0, 0.0, 0.0, 3.333333333333334e-09, 8.714285714285715e-09, 8.714285714285715e-09, 9.384615384615384e-09, 8.970588235294117e-09, 4.6e-09, 4.3396226415094335e-09, 0.0, 0.0, 1.9999999999999997e-09, 1.896551724137931e-09, 1.342857142857143e-09, 2.82e-08, 1e-09, 2.352941176470588e-10, 0.0, 0.0, 0.0, 0.0, 3.333333333333334e-09, 8.714285714285715e-09, 8.714285714285715e-09, 9.384615384615384e-09, 8.970588235294117e-09, 4.6e-09, 4.3396226415094335e-09, 0.0, 0.0, 1.9999999999999997e-09, 1.896551724137931e-09, 1.342857142857143e-09, 2.82e-08, 1e-09, 2.352941176470588e-10, 0.0, 0.0, 0.0, 0.0, 3.333333333333334e-09, 8.714285714285715e-09, 8.714285714285715e-09, 9.384615384615384e-09, 8.970588235294117e-09, 4.6e-09, 4.3396226415094335e-09, 0.0, 0.0, 1.9999999999999997e-09, 1.896551724137931e-09, 1.342857142857143e-09, 2.82e-08, 1e-09, 2.352941176470588e-10, 0.0, 0.0, 0.0, 0.0, 3.333333333333334e-09, 8.714285714285715e-09, 8.714285714285715e-09, 9.384615384615384e-09, 8.970588235294117e-09, 4.6e-09, 4.3396226415094335e-09, 0.0, 0.0, 1.9999999999999997e-09, 1.896551724137931e-09, 1.342857142857143e-09, 2.82e-08, 1e-09, 2.352941176470588e-10, 0.0, 0.0, 0.0, 0.0, 3.333333333333334e-09, 8.714285714285715e-09, 8.714285714285715e-09, 9.384615384615384e-09, 8.970588235294117e-09, 4.6e-09, 4.3396226415094335e-09, 0.0, 0.0, 1.9999999999999997e-09, 1.896551724137931e-09, 1.342857142857143e-09, 2.82e-08, 1e-09, 2.352941176470588e-10, 0.0, 0.0, 0.0, 0.0]}
'''
async def func_peri(self, prep_to_peri=None):
# set output
self.set_output("kw_data", self.KWDaten)
| 1,132.581633 | 12,862 | 0.640031 | 20,967 | 110,993 | 3.363476 | 0.049173 | 0.091915 | 0.124642 | 0.151442 | 0.78455 | 0.779544 | 0.778906 | 0.77831 | 0.77831 | 0.777346 | 0 | 0.477445 | 0.126224 | 110,993 | 97 | 12,863 | 1,144.257732 | 0.249714 | 0.013659 | 0 | 0 | 0 | 0 | 0.154032 | 0.022211 | 0 | 0 | 0 | 0 | 0 | 1 | 0.045455 | false | 0 | 0.136364 | 0 | 0.227273 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
27842a49ed6a9ff569370a96d1eb4d256395d989 | 1,319 | py | Python | model/models.py | huylb314/sample-django | 5c53e05ccd62abc075e4a9942681ab845d5be2e0 | [
"MIT"
] | null | null | null | model/models.py | huylb314/sample-django | 5c53e05ccd62abc075e4a9942681ab845d5be2e0 | [
"MIT"
] | null | null | null | model/models.py | huylb314/sample-django | 5c53e05ccd62abc075e4a9942681ab845d5be2e0 | [
"MIT"
] | null | null | null | from django.db import models
from jsonfield import JSONField
class Chart(models.Model):
ownerSource = models.CharField(max_length=200, db_index=True)
ownerId = models.CharField(max_length=200, db_index=True)
name = models.CharField(max_length=200)
symbol = models.CharField(max_length=50)
resolution = models.CharField(max_length=10)
lastModified = models.DateTimeField()
content = JSONField()
def __str__(self):
return self.ownerSource + ":" + self.ownerId
def setContent(self, _content):
self.content = _content
class StudyTemplate(models.Model):
ownerSource = models.CharField(max_length=200, db_index=True)
ownerId = models.CharField(max_length=200, db_index=True)
name = models.CharField(max_length=200)
content = JSONField()
def __str__(self):
return self.ownerSource + ":" + self.ownerId
def setContent(self, _content):
self.content = _content
class DrawingTemplate(models.Model):
ownerSource = models.CharField(max_length=200, db_index=True)
ownerId = models.CharField(max_length=200, db_index=True)
name = models.CharField(max_length=200)
tool = models.CharField(max_length=200)
content = JSONField()
def __str__(self):
return self.ownerSource + ":" + self.ownerId
def setContent(self, _content):
self.content = _content
| 29.311111 | 63 | 0.738438 | 165 | 1,319 | 5.684848 | 0.2 | 0.191898 | 0.230277 | 0.307036 | 0.794243 | 0.794243 | 0.794243 | 0.794243 | 0.794243 | 0.794243 | 0 | 0.030357 | 0.150872 | 1,319 | 44 | 64 | 29.977273 | 0.807143 | 0 | 0 | 0.727273 | 0 | 0 | 0.002353 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.181818 | false | 0 | 0.060606 | 0.090909 | 0.909091 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 9 |
279a189df5301ebad3ec6f7a2a0ed28efec0e15d | 1,552 | py | Python | tests/functions/test_norm.py | nlp-greyfoss/metagrad | 0f32f177ced1478f0c75ad37bace9a9fc4044ba3 | [
"MIT"
] | 7 | 2022-01-27T05:38:02.000Z | 2022-03-30T01:48:00.000Z | tests/functions/test_norm.py | nlp-greyfoss/metagrad | 0f32f177ced1478f0c75ad37bace9a9fc4044ba3 | [
"MIT"
] | null | null | null | tests/functions/test_norm.py | nlp-greyfoss/metagrad | 0f32f177ced1478f0c75ad37bace9a9fc4044ba3 | [
"MIT"
] | 2 | 2022-02-22T07:47:02.000Z | 2022-03-22T08:31:59.000Z | import numpy as np
from metagrad import Tensor
import metagrad.functions as F
import torch
def test_simple_norm_1():
x = np.arange(9, dtype=np.float32) - 4
x = x.reshape((3, 3))
mx = Tensor(x, requires_grad=True)
y = F.norm(mx, 1)
tx = torch.tensor(x, requires_grad=True)
ty = torch.norm(tx, p=1)
assert np.allclose(y.data, ty.data)
y.backward()
ty.backward()
assert np.allclose(mx.grad.data, tx.grad.data)
def test_simple_norm_2():
x = np.arange(9, dtype=np.float32) - 4
x = x.reshape((3, 3))
mx = Tensor(x, requires_grad=True)
y = F.norm(mx)
tx = torch.tensor(x, requires_grad=True)
ty = torch.norm(tx)
assert np.allclose(y.data, ty.data)
y.backward()
ty.backward()
assert np.allclose(mx.grad.data, tx.grad.data)
def test_norm_1():
x = np.array([[1, 2, 3], [-1, 1, 4]], dtype=np.float32)
mx = Tensor(x, requires_grad=True)
y = F.norm(mx, p=1, axis=0)
tx = torch.tensor(x, requires_grad=True)
ty = torch.norm(tx, p=1, dim=0)
assert np.allclose(y.data, ty.data)
y.sum().backward()
ty.sum().backward()
assert np.allclose(mx.grad.data, tx.grad.data)
def test_norm_2():
x = np.array([[1, 2, 3], [-1, 1, 4]], dtype=np.float32)
mx = Tensor(x, requires_grad=True)
y = F.norm(mx, axis=0)
tx = torch.tensor(x, requires_grad=True)
ty = torch.norm(tx, dim=0)
assert np.allclose(y.data, ty.data)
y.sum().backward()
ty.sum().backward()
assert np.allclose(mx.grad.data, tx.grad.data)
| 20.421053 | 59 | 0.612758 | 266 | 1,552 | 3.507519 | 0.154135 | 0.060021 | 0.128617 | 0.162915 | 0.882101 | 0.882101 | 0.882101 | 0.882101 | 0.882101 | 0.882101 | 0 | 0.032922 | 0.217139 | 1,552 | 75 | 60 | 20.693333 | 0.734979 | 0 | 0 | 0.652174 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.173913 | 1 | 0.086957 | false | 0 | 0.086957 | 0 | 0.173913 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
27d52dcf2c4d95e36787c5fb9cc103ee67d12a77 | 175,939 | py | Python | Thrift/gen-py/SpotifakeServices/ContentCreatorService.py | BrunoLujan/Spotifake-DESER | a811444af0a1326659dd27949c6a1c66c7cd66a1 | [
"Apache-2.0"
] | null | null | null | Thrift/gen-py/SpotifakeServices/ContentCreatorService.py | BrunoLujan/Spotifake-DESER | a811444af0a1326659dd27949c6a1c66c7cd66a1 | [
"Apache-2.0"
] | null | null | null | Thrift/gen-py/SpotifakeServices/ContentCreatorService.py | BrunoLujan/Spotifake-DESER | a811444af0a1326659dd27949c6a1c66c7cd66a1 | [
"Apache-2.0"
] | null | null | null | #
# Autogenerated by Thrift Compiler (0.13.0)
#
# DO NOT EDIT UNLESS YOU ARE SURE THAT YOU KNOW WHAT YOU ARE DOING
#
# options string: py
#
from thrift.Thrift import TType, TMessageType, TFrozenDict, TException, TApplicationException
from thrift.protocol.TProtocol import TProtocolException
from thrift.TRecursive import fix_spec
import sys
import logging
from .ttypes import *
from thrift.Thrift import TProcessor
from thrift.transport import TTransport
all_structs = []
class Iface(object):
def GetContentCreators(self):
"""
Get ContentCreator
@return list<ContentCreator>
ContentCreator list
"""
pass
def GetContentCreatorById(self, idContentCreator):
"""
Get ContentCreator by Id
@param idContentCreator
The ContentCreator Id to be obtained.
@return ContentCreator
ContentCreator object
Parameters:
- idContentCreator
"""
pass
def GetContentCreatorByLibraryId(self, idLibrary):
"""
Get ContentCreator by Library Id
@param idLibrary
The Library Id to be obtained.
@return ContentCreator list
list<ContentCreator>
Parameters:
- idLibrary
"""
pass
def GetContentCreatorByEmail(self, email):
"""
Get ContentCreator by email
@param email
The ContentCreator email to be obtained.
@return bool
bool object
Parameters:
- email
"""
pass
def GetContentCreatorByStageName(self, email):
"""
Get ContentCreator by email
@param email
The ContentCreator email to be obtained.
@return bool
bool object
Parameters:
- email
"""
pass
def AddContentCreator(self, newContentCreator):
"""
Register a Content Creator.
@param newContentCreator
@return ContentCreator
ContentCreator object added
Parameters:
- newContentCreator
"""
pass
def DeleteContentCreator(self, email):
"""
Delete a ContentCreator
@param email
The Content Creator email of the Content Creator to be deleted.
@return Id
The Content Creator Id of the Content Creator deleted.
Parameters:
- email
"""
pass
def UpdateContentCreatorPassword(self, email, newPassword):
"""
Update previously registered Content Creator password.
@param email
The Content Creator Email of the Consumer which require an update password.
@return ContentCreator
Modified Content Creator obejct.
Parameters:
- email
- newPassword
"""
pass
def UpdateContentCreatorImage(self, email, fileName):
"""
Update previously registered Content Creator image.
@param email
The Content Creator Email of the Consumer which require an update image.
@return bool
True or False
Parameters:
- email
- fileName
"""
pass
def UpdateContentCreatorStageName(self, email, currentPassword, newStageName):
"""
Update previously registered Content Creator stage name.
@param email
The Content Creator Email of the Consumer which require an update stage name.
@return ContentCreator
Modified Content Creator obejct.
Parameters:
- email
- currentPassword
- newStageName
"""
pass
def UpdateContentCreatorDescription(self, email, currentPassword, newDescription):
"""
Update previously registered Content Creator description.
@param email
The Content Creator Email of the Consumer which require an update description.
@return ContentCreator
Modified Content Creator obejct
Parameters:
- email
- currentPassword
- newDescription
"""
pass
def LoginContentCreator(self, email, password):
"""
Allows the login of a content creator
@param email
The Conntent Creator email
@param password
The Email password of the content creator
@return Content Creator
Content Creator object
Parameters:
- email
- password
"""
pass
def AddContentCreatorToLibrary(self, idLibrary, idContenCreator):
"""
Add a ContentCreator to Library.
@param idLibrary
The Library Id to which a content creator will be added
@param newContentCreator
@return ContentCreator
ContentCreator object added
Parameters:
- idLibrary
- idContenCreator
"""
pass
def DeleteLibraryContentCreator(self, idLibrary, idContentCreator):
"""
Delete a Content Creator from a Library
@param idLibrary
The Library Id which a content creator will be deleted.
@param idContentCreator
The Content Creator Id which will be deleted
@return Id
The Content Creator Id of the Content Creator deleted.
Parameters:
- idLibrary
- idContentCreator
"""
pass
def GetContentCreatorByQuery(self, query):
"""
Get ContentCreator by Query
@param query
The query to be obtained
@return ContentCreator
list<contentCreator>
Parameters:
- query
"""
pass
def AddImageToMedia(self, fileName, image):
"""
Add image file binary
@param binary image
The binary number that will be keep.
@return bool
true or false.
Parameters:
- fileName
- image
"""
pass
def GetImageToMedia(self, fileName):
"""
Get image file binary
@param binary image
The binary number that will be keep.
@return binary
binary image.
Parameters:
- fileName
"""
pass
def DeleteImageToMedia(self, fileName):
"""
Delete image file binary
@param fileName
The fileName of file that will be delete.
@return bool
True or False
Parameters:
- fileName
"""
pass
class Client(Iface):
def __init__(self, iprot, oprot=None):
self._iprot = self._oprot = iprot
if oprot is not None:
self._oprot = oprot
self._seqid = 0
def GetContentCreators(self):
"""
Get ContentCreator
@return list<ContentCreator>
ContentCreator list
"""
self.send_GetContentCreators()
return self.recv_GetContentCreators()
def send_GetContentCreators(self):
self._oprot.writeMessageBegin('GetContentCreators', TMessageType.CALL, self._seqid)
args = GetContentCreators_args()
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_GetContentCreators(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = GetContentCreators_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.sErrorUserE is not None:
raise result.sErrorUserE
if result.sErrorNotFoundE is not None:
raise result.sErrorNotFoundE
if result.sErrorInvalidRequestE is not None:
raise result.sErrorInvalidRequestE
raise TApplicationException(TApplicationException.MISSING_RESULT, "GetContentCreators failed: unknown result")
def GetContentCreatorById(self, idContentCreator):
"""
Get ContentCreator by Id
@param idContentCreator
The ContentCreator Id to be obtained.
@return ContentCreator
ContentCreator object
Parameters:
- idContentCreator
"""
self.send_GetContentCreatorById(idContentCreator)
return self.recv_GetContentCreatorById()
def send_GetContentCreatorById(self, idContentCreator):
self._oprot.writeMessageBegin('GetContentCreatorById', TMessageType.CALL, self._seqid)
args = GetContentCreatorById_args()
args.idContentCreator = idContentCreator
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_GetContentCreatorById(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = GetContentCreatorById_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.sErrorUserE is not None:
raise result.sErrorUserE
if result.sErrorNotFoundE is not None:
raise result.sErrorNotFoundE
if result.sErrorInvalidRequestE is not None:
raise result.sErrorInvalidRequestE
raise TApplicationException(TApplicationException.MISSING_RESULT, "GetContentCreatorById failed: unknown result")
def GetContentCreatorByLibraryId(self, idLibrary):
"""
Get ContentCreator by Library Id
@param idLibrary
The Library Id to be obtained.
@return ContentCreator list
list<ContentCreator>
Parameters:
- idLibrary
"""
self.send_GetContentCreatorByLibraryId(idLibrary)
return self.recv_GetContentCreatorByLibraryId()
def send_GetContentCreatorByLibraryId(self, idLibrary):
self._oprot.writeMessageBegin('GetContentCreatorByLibraryId', TMessageType.CALL, self._seqid)
args = GetContentCreatorByLibraryId_args()
args.idLibrary = idLibrary
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_GetContentCreatorByLibraryId(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = GetContentCreatorByLibraryId_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.sErrorUserE is not None:
raise result.sErrorUserE
if result.sErrorNotFoundE is not None:
raise result.sErrorNotFoundE
if result.sErrorInvalidRequestE is not None:
raise result.sErrorInvalidRequestE
raise TApplicationException(TApplicationException.MISSING_RESULT, "GetContentCreatorByLibraryId failed: unknown result")
def GetContentCreatorByEmail(self, email):
"""
Get ContentCreator by email
@param email
The ContentCreator email to be obtained.
@return bool
bool object
Parameters:
- email
"""
self.send_GetContentCreatorByEmail(email)
return self.recv_GetContentCreatorByEmail()
def send_GetContentCreatorByEmail(self, email):
self._oprot.writeMessageBegin('GetContentCreatorByEmail', TMessageType.CALL, self._seqid)
args = GetContentCreatorByEmail_args()
args.email = email
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_GetContentCreatorByEmail(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = GetContentCreatorByEmail_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.sErrorUserE is not None:
raise result.sErrorUserE
if result.sErrorNotFoundE is not None:
raise result.sErrorNotFoundE
if result.sErrorInvalidRequestE is not None:
raise result.sErrorInvalidRequestE
raise TApplicationException(TApplicationException.MISSING_RESULT, "GetContentCreatorByEmail failed: unknown result")
def GetContentCreatorByStageName(self, email):
"""
Get ContentCreator by email
@param email
The ContentCreator email to be obtained.
@return bool
bool object
Parameters:
- email
"""
self.send_GetContentCreatorByStageName(email)
return self.recv_GetContentCreatorByStageName()
def send_GetContentCreatorByStageName(self, email):
self._oprot.writeMessageBegin('GetContentCreatorByStageName', TMessageType.CALL, self._seqid)
args = GetContentCreatorByStageName_args()
args.email = email
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_GetContentCreatorByStageName(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = GetContentCreatorByStageName_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.sErrorUserE is not None:
raise result.sErrorUserE
if result.sErrorNotFoundE is not None:
raise result.sErrorNotFoundE
if result.sErrorInvalidRequestE is not None:
raise result.sErrorInvalidRequestE
raise TApplicationException(TApplicationException.MISSING_RESULT, "GetContentCreatorByStageName failed: unknown result")
def AddContentCreator(self, newContentCreator):
"""
Register a Content Creator.
@param newContentCreator
@return ContentCreator
ContentCreator object added
Parameters:
- newContentCreator
"""
self.send_AddContentCreator(newContentCreator)
return self.recv_AddContentCreator()
def send_AddContentCreator(self, newContentCreator):
self._oprot.writeMessageBegin('AddContentCreator', TMessageType.CALL, self._seqid)
args = AddContentCreator_args()
args.newContentCreator = newContentCreator
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_AddContentCreator(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = AddContentCreator_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.sErrorUserE is not None:
raise result.sErrorUserE
raise TApplicationException(TApplicationException.MISSING_RESULT, "AddContentCreator failed: unknown result")
def DeleteContentCreator(self, email):
"""
Delete a ContentCreator
@param email
The Content Creator email of the Content Creator to be deleted.
@return Id
The Content Creator Id of the Content Creator deleted.
Parameters:
- email
"""
self.send_DeleteContentCreator(email)
return self.recv_DeleteContentCreator()
def send_DeleteContentCreator(self, email):
self._oprot.writeMessageBegin('DeleteContentCreator', TMessageType.CALL, self._seqid)
args = DeleteContentCreator_args()
args.email = email
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_DeleteContentCreator(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = DeleteContentCreator_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.sErrorNotFoundE is not None:
raise result.sErrorNotFoundE
if result.sErrorSystemE is not None:
raise result.sErrorSystemE
if result.sErrorInvalidRequestE is not None:
raise result.sErrorInvalidRequestE
raise TApplicationException(TApplicationException.MISSING_RESULT, "DeleteContentCreator failed: unknown result")
def UpdateContentCreatorPassword(self, email, newPassword):
"""
Update previously registered Content Creator password.
@param email
The Content Creator Email of the Consumer which require an update password.
@return ContentCreator
Modified Content Creator obejct.
Parameters:
- email
- newPassword
"""
self.send_UpdateContentCreatorPassword(email, newPassword)
return self.recv_UpdateContentCreatorPassword()
def send_UpdateContentCreatorPassword(self, email, newPassword):
self._oprot.writeMessageBegin('UpdateContentCreatorPassword', TMessageType.CALL, self._seqid)
args = UpdateContentCreatorPassword_args()
args.email = email
args.newPassword = newPassword
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_UpdateContentCreatorPassword(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = UpdateContentCreatorPassword_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.sErrorUserE is not None:
raise result.sErrorUserE
if result.sErrorNotFoundE is not None:
raise result.sErrorNotFoundE
if result.sErrorSystemE is not None:
raise result.sErrorSystemE
if result.sErrorInvalidRequestE is not None:
raise result.sErrorInvalidRequestE
raise TApplicationException(TApplicationException.MISSING_RESULT, "UpdateContentCreatorPassword failed: unknown result")
def UpdateContentCreatorImage(self, email, fileName):
"""
Update previously registered Content Creator image.
@param email
The Content Creator Email of the Consumer which require an update image.
@return bool
True or False
Parameters:
- email
- fileName
"""
self.send_UpdateContentCreatorImage(email, fileName)
return self.recv_UpdateContentCreatorImage()
def send_UpdateContentCreatorImage(self, email, fileName):
self._oprot.writeMessageBegin('UpdateContentCreatorImage', TMessageType.CALL, self._seqid)
args = UpdateContentCreatorImage_args()
args.email = email
args.fileName = fileName
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_UpdateContentCreatorImage(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = UpdateContentCreatorImage_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.sErrorUserE is not None:
raise result.sErrorUserE
if result.sErrorNotFoundE is not None:
raise result.sErrorNotFoundE
if result.sErrorSystemE is not None:
raise result.sErrorSystemE
if result.sErrorInvalidRequestE is not None:
raise result.sErrorInvalidRequestE
raise TApplicationException(TApplicationException.MISSING_RESULT, "UpdateContentCreatorImage failed: unknown result")
def UpdateContentCreatorStageName(self, email, currentPassword, newStageName):
"""
Update previously registered Content Creator stage name.
@param email
The Content Creator Email of the Consumer which require an update stage name.
@return ContentCreator
Modified Content Creator obejct.
Parameters:
- email
- currentPassword
- newStageName
"""
self.send_UpdateContentCreatorStageName(email, currentPassword, newStageName)
return self.recv_UpdateContentCreatorStageName()
def send_UpdateContentCreatorStageName(self, email, currentPassword, newStageName):
self._oprot.writeMessageBegin('UpdateContentCreatorStageName', TMessageType.CALL, self._seqid)
args = UpdateContentCreatorStageName_args()
args.email = email
args.currentPassword = currentPassword
args.newStageName = newStageName
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_UpdateContentCreatorStageName(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = UpdateContentCreatorStageName_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.sErrorUserE is not None:
raise result.sErrorUserE
if result.sErrorNotFoundE is not None:
raise result.sErrorNotFoundE
if result.sErrorSystemE is not None:
raise result.sErrorSystemE
if result.sErrorInvalidRequestE is not None:
raise result.sErrorInvalidRequestE
raise TApplicationException(TApplicationException.MISSING_RESULT, "UpdateContentCreatorStageName failed: unknown result")
def UpdateContentCreatorDescription(self, email, currentPassword, newDescription):
"""
Update previously registered Content Creator description.
@param email
The Content Creator Email of the Consumer which require an update description.
@return ContentCreator
Modified Content Creator obejct
Parameters:
- email
- currentPassword
- newDescription
"""
self.send_UpdateContentCreatorDescription(email, currentPassword, newDescription)
return self.recv_UpdateContentCreatorDescription()
def send_UpdateContentCreatorDescription(self, email, currentPassword, newDescription):
self._oprot.writeMessageBegin('UpdateContentCreatorDescription', TMessageType.CALL, self._seqid)
args = UpdateContentCreatorDescription_args()
args.email = email
args.currentPassword = currentPassword
args.newDescription = newDescription
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_UpdateContentCreatorDescription(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = UpdateContentCreatorDescription_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.sErrorUserE is not None:
raise result.sErrorUserE
if result.sErrorNotFoundE is not None:
raise result.sErrorNotFoundE
if result.sErrorSystemE is not None:
raise result.sErrorSystemE
if result.sErrorInvalidRequestE is not None:
raise result.sErrorInvalidRequestE
raise TApplicationException(TApplicationException.MISSING_RESULT, "UpdateContentCreatorDescription failed: unknown result")
def LoginContentCreator(self, email, password):
"""
Allows the login of a content creator
@param email
The Conntent Creator email
@param password
The Email password of the content creator
@return Content Creator
Content Creator object
Parameters:
- email
- password
"""
self.send_LoginContentCreator(email, password)
return self.recv_LoginContentCreator()
def send_LoginContentCreator(self, email, password):
self._oprot.writeMessageBegin('LoginContentCreator', TMessageType.CALL, self._seqid)
args = LoginContentCreator_args()
args.email = email
args.password = password
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_LoginContentCreator(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = LoginContentCreator_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.sErrorUserE is not None:
raise result.sErrorUserE
if result.sErrorSystemE is not None:
raise result.sErrorSystemE
raise TApplicationException(TApplicationException.MISSING_RESULT, "LoginContentCreator failed: unknown result")
def AddContentCreatorToLibrary(self, idLibrary, idContenCreator):
"""
Add a ContentCreator to Library.
@param idLibrary
The Library Id to which a content creator will be added
@param newContentCreator
@return ContentCreator
ContentCreator object added
Parameters:
- idLibrary
- idContenCreator
"""
self.send_AddContentCreatorToLibrary(idLibrary, idContenCreator)
return self.recv_AddContentCreatorToLibrary()
def send_AddContentCreatorToLibrary(self, idLibrary, idContenCreator):
self._oprot.writeMessageBegin('AddContentCreatorToLibrary', TMessageType.CALL, self._seqid)
args = AddContentCreatorToLibrary_args()
args.idLibrary = idLibrary
args.idContenCreator = idContenCreator
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_AddContentCreatorToLibrary(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = AddContentCreatorToLibrary_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.sErrorSystemE is not None:
raise result.sErrorSystemE
raise TApplicationException(TApplicationException.MISSING_RESULT, "AddContentCreatorToLibrary failed: unknown result")
def DeleteLibraryContentCreator(self, idLibrary, idContentCreator):
"""
Delete a Content Creator from a Library
@param idLibrary
The Library Id which a content creator will be deleted.
@param idContentCreator
The Content Creator Id which will be deleted
@return Id
The Content Creator Id of the Content Creator deleted.
Parameters:
- idLibrary
- idContentCreator
"""
self.send_DeleteLibraryContentCreator(idLibrary, idContentCreator)
return self.recv_DeleteLibraryContentCreator()
def send_DeleteLibraryContentCreator(self, idLibrary, idContentCreator):
self._oprot.writeMessageBegin('DeleteLibraryContentCreator', TMessageType.CALL, self._seqid)
args = DeleteLibraryContentCreator_args()
args.idLibrary = idLibrary
args.idContentCreator = idContentCreator
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_DeleteLibraryContentCreator(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = DeleteLibraryContentCreator_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.sErrorNotFoundE is not None:
raise result.sErrorNotFoundE
if result.sErrorSystemE is not None:
raise result.sErrorSystemE
raise TApplicationException(TApplicationException.MISSING_RESULT, "DeleteLibraryContentCreator failed: unknown result")
def GetContentCreatorByQuery(self, query):
"""
Get ContentCreator by Query
@param query
The query to be obtained
@return ContentCreator
list<contentCreator>
Parameters:
- query
"""
self.send_GetContentCreatorByQuery(query)
return self.recv_GetContentCreatorByQuery()
def send_GetContentCreatorByQuery(self, query):
self._oprot.writeMessageBegin('GetContentCreatorByQuery', TMessageType.CALL, self._seqid)
args = GetContentCreatorByQuery_args()
args.query = query
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_GetContentCreatorByQuery(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = GetContentCreatorByQuery_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.sErrorNotFoundE is not None:
raise result.sErrorNotFoundE
if result.sErrorSystemE is not None:
raise result.sErrorSystemE
raise TApplicationException(TApplicationException.MISSING_RESULT, "GetContentCreatorByQuery failed: unknown result")
def AddImageToMedia(self, fileName, image):
"""
Add image file binary
@param binary image
The binary number that will be keep.
@return bool
true or false.
Parameters:
- fileName
- image
"""
self.send_AddImageToMedia(fileName, image)
return self.recv_AddImageToMedia()
def send_AddImageToMedia(self, fileName, image):
self._oprot.writeMessageBegin('AddImageToMedia', TMessageType.CALL, self._seqid)
args = AddImageToMedia_args()
args.fileName = fileName
args.image = image
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_AddImageToMedia(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = AddImageToMedia_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.sErrorSystemE is not None:
raise result.sErrorSystemE
raise TApplicationException(TApplicationException.MISSING_RESULT, "AddImageToMedia failed: unknown result")
def GetImageToMedia(self, fileName):
"""
Get image file binary
@param binary image
The binary number that will be keep.
@return binary
binary image.
Parameters:
- fileName
"""
self.send_GetImageToMedia(fileName)
return self.recv_GetImageToMedia()
def send_GetImageToMedia(self, fileName):
self._oprot.writeMessageBegin('GetImageToMedia', TMessageType.CALL, self._seqid)
args = GetImageToMedia_args()
args.fileName = fileName
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_GetImageToMedia(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = GetImageToMedia_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.sErrorSystemE is not None:
raise result.sErrorSystemE
raise TApplicationException(TApplicationException.MISSING_RESULT, "GetImageToMedia failed: unknown result")
def DeleteImageToMedia(self, fileName):
"""
Delete image file binary
@param fileName
The fileName of file that will be delete.
@return bool
True or False
Parameters:
- fileName
"""
self.send_DeleteImageToMedia(fileName)
return self.recv_DeleteImageToMedia()
def send_DeleteImageToMedia(self, fileName):
self._oprot.writeMessageBegin('DeleteImageToMedia', TMessageType.CALL, self._seqid)
args = DeleteImageToMedia_args()
args.fileName = fileName
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_DeleteImageToMedia(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = DeleteImageToMedia_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.sErrorSystemE is not None:
raise result.sErrorSystemE
raise TApplicationException(TApplicationException.MISSING_RESULT, "DeleteImageToMedia failed: unknown result")
class Processor(Iface, TProcessor):
def __init__(self, handler):
self._handler = handler
self._processMap = {}
self._processMap["GetContentCreators"] = Processor.process_GetContentCreators
self._processMap["GetContentCreatorById"] = Processor.process_GetContentCreatorById
self._processMap["GetContentCreatorByLibraryId"] = Processor.process_GetContentCreatorByLibraryId
self._processMap["GetContentCreatorByEmail"] = Processor.process_GetContentCreatorByEmail
self._processMap["GetContentCreatorByStageName"] = Processor.process_GetContentCreatorByStageName
self._processMap["AddContentCreator"] = Processor.process_AddContentCreator
self._processMap["DeleteContentCreator"] = Processor.process_DeleteContentCreator
self._processMap["UpdateContentCreatorPassword"] = Processor.process_UpdateContentCreatorPassword
self._processMap["UpdateContentCreatorImage"] = Processor.process_UpdateContentCreatorImage
self._processMap["UpdateContentCreatorStageName"] = Processor.process_UpdateContentCreatorStageName
self._processMap["UpdateContentCreatorDescription"] = Processor.process_UpdateContentCreatorDescription
self._processMap["LoginContentCreator"] = Processor.process_LoginContentCreator
self._processMap["AddContentCreatorToLibrary"] = Processor.process_AddContentCreatorToLibrary
self._processMap["DeleteLibraryContentCreator"] = Processor.process_DeleteLibraryContentCreator
self._processMap["GetContentCreatorByQuery"] = Processor.process_GetContentCreatorByQuery
self._processMap["AddImageToMedia"] = Processor.process_AddImageToMedia
self._processMap["GetImageToMedia"] = Processor.process_GetImageToMedia
self._processMap["DeleteImageToMedia"] = Processor.process_DeleteImageToMedia
self._on_message_begin = None
def on_message_begin(self, func):
self._on_message_begin = func
def process(self, iprot, oprot):
(name, type, seqid) = iprot.readMessageBegin()
if self._on_message_begin:
self._on_message_begin(name, type, seqid)
if name not in self._processMap:
iprot.skip(TType.STRUCT)
iprot.readMessageEnd()
x = TApplicationException(TApplicationException.UNKNOWN_METHOD, 'Unknown function %s' % (name))
oprot.writeMessageBegin(name, TMessageType.EXCEPTION, seqid)
x.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
return
else:
self._processMap[name](self, seqid, iprot, oprot)
return True
def process_GetContentCreators(self, seqid, iprot, oprot):
args = GetContentCreators_args()
args.read(iprot)
iprot.readMessageEnd()
result = GetContentCreators_result()
try:
result.success = self._handler.GetContentCreators()
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except SpotifakeManagement.ttypes.SErrorUserException as sErrorUserE:
msg_type = TMessageType.REPLY
result.sErrorUserE = sErrorUserE
except SpotifakeManagement.ttypes.SErrorNotFoundException as sErrorNotFoundE:
msg_type = TMessageType.REPLY
result.sErrorNotFoundE = sErrorNotFoundE
except SpotifakeManagement.ttypes.SErrorInvalidRequestException as sErrorInvalidRequestE:
msg_type = TMessageType.REPLY
result.sErrorInvalidRequestE = sErrorInvalidRequestE
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("GetContentCreators", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_GetContentCreatorById(self, seqid, iprot, oprot):
args = GetContentCreatorById_args()
args.read(iprot)
iprot.readMessageEnd()
result = GetContentCreatorById_result()
try:
result.success = self._handler.GetContentCreatorById(args.idContentCreator)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except SpotifakeManagement.ttypes.SErrorUserException as sErrorUserE:
msg_type = TMessageType.REPLY
result.sErrorUserE = sErrorUserE
except SpotifakeManagement.ttypes.SErrorNotFoundException as sErrorNotFoundE:
msg_type = TMessageType.REPLY
result.sErrorNotFoundE = sErrorNotFoundE
except SpotifakeManagement.ttypes.SErrorInvalidRequestException as sErrorInvalidRequestE:
msg_type = TMessageType.REPLY
result.sErrorInvalidRequestE = sErrorInvalidRequestE
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("GetContentCreatorById", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_GetContentCreatorByLibraryId(self, seqid, iprot, oprot):
args = GetContentCreatorByLibraryId_args()
args.read(iprot)
iprot.readMessageEnd()
result = GetContentCreatorByLibraryId_result()
try:
result.success = self._handler.GetContentCreatorByLibraryId(args.idLibrary)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except SpotifakeManagement.ttypes.SErrorUserException as sErrorUserE:
msg_type = TMessageType.REPLY
result.sErrorUserE = sErrorUserE
except SpotifakeManagement.ttypes.SErrorNotFoundException as sErrorNotFoundE:
msg_type = TMessageType.REPLY
result.sErrorNotFoundE = sErrorNotFoundE
except SpotifakeManagement.ttypes.SErrorInvalidRequestException as sErrorInvalidRequestE:
msg_type = TMessageType.REPLY
result.sErrorInvalidRequestE = sErrorInvalidRequestE
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("GetContentCreatorByLibraryId", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_GetContentCreatorByEmail(self, seqid, iprot, oprot):
args = GetContentCreatorByEmail_args()
args.read(iprot)
iprot.readMessageEnd()
result = GetContentCreatorByEmail_result()
try:
result.success = self._handler.GetContentCreatorByEmail(args.email)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except SpotifakeManagement.ttypes.SErrorUserException as sErrorUserE:
msg_type = TMessageType.REPLY
result.sErrorUserE = sErrorUserE
except SpotifakeManagement.ttypes.SErrorNotFoundException as sErrorNotFoundE:
msg_type = TMessageType.REPLY
result.sErrorNotFoundE = sErrorNotFoundE
except SpotifakeManagement.ttypes.SErrorInvalidRequestException as sErrorInvalidRequestE:
msg_type = TMessageType.REPLY
result.sErrorInvalidRequestE = sErrorInvalidRequestE
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("GetContentCreatorByEmail", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_GetContentCreatorByStageName(self, seqid, iprot, oprot):
args = GetContentCreatorByStageName_args()
args.read(iprot)
iprot.readMessageEnd()
result = GetContentCreatorByStageName_result()
try:
result.success = self._handler.GetContentCreatorByStageName(args.email)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except SpotifakeManagement.ttypes.SErrorUserException as sErrorUserE:
msg_type = TMessageType.REPLY
result.sErrorUserE = sErrorUserE
except SpotifakeManagement.ttypes.SErrorNotFoundException as sErrorNotFoundE:
msg_type = TMessageType.REPLY
result.sErrorNotFoundE = sErrorNotFoundE
except SpotifakeManagement.ttypes.SErrorInvalidRequestException as sErrorInvalidRequestE:
msg_type = TMessageType.REPLY
result.sErrorInvalidRequestE = sErrorInvalidRequestE
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("GetContentCreatorByStageName", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_AddContentCreator(self, seqid, iprot, oprot):
args = AddContentCreator_args()
args.read(iprot)
iprot.readMessageEnd()
result = AddContentCreator_result()
try:
result.success = self._handler.AddContentCreator(args.newContentCreator)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except SpotifakeManagement.ttypes.SErrorUserException as sErrorUserE:
msg_type = TMessageType.REPLY
result.sErrorUserE = sErrorUserE
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("AddContentCreator", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_DeleteContentCreator(self, seqid, iprot, oprot):
args = DeleteContentCreator_args()
args.read(iprot)
iprot.readMessageEnd()
result = DeleteContentCreator_result()
try:
result.success = self._handler.DeleteContentCreator(args.email)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except SpotifakeManagement.ttypes.SErrorNotFoundException as sErrorNotFoundE:
msg_type = TMessageType.REPLY
result.sErrorNotFoundE = sErrorNotFoundE
except SpotifakeManagement.ttypes.SErrorSystemException as sErrorSystemE:
msg_type = TMessageType.REPLY
result.sErrorSystemE = sErrorSystemE
except SpotifakeManagement.ttypes.SErrorInvalidRequestException as sErrorInvalidRequestE:
msg_type = TMessageType.REPLY
result.sErrorInvalidRequestE = sErrorInvalidRequestE
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("DeleteContentCreator", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_UpdateContentCreatorPassword(self, seqid, iprot, oprot):
args = UpdateContentCreatorPassword_args()
args.read(iprot)
iprot.readMessageEnd()
result = UpdateContentCreatorPassword_result()
try:
result.success = self._handler.UpdateContentCreatorPassword(args.email, args.newPassword)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except SpotifakeManagement.ttypes.SErrorUserException as sErrorUserE:
msg_type = TMessageType.REPLY
result.sErrorUserE = sErrorUserE
except SpotifakeManagement.ttypes.SErrorNotFoundException as sErrorNotFoundE:
msg_type = TMessageType.REPLY
result.sErrorNotFoundE = sErrorNotFoundE
except SpotifakeManagement.ttypes.SErrorSystemException as sErrorSystemE:
msg_type = TMessageType.REPLY
result.sErrorSystemE = sErrorSystemE
except SpotifakeManagement.ttypes.SErrorInvalidRequestException as sErrorInvalidRequestE:
msg_type = TMessageType.REPLY
result.sErrorInvalidRequestE = sErrorInvalidRequestE
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("UpdateContentCreatorPassword", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_UpdateContentCreatorImage(self, seqid, iprot, oprot):
args = UpdateContentCreatorImage_args()
args.read(iprot)
iprot.readMessageEnd()
result = UpdateContentCreatorImage_result()
try:
result.success = self._handler.UpdateContentCreatorImage(args.email, args.fileName)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except SpotifakeManagement.ttypes.SErrorUserException as sErrorUserE:
msg_type = TMessageType.REPLY
result.sErrorUserE = sErrorUserE
except SpotifakeManagement.ttypes.SErrorNotFoundException as sErrorNotFoundE:
msg_type = TMessageType.REPLY
result.sErrorNotFoundE = sErrorNotFoundE
except SpotifakeManagement.ttypes.SErrorSystemException as sErrorSystemE:
msg_type = TMessageType.REPLY
result.sErrorSystemE = sErrorSystemE
except SpotifakeManagement.ttypes.SErrorInvalidRequestException as sErrorInvalidRequestE:
msg_type = TMessageType.REPLY
result.sErrorInvalidRequestE = sErrorInvalidRequestE
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("UpdateContentCreatorImage", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_UpdateContentCreatorStageName(self, seqid, iprot, oprot):
args = UpdateContentCreatorStageName_args()
args.read(iprot)
iprot.readMessageEnd()
result = UpdateContentCreatorStageName_result()
try:
result.success = self._handler.UpdateContentCreatorStageName(args.email, args.currentPassword, args.newStageName)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except SpotifakeManagement.ttypes.SErrorUserException as sErrorUserE:
msg_type = TMessageType.REPLY
result.sErrorUserE = sErrorUserE
except SpotifakeManagement.ttypes.SErrorNotFoundException as sErrorNotFoundE:
msg_type = TMessageType.REPLY
result.sErrorNotFoundE = sErrorNotFoundE
except SpotifakeManagement.ttypes.SErrorSystemException as sErrorSystemE:
msg_type = TMessageType.REPLY
result.sErrorSystemE = sErrorSystemE
except SpotifakeManagement.ttypes.SErrorInvalidRequestException as sErrorInvalidRequestE:
msg_type = TMessageType.REPLY
result.sErrorInvalidRequestE = sErrorInvalidRequestE
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("UpdateContentCreatorStageName", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_UpdateContentCreatorDescription(self, seqid, iprot, oprot):
args = UpdateContentCreatorDescription_args()
args.read(iprot)
iprot.readMessageEnd()
result = UpdateContentCreatorDescription_result()
try:
result.success = self._handler.UpdateContentCreatorDescription(args.email, args.currentPassword, args.newDescription)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except SpotifakeManagement.ttypes.SErrorUserException as sErrorUserE:
msg_type = TMessageType.REPLY
result.sErrorUserE = sErrorUserE
except SpotifakeManagement.ttypes.SErrorNotFoundException as sErrorNotFoundE:
msg_type = TMessageType.REPLY
result.sErrorNotFoundE = sErrorNotFoundE
except SpotifakeManagement.ttypes.SErrorSystemException as sErrorSystemE:
msg_type = TMessageType.REPLY
result.sErrorSystemE = sErrorSystemE
except SpotifakeManagement.ttypes.SErrorInvalidRequestException as sErrorInvalidRequestE:
msg_type = TMessageType.REPLY
result.sErrorInvalidRequestE = sErrorInvalidRequestE
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("UpdateContentCreatorDescription", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_LoginContentCreator(self, seqid, iprot, oprot):
args = LoginContentCreator_args()
args.read(iprot)
iprot.readMessageEnd()
result = LoginContentCreator_result()
try:
result.success = self._handler.LoginContentCreator(args.email, args.password)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except SpotifakeManagement.ttypes.SErrorUserException as sErrorUserE:
msg_type = TMessageType.REPLY
result.sErrorUserE = sErrorUserE
except SpotifakeManagement.ttypes.SErrorSystemException as sErrorSystemE:
msg_type = TMessageType.REPLY
result.sErrorSystemE = sErrorSystemE
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("LoginContentCreator", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_AddContentCreatorToLibrary(self, seqid, iprot, oprot):
args = AddContentCreatorToLibrary_args()
args.read(iprot)
iprot.readMessageEnd()
result = AddContentCreatorToLibrary_result()
try:
result.success = self._handler.AddContentCreatorToLibrary(args.idLibrary, args.idContenCreator)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except SpotifakeManagement.ttypes.SErrorSystemException as sErrorSystemE:
msg_type = TMessageType.REPLY
result.sErrorSystemE = sErrorSystemE
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("AddContentCreatorToLibrary", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_DeleteLibraryContentCreator(self, seqid, iprot, oprot):
args = DeleteLibraryContentCreator_args()
args.read(iprot)
iprot.readMessageEnd()
result = DeleteLibraryContentCreator_result()
try:
result.success = self._handler.DeleteLibraryContentCreator(args.idLibrary, args.idContentCreator)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except SpotifakeManagement.ttypes.SErrorNotFoundException as sErrorNotFoundE:
msg_type = TMessageType.REPLY
result.sErrorNotFoundE = sErrorNotFoundE
except SpotifakeManagement.ttypes.SErrorSystemException as sErrorSystemE:
msg_type = TMessageType.REPLY
result.sErrorSystemE = sErrorSystemE
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("DeleteLibraryContentCreator", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_GetContentCreatorByQuery(self, seqid, iprot, oprot):
args = GetContentCreatorByQuery_args()
args.read(iprot)
iprot.readMessageEnd()
result = GetContentCreatorByQuery_result()
try:
result.success = self._handler.GetContentCreatorByQuery(args.query)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except SpotifakeManagement.ttypes.SErrorNotFoundException as sErrorNotFoundE:
msg_type = TMessageType.REPLY
result.sErrorNotFoundE = sErrorNotFoundE
except SpotifakeManagement.ttypes.SErrorSystemException as sErrorSystemE:
msg_type = TMessageType.REPLY
result.sErrorSystemE = sErrorSystemE
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("GetContentCreatorByQuery", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_AddImageToMedia(self, seqid, iprot, oprot):
args = AddImageToMedia_args()
args.read(iprot)
iprot.readMessageEnd()
result = AddImageToMedia_result()
try:
result.success = self._handler.AddImageToMedia(args.fileName, args.image)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except SpotifakeManagement.ttypes.SErrorSystemException as sErrorSystemE:
msg_type = TMessageType.REPLY
result.sErrorSystemE = sErrorSystemE
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("AddImageToMedia", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_GetImageToMedia(self, seqid, iprot, oprot):
args = GetImageToMedia_args()
args.read(iprot)
iprot.readMessageEnd()
result = GetImageToMedia_result()
try:
result.success = self._handler.GetImageToMedia(args.fileName)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except SpotifakeManagement.ttypes.SErrorSystemException as sErrorSystemE:
msg_type = TMessageType.REPLY
result.sErrorSystemE = sErrorSystemE
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("GetImageToMedia", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_DeleteImageToMedia(self, seqid, iprot, oprot):
args = DeleteImageToMedia_args()
args.read(iprot)
iprot.readMessageEnd()
result = DeleteImageToMedia_result()
try:
result.success = self._handler.DeleteImageToMedia(args.fileName)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except SpotifakeManagement.ttypes.SErrorSystemException as sErrorSystemE:
msg_type = TMessageType.REPLY
result.sErrorSystemE = sErrorSystemE
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("DeleteImageToMedia", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
# HELPER FUNCTIONS AND STRUCTURES
class GetContentCreators_args(object):
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('GetContentCreators_args')
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(GetContentCreators_args)
GetContentCreators_args.thrift_spec = (
)
class GetContentCreators_result(object):
"""
Attributes:
- success
- sErrorUserE
- sErrorNotFoundE
- sErrorInvalidRequestE
"""
def __init__(self, success=None, sErrorUserE=None, sErrorNotFoundE=None, sErrorInvalidRequestE=None,):
self.success = success
self.sErrorUserE = sErrorUserE
self.sErrorNotFoundE = sErrorNotFoundE
self.sErrorInvalidRequestE = sErrorInvalidRequestE
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.LIST:
self.success = []
(_etype3, _size0) = iprot.readListBegin()
for _i4 in range(_size0):
_elem5 = SpotifakeManagement.ttypes.ContentCreator()
_elem5.read(iprot)
self.success.append(_elem5)
iprot.readListEnd()
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.sErrorUserE = SpotifakeManagement.ttypes.SErrorUserException()
self.sErrorUserE.read(iprot)
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.STRUCT:
self.sErrorNotFoundE = SpotifakeManagement.ttypes.SErrorNotFoundException()
self.sErrorNotFoundE.read(iprot)
else:
iprot.skip(ftype)
elif fid == 3:
if ftype == TType.STRUCT:
self.sErrorInvalidRequestE = SpotifakeManagement.ttypes.SErrorInvalidRequestException()
self.sErrorInvalidRequestE.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('GetContentCreators_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.LIST, 0)
oprot.writeListBegin(TType.STRUCT, len(self.success))
for iter6 in self.success:
iter6.write(oprot)
oprot.writeListEnd()
oprot.writeFieldEnd()
if self.sErrorUserE is not None:
oprot.writeFieldBegin('sErrorUserE', TType.STRUCT, 1)
self.sErrorUserE.write(oprot)
oprot.writeFieldEnd()
if self.sErrorNotFoundE is not None:
oprot.writeFieldBegin('sErrorNotFoundE', TType.STRUCT, 2)
self.sErrorNotFoundE.write(oprot)
oprot.writeFieldEnd()
if self.sErrorInvalidRequestE is not None:
oprot.writeFieldBegin('sErrorInvalidRequestE', TType.STRUCT, 3)
self.sErrorInvalidRequestE.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(GetContentCreators_result)
GetContentCreators_result.thrift_spec = (
(0, TType.LIST, 'success', (TType.STRUCT, [SpotifakeManagement.ttypes.ContentCreator, None], False), None, ), # 0
(1, TType.STRUCT, 'sErrorUserE', [SpotifakeManagement.ttypes.SErrorUserException, None], None, ), # 1
(2, TType.STRUCT, 'sErrorNotFoundE', [SpotifakeManagement.ttypes.SErrorNotFoundException, None], None, ), # 2
(3, TType.STRUCT, 'sErrorInvalidRequestE', [SpotifakeManagement.ttypes.SErrorInvalidRequestException, None], None, ), # 3
)
class GetContentCreatorById_args(object):
"""
Attributes:
- idContentCreator
"""
def __init__(self, idContentCreator=None,):
self.idContentCreator = idContentCreator
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.I16:
self.idContentCreator = iprot.readI16()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('GetContentCreatorById_args')
if self.idContentCreator is not None:
oprot.writeFieldBegin('idContentCreator', TType.I16, 1)
oprot.writeI16(self.idContentCreator)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(GetContentCreatorById_args)
GetContentCreatorById_args.thrift_spec = (
None, # 0
(1, TType.I16, 'idContentCreator', None, None, ), # 1
)
class GetContentCreatorById_result(object):
"""
Attributes:
- success
- sErrorUserE
- sErrorNotFoundE
- sErrorInvalidRequestE
"""
def __init__(self, success=None, sErrorUserE=None, sErrorNotFoundE=None, sErrorInvalidRequestE=None,):
self.success = success
self.sErrorUserE = sErrorUserE
self.sErrorNotFoundE = sErrorNotFoundE
self.sErrorInvalidRequestE = sErrorInvalidRequestE
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.STRUCT:
self.success = SpotifakeManagement.ttypes.ContentCreator()
self.success.read(iprot)
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.sErrorUserE = SpotifakeManagement.ttypes.SErrorUserException()
self.sErrorUserE.read(iprot)
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.STRUCT:
self.sErrorNotFoundE = SpotifakeManagement.ttypes.SErrorNotFoundException()
self.sErrorNotFoundE.read(iprot)
else:
iprot.skip(ftype)
elif fid == 3:
if ftype == TType.STRUCT:
self.sErrorInvalidRequestE = SpotifakeManagement.ttypes.SErrorInvalidRequestException()
self.sErrorInvalidRequestE.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('GetContentCreatorById_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.STRUCT, 0)
self.success.write(oprot)
oprot.writeFieldEnd()
if self.sErrorUserE is not None:
oprot.writeFieldBegin('sErrorUserE', TType.STRUCT, 1)
self.sErrorUserE.write(oprot)
oprot.writeFieldEnd()
if self.sErrorNotFoundE is not None:
oprot.writeFieldBegin('sErrorNotFoundE', TType.STRUCT, 2)
self.sErrorNotFoundE.write(oprot)
oprot.writeFieldEnd()
if self.sErrorInvalidRequestE is not None:
oprot.writeFieldBegin('sErrorInvalidRequestE', TType.STRUCT, 3)
self.sErrorInvalidRequestE.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(GetContentCreatorById_result)
GetContentCreatorById_result.thrift_spec = (
(0, TType.STRUCT, 'success', [SpotifakeManagement.ttypes.ContentCreator, None], None, ), # 0
(1, TType.STRUCT, 'sErrorUserE', [SpotifakeManagement.ttypes.SErrorUserException, None], None, ), # 1
(2, TType.STRUCT, 'sErrorNotFoundE', [SpotifakeManagement.ttypes.SErrorNotFoundException, None], None, ), # 2
(3, TType.STRUCT, 'sErrorInvalidRequestE', [SpotifakeManagement.ttypes.SErrorInvalidRequestException, None], None, ), # 3
)
class GetContentCreatorByLibraryId_args(object):
"""
Attributes:
- idLibrary
"""
def __init__(self, idLibrary=None,):
self.idLibrary = idLibrary
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.I16:
self.idLibrary = iprot.readI16()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('GetContentCreatorByLibraryId_args')
if self.idLibrary is not None:
oprot.writeFieldBegin('idLibrary', TType.I16, 1)
oprot.writeI16(self.idLibrary)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(GetContentCreatorByLibraryId_args)
GetContentCreatorByLibraryId_args.thrift_spec = (
None, # 0
(1, TType.I16, 'idLibrary', None, None, ), # 1
)
class GetContentCreatorByLibraryId_result(object):
"""
Attributes:
- success
- sErrorUserE
- sErrorNotFoundE
- sErrorInvalidRequestE
"""
def __init__(self, success=None, sErrorUserE=None, sErrorNotFoundE=None, sErrorInvalidRequestE=None,):
self.success = success
self.sErrorUserE = sErrorUserE
self.sErrorNotFoundE = sErrorNotFoundE
self.sErrorInvalidRequestE = sErrorInvalidRequestE
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.LIST:
self.success = []
(_etype10, _size7) = iprot.readListBegin()
for _i11 in range(_size7):
_elem12 = SpotifakeManagement.ttypes.ContentCreator()
_elem12.read(iprot)
self.success.append(_elem12)
iprot.readListEnd()
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.sErrorUserE = SpotifakeManagement.ttypes.SErrorUserException()
self.sErrorUserE.read(iprot)
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.STRUCT:
self.sErrorNotFoundE = SpotifakeManagement.ttypes.SErrorNotFoundException()
self.sErrorNotFoundE.read(iprot)
else:
iprot.skip(ftype)
elif fid == 3:
if ftype == TType.STRUCT:
self.sErrorInvalidRequestE = SpotifakeManagement.ttypes.SErrorInvalidRequestException()
self.sErrorInvalidRequestE.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('GetContentCreatorByLibraryId_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.LIST, 0)
oprot.writeListBegin(TType.STRUCT, len(self.success))
for iter13 in self.success:
iter13.write(oprot)
oprot.writeListEnd()
oprot.writeFieldEnd()
if self.sErrorUserE is not None:
oprot.writeFieldBegin('sErrorUserE', TType.STRUCT, 1)
self.sErrorUserE.write(oprot)
oprot.writeFieldEnd()
if self.sErrorNotFoundE is not None:
oprot.writeFieldBegin('sErrorNotFoundE', TType.STRUCT, 2)
self.sErrorNotFoundE.write(oprot)
oprot.writeFieldEnd()
if self.sErrorInvalidRequestE is not None:
oprot.writeFieldBegin('sErrorInvalidRequestE', TType.STRUCT, 3)
self.sErrorInvalidRequestE.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(GetContentCreatorByLibraryId_result)
GetContentCreatorByLibraryId_result.thrift_spec = (
(0, TType.LIST, 'success', (TType.STRUCT, [SpotifakeManagement.ttypes.ContentCreator, None], False), None, ), # 0
(1, TType.STRUCT, 'sErrorUserE', [SpotifakeManagement.ttypes.SErrorUserException, None], None, ), # 1
(2, TType.STRUCT, 'sErrorNotFoundE', [SpotifakeManagement.ttypes.SErrorNotFoundException, None], None, ), # 2
(3, TType.STRUCT, 'sErrorInvalidRequestE', [SpotifakeManagement.ttypes.SErrorInvalidRequestException, None], None, ), # 3
)
class GetContentCreatorByEmail_args(object):
"""
Attributes:
- email
"""
def __init__(self, email=None,):
self.email = email
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRING:
self.email = iprot.readString().decode('utf-8') if sys.version_info[0] == 2 else iprot.readString()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('GetContentCreatorByEmail_args')
if self.email is not None:
oprot.writeFieldBegin('email', TType.STRING, 1)
oprot.writeString(self.email.encode('utf-8') if sys.version_info[0] == 2 else self.email)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(GetContentCreatorByEmail_args)
GetContentCreatorByEmail_args.thrift_spec = (
None, # 0
(1, TType.STRING, 'email', 'UTF8', None, ), # 1
)
class GetContentCreatorByEmail_result(object):
"""
Attributes:
- success
- sErrorUserE
- sErrorNotFoundE
- sErrorInvalidRequestE
"""
def __init__(self, success=None, sErrorUserE=None, sErrorNotFoundE=None, sErrorInvalidRequestE=None,):
self.success = success
self.sErrorUserE = sErrorUserE
self.sErrorNotFoundE = sErrorNotFoundE
self.sErrorInvalidRequestE = sErrorInvalidRequestE
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.BOOL:
self.success = iprot.readBool()
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.sErrorUserE = SpotifakeManagement.ttypes.SErrorUserException()
self.sErrorUserE.read(iprot)
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.STRUCT:
self.sErrorNotFoundE = SpotifakeManagement.ttypes.SErrorNotFoundException()
self.sErrorNotFoundE.read(iprot)
else:
iprot.skip(ftype)
elif fid == 3:
if ftype == TType.STRUCT:
self.sErrorInvalidRequestE = SpotifakeManagement.ttypes.SErrorInvalidRequestException()
self.sErrorInvalidRequestE.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('GetContentCreatorByEmail_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.BOOL, 0)
oprot.writeBool(self.success)
oprot.writeFieldEnd()
if self.sErrorUserE is not None:
oprot.writeFieldBegin('sErrorUserE', TType.STRUCT, 1)
self.sErrorUserE.write(oprot)
oprot.writeFieldEnd()
if self.sErrorNotFoundE is not None:
oprot.writeFieldBegin('sErrorNotFoundE', TType.STRUCT, 2)
self.sErrorNotFoundE.write(oprot)
oprot.writeFieldEnd()
if self.sErrorInvalidRequestE is not None:
oprot.writeFieldBegin('sErrorInvalidRequestE', TType.STRUCT, 3)
self.sErrorInvalidRequestE.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(GetContentCreatorByEmail_result)
GetContentCreatorByEmail_result.thrift_spec = (
(0, TType.BOOL, 'success', None, None, ), # 0
(1, TType.STRUCT, 'sErrorUserE', [SpotifakeManagement.ttypes.SErrorUserException, None], None, ), # 1
(2, TType.STRUCT, 'sErrorNotFoundE', [SpotifakeManagement.ttypes.SErrorNotFoundException, None], None, ), # 2
(3, TType.STRUCT, 'sErrorInvalidRequestE', [SpotifakeManagement.ttypes.SErrorInvalidRequestException, None], None, ), # 3
)
class GetContentCreatorByStageName_args(object):
"""
Attributes:
- email
"""
def __init__(self, email=None,):
self.email = email
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRING:
self.email = iprot.readString().decode('utf-8') if sys.version_info[0] == 2 else iprot.readString()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('GetContentCreatorByStageName_args')
if self.email is not None:
oprot.writeFieldBegin('email', TType.STRING, 1)
oprot.writeString(self.email.encode('utf-8') if sys.version_info[0] == 2 else self.email)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(GetContentCreatorByStageName_args)
GetContentCreatorByStageName_args.thrift_spec = (
None, # 0
(1, TType.STRING, 'email', 'UTF8', None, ), # 1
)
class GetContentCreatorByStageName_result(object):
"""
Attributes:
- success
- sErrorUserE
- sErrorNotFoundE
- sErrorInvalidRequestE
"""
def __init__(self, success=None, sErrorUserE=None, sErrorNotFoundE=None, sErrorInvalidRequestE=None,):
self.success = success
self.sErrorUserE = sErrorUserE
self.sErrorNotFoundE = sErrorNotFoundE
self.sErrorInvalidRequestE = sErrorInvalidRequestE
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.BOOL:
self.success = iprot.readBool()
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.sErrorUserE = SpotifakeManagement.ttypes.SErrorUserException()
self.sErrorUserE.read(iprot)
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.STRUCT:
self.sErrorNotFoundE = SpotifakeManagement.ttypes.SErrorNotFoundException()
self.sErrorNotFoundE.read(iprot)
else:
iprot.skip(ftype)
elif fid == 3:
if ftype == TType.STRUCT:
self.sErrorInvalidRequestE = SpotifakeManagement.ttypes.SErrorInvalidRequestException()
self.sErrorInvalidRequestE.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('GetContentCreatorByStageName_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.BOOL, 0)
oprot.writeBool(self.success)
oprot.writeFieldEnd()
if self.sErrorUserE is not None:
oprot.writeFieldBegin('sErrorUserE', TType.STRUCT, 1)
self.sErrorUserE.write(oprot)
oprot.writeFieldEnd()
if self.sErrorNotFoundE is not None:
oprot.writeFieldBegin('sErrorNotFoundE', TType.STRUCT, 2)
self.sErrorNotFoundE.write(oprot)
oprot.writeFieldEnd()
if self.sErrorInvalidRequestE is not None:
oprot.writeFieldBegin('sErrorInvalidRequestE', TType.STRUCT, 3)
self.sErrorInvalidRequestE.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(GetContentCreatorByStageName_result)
GetContentCreatorByStageName_result.thrift_spec = (
(0, TType.BOOL, 'success', None, None, ), # 0
(1, TType.STRUCT, 'sErrorUserE', [SpotifakeManagement.ttypes.SErrorUserException, None], None, ), # 1
(2, TType.STRUCT, 'sErrorNotFoundE', [SpotifakeManagement.ttypes.SErrorNotFoundException, None], None, ), # 2
(3, TType.STRUCT, 'sErrorInvalidRequestE', [SpotifakeManagement.ttypes.SErrorInvalidRequestException, None], None, ), # 3
)
class AddContentCreator_args(object):
"""
Attributes:
- newContentCreator
"""
def __init__(self, newContentCreator=None,):
self.newContentCreator = newContentCreator
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.newContentCreator = SpotifakeManagement.ttypes.ContentCreator()
self.newContentCreator.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('AddContentCreator_args')
if self.newContentCreator is not None:
oprot.writeFieldBegin('newContentCreator', TType.STRUCT, 1)
self.newContentCreator.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(AddContentCreator_args)
AddContentCreator_args.thrift_spec = (
None, # 0
(1, TType.STRUCT, 'newContentCreator', [SpotifakeManagement.ttypes.ContentCreator, None], None, ), # 1
)
class AddContentCreator_result(object):
"""
Attributes:
- success
- sErrorUserE
"""
def __init__(self, success=None, sErrorUserE=None,):
self.success = success
self.sErrorUserE = sErrorUserE
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.STRUCT:
self.success = SpotifakeManagement.ttypes.ContentCreator()
self.success.read(iprot)
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.sErrorUserE = SpotifakeManagement.ttypes.SErrorUserException()
self.sErrorUserE.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('AddContentCreator_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.STRUCT, 0)
self.success.write(oprot)
oprot.writeFieldEnd()
if self.sErrorUserE is not None:
oprot.writeFieldBegin('sErrorUserE', TType.STRUCT, 1)
self.sErrorUserE.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(AddContentCreator_result)
AddContentCreator_result.thrift_spec = (
(0, TType.STRUCT, 'success', [SpotifakeManagement.ttypes.ContentCreator, None], None, ), # 0
(1, TType.STRUCT, 'sErrorUserE', [SpotifakeManagement.ttypes.SErrorUserException, None], None, ), # 1
)
class DeleteContentCreator_args(object):
"""
Attributes:
- email
"""
def __init__(self, email=None,):
self.email = email
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRING:
self.email = iprot.readString().decode('utf-8') if sys.version_info[0] == 2 else iprot.readString()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('DeleteContentCreator_args')
if self.email is not None:
oprot.writeFieldBegin('email', TType.STRING, 1)
oprot.writeString(self.email.encode('utf-8') if sys.version_info[0] == 2 else self.email)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(DeleteContentCreator_args)
DeleteContentCreator_args.thrift_spec = (
None, # 0
(1, TType.STRING, 'email', 'UTF8', None, ), # 1
)
class DeleteContentCreator_result(object):
"""
Attributes:
- success
- sErrorNotFoundE
- sErrorSystemE
- sErrorInvalidRequestE
"""
def __init__(self, success=None, sErrorNotFoundE=None, sErrorSystemE=None, sErrorInvalidRequestE=None,):
self.success = success
self.sErrorNotFoundE = sErrorNotFoundE
self.sErrorSystemE = sErrorSystemE
self.sErrorInvalidRequestE = sErrorInvalidRequestE
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.I16:
self.success = iprot.readI16()
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.sErrorNotFoundE = SpotifakeManagement.ttypes.SErrorNotFoundException()
self.sErrorNotFoundE.read(iprot)
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.STRUCT:
self.sErrorSystemE = SpotifakeManagement.ttypes.SErrorSystemException()
self.sErrorSystemE.read(iprot)
else:
iprot.skip(ftype)
elif fid == 3:
if ftype == TType.STRUCT:
self.sErrorInvalidRequestE = SpotifakeManagement.ttypes.SErrorInvalidRequestException()
self.sErrorInvalidRequestE.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('DeleteContentCreator_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.I16, 0)
oprot.writeI16(self.success)
oprot.writeFieldEnd()
if self.sErrorNotFoundE is not None:
oprot.writeFieldBegin('sErrorNotFoundE', TType.STRUCT, 1)
self.sErrorNotFoundE.write(oprot)
oprot.writeFieldEnd()
if self.sErrorSystemE is not None:
oprot.writeFieldBegin('sErrorSystemE', TType.STRUCT, 2)
self.sErrorSystemE.write(oprot)
oprot.writeFieldEnd()
if self.sErrorInvalidRequestE is not None:
oprot.writeFieldBegin('sErrorInvalidRequestE', TType.STRUCT, 3)
self.sErrorInvalidRequestE.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(DeleteContentCreator_result)
DeleteContentCreator_result.thrift_spec = (
(0, TType.I16, 'success', None, None, ), # 0
(1, TType.STRUCT, 'sErrorNotFoundE', [SpotifakeManagement.ttypes.SErrorNotFoundException, None], None, ), # 1
(2, TType.STRUCT, 'sErrorSystemE', [SpotifakeManagement.ttypes.SErrorSystemException, None], None, ), # 2
(3, TType.STRUCT, 'sErrorInvalidRequestE', [SpotifakeManagement.ttypes.SErrorInvalidRequestException, None], None, ), # 3
)
class UpdateContentCreatorPassword_args(object):
"""
Attributes:
- email
- newPassword
"""
def __init__(self, email=None, newPassword=None,):
self.email = email
self.newPassword = newPassword
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRING:
self.email = iprot.readString().decode('utf-8') if sys.version_info[0] == 2 else iprot.readString()
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.STRING:
self.newPassword = iprot.readString().decode('utf-8') if sys.version_info[0] == 2 else iprot.readString()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('UpdateContentCreatorPassword_args')
if self.email is not None:
oprot.writeFieldBegin('email', TType.STRING, 1)
oprot.writeString(self.email.encode('utf-8') if sys.version_info[0] == 2 else self.email)
oprot.writeFieldEnd()
if self.newPassword is not None:
oprot.writeFieldBegin('newPassword', TType.STRING, 2)
oprot.writeString(self.newPassword.encode('utf-8') if sys.version_info[0] == 2 else self.newPassword)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(UpdateContentCreatorPassword_args)
UpdateContentCreatorPassword_args.thrift_spec = (
None, # 0
(1, TType.STRING, 'email', 'UTF8', None, ), # 1
(2, TType.STRING, 'newPassword', 'UTF8', None, ), # 2
)
class UpdateContentCreatorPassword_result(object):
"""
Attributes:
- success
- sErrorUserE
- sErrorNotFoundE
- sErrorSystemE
- sErrorInvalidRequestE
"""
def __init__(self, success=None, sErrorUserE=None, sErrorNotFoundE=None, sErrorSystemE=None, sErrorInvalidRequestE=None,):
self.success = success
self.sErrorUserE = sErrorUserE
self.sErrorNotFoundE = sErrorNotFoundE
self.sErrorSystemE = sErrorSystemE
self.sErrorInvalidRequestE = sErrorInvalidRequestE
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.BOOL:
self.success = iprot.readBool()
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.sErrorUserE = SpotifakeManagement.ttypes.SErrorUserException()
self.sErrorUserE.read(iprot)
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.STRUCT:
self.sErrorNotFoundE = SpotifakeManagement.ttypes.SErrorNotFoundException()
self.sErrorNotFoundE.read(iprot)
else:
iprot.skip(ftype)
elif fid == 3:
if ftype == TType.STRUCT:
self.sErrorSystemE = SpotifakeManagement.ttypes.SErrorSystemException()
self.sErrorSystemE.read(iprot)
else:
iprot.skip(ftype)
elif fid == 4:
if ftype == TType.STRUCT:
self.sErrorInvalidRequestE = SpotifakeManagement.ttypes.SErrorInvalidRequestException()
self.sErrorInvalidRequestE.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('UpdateContentCreatorPassword_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.BOOL, 0)
oprot.writeBool(self.success)
oprot.writeFieldEnd()
if self.sErrorUserE is not None:
oprot.writeFieldBegin('sErrorUserE', TType.STRUCT, 1)
self.sErrorUserE.write(oprot)
oprot.writeFieldEnd()
if self.sErrorNotFoundE is not None:
oprot.writeFieldBegin('sErrorNotFoundE', TType.STRUCT, 2)
self.sErrorNotFoundE.write(oprot)
oprot.writeFieldEnd()
if self.sErrorSystemE is not None:
oprot.writeFieldBegin('sErrorSystemE', TType.STRUCT, 3)
self.sErrorSystemE.write(oprot)
oprot.writeFieldEnd()
if self.sErrorInvalidRequestE is not None:
oprot.writeFieldBegin('sErrorInvalidRequestE', TType.STRUCT, 4)
self.sErrorInvalidRequestE.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(UpdateContentCreatorPassword_result)
UpdateContentCreatorPassword_result.thrift_spec = (
(0, TType.BOOL, 'success', None, None, ), # 0
(1, TType.STRUCT, 'sErrorUserE', [SpotifakeManagement.ttypes.SErrorUserException, None], None, ), # 1
(2, TType.STRUCT, 'sErrorNotFoundE', [SpotifakeManagement.ttypes.SErrorNotFoundException, None], None, ), # 2
(3, TType.STRUCT, 'sErrorSystemE', [SpotifakeManagement.ttypes.SErrorSystemException, None], None, ), # 3
(4, TType.STRUCT, 'sErrorInvalidRequestE', [SpotifakeManagement.ttypes.SErrorInvalidRequestException, None], None, ), # 4
)
class UpdateContentCreatorImage_args(object):
"""
Attributes:
- email
- fileName
"""
def __init__(self, email=None, fileName=None,):
self.email = email
self.fileName = fileName
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRING:
self.email = iprot.readString().decode('utf-8') if sys.version_info[0] == 2 else iprot.readString()
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.STRING:
self.fileName = iprot.readString().decode('utf-8') if sys.version_info[0] == 2 else iprot.readString()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('UpdateContentCreatorImage_args')
if self.email is not None:
oprot.writeFieldBegin('email', TType.STRING, 1)
oprot.writeString(self.email.encode('utf-8') if sys.version_info[0] == 2 else self.email)
oprot.writeFieldEnd()
if self.fileName is not None:
oprot.writeFieldBegin('fileName', TType.STRING, 2)
oprot.writeString(self.fileName.encode('utf-8') if sys.version_info[0] == 2 else self.fileName)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(UpdateContentCreatorImage_args)
UpdateContentCreatorImage_args.thrift_spec = (
None, # 0
(1, TType.STRING, 'email', 'UTF8', None, ), # 1
(2, TType.STRING, 'fileName', 'UTF8', None, ), # 2
)
class UpdateContentCreatorImage_result(object):
"""
Attributes:
- success
- sErrorUserE
- sErrorNotFoundE
- sErrorSystemE
- sErrorInvalidRequestE
"""
def __init__(self, success=None, sErrorUserE=None, sErrorNotFoundE=None, sErrorSystemE=None, sErrorInvalidRequestE=None,):
self.success = success
self.sErrorUserE = sErrorUserE
self.sErrorNotFoundE = sErrorNotFoundE
self.sErrorSystemE = sErrorSystemE
self.sErrorInvalidRequestE = sErrorInvalidRequestE
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.BOOL:
self.success = iprot.readBool()
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.sErrorUserE = SpotifakeManagement.ttypes.SErrorUserException()
self.sErrorUserE.read(iprot)
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.STRUCT:
self.sErrorNotFoundE = SpotifakeManagement.ttypes.SErrorNotFoundException()
self.sErrorNotFoundE.read(iprot)
else:
iprot.skip(ftype)
elif fid == 3:
if ftype == TType.STRUCT:
self.sErrorSystemE = SpotifakeManagement.ttypes.SErrorSystemException()
self.sErrorSystemE.read(iprot)
else:
iprot.skip(ftype)
elif fid == 4:
if ftype == TType.STRUCT:
self.sErrorInvalidRequestE = SpotifakeManagement.ttypes.SErrorInvalidRequestException()
self.sErrorInvalidRequestE.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('UpdateContentCreatorImage_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.BOOL, 0)
oprot.writeBool(self.success)
oprot.writeFieldEnd()
if self.sErrorUserE is not None:
oprot.writeFieldBegin('sErrorUserE', TType.STRUCT, 1)
self.sErrorUserE.write(oprot)
oprot.writeFieldEnd()
if self.sErrorNotFoundE is not None:
oprot.writeFieldBegin('sErrorNotFoundE', TType.STRUCT, 2)
self.sErrorNotFoundE.write(oprot)
oprot.writeFieldEnd()
if self.sErrorSystemE is not None:
oprot.writeFieldBegin('sErrorSystemE', TType.STRUCT, 3)
self.sErrorSystemE.write(oprot)
oprot.writeFieldEnd()
if self.sErrorInvalidRequestE is not None:
oprot.writeFieldBegin('sErrorInvalidRequestE', TType.STRUCT, 4)
self.sErrorInvalidRequestE.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(UpdateContentCreatorImage_result)
UpdateContentCreatorImage_result.thrift_spec = (
(0, TType.BOOL, 'success', None, None, ), # 0
(1, TType.STRUCT, 'sErrorUserE', [SpotifakeManagement.ttypes.SErrorUserException, None], None, ), # 1
(2, TType.STRUCT, 'sErrorNotFoundE', [SpotifakeManagement.ttypes.SErrorNotFoundException, None], None, ), # 2
(3, TType.STRUCT, 'sErrorSystemE', [SpotifakeManagement.ttypes.SErrorSystemException, None], None, ), # 3
(4, TType.STRUCT, 'sErrorInvalidRequestE', [SpotifakeManagement.ttypes.SErrorInvalidRequestException, None], None, ), # 4
)
class UpdateContentCreatorStageName_args(object):
"""
Attributes:
- email
- currentPassword
- newStageName
"""
def __init__(self, email=None, currentPassword=None, newStageName=None,):
self.email = email
self.currentPassword = currentPassword
self.newStageName = newStageName
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRING:
self.email = iprot.readString().decode('utf-8') if sys.version_info[0] == 2 else iprot.readString()
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.STRING:
self.currentPassword = iprot.readString().decode('utf-8') if sys.version_info[0] == 2 else iprot.readString()
else:
iprot.skip(ftype)
elif fid == 3:
if ftype == TType.STRING:
self.newStageName = iprot.readString().decode('utf-8') if sys.version_info[0] == 2 else iprot.readString()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('UpdateContentCreatorStageName_args')
if self.email is not None:
oprot.writeFieldBegin('email', TType.STRING, 1)
oprot.writeString(self.email.encode('utf-8') if sys.version_info[0] == 2 else self.email)
oprot.writeFieldEnd()
if self.currentPassword is not None:
oprot.writeFieldBegin('currentPassword', TType.STRING, 2)
oprot.writeString(self.currentPassword.encode('utf-8') if sys.version_info[0] == 2 else self.currentPassword)
oprot.writeFieldEnd()
if self.newStageName is not None:
oprot.writeFieldBegin('newStageName', TType.STRING, 3)
oprot.writeString(self.newStageName.encode('utf-8') if sys.version_info[0] == 2 else self.newStageName)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(UpdateContentCreatorStageName_args)
UpdateContentCreatorStageName_args.thrift_spec = (
None, # 0
(1, TType.STRING, 'email', 'UTF8', None, ), # 1
(2, TType.STRING, 'currentPassword', 'UTF8', None, ), # 2
(3, TType.STRING, 'newStageName', 'UTF8', None, ), # 3
)
class UpdateContentCreatorStageName_result(object):
"""
Attributes:
- success
- sErrorUserE
- sErrorNotFoundE
- sErrorSystemE
- sErrorInvalidRequestE
"""
def __init__(self, success=None, sErrorUserE=None, sErrorNotFoundE=None, sErrorSystemE=None, sErrorInvalidRequestE=None,):
self.success = success
self.sErrorUserE = sErrorUserE
self.sErrorNotFoundE = sErrorNotFoundE
self.sErrorSystemE = sErrorSystemE
self.sErrorInvalidRequestE = sErrorInvalidRequestE
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.STRUCT:
self.success = SpotifakeManagement.ttypes.ContentCreator()
self.success.read(iprot)
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.sErrorUserE = SpotifakeManagement.ttypes.SErrorUserException()
self.sErrorUserE.read(iprot)
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.STRUCT:
self.sErrorNotFoundE = SpotifakeManagement.ttypes.SErrorNotFoundException()
self.sErrorNotFoundE.read(iprot)
else:
iprot.skip(ftype)
elif fid == 3:
if ftype == TType.STRUCT:
self.sErrorSystemE = SpotifakeManagement.ttypes.SErrorSystemException()
self.sErrorSystemE.read(iprot)
else:
iprot.skip(ftype)
elif fid == 4:
if ftype == TType.STRUCT:
self.sErrorInvalidRequestE = SpotifakeManagement.ttypes.SErrorInvalidRequestException()
self.sErrorInvalidRequestE.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('UpdateContentCreatorStageName_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.STRUCT, 0)
self.success.write(oprot)
oprot.writeFieldEnd()
if self.sErrorUserE is not None:
oprot.writeFieldBegin('sErrorUserE', TType.STRUCT, 1)
self.sErrorUserE.write(oprot)
oprot.writeFieldEnd()
if self.sErrorNotFoundE is not None:
oprot.writeFieldBegin('sErrorNotFoundE', TType.STRUCT, 2)
self.sErrorNotFoundE.write(oprot)
oprot.writeFieldEnd()
if self.sErrorSystemE is not None:
oprot.writeFieldBegin('sErrorSystemE', TType.STRUCT, 3)
self.sErrorSystemE.write(oprot)
oprot.writeFieldEnd()
if self.sErrorInvalidRequestE is not None:
oprot.writeFieldBegin('sErrorInvalidRequestE', TType.STRUCT, 4)
self.sErrorInvalidRequestE.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(UpdateContentCreatorStageName_result)
UpdateContentCreatorStageName_result.thrift_spec = (
(0, TType.STRUCT, 'success', [SpotifakeManagement.ttypes.ContentCreator, None], None, ), # 0
(1, TType.STRUCT, 'sErrorUserE', [SpotifakeManagement.ttypes.SErrorUserException, None], None, ), # 1
(2, TType.STRUCT, 'sErrorNotFoundE', [SpotifakeManagement.ttypes.SErrorNotFoundException, None], None, ), # 2
(3, TType.STRUCT, 'sErrorSystemE', [SpotifakeManagement.ttypes.SErrorSystemException, None], None, ), # 3
(4, TType.STRUCT, 'sErrorInvalidRequestE', [SpotifakeManagement.ttypes.SErrorInvalidRequestException, None], None, ), # 4
)
class UpdateContentCreatorDescription_args(object):
"""
Attributes:
- email
- currentPassword
- newDescription
"""
def __init__(self, email=None, currentPassword=None, newDescription=None,):
self.email = email
self.currentPassword = currentPassword
self.newDescription = newDescription
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRING:
self.email = iprot.readString().decode('utf-8') if sys.version_info[0] == 2 else iprot.readString()
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.STRING:
self.currentPassword = iprot.readString().decode('utf-8') if sys.version_info[0] == 2 else iprot.readString()
else:
iprot.skip(ftype)
elif fid == 3:
if ftype == TType.STRING:
self.newDescription = iprot.readString().decode('utf-8') if sys.version_info[0] == 2 else iprot.readString()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('UpdateContentCreatorDescription_args')
if self.email is not None:
oprot.writeFieldBegin('email', TType.STRING, 1)
oprot.writeString(self.email.encode('utf-8') if sys.version_info[0] == 2 else self.email)
oprot.writeFieldEnd()
if self.currentPassword is not None:
oprot.writeFieldBegin('currentPassword', TType.STRING, 2)
oprot.writeString(self.currentPassword.encode('utf-8') if sys.version_info[0] == 2 else self.currentPassword)
oprot.writeFieldEnd()
if self.newDescription is not None:
oprot.writeFieldBegin('newDescription', TType.STRING, 3)
oprot.writeString(self.newDescription.encode('utf-8') if sys.version_info[0] == 2 else self.newDescription)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(UpdateContentCreatorDescription_args)
UpdateContentCreatorDescription_args.thrift_spec = (
None, # 0
(1, TType.STRING, 'email', 'UTF8', None, ), # 1
(2, TType.STRING, 'currentPassword', 'UTF8', None, ), # 2
(3, TType.STRING, 'newDescription', 'UTF8', None, ), # 3
)
class UpdateContentCreatorDescription_result(object):
"""
Attributes:
- success
- sErrorUserE
- sErrorNotFoundE
- sErrorSystemE
- sErrorInvalidRequestE
"""
def __init__(self, success=None, sErrorUserE=None, sErrorNotFoundE=None, sErrorSystemE=None, sErrorInvalidRequestE=None,):
self.success = success
self.sErrorUserE = sErrorUserE
self.sErrorNotFoundE = sErrorNotFoundE
self.sErrorSystemE = sErrorSystemE
self.sErrorInvalidRequestE = sErrorInvalidRequestE
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.STRUCT:
self.success = SpotifakeManagement.ttypes.ContentCreator()
self.success.read(iprot)
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.sErrorUserE = SpotifakeManagement.ttypes.SErrorUserException()
self.sErrorUserE.read(iprot)
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.STRUCT:
self.sErrorNotFoundE = SpotifakeManagement.ttypes.SErrorNotFoundException()
self.sErrorNotFoundE.read(iprot)
else:
iprot.skip(ftype)
elif fid == 3:
if ftype == TType.STRUCT:
self.sErrorSystemE = SpotifakeManagement.ttypes.SErrorSystemException()
self.sErrorSystemE.read(iprot)
else:
iprot.skip(ftype)
elif fid == 4:
if ftype == TType.STRUCT:
self.sErrorInvalidRequestE = SpotifakeManagement.ttypes.SErrorInvalidRequestException()
self.sErrorInvalidRequestE.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('UpdateContentCreatorDescription_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.STRUCT, 0)
self.success.write(oprot)
oprot.writeFieldEnd()
if self.sErrorUserE is not None:
oprot.writeFieldBegin('sErrorUserE', TType.STRUCT, 1)
self.sErrorUserE.write(oprot)
oprot.writeFieldEnd()
if self.sErrorNotFoundE is not None:
oprot.writeFieldBegin('sErrorNotFoundE', TType.STRUCT, 2)
self.sErrorNotFoundE.write(oprot)
oprot.writeFieldEnd()
if self.sErrorSystemE is not None:
oprot.writeFieldBegin('sErrorSystemE', TType.STRUCT, 3)
self.sErrorSystemE.write(oprot)
oprot.writeFieldEnd()
if self.sErrorInvalidRequestE is not None:
oprot.writeFieldBegin('sErrorInvalidRequestE', TType.STRUCT, 4)
self.sErrorInvalidRequestE.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(UpdateContentCreatorDescription_result)
UpdateContentCreatorDescription_result.thrift_spec = (
(0, TType.STRUCT, 'success', [SpotifakeManagement.ttypes.ContentCreator, None], None, ), # 0
(1, TType.STRUCT, 'sErrorUserE', [SpotifakeManagement.ttypes.SErrorUserException, None], None, ), # 1
(2, TType.STRUCT, 'sErrorNotFoundE', [SpotifakeManagement.ttypes.SErrorNotFoundException, None], None, ), # 2
(3, TType.STRUCT, 'sErrorSystemE', [SpotifakeManagement.ttypes.SErrorSystemException, None], None, ), # 3
(4, TType.STRUCT, 'sErrorInvalidRequestE', [SpotifakeManagement.ttypes.SErrorInvalidRequestException, None], None, ), # 4
)
class LoginContentCreator_args(object):
"""
Attributes:
- email
- password
"""
def __init__(self, email=None, password=None,):
self.email = email
self.password = password
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRING:
self.email = iprot.readString().decode('utf-8') if sys.version_info[0] == 2 else iprot.readString()
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.STRING:
self.password = iprot.readString().decode('utf-8') if sys.version_info[0] == 2 else iprot.readString()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('LoginContentCreator_args')
if self.email is not None:
oprot.writeFieldBegin('email', TType.STRING, 1)
oprot.writeString(self.email.encode('utf-8') if sys.version_info[0] == 2 else self.email)
oprot.writeFieldEnd()
if self.password is not None:
oprot.writeFieldBegin('password', TType.STRING, 2)
oprot.writeString(self.password.encode('utf-8') if sys.version_info[0] == 2 else self.password)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(LoginContentCreator_args)
LoginContentCreator_args.thrift_spec = (
None, # 0
(1, TType.STRING, 'email', 'UTF8', None, ), # 1
(2, TType.STRING, 'password', 'UTF8', None, ), # 2
)
class LoginContentCreator_result(object):
"""
Attributes:
- success
- sErrorUserE
- sErrorSystemE
"""
def __init__(self, success=None, sErrorUserE=None, sErrorSystemE=None,):
self.success = success
self.sErrorUserE = sErrorUserE
self.sErrorSystemE = sErrorSystemE
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.STRUCT:
self.success = SpotifakeManagement.ttypes.ContentCreator()
self.success.read(iprot)
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.sErrorUserE = SpotifakeManagement.ttypes.SErrorUserException()
self.sErrorUserE.read(iprot)
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.STRUCT:
self.sErrorSystemE = SpotifakeManagement.ttypes.SErrorSystemException()
self.sErrorSystemE.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('LoginContentCreator_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.STRUCT, 0)
self.success.write(oprot)
oprot.writeFieldEnd()
if self.sErrorUserE is not None:
oprot.writeFieldBegin('sErrorUserE', TType.STRUCT, 1)
self.sErrorUserE.write(oprot)
oprot.writeFieldEnd()
if self.sErrorSystemE is not None:
oprot.writeFieldBegin('sErrorSystemE', TType.STRUCT, 2)
self.sErrorSystemE.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(LoginContentCreator_result)
LoginContentCreator_result.thrift_spec = (
(0, TType.STRUCT, 'success', [SpotifakeManagement.ttypes.ContentCreator, None], None, ), # 0
(1, TType.STRUCT, 'sErrorUserE', [SpotifakeManagement.ttypes.SErrorUserException, None], None, ), # 1
(2, TType.STRUCT, 'sErrorSystemE', [SpotifakeManagement.ttypes.SErrorSystemException, None], None, ), # 2
)
class AddContentCreatorToLibrary_args(object):
"""
Attributes:
- idLibrary
- idContenCreator
"""
def __init__(self, idLibrary=None, idContenCreator=None,):
self.idLibrary = idLibrary
self.idContenCreator = idContenCreator
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.I16:
self.idLibrary = iprot.readI16()
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.I16:
self.idContenCreator = iprot.readI16()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('AddContentCreatorToLibrary_args')
if self.idLibrary is not None:
oprot.writeFieldBegin('idLibrary', TType.I16, 1)
oprot.writeI16(self.idLibrary)
oprot.writeFieldEnd()
if self.idContenCreator is not None:
oprot.writeFieldBegin('idContenCreator', TType.I16, 2)
oprot.writeI16(self.idContenCreator)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(AddContentCreatorToLibrary_args)
AddContentCreatorToLibrary_args.thrift_spec = (
None, # 0
(1, TType.I16, 'idLibrary', None, None, ), # 1
(2, TType.I16, 'idContenCreator', None, None, ), # 2
)
class AddContentCreatorToLibrary_result(object):
"""
Attributes:
- success
- sErrorSystemE
"""
def __init__(self, success=None, sErrorSystemE=None,):
self.success = success
self.sErrorSystemE = sErrorSystemE
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.BOOL:
self.success = iprot.readBool()
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.sErrorSystemE = SpotifakeManagement.ttypes.SErrorSystemException()
self.sErrorSystemE.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('AddContentCreatorToLibrary_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.BOOL, 0)
oprot.writeBool(self.success)
oprot.writeFieldEnd()
if self.sErrorSystemE is not None:
oprot.writeFieldBegin('sErrorSystemE', TType.STRUCT, 1)
self.sErrorSystemE.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(AddContentCreatorToLibrary_result)
AddContentCreatorToLibrary_result.thrift_spec = (
(0, TType.BOOL, 'success', None, None, ), # 0
(1, TType.STRUCT, 'sErrorSystemE', [SpotifakeManagement.ttypes.SErrorSystemException, None], None, ), # 1
)
class DeleteLibraryContentCreator_args(object):
"""
Attributes:
- idLibrary
- idContentCreator
"""
def __init__(self, idLibrary=None, idContentCreator=None,):
self.idLibrary = idLibrary
self.idContentCreator = idContentCreator
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.I16:
self.idLibrary = iprot.readI16()
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.I16:
self.idContentCreator = iprot.readI16()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('DeleteLibraryContentCreator_args')
if self.idLibrary is not None:
oprot.writeFieldBegin('idLibrary', TType.I16, 1)
oprot.writeI16(self.idLibrary)
oprot.writeFieldEnd()
if self.idContentCreator is not None:
oprot.writeFieldBegin('idContentCreator', TType.I16, 2)
oprot.writeI16(self.idContentCreator)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(DeleteLibraryContentCreator_args)
DeleteLibraryContentCreator_args.thrift_spec = (
None, # 0
(1, TType.I16, 'idLibrary', None, None, ), # 1
(2, TType.I16, 'idContentCreator', None, None, ), # 2
)
class DeleteLibraryContentCreator_result(object):
"""
Attributes:
- success
- sErrorNotFoundE
- sErrorSystemE
"""
def __init__(self, success=None, sErrorNotFoundE=None, sErrorSystemE=None,):
self.success = success
self.sErrorNotFoundE = sErrorNotFoundE
self.sErrorSystemE = sErrorSystemE
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.I16:
self.success = iprot.readI16()
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.sErrorNotFoundE = SpotifakeManagement.ttypes.SErrorNotFoundException()
self.sErrorNotFoundE.read(iprot)
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.STRUCT:
self.sErrorSystemE = SpotifakeManagement.ttypes.SErrorSystemException()
self.sErrorSystemE.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('DeleteLibraryContentCreator_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.I16, 0)
oprot.writeI16(self.success)
oprot.writeFieldEnd()
if self.sErrorNotFoundE is not None:
oprot.writeFieldBegin('sErrorNotFoundE', TType.STRUCT, 1)
self.sErrorNotFoundE.write(oprot)
oprot.writeFieldEnd()
if self.sErrorSystemE is not None:
oprot.writeFieldBegin('sErrorSystemE', TType.STRUCT, 2)
self.sErrorSystemE.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(DeleteLibraryContentCreator_result)
DeleteLibraryContentCreator_result.thrift_spec = (
(0, TType.I16, 'success', None, None, ), # 0
(1, TType.STRUCT, 'sErrorNotFoundE', [SpotifakeManagement.ttypes.SErrorNotFoundException, None], None, ), # 1
(2, TType.STRUCT, 'sErrorSystemE', [SpotifakeManagement.ttypes.SErrorSystemException, None], None, ), # 2
)
class GetContentCreatorByQuery_args(object):
"""
Attributes:
- query
"""
def __init__(self, query=None,):
self.query = query
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRING:
self.query = iprot.readString().decode('utf-8') if sys.version_info[0] == 2 else iprot.readString()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('GetContentCreatorByQuery_args')
if self.query is not None:
oprot.writeFieldBegin('query', TType.STRING, 1)
oprot.writeString(self.query.encode('utf-8') if sys.version_info[0] == 2 else self.query)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(GetContentCreatorByQuery_args)
GetContentCreatorByQuery_args.thrift_spec = (
None, # 0
(1, TType.STRING, 'query', 'UTF8', None, ), # 1
)
class GetContentCreatorByQuery_result(object):
"""
Attributes:
- success
- sErrorNotFoundE
- sErrorSystemE
"""
def __init__(self, success=None, sErrorNotFoundE=None, sErrorSystemE=None,):
self.success = success
self.sErrorNotFoundE = sErrorNotFoundE
self.sErrorSystemE = sErrorSystemE
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.LIST:
self.success = []
(_etype17, _size14) = iprot.readListBegin()
for _i18 in range(_size14):
_elem19 = SpotifakeManagement.ttypes.ContentCreator()
_elem19.read(iprot)
self.success.append(_elem19)
iprot.readListEnd()
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.sErrorNotFoundE = SpotifakeManagement.ttypes.SErrorNotFoundException()
self.sErrorNotFoundE.read(iprot)
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.STRUCT:
self.sErrorSystemE = SpotifakeManagement.ttypes.SErrorSystemException()
self.sErrorSystemE.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('GetContentCreatorByQuery_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.LIST, 0)
oprot.writeListBegin(TType.STRUCT, len(self.success))
for iter20 in self.success:
iter20.write(oprot)
oprot.writeListEnd()
oprot.writeFieldEnd()
if self.sErrorNotFoundE is not None:
oprot.writeFieldBegin('sErrorNotFoundE', TType.STRUCT, 1)
self.sErrorNotFoundE.write(oprot)
oprot.writeFieldEnd()
if self.sErrorSystemE is not None:
oprot.writeFieldBegin('sErrorSystemE', TType.STRUCT, 2)
self.sErrorSystemE.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(GetContentCreatorByQuery_result)
GetContentCreatorByQuery_result.thrift_spec = (
(0, TType.LIST, 'success', (TType.STRUCT, [SpotifakeManagement.ttypes.ContentCreator, None], False), None, ), # 0
(1, TType.STRUCT, 'sErrorNotFoundE', [SpotifakeManagement.ttypes.SErrorNotFoundException, None], None, ), # 1
(2, TType.STRUCT, 'sErrorSystemE', [SpotifakeManagement.ttypes.SErrorSystemException, None], None, ), # 2
)
class AddImageToMedia_args(object):
"""
Attributes:
- fileName
- image
"""
def __init__(self, fileName=None, image=None,):
self.fileName = fileName
self.image = image
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRING:
self.fileName = iprot.readString().decode('utf-8') if sys.version_info[0] == 2 else iprot.readString()
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.STRING:
self.image = iprot.readBinary()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('AddImageToMedia_args')
if self.fileName is not None:
oprot.writeFieldBegin('fileName', TType.STRING, 1)
oprot.writeString(self.fileName.encode('utf-8') if sys.version_info[0] == 2 else self.fileName)
oprot.writeFieldEnd()
if self.image is not None:
oprot.writeFieldBegin('image', TType.STRING, 2)
oprot.writeBinary(self.image)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(AddImageToMedia_args)
AddImageToMedia_args.thrift_spec = (
None, # 0
(1, TType.STRING, 'fileName', 'UTF8', None, ), # 1
(2, TType.STRING, 'image', 'BINARY', None, ), # 2
)
class AddImageToMedia_result(object):
"""
Attributes:
- success
- sErrorSystemE
"""
def __init__(self, success=None, sErrorSystemE=None,):
self.success = success
self.sErrorSystemE = sErrorSystemE
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.BOOL:
self.success = iprot.readBool()
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.sErrorSystemE = SpotifakeManagement.ttypes.SErrorSystemException()
self.sErrorSystemE.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('AddImageToMedia_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.BOOL, 0)
oprot.writeBool(self.success)
oprot.writeFieldEnd()
if self.sErrorSystemE is not None:
oprot.writeFieldBegin('sErrorSystemE', TType.STRUCT, 1)
self.sErrorSystemE.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(AddImageToMedia_result)
AddImageToMedia_result.thrift_spec = (
(0, TType.BOOL, 'success', None, None, ), # 0
(1, TType.STRUCT, 'sErrorSystemE', [SpotifakeManagement.ttypes.SErrorSystemException, None], None, ), # 1
)
class GetImageToMedia_args(object):
"""
Attributes:
- fileName
"""
def __init__(self, fileName=None,):
self.fileName = fileName
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRING:
self.fileName = iprot.readString().decode('utf-8') if sys.version_info[0] == 2 else iprot.readString()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('GetImageToMedia_args')
if self.fileName is not None:
oprot.writeFieldBegin('fileName', TType.STRING, 1)
oprot.writeString(self.fileName.encode('utf-8') if sys.version_info[0] == 2 else self.fileName)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(GetImageToMedia_args)
GetImageToMedia_args.thrift_spec = (
None, # 0
(1, TType.STRING, 'fileName', 'UTF8', None, ), # 1
)
class GetImageToMedia_result(object):
"""
Attributes:
- success
- sErrorSystemE
"""
def __init__(self, success=None, sErrorSystemE=None,):
self.success = success
self.sErrorSystemE = sErrorSystemE
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.STRING:
self.success = iprot.readBinary()
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.sErrorSystemE = SpotifakeManagement.ttypes.SErrorSystemException()
self.sErrorSystemE.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('GetImageToMedia_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.STRING, 0)
oprot.writeBinary(self.success)
oprot.writeFieldEnd()
if self.sErrorSystemE is not None:
oprot.writeFieldBegin('sErrorSystemE', TType.STRUCT, 1)
self.sErrorSystemE.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(GetImageToMedia_result)
GetImageToMedia_result.thrift_spec = (
(0, TType.STRING, 'success', 'BINARY', None, ), # 0
(1, TType.STRUCT, 'sErrorSystemE', [SpotifakeManagement.ttypes.SErrorSystemException, None], None, ), # 1
)
class DeleteImageToMedia_args(object):
"""
Attributes:
- fileName
"""
def __init__(self, fileName=None,):
self.fileName = fileName
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRING:
self.fileName = iprot.readString().decode('utf-8') if sys.version_info[0] == 2 else iprot.readString()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('DeleteImageToMedia_args')
if self.fileName is not None:
oprot.writeFieldBegin('fileName', TType.STRING, 1)
oprot.writeString(self.fileName.encode('utf-8') if sys.version_info[0] == 2 else self.fileName)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(DeleteImageToMedia_args)
DeleteImageToMedia_args.thrift_spec = (
None, # 0
(1, TType.STRING, 'fileName', 'UTF8', None, ), # 1
)
class DeleteImageToMedia_result(object):
"""
Attributes:
- success
- sErrorSystemE
"""
def __init__(self, success=None, sErrorSystemE=None,):
self.success = success
self.sErrorSystemE = sErrorSystemE
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.BOOL:
self.success = iprot.readBool()
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.sErrorSystemE = SpotifakeManagement.ttypes.SErrorSystemException()
self.sErrorSystemE.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('DeleteImageToMedia_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.BOOL, 0)
oprot.writeBool(self.success)
oprot.writeFieldEnd()
if self.sErrorSystemE is not None:
oprot.writeFieldBegin('sErrorSystemE', TType.STRUCT, 1)
self.sErrorSystemE.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(DeleteImageToMedia_result)
DeleteImageToMedia_result.thrift_spec = (
(0, TType.BOOL, 'success', None, None, ), # 0
(1, TType.STRUCT, 'sErrorSystemE', [SpotifakeManagement.ttypes.SErrorSystemException, None], None, ), # 1
)
fix_spec(all_structs)
del all_structs
| 37.164977 | 134 | 0.620306 | 16,468 | 175,939 | 6.448385 | 0.016335 | 0.014031 | 0.025256 | 0.021358 | 0.871883 | 0.848294 | 0.840129 | 0.830289 | 0.826456 | 0.826286 | 0 | 0.005113 | 0.290749 | 175,939 | 4,733 | 135 | 37.172829 | 0.845887 | 0.051359 | 0 | 0.857469 | 1 | 0 | 0.045134 | 0.01586 | 0 | 0 | 0 | 0 | 0 | 1 | 0.098543 | false | 0.025421 | 0.002285 | 0.030848 | 0.184519 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
27ef611fb82268735e3c37b6d72d01b1ea99f76f | 62 | py | Python | tests/test_remind.py | damani42/jeeves | 955ac418292538955fd63b1fc0744169219d345f | [
"MIT"
] | null | null | null | tests/test_remind.py | damani42/jeeves | 955ac418292538955fd63b1fc0744169219d345f | [
"MIT"
] | null | null | null | tests/test_remind.py | damani42/jeeves | 955ac418292538955fd63b1fc0744169219d345f | [
"MIT"
] | null | null | null | from jeeves.remind import *
def test_run_remind():
pass
| 10.333333 | 27 | 0.709677 | 9 | 62 | 4.666667 | 0.888889 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.209677 | 62 | 5 | 28 | 12.4 | 0.857143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | true | 0.333333 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 7 |
27fe8f78e17073945b19bafa061d079892e45697 | 135 | py | Python | regym/logging_server/__init__.py | Danielhp95/Regym | f0f0be0ad23bf1a3410ecd9ed9b8025947d6080a | [
"MIT"
] | 6 | 2019-12-03T21:07:12.000Z | 2021-03-25T13:09:39.000Z | regym/logging_server/__init__.py | Danielhp95/Generalized-RL-Self-Play-Framework | 64e02e143070ca6eb8bc8f898c431f59cd229341 | [
"MIT"
] | 1 | 2019-01-29T18:43:32.000Z | 2019-01-31T17:31:39.000Z | regym/logging_server/__init__.py | Danielhp95/Regym | f0f0be0ad23bf1a3410ecd9ed9b8025947d6080a | [
"MIT"
] | 4 | 2019-08-01T10:29:41.000Z | 2021-12-06T21:44:30.000Z | from .log_server_socket import SERVER_SHUTDOWN_MESSAGE
from .log_server_socket import initialize_logger, create_logging_server_process
| 45 | 79 | 0.911111 | 19 | 135 | 5.947368 | 0.631579 | 0.123894 | 0.230089 | 0.336283 | 0.442478 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.066667 | 135 | 2 | 80 | 67.5 | 0.896825 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
fd6964b7b4254c289355037bfae03e91d8e8aae5 | 29,041 | py | Python | src/models.py | aresPanos/dmgp_dfgp_regression | 5fb554d3ca96b2cb1aa18a955e39d09fbe1d7177 | [
"Apache-2.0"
] | 4 | 2020-06-11T07:46:29.000Z | 2021-05-22T11:53:06.000Z | src/models.py | aresPanos/dmgp_dfgp_regression | 5fb554d3ca96b2cb1aa18a955e39d09fbe1d7177 | [
"Apache-2.0"
] | 1 | 2020-06-26T10:42:01.000Z | 2020-07-10T17:21:28.000Z | src/models.py | aresPanos/dmgp_dfgp_regression | 5fb554d3ca96b2cb1aa18a955e39d09fbe1d7177 | [
"Apache-2.0"
] | null | null | null | # Copyright 2020 Aristeidis Panos
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
# http://www.apache.org/licenses/LICENSE-2.0
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from typing import Optional, Tuple, TypeVar
import importlib.util
spec = importlib.util.find_spec('silence_tensorflow')
if spec is not None:
import silence_tensorflow.auto
import tensorflow as tf
import numpy as np
from tensorflow.keras import layers, models
from tensorflow.keras import backend as K_bd
from itertools import product
import os
from gpflow.base import Module, Parameter
from gpflow.config import default_float, set_default_float
from gpflow.utilities import ops, positive
class DMGP_model(Module):
'''
This is the Deep Mercer Gaussian Process (DMGP) model for regression using batch-optimization
data: a tuple of two tensors. First tuple corresponds to the design matrix with shape [N, D]
and the second tensor is an N-dimensional array with the targets
m: number of eigenfunctions used for the approximation; an integer greater than 1
d: number of outputs for the neural net; a positive integer
alpha: normalization constant related to the variance of the embedding space
eps_sq: initial value of the epsilon square
sigma_n_sq: initial value of the noise variance
sigma_f_sq: initial value of the signal variance
simple_dnn: if True a single layer DNN is used otherwise a 4-layer DNN
'''
def __init__(self,
data: Tuple[tf.Tensor, tf.Tensor],
m: int = 20,
d: int = 1,
alpha: np.float = 1./np.sqrt(2.),
eps_sq: np.float = 1,
sigma_n_sq: np.float = 1,
sigma_f_sq: np.float = 1,
dir_weights: str = None):
if data[1].dtype == np.float64:
K_bd.set_floatx('float64')
else:
set_default_float(np.float32)
self.num_data = tf.cast(data[1].shape[0], default_float())
self.data = (tf.cast(data[0], default_float()), tf.cast(data[1], default_float()))
self.const = tf.cast(0.5*data[1].size*np.log(2*np.pi), default_float())
self.flag_1d = d == 1
self.alpha = tf.cast(alpha, default_float())
self.alpha_sq = tf.square(self.alpha)
self.m = tf.cast(m, default_float())
self.this_range = tf.constant(np.asarray(list(product(range(1, m + 1), repeat=d))).squeeze(), dtype=default_float())
self.this_range_1 = self.this_range - 1.
self.this_range_1_2 = self.this_range_1 if self.flag_1d else tf.range(m, dtype=default_float())
self.this_range_1_int = tf.cast(self.this_range_1, tf.int32)
self.tf_range_dnn_out = tf.range(d)
self.this_range_1_ln2 = np.log(2.)*self.this_range_1
self.vander_range = tf.range(m+1, dtype=default_float())
self.eye_k = tf.eye(m**d, dtype=default_float())
self.yTy = tf.reduce_sum(tf.math.square(self.data[1]))
self.coeff_n_tf = tf.constant(np.load(os.path.dirname(os.path.realpath(__file__)) + '/hermite_coeff.npy')[:m, :m], dtype=default_float())
eps_sq = eps_sq*np.ones(d) if d > 1 else eps_sq
self.eps_sq = Parameter(eps_sq, transform=positive(), dtype=default_float())
self.sigma_f_sq = Parameter(sigma_f_sq, transform=positive(), dtype=default_float())
self.sigma_n_sq = Parameter(sigma_n_sq, transform=positive(), dtype=default_float())
model = models.Sequential()
model.add(layers.Dense(512, activation='tanh', input_dim=data[0].shape[1]))
model.add(layers.Dense(256, activation='tanh'))
model.add(layers.Dense(64, activation='tanh'))
model.add(layers.Dense(d))
if dir_weights is not None:
model.load_weights(dir_weights)
self.neural_net = model
def neg_log_marginal_likelihood(self, data_in: Tuple[tf.Tensor, tf.Tensor] = None) -> tf.Tensor:
'''
It computes the negative log marginal likelihood up to a constant using either batches or the full training dataset
'''
if data_in is None:
Xb = self.data[0]
yb = self.data[1]
yTy_b = self.yTy
else:
Xb = data_in[0]
yb = data_in[1]
yTy_b = tf.reduce_sum(tf.math.square(data_in[1]))
x_nn_tr = tf.squeeze(self.neural_net(Xb))
x_nn_tr = (x_nn_tr - tf.math.reduce_mean(x_nn_tr,0))/tf.math.reduce_std(x_nn_tr,0)
inv_sigma_sq = 1/self.sigma_n_sq
Lambda_Herm, V_herm = self.eigen_fun(x_nn_tr) # [k, None], [N, k]
V_lambda_sqrt = V_herm*tf.math.sqrt(Lambda_Herm) # [N, k]
V_lambda_sqrt_y = tf.linalg.matvec(V_lambda_sqrt, yb, transpose_a=True) # [k, None]
low_rank_term = self.eye_k + inv_sigma_sq*tf.linalg.matmul(V_lambda_sqrt, V_lambda_sqrt, transpose_a=True) # [k, k]
low_rank_L = tf.linalg.cholesky(low_rank_term)
L_inv_V_y = tf.linalg.triangular_solve(low_rank_L, V_lambda_sqrt_y[:, None], lower=True) # [k, 1]
data_fit = inv_sigma_sq*(yTy_b - inv_sigma_sq*tf.reduce_sum(tf.math.square(L_inv_V_y)))
return 0.5*(data_fit + self.num_data*tf.math.log(self.sigma_n_sq)) + tf.reduce_sum(tf.math.log(tf.linalg.diag_part(low_rank_L))) + self.const
def eigen_fun(self, data_x) -> Tuple[tf.Tensor, tf.Tensor]:
'''
It computes the eigenvalues and eigenfunctions evaluated at the training inputs.
'''
beta_tmp = 1. + 4.*self.eps_sq/self.alpha_sq # [d, None]
beta = tf.pow(beta_tmp, .25) # [d, None]
beta_sq = tf.math.sqrt(beta_tmp) # [d, None]
delta_sq = .5*self.alpha_sq*(tf.square(beta) - 1.) # [d, None]
log_a_d_ep_sq = tf.math.log(self.alpha_sq + delta_sq + self.eps_sq) # [d, None]
log_term = .5*(tf.math.log(self.alpha_sq) - log_a_d_ep_sq) + self.this_range_1*(tf.math.log(self.eps_sq) - log_a_d_ep_sq) # [n_eigen^d, d]
gamma = tf.exp(.5 * (tf.math.log(beta) - self.this_range_1_ln2 - tf.math.lgamma(self.this_range))) # [n_eigen^d, d]
tmp_exp = tf.exp(-delta_sq*(tf.square(data_x))) # [N, d]
x_data_upgr = self.alpha*beta*data_x # [N, d]
vander_tf = tf.math.pow(tf.expand_dims(x_data_upgr, axis=-1), self.this_range_1_2[None, :]) # [N, n_eigen] or [N, d, n_eigen]
final_mtr = tf.linalg.matmul(vander_tf, self.coeff_n_tf, transpose_b=True) # [N, n_eigen] or [N, d, n_eigen]
if self.flag_1d:
tf_lambda_n = tf.exp(log_term)
tf_phi_n = gamma*tmp_exp[:, None]*final_mtr # [N, n_eigen]
else:
tf_lambda_n = tf.exp(tf.reduce_sum(log_term, 1))
eigen_fun_tmp = tf.map_fn(lambda x: tf.gather(final_mtr[:, x], self.this_range_1_int[:, x], axis=-1), self.tf_range_dnn_out, dtype = tf.float64) # [d, N, n_eigen^d]
eigen_fun_tmp = tf.einsum('ed,nd,dne->dne', gamma, tmp_exp, eigen_fun_tmp) # [d, N, n_eigen^d]
tf_phi_n = tf.reduce_prod(eigen_fun_tmp, 0) # [N, n_eigen^d]
return self.sigma_f_sq*tf_lambda_n, tf_phi_n
def predict_y(self, x_test: tf.Tensor, full_cov: bool = False) -> Tuple[tf.Tensor, tf.Tensor]:
'''
It computes the mean and variance of the held-out data x_test.
:x_test: tf.Tensor
Input locations of the held-out data with shape=[Ntest, D]
where Ntest is the number of rows and D is the input dimension of each point.
:full_cov: bool
If True, compute and return the full Ntest x Ntest test covariance matrix. Otherwise return only the diagonal of this matrix.
'''
inv_sigma_sq = 1/self.sigma_n_sq
x_nn_tr = tf.squeeze(self.neural_net(self.data[0]))
mean_x_nn_tr, std_x_nn_tr = tf.math.reduce_mean(x_nn_tr, 0), tf.math.reduce_std(x_nn_tr, 0)
x_nn_tr = (x_nn_tr - mean_x_nn_tr)/std_x_nn_tr
x_nn_test = tf.squeeze(self.neural_net(x_test))
x_nn_test = (x_nn_test - mean_x_nn_tr)/std_x_nn_tr
Lambda_Herm, V_herm = self.eigen_fun(x_nn_tr)
V_herm_test = self.eigen_fun(x_nn_test)[1]
sqrt_Lambda_Herm = tf.math.sqrt(Lambda_Herm)
V_lambda_sqrt = V_herm*sqrt_Lambda_Herm
V_lambda_sqrt_y = tf.linalg.matvec(V_lambda_sqrt, self.data[1], transpose_a=True)
V_test_lambda_sqrt = V_herm_test*sqrt_Lambda_Herm
K_Xtest_X_y = tf.linalg.matvec(V_test_lambda_sqrt, V_lambda_sqrt_y)
VT_V = tf.linalg.matmul(V_lambda_sqrt, V_lambda_sqrt, transpose_a=True)
low_rank_term = self.eye_k + inv_sigma_sq*VT_V
low_rank_L = tf.linalg.cholesky(low_rank_term)
L_inv_V_y = tf.linalg.triangular_solve(low_rank_L, V_lambda_sqrt_y[:, None], lower=True)
V_K_Xtest = tf.linalg.matmul(VT_V, V_test_lambda_sqrt, transpose_b=True)
tmp_inv = tf.linalg.triangular_solve(low_rank_L, V_K_Xtest, lower=True)
mean_f = inv_sigma_sq*(K_Xtest_X_y - inv_sigma_sq*tf.linalg.matvec(tmp_inv, tf.squeeze(L_inv_V_y), transpose_a=True))
if full_cov:
tmp_matmul = tf.linalg.matmul(V_test_lambda_sqrt, V_K_Xtest)
K_Xtest_herm = tf.linalg.matmul(V_test_lambda_sqrt, V_test_lambda_sqrt, transpose_b=True)
var_f = K_Xtest_herm - inv_sigma_sq*(tmp_matmul - inv_sigma_sq*tf.linalg.matmul(tmp_inv, tmp_inv, transpose_a=True))
diag = tf.linalg.diag_part(var_f) + self.sigma_n_sq
var_f = tf.linalg.set_diag(var_f, diag)
else:
var_f = self.sigma_f_sq + self.sigma_n_sq - inv_sigma_sq*(tf.einsum('kn,nk->n', V_K_Xtest, V_test_lambda_sqrt) - inv_sigma_sq*tf.reduce_sum(tf.math.square(tmp_inv), 0))
return mean_f, var_f
class DFGP_model(Module):
def __init__(self,
data: Tuple[tf.Tensor, tf.Tensor],
m: int = 100,
d: int = 4,
lengthscales = None,
sigma_n_sq: np.float = 1,
sigma_f_sq: np.float = 1,
dir_weights: str = None):
if data[1].dtype == np.float64:
K_bd.set_floatx('float64')
else:
set_default_float(np.float32)
self.num_data = tf.cast(data[1].shape[0], default_float())
self.data = (tf.cast(data[0], default_float()), tf.cast(data[1], default_float()))
self.const = tf.cast(0.5*data[1].size*np.log(2*np.pi), default_float())
self.eye_2m = tf.eye(2*m, dtype=default_float())
self.yTy = tf.reduce_sum(tf.math.square(self.data[1]))
self.m_float = tf.cast(m, default_float())
self.randn = tf.random.normal(shape=[m, d], dtype=default_float())
lengthscales0 = np.ones(d) if lengthscales is None else lengthscales
self.lengthscales = Parameter(lengthscales0, transform=positive(), dtype=default_float())
self.sigma_f_sq = Parameter(sigma_f_sq, transform=positive(), dtype=default_float())
self.sigma_n_sq = Parameter(sigma_n_sq, transform=positive(), dtype=default_float())
model = models.Sequential()
model.add(layers.Dense(512, activation='tanh', input_dim=data[0].shape[1]))
model.add(layers.Dense(256, activation='tanh'))
model.add(layers.Dense(64, activation='tanh'))
model.add(layers.Dense(d))
if dir_weights is not None:
model.load_weights(dir_weights)
self.neural_net = model
def neg_log_marginal_likelihood(self, data_in: Tuple[tf.Tensor, tf.Tensor] = None) -> tf.Tensor:
'''
It computes the negative log marginal likelihood up to a constant using either batches or the full training dataset
'''
if data_in is None:
Xb = self.data[0]
yb = self.data[1]
yTy_b = self.yTy
else:
Xb = data_in[0]
yb = data_in[1]
yTy_b = tf.reduce_sum(tf.math.square(data_in[1]))
x_nn_tr = self.neural_net(Xb) # [batch, d]
inv_sigma_sq = 1/self.sigma_n_sq
V_lambda_sqrt = self.fourier_features(x_nn_tr) # [batch, 2m]
V_lambda_sqrt_y = tf.linalg.matvec(V_lambda_sqrt, yb, transpose_a=True) # [k, None]
low_rank_term = self.eye_2m + inv_sigma_sq*tf.linalg.matmul(V_lambda_sqrt, V_lambda_sqrt, transpose_a=True) # [k, k]
low_rank_L = tf.linalg.cholesky(low_rank_term)
L_inv_V_y = tf.linalg.triangular_solve(low_rank_L, V_lambda_sqrt_y[:, None], lower=True) # [k, 1]
data_fit = inv_sigma_sq*(yTy_b - inv_sigma_sq*tf.reduce_sum(tf.math.square(L_inv_V_y)))
return 0.5*(data_fit + self.num_data*tf.math.log(self.sigma_n_sq)) + tf.reduce_sum(tf.math.log(tf.linalg.diag_part(low_rank_L))) + self.const
def fourier_features(self, data_x) -> tf.Tensor:
'''
Computing the random Fourier expansion
'''
freq = self.randn / self.lengthscales
xall_freq = tf.linalg.matmul(data_x, freq, transpose_b=True) # [batch, m]
cos_freq = tf.math.cos(xall_freq)
sin_freq = tf.math.sin(xall_freq)
full_z = tf.concat([cos_freq, sin_freq], 1) # [batch, 2m]
return tf.math.sqrt(self.sigma_f_sq/self.m_float)*full_z
def predict_y(self, x_test: tf.Tensor, full_cov: bool = False) -> Tuple[tf.Tensor, tf.Tensor]:
'''
It computes the mean and variance of the held-out data x_test.
:x_test: tf.Tensor
Input locations of the held-out data with shape=[Ntest, D]
where Ntest is the number of rows and D is the input dimension of each point.
:full_cov: bool
If True, compute and return the full Ntest x Ntest test covariance matrix. Otherwise return only the diagonal of this matrix.
'''
inv_sigma_sq = 1/self.sigma_n_sq
x_nn_tr = self.neural_net(self.data[0])
x_nn_test = self.neural_net(x_test)
V_lambda_sqrt = self.fourier_features(x_nn_tr) # [N, 2m]
V_lambda_sqrt_y = tf.linalg.matvec(V_lambda_sqrt, self.data[1], transpose_a=True) # [2m, None]
V_test_lambda_sqrt = self.fourier_features(x_nn_test) # [Ntest, 2m]
K_Xtest_X_y = tf.linalg.matvec(V_test_lambda_sqrt, V_lambda_sqrt_y) # [Ntest, None]
VT_V = tf.linalg.matmul(V_lambda_sqrt, V_lambda_sqrt, transpose_a=True) # [2m, 2m]
low_rank_term = self.eye_2m + inv_sigma_sq*VT_V # [2m, 2m]
low_rank_L = tf.linalg.cholesky(low_rank_term)
L_inv_V_y = tf.linalg.triangular_solve(low_rank_L, V_lambda_sqrt_y[:, None], lower=True) # [2m, 1]
V_K_Xtest = tf.linalg.matmul(VT_V, V_test_lambda_sqrt, transpose_b=True) # [2m, N_test]
tmp_inv = tf.linalg.triangular_solve(low_rank_L, V_K_Xtest, lower=True) # [2m, N_test]
mean_f = inv_sigma_sq*(K_Xtest_X_y - inv_sigma_sq*tf.linalg.matvec(tmp_inv, tf.squeeze(L_inv_V_y), transpose_a=True))
if full_cov:
tmp_matmul = tf.linalg.matmul(V_test_lambda_sqrt, V_K_Xtest) # [Ntest, Ntest]
K_Xtest_herm = tf.linalg.matmul(V_test_lambda_sqrt, V_test_lambda_sqrt, transpose_b=True) # [Ntest, Ntest]
var_f = K_Xtest_herm - inv_sigma_sq*(tmp_matmul - inv_sigma_sq*tf.linalg.matmul(tmp_inv, tmp_inv, transpose_a=True))
diag = tf.linalg.diag_part(var_f) + self.sigma_n_sq
var_f = tf.linalg.set_diag(var_f, diag)
else:
var_f = self.sigma_f_sq + self.sigma_n_sq - inv_sigma_sq*(tf.einsum('kn,nk->n', V_K_Xtest, V_test_lambda_sqrt) - inv_sigma_sq*tf.reduce_sum(tf.math.square(tmp_inv), 0))
return mean_f, var_f
class MGP_model(Module):
def __init__(self,
data: Tuple[tf.Tensor, tf.Tensor],
m: int = 20,
alpha: np.float = 1./np.sqrt(2.),
eps_sq: np.float = 1,
sigma_n_sq: np.float = 1,
sigma_f_sq: np.float = 1):
self.num_data = tf.cast(data[1].shape[0], default_float())
self.data = (tf.cast(tf.squeeze(data[0]), default_float()), tf.cast(data[1], default_float()))
self.const = tf.cast(0.5*data[1].size*np.log(2*np.pi), default_float())
D = data[0].shape[1]
self.flag_1d = D == 1
self.alpha = tf.cast(alpha, default_float())
self.alpha_sq = tf.square(self.alpha)
self.m = tf.cast(m, default_float())
self.this_range = tf.constant(np.asarray(list(product(range(1, m + 1), repeat=D))).squeeze(), dtype=default_float())
self.this_range_1 = self.this_range - 1.
self.this_range_1_2 = self.this_range_1 if self.flag_1d else tf.range(m, dtype=default_float())
self.this_range_1_int = tf.cast(self.this_range_1, tf.int32)
self.tf_range_dnn_out = tf.range(D)
self.this_range_1_ln2 = np.log(2.)*self.this_range_1
self.vander_range = tf.range(m+1, dtype=default_float())
self.eye_k = tf.eye(m**D, dtype=default_float())
self.yTy = tf.reduce_sum(tf.math.square(self.data[1]))
self.coeff_n_tf = tf.constant(np.load(os.path.dirname(os.path.realpath(__file__)) + '/hermite_coeff.npy')[:m, :m], dtype=default_float())
eps_sq = eps_sq*np.ones(D) if D > 1 else eps_sq
self.eps_sq = Parameter(eps_sq, transform=positive(), dtype=default_float())
self.sigma_f_sq = Parameter(sigma_f_sq, transform=positive(), dtype=default_float())
self.sigma_n_sq = Parameter(sigma_n_sq, transform=positive(), dtype=default_float())
def neg_log_marginal_likelihood(self) -> tf.Tensor:
'''
It computes the negative log marginal likelihood up to a constant
'''
inv_sigma_sq = 1/self.sigma_n_sq
Lambda_Herm, V_herm = self.eigen_fun(self.data[0]) # [k, None], [N, k]
V_lambda_sqrt = V_herm*tf.math.sqrt(Lambda_Herm) # [N, k]
V_lambda_sqrt_y = tf.linalg.matvec(V_lambda_sqrt, self.data[1], transpose_a=True) # [k, None]
low_rank_term = self.eye_k + inv_sigma_sq*tf.linalg.matmul(V_lambda_sqrt, V_lambda_sqrt, transpose_a=True) # [k, k]
low_rank_L = tf.linalg.cholesky(low_rank_term)
L_inv_V_y = tf.linalg.triangular_solve(low_rank_L, V_lambda_sqrt_y[:, None], lower=True) # [k, 1]
data_fit = inv_sigma_sq*(self.yTy - inv_sigma_sq*tf.reduce_sum(tf.math.square(L_inv_V_y)))
return 0.5*(data_fit + self.num_data*tf.math.log(self.sigma_n_sq)) + tf.reduce_sum(tf.math.log(tf.linalg.diag_part(low_rank_L))) + self.const
def eigen_fun(self, data_x) -> Tuple[tf.Tensor, tf.Tensor]:
'''
It computes the eigenvalues and eigenfunctions evaluated at the training inputs.
'''
beta_tmp = 1. + 4.*self.eps_sq/self.alpha_sq # [d, None]
beta = tf.pow(beta_tmp, .25) # [d, None]
beta_sq = tf.math.sqrt(beta_tmp) # [d, None]
delta_sq = .5*self.alpha_sq*(tf.square(beta) - 1.) # [d, None]
log_a_d_ep_sq = tf.math.log(self.alpha_sq + delta_sq + self.eps_sq) # [d, None]
log_term = .5*(tf.math.log(self.alpha_sq) - log_a_d_ep_sq) + self.this_range_1*(tf.math.log(self.eps_sq) - log_a_d_ep_sq) # [n_eigen^d, d]
gamma = tf.exp(.5 * (tf.math.log(beta) - self.this_range_1_ln2 - tf.math.lgamma(self.this_range))) # [n_eigen^d, d]
tmp_exp = tf.exp(-delta_sq*(tf.square(data_x))) # [N, d]
x_data_upgr = self.alpha*beta*data_x # [N, d]
vander_tf = tf.math.pow(tf.expand_dims(x_data_upgr, axis=-1), self.this_range_1_2[None, :]) # [N, n_eigen] or [N, d, n_eigen]
final_mtr = tf.linalg.matmul(vander_tf, self.coeff_n_tf, transpose_b=True) # [N, n_eigen] or [N, d, n_eigen]
if self.flag_1d:
tf_lambda_n = tf.exp(log_term)
tf_phi_n = gamma*tmp_exp[:, None]*final_mtr # [N, n_eigen]
else:
tf_lambda_n = tf.exp(tf.reduce_sum(log_term, 1))
eigen_fun_tmp = tf.map_fn(lambda x: tf.gather(final_mtr[:, x], self.this_range_1_int[:, x], axis=-1), self.tf_range_dnn_out, dtype = tf.float64) # [d, N, n_eigen^d]
eigen_fun_tmp = tf.einsum('ed,nd,dne->dne', gamma, tmp_exp, eigen_fun_tmp) # [d, N, n_eigen^d]
tf_phi_n = tf.reduce_prod(eigen_fun_tmp, 0) # [N, n_eigen^d]
return self.sigma_f_sq*tf_lambda_n, tf_phi_n
def predict_y(self, x_test: tf.Tensor, full_cov: bool = False) -> Tuple[tf.Tensor, tf.Tensor]:
'''
It computes the mean and variance of the held-out data x_test.
:x_test: tf.Tensor
Input locations of the held-out data with shape=[Ntest, D]
where Ntest is the number of rows and D is the input dimension of each point.
:full_cov: bool
If True, compute and return the full Ntest x Ntest test covariance matrix. Otherwise return only the diagonal of this matrix.
'''
inv_sigma_sq = 1/self.sigma_n_sq
Lambda_Herm, V_herm = self.eigen_fun(self.data[0]) # [N, k]
V_herm_test = self.eigen_fun(tf.squeeze(x_test))[1] # [Ntest, k]
sqrt_Lambda_Herm = tf.math.sqrt(Lambda_Herm) # [k, None]
V_lambda_sqrt = V_herm*sqrt_Lambda_Herm # [N, k]
V_lambda_sqrt_y = tf.linalg.matvec(V_lambda_sqrt, self.data[1], transpose_a=True) # [k, None]
V_test_lambda_sqrt = V_herm_test*sqrt_Lambda_Herm # [Ntest, k]
K_Xtest_X_y = tf.linalg.matvec(V_test_lambda_sqrt, V_lambda_sqrt_y) # [Ntest, None]
VT_V = tf.linalg.matmul(V_lambda_sqrt, V_lambda_sqrt, transpose_a=True) # [k, k]
low_rank_term = self.eye_k + inv_sigma_sq*VT_V # [k, k]
low_rank_L = tf.linalg.cholesky(low_rank_term)
L_inv_V_y = tf.linalg.triangular_solve(low_rank_L, V_lambda_sqrt_y[:, None], lower=True) # [k, 1]
V_K_Xtest = tf.linalg.matmul(VT_V, V_test_lambda_sqrt, transpose_b=True) # [k, N_test]
tmp_inv = tf.linalg.triangular_solve(low_rank_L, V_K_Xtest, lower=True) # [k, N_test]
mean_f = inv_sigma_sq*(K_Xtest_X_y - inv_sigma_sq*tf.linalg.matvec(tmp_inv, tf.squeeze(L_inv_V_y), transpose_a=True))
if full_cov:
tmp_matmul = tf.linalg.matmul(V_test_lambda_sqrt, V_K_Xtest) # [Ntest, Ntest]
K_Xtest_herm = tf.linalg.matmul(V_test_lambda_sqrt, V_test_lambda_sqrt, transpose_b=True) # [Ntest, Ntest]
var_f = K_Xtest_herm - inv_sigma_sq*(tmp_matmul - inv_sigma_sq*tf.linalg.matmul(tmp_inv, tmp_inv, transpose_a=True))
diag = tf.linalg.diag_part(var_f) + self.sigma_n_sq
var_f = tf.linalg.set_diag(var_f, diag)
else:
var_f = self.sigma_f_sq + self.sigma_n_sq - inv_sigma_sq*(tf.einsum('kn,nk->n', V_K_Xtest, V_test_lambda_sqrt) - inv_sigma_sq*tf.reduce_sum(tf.math.square(tmp_inv), 0))
return mean_f, var_f
class FGP_model(Module):
def __init__(self,
data: Tuple[tf.Tensor, tf.Tensor],
m: int = 100,
lengthscales = None,
sigma_n_sq: np.float = 1,
sigma_f_sq: np.float = 1,
randn = None):
self.num_data = tf.cast(data[1].size, default_float())
self.data = (tf.cast(data[0], default_float()), tf.cast(data[1], default_float()))
self.const = tf.cast(0.5*data[1].size*np.log(2*np.pi), default_float())
self.eye_2m = tf.eye(2*m, dtype=default_float())
self.yTy = tf.reduce_sum(tf.math.square(self.data[1]))
self.m_float = tf.cast(m, default_float())
self.randn = tf.random.normal(shape=[m, data[0].shape[1]], dtype=default_float()) if randn is None else tf.cast(randn[:, None], default_float())
lengthscales0 = np.ones(data[0].shape[1]) if lengthscales is None else lengthscales
self.lengthscales = Parameter(lengthscales0, transform=positive(), dtype=default_float())
self.sigma_f_sq = Parameter(sigma_f_sq, transform=positive(), dtype=default_float())
self.sigma_n_sq = Parameter(sigma_n_sq, transform=positive(), dtype=default_float())
def neg_log_marginal_likelihood(self) -> tf.Tensor:
'''
It computes the negative log marginal likelihood up to a constant
'''
inv_sigma_sq = 1/self.sigma_n_sq
V_lambda_sqrt = self.fourier_features(self.data[0]) # [batch, 2m]
V_lambda_sqrt_y = tf.linalg.matvec(V_lambda_sqrt, self.data[1], transpose_a=True) # [k, None]
low_rank_term = self.eye_2m + inv_sigma_sq*tf.linalg.matmul(V_lambda_sqrt, V_lambda_sqrt, transpose_a=True) # [k, k]
low_rank_L = tf.linalg.cholesky(low_rank_term)
L_inv_V_y = tf.linalg.triangular_solve(low_rank_L, V_lambda_sqrt_y[:, None], lower=True) # [k, 1]
data_fit = inv_sigma_sq*(self.yTy - inv_sigma_sq*tf.reduce_sum(tf.math.square(L_inv_V_y)))
return 0.5*(data_fit + self.num_data*tf.math.log(self.sigma_n_sq)) + tf.reduce_sum(tf.math.log(tf.linalg.diag_part(low_rank_L))) + self.const
def fourier_features(self, data_x) -> tf.Tensor:
freq = self.randn / self.lengthscales
xall_freq = tf.linalg.matmul(data_x, freq, transpose_b=True) # [batch, m]
cos_freq = tf.math.cos(xall_freq)
sin_freq = tf.math.sin(xall_freq)
full_z = tf.concat([cos_freq, sin_freq], 1) # [batch, 2m]
return tf.math.sqrt(self.sigma_f_sq/self.m_float)*full_z
def predict_y(self, x_test: tf.Tensor, full_cov: bool = False) -> Tuple[tf.Tensor, tf.Tensor]:
'''
It computes the mean and variance of the held-out data x_test.
:x_test: tf.Tensor
Input locations of the held-out data with shape=[Ntest, D]
where Ntest is the number of rows and D is the input dimension of each point.
:full_cov: bool
If True, compute and return the full Ntest x Ntest test covariance matrix. Otherwise return only the diagonal of this matrix.
'''
inv_sigma_sq = 1/self.sigma_n_sq
V_lambda_sqrt = self.fourier_features(self.data[0]) # [N, 2m]
V_lambda_sqrt_y = tf.linalg.matvec(V_lambda_sqrt, self.data[1], transpose_a=True) # [2m, None]
V_test_lambda_sqrt = self.fourier_features(x_test) # [Ntest, 2m]
K_Xtest_X_y = tf.linalg.matvec(V_test_lambda_sqrt, V_lambda_sqrt_y) # [Ntest, None]
VT_V = tf.linalg.matmul(V_lambda_sqrt, V_lambda_sqrt, transpose_a=True) # [2m, 2m]
low_rank_term = self.eye_2m + inv_sigma_sq*VT_V # [2m, 2m]
low_rank_L = tf.linalg.cholesky(low_rank_term)
L_inv_V_y = tf.linalg.triangular_solve(low_rank_L, V_lambda_sqrt_y[:, None], lower=True) # [2m, 1]
V_K_Xtest = tf.linalg.matmul(VT_V, V_test_lambda_sqrt, transpose_b=True) # [2m, N_test]
tmp_inv = tf.linalg.triangular_solve(low_rank_L, V_K_Xtest, lower=True) # [2m, N_test]
mean_f = inv_sigma_sq*(K_Xtest_X_y - inv_sigma_sq*tf.linalg.matvec(tmp_inv, tf.squeeze(L_inv_V_y), transpose_a=True))
if full_cov:
tmp_matmul = tf.linalg.matmul(V_test_lambda_sqrt, V_K_Xtest) # [Ntest, Ntest]
K_Xtest_herm = tf.linalg.matmul(V_test_lambda_sqrt, V_test_lambda_sqrt, transpose_b=True) # [Ntest, Ntest]
var_f = K_Xtest_herm - inv_sigma_sq*(tmp_matmul - inv_sigma_sq*tf.linalg.matmul(tmp_inv, tmp_inv, transpose_a=True))
diag = tf.linalg.diag_part(var_f) + self.sigma_n_sq
var_f = tf.linalg.set_diag(var_f, diag)
else:
var_f = self.sigma_f_sq + self.sigma_n_sq - inv_sigma_sq*(tf.einsum('kn,nk->n', V_K_Xtest, V_test_lambda_sqrt) - inv_sigma_sq*tf.reduce_sum(tf.math.square(tmp_inv), 0))
return mean_f, var_f
| 52.897996 | 180 | 0.626287 | 4,625 | 29,041 | 3.635459 | 0.071351 | 0.047579 | 0.034019 | 0.024979 | 0.905258 | 0.89479 | 0.888783 | 0.887356 | 0.875223 | 0.872844 | 0 | 0.013221 | 0.255088 | 29,041 | 549 | 181 | 52.897996 | 0.764018 | 0.150821 | 0 | 0.849432 | 0 | 0 | 0.006317 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.045455 | false | 0 | 0.036932 | 0 | 0.127841 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
a90082ad1288c6b83ff01c0d22c8d485cecad10d | 61,275 | py | Python | test/integration/component/maint/test_dedicate_guest_vlan_ranges.py | ycyun/ablestack-cloud | b7bd36a043e2697d05303246373988aa033c9229 | [
"Apache-2.0"
] | 1,131 | 2015-01-08T18:59:06.000Z | 2022-03-29T11:31:10.000Z | test/integration/component/maint/test_dedicate_guest_vlan_ranges.py | ycyun/ablestack-cloud | b7bd36a043e2697d05303246373988aa033c9229 | [
"Apache-2.0"
] | 5,908 | 2015-01-13T15:28:37.000Z | 2022-03-31T20:31:07.000Z | test/integration/component/maint/test_dedicate_guest_vlan_ranges.py | ycyun/ablestack-cloud | b7bd36a043e2697d05303246373988aa033c9229 | [
"Apache-2.0"
] | 1,083 | 2015-01-05T01:16:52.000Z | 2022-03-31T12:14:10.000Z | # Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
""" P1 tests for Dedicating guest VLAN ranges
Test Plan: https://cwiki.apache.org/confluence/display/CLOUDSTACK/Dedicated+Resources+-+Public+IP+Addresses+and+VLANs+per+Tenant+Test+Plan
Issue Link: https://issues.apache.org/jira/browse/CLOUDSTACK-2251
Feature Specifications: https://cwiki.apache.org/confluence/display/CLOUDSTACK/FS-+Dedicate+Guest+VLANs+per+tenant
"""
#Import Local Modules
from nose.plugins.attrib import attr
from marvin.cloudstackTestCase import cloudstackTestCase
import unittest
from marvin.lib.utils import (validateList,
cleanup_resources,
random_gen,
xsplit)
from marvin.lib.base import (Account,
Domain,
PhysicalNetwork,
NetworkOffering,
Network,
ServiceOffering,
Project)
from marvin.lib.common import (get_domain,
get_zone,
get_template,
setNonContiguousVlanIds,
isNetworkDeleted)
from marvin.codes import PASS
def LimitVlanRange(self, vlanrange, range=2):
"""Limits the length of vlan range"""
vlan_endpoints = str(vlanrange).split("-")
vlan_startid = int(vlan_endpoints[1])
vlan_endid = vlan_startid + (range-1)
return str(vlan_startid) + "-" + str(vlan_endid)
class TestDedicateGuestVLANRange(cloudstackTestCase):
@classmethod
def setUpClass(cls):
cls.testClient = super(TestDedicateGuestVLANRange, cls).getClsTestClient()
cls.apiclient = cls.testClient.getApiClient()
cls.testdata = cls.testClient.getParsedTestDataConfig()
# Get Zone, Domain
cls.domain = get_domain(cls.apiclient)
cls.zone = get_zone(cls.apiclient)
cls.testdata["isolated_network"]["zoneid"] = cls.zone.id
cls.testdata['mode'] = cls.zone.networktype
template = get_template(
cls.apiclient,
cls.zone.id,
cls.testdata["ostype"]
)
cls._cleanup = []
try:
cls.isolated_network_offering = NetworkOffering.create(
cls.apiclient,
cls.testdata["nw_off_isolated_persistent"])
cls._cleanup.append(cls.isolated_network_offering)
cls.isolated_network_offering.update(cls.apiclient, state='Enabled')
cls.testdata["nw_off_isolated_persistent"]["specifyVlan"] = True
cls.isolated_network_offering_vlan = NetworkOffering.create(
cls.apiclient,
cls.testdata["nw_off_isolated_persistent"])
cls._cleanup.append(cls.isolated_network_offering_vlan)
cls.isolated_network_offering_vlan.update(cls.apiclient, state='Enabled')
cls.service_offering = ServiceOffering.create(
cls.apiclient,
cls.testdata["service_offering"])
cls._cleanup.append(cls.service_offering)
cls.testdata["small"]["zoneid"] = cls.zone.id
cls.testdata["small"]["template"] = template.id
except Exception as e:
cls.tearDownClass()
raise unittest.SkipTest(e)
return
@classmethod
def tearDownClass(cls):
try:
# Cleanup resources used
cleanup_resources(cls.apiclient, cls._cleanup)
except Exception as e:
raise Exception("Warning: Exception during cleanup : %s" % e)
return
def setUp(self):
self.apiclient = self.testClient.getApiClient()
self.dbclient = self.testClient.getDbConnection()
self.cleanup = []
self.physical_network, self.free_vlan = setNonContiguousVlanIds(self.apiclient,
self.zone.id)
return
def tearDown(self):
try:
# Clean up
cleanup_resources(self.apiclient, self.cleanup)
except Exception as e:
raise Exception("Warning: Exception during cleanup : %s" % e)
finally:
self.physical_network.update(self.apiclient,
id=self.physical_network.id,
vlan=self.physical_network.vlan)
return
@attr(tags = ["advanced", "selfservice"], required_hardware="false")
def test_01_dedicate_guest_vlan_range_root_domain(self):
"""Dedicate guest vlan range to account in root domain
# Validate the following:
# 1. Create two accounts under root domain
# 2. Dedicate a new vlan range to account 1
# 3. Verify that the new vlan range is dedicated to account 1
by listing the dedicated range and checking the account name
# 4. Try to create a guest network in account 2 usign the vlan in dedicated range
# 5. The operation should fail
# 6. Create a guest network in account 2
# 7. Verify that the vlan for guest network is acquired from the dedicated range
# 8. Delete the guest network in account 2
# 9. Verify that the network is deleted
# 10.Verify that the vlan is still dedicated to account 1 after deleting the network
# 11.Release the vlan range back to the system
# 12.Verify that ther list of dedicated vlans doesn't contain the vlan
"""
self.account1 = Account.create(
self.apiclient,
self.testdata["account"],
domainid=self.domain.id
)
self.cleanup.append(self.account1)
self.account2 = Account.create(
self.apiclient,
self.testdata["account"],
domainid=self.domain.id
)
self.cleanup.append(self.account2)
new_vlan = self.physical_network.vlan + "," + self.free_vlan["partial_range"][0]
self.physical_network.update(self.apiclient,
id=self.physical_network.id, vlan=new_vlan)
# Dedicating guest vlan range
dedicate_guest_vlan_range_response = PhysicalNetwork.dedicate(
self.apiclient,
self.free_vlan["partial_range"][0],
physicalnetworkid=self.physical_network.id,
account=self.account1.name,
domainid=self.account1.domainid
)
list_dedicated_guest_vlan_range_response = PhysicalNetwork.listDedicated(
self.apiclient,
id=dedicate_guest_vlan_range_response.id
)
dedicated_guest_vlan_response = list_dedicated_guest_vlan_range_response[0]
self.assertEqual(
dedicated_guest_vlan_response.account,
self.account1.name,
"Check account name is in listDedicatedGuestVlanRanges as the account the range is dedicated to"
)
dedicatedvlans = str(self.free_vlan["partial_range"][0]).split("-")
with self.assertRaises(Exception):
isolated_network1 = Network.create(
self.apiclient,
self.testdata["isolated_network"],
self.account2.name,
self.account2.domainid,
networkofferingid=self.isolated_network_offering_vlan.id,
vlan=int(dedicatedvlans[0]))
isolated_network1.delete(self.apiclient)
isolated_network2 = Network.create(
self.apiclient,
self.testdata["isolated_network"],
self.account1.name,
self.account1.domainid,
networkofferingid=self.isolated_network_offering.id)
networks = Network.list(self.apiclient, id=isolated_network2.id)
self.assertEqual(validateList(networks)[0], PASS, "networks list validation failed")
self.assertTrue(int(dedicatedvlans[0]) <= int(networks[0].vlan) <= int(dedicatedvlans[1]),
"Vlan of the network should be from the dedicated range")
isolated_network2.delete(self.apiclient)
self.assertTrue(isNetworkDeleted(self.apiclient, networkid=isolated_network2.id),
"Network not deleted in timeout period")
# List after deleting all networks, it should still be dedicated to the account
list_dedicated_guest_vlan_range_response = PhysicalNetwork.listDedicated(
self.apiclient,
id=dedicate_guest_vlan_range_response.id
)
dedicated_guest_vlan_response = list_dedicated_guest_vlan_range_response[0]
self.assertEqual(
dedicated_guest_vlan_response.account,
self.account1.name,
"Check account name is in listDedicatedGuestVlanRanges as the account the range is dedicated to"
)
self.debug("Releasing guest vlan range");
dedicate_guest_vlan_range_response.release(self.apiclient)
list_dedicated_guest_vlan_range_response = PhysicalNetwork.listDedicated(self.apiclient)
self.assertEqual(
list_dedicated_guest_vlan_range_response,
None,
"Check vlan range is not available in listDedicatedGuestVlanRanges"
)
return
@attr(tags = ["advanced", "selfservice"], required_hardware="false")
def test_02_dedicate_guest_vlan_range_user_domain(self):
"""Dedicate guest vlan range to account in user domain
# Validate the following:
# 1. Create two accounts under user domain
# 2. Dedicate a new vlan range to account 1
# 3. Verify that the new vlan range is dedicated to account 1
by listing the dedicated range and checking the account name
# 4. Try to create a guest network in account 2 usign the vlan in dedicated range
# 5. The operation should fail
# 6. Create a guest network in account 2
# 7. Verify that the vlan for guest network is acquired from the dedicated range
# 8. Delete the guest network in account 2
# 9. Verify that the network is deleted
# 10.Verify that the vlan is still dedicated to account 1 after deleting the network
# 11.Release the vlan range back to the system
# 12.Verify that ther list of dedicated vlans doesn't contain the vlan
"""
self.user_domain1 = Domain.create(
self.apiclient,
services=self.testdata["domain"],
parentdomainid=self.domain.id)
self.cleanup.append(self.user_domain1)
#Create Account
self.account1 = Account.create(
self.apiclient,
self.testdata["account"],
domainid=self.user_domain1.id
)
self.cleanup.insert(-1, self.account1)
#Create Account
self.account2 = Account.create(
self.apiclient,
self.testdata["account"],
domainid=self.user_domain1.id
)
self.cleanup.insert(-1, self.account2)
new_vlan = self.physical_network.vlan + "," + self.free_vlan["partial_range"][0]
self.physical_network.update(self.apiclient,
id=self.physical_network.id, vlan=new_vlan)
# Dedicating guest vlan range
dedicate_guest_vlan_range_response = PhysicalNetwork.dedicate(
self.apiclient,
self.free_vlan["partial_range"][0],
physicalnetworkid=self.physical_network.id,
account=self.account1.name,
domainid=self.account1.domainid
)
list_dedicated_guest_vlan_range_response = PhysicalNetwork.listDedicated(
self.apiclient,
id=dedicate_guest_vlan_range_response.id
)
dedicated_guest_vlan_response = list_dedicated_guest_vlan_range_response[0]
self.assertEqual(
dedicated_guest_vlan_response.account,
self.account1.name,
"Check account name is in listDedicatedGuestVlanRanges as the account the range is dedicated to"
)
dedicatedvlans = str(self.free_vlan["partial_range"][0]).split("-")
with self.assertRaises(Exception):
isolated_network1 = Network.create(
self.apiclient,
self.testdata["isolated_network"],
self.account2.name,
self.account2.domainid,
networkofferingid=self.isolated_network_offering_vlan.id,
vlan=int(dedicatedvlans[0]))
isolated_network1.delete(self.apiclient)
isolated_network2 = Network.create(
self.apiclient,
self.testdata["isolated_network"],
self.account1.name,
self.account1.domainid,
networkofferingid=self.isolated_network_offering.id)
networks = Network.list(self.apiclient, id=isolated_network2.id, listall=True)
self.assertEqual(validateList(networks)[0], PASS, "networks list validation failed")
self.assertTrue(int(dedicatedvlans[0]) <= int(networks[0].vlan) <= int(dedicatedvlans[1]),
"Vlan of the network should be from the dedicated range")
isolated_network2.delete(self.apiclient)
self.assertTrue(isNetworkDeleted(self.apiclient, networkid=isolated_network2.id),
"Network not deleted in timeout period")
self.debug("Releasing guest vlan range");
dedicate_guest_vlan_range_response.release(self.apiclient)
list_dedicated_guest_vlan_range_response = PhysicalNetwork.listDedicated(self.apiclient)
self.assertEqual(
list_dedicated_guest_vlan_range_response,
None,
"Check vlan range is not available in listDedicatedGuestVlanRanges"
)
return
@attr(tags = ["advanced", "selfservice"], required_hardware="false")
def test_03_multiple_guest_netwoks(self):
"""Dedicate multiple guest networks in account with dedicated vlan range
# Validate the following:
# 1. Create account under user domain
# 2. Dedicate a new vlan range of range 2 to account
# 3. Verify that the new vlan range is dedicated to account
by listing the dedicated range and checking the account name
# 4. Create a guest network in the account
# 5. Verify that the vlan of the network is from dedicated range
# 6. Repeat steps 4 and 5 for network 2
# 7. Now create 3rd guest network in the account
# 8. Verify that the vlan of the network is not from the dedicated range, as
all the vlans in dedicated range are now exhausted
"""
self.user_domain = Domain.create(
self.apiclient,
services=self.testdata["domain"],
parentdomainid=self.domain.id)
self.cleanup.append(self.user_domain)
#Create Account
self.account = Account.create(
self.apiclient,
self.testdata["account"],
domainid=self.user_domain.id
)
self.cleanup.insert(-1, self.account)
self.free_vlan["partial_range"][0] = LimitVlanRange(self, self.free_vlan["partial_range"][0], range=2)
vlan_startid = int(str(self.free_vlan["partial_range"][0]).split("-")[0])
vlan_endid = vlan_startid + 1
new_vlan = self.physical_network.vlan + "," + self.free_vlan["partial_range"][0]
self.physical_network.update(self.apiclient,
id=self.physical_network.id, vlan=new_vlan)
# Dedicating guest vlan range
dedicate_guest_vlan_range_response = PhysicalNetwork.dedicate(
self.apiclient,
self.free_vlan["partial_range"][0],
physicalnetworkid=self.physical_network.id,
account=self.account.name,
domainid=self.account.domainid
)
list_dedicated_guest_vlan_range_response = PhysicalNetwork.listDedicated(
self.apiclient,
id=dedicate_guest_vlan_range_response.id
)
dedicated_guest_vlan_response = list_dedicated_guest_vlan_range_response[0]
self.assertEqual(
dedicated_guest_vlan_response.account,
self.account.name,
"Check account name is in listDedicatedGuestVlanRanges as the account the range is dedicated to"
)
isolated_network1 = Network.create(
self.apiclient,
self.testdata["isolated_network"],
self.account.name,
self.account.domainid,
networkofferingid=self.isolated_network_offering.id)
networks = Network.list(self.apiclient, id=isolated_network1.id, listall=True)
self.assertEqual(validateList(networks)[0], PASS, "networks list validation failed")
self.assertTrue(vlan_startid <= int(networks[0].vlan) <= vlan_endid,
"Vlan of the network should be from the dedicated range")
isolated_network2 = Network.create(
self.apiclient,
self.testdata["isolated_network"],
self.account.name,
self.account.domainid,
networkofferingid=self.isolated_network_offering.id)
networks = Network.list(self.apiclient, id=isolated_network2.id, listall=True)
self.assertEqual(validateList(networks)[0], PASS, "networks list validation failed")
self.assertTrue(vlan_startid <= int(networks[0].vlan) <= vlan_endid,
"Vlan of the network should be from the dedicated range")
isolated_network3 = Network.create(
self.apiclient,
self.testdata["isolated_network"],
self.account.name,
self.account.domainid,
networkofferingid=self.isolated_network_offering.id)
networks = Network.list(self.apiclient, id=isolated_network3.id, listall=True)
self.assertEqual(validateList(networks)[0], PASS, "networks list validation failed")
self.assertFalse(vlan_startid <= int(networks[0].vlan) <= vlan_endid,
"Vlan of the network should not be from the dedicated range")
return
@attr(tags = ["invalid"])
def test_04_dedicate_guest_vlan_in_project(self):
"""Dedicate guest vlan range project owner account and test guest network vlan in project
# Validate the following:
# 1. Create account under user domain
# 2. Create a project with this account
# 3. Dedicate a new vlan range to the account
# 4. Verify that the new vlan range is dedicated to account
by listing the dedicated range and checking the account name
# 5. Create a guest network in the project
# 6. Verify that the vlan of the network is from dedicated range
# 7. Repeat steps 4 and 5 for network 2
# 8. Now create 3rd guest network in the account
# 9. Verify that the vlan of the network is not from the dedicated range, as
all the vlans in dedicated range are now exhausted
"""
user_domain = Domain.create(
self.apiclient,
services=self.testdata["domain"],
parentdomainid=self.domain.id)
self.cleanup.append(user_domain)
#Create Account
self.account = Account.create(
self.apiclient,
self.testdata["account"],
domainid=user_domain.id
)
self.cleanup.insert(-1, self.account)
# Create project as a domain admin
project = Project.create(self.apiclient,
self.testdata["project"],
account=self.account.name,
domainid=self.account.domainid)
self.cleanup.insert(-2, project)
self.free_vlan["partial_range"][0] = LimitVlanRange(self, self.free_vlan["partial_range"][0], range=2)
vlan_startid = int(str(self.free_vlan["partial_range"][0]).split("-")[0])
vlan_endid = vlan_startid + 1
new_vlan = self.physical_network.vlan + "," + self.free_vlan["partial_range"][0]
self.physical_network.update(self.apiclient,
id=self.physical_network.id, vlan=new_vlan)
# Dedicating guest vlan range
dedicate_guest_vlan_range_response = PhysicalNetwork.dedicate(
self.apiclient,
self.free_vlan["partial_range"][0],
physicalnetworkid=self.physical_network.id,
account=self.account.name,
domainid=self.account.domainid
)
list_dedicated_guest_vlan_range_response = PhysicalNetwork.listDedicated(
self.apiclient,
id=dedicate_guest_vlan_range_response.id
)
dedicated_guest_vlan_response = list_dedicated_guest_vlan_range_response[0]
self.assertEqual(
dedicated_guest_vlan_response.account,
self.account.name,
"Check account name is in listDedicatedGuestVlanRanges as the account the range is dedicated to"
)
isolated_network1 = Network.create(
self.apiclient,
self.testdata["isolated_network"],
projectid=project.id,
networkofferingid=self.isolated_network_offering.id)
networks = Network.list(self.apiclient, id=isolated_network1.id, projectid=project.id, listall=True)
self.assertEqual(validateList(networks)[0], PASS, "networks list validation failed")
self.assertTrue(vlan_startid <= int(networks[0].vlan) <= vlan_endid,
"Vlan of the network should be from the dedicated range")
isolated_network2 = Network.create(
self.apiclient,
self.testdata["isolated_network"],
projectid=project.id,
networkofferingid=self.isolated_network_offering.id)
networks = Network.list(self.apiclient, id=isolated_network2.id, projectid=project.id, listall=True)
self.assertEqual(validateList(networks)[0], PASS, "networks list validation failed")
self.assertTrue(vlan_startid <= int(networks[0].vlan) <= vlan_endid,
"Vlan of the network should be from the dedicated range")
isolated_network3 = Network.create(
self.apiclient,
self.testdata["isolated_network"],
projectid=project.id,
networkofferingid=self.isolated_network_offering.id)
networks = Network.list(self.apiclient, id=isolated_network3.id, projectid=project.id, listall=True)
self.assertEqual(validateList(networks)[0], PASS, "networks list validation failed")
self.assertFalse(vlan_startid <= int(networks[0].vlan) <= vlan_endid,
"Vlan of the network should be from the dedicated range")
return
@attr(tags = ["advanced", "selfservice"], required_hardware="false")
def test_05_dedicate_range_different_accounts(self):
"""Dedicate two different vlan ranges to two different accounts
# Validate the following:
# 1. Create two accounts in root domain
# 2. Update the physical network with two different vlan ranges
# 3. Dedicate first vlan range to the account 1
# 4. Dedicate 2nd vlan range to account 2
# 5. Both the operations should be successful
"""
self.account1 = Account.create(
self.apiclient,
self.testdata["account"],
domainid=self.domain.id
)
self.cleanup.append(self.account1)
self.account2 = Account.create(
self.apiclient,
self.testdata["account"],
domainid=self.domain.id
)
self.cleanup.append(self.account2)
new_vlan = self.physical_network.vlan + "," + self.free_vlan["partial_range"][0] + ","+\
self.free_vlan["partial_range"][1]
self.physical_network.update(self.apiclient,
id=self.physical_network.id, vlan=new_vlan)
# Dedicating guest vlan range
dedicate_guest_vlan_range_response = PhysicalNetwork.dedicate(
self.apiclient,
self.free_vlan["partial_range"][0],
physicalnetworkid=self.physical_network.id,
account=self.account1.name,
domainid=self.account1.domainid
)
list_dedicated_guest_vlan_range_response = PhysicalNetwork.listDedicated(
self.apiclient,
id=dedicate_guest_vlan_range_response.id
)
dedicated_guest_vlan_response = list_dedicated_guest_vlan_range_response[0]
self.assertEqual(
dedicated_guest_vlan_response.account,
self.account1.name,
"Check account name is in listDedicatedGuestVlanRanges as the account the range is dedicated to"
)
# Dedicating guest vlan range
dedicate_guest_vlan_range_response = PhysicalNetwork.dedicate(
self.apiclient,
self.free_vlan["partial_range"][1],
physicalnetworkid=self.physical_network.id,
account=self.account2.name,
domainid=self.account2.domainid
)
list_dedicated_guest_vlan_range_response = PhysicalNetwork.listDedicated(
self.apiclient,
id=dedicate_guest_vlan_range_response.id
)
dedicated_guest_vlan_response = list_dedicated_guest_vlan_range_response[0]
self.assertEqual(
dedicated_guest_vlan_response.account,
self.account2.name,
"Check account name is in listDedicatedGuestVlanRanges as the account the range is dedicated to"
)
return
@attr(tags = ["advanced", "selfservice"], required_hardware="false")
def test_07_extend_vlan_range(self):
"""Dedicate vlan range to an account when some vlan in range are already acquired by same account
# Validate the following:
# 1. Create account under root domain
# 2. Add a new vlan range to the physical network
# 3. Create a guest network in account using the vlan id from the newly added range
# 4. Try to dedicate the vlan range to account
# 5. Operation should succeed
"""
self.account = Account.create(
self.apiclient,
self.testdata["account"],
domainid=self.domain.id
)
self.cleanup.append(self.account)
vlans = str(self.free_vlan["partial_range"][0]).split("-")
startid = int(vlans[0])
endid = int(vlans[1])
vlan_range1 = str(startid) + "-" + str(endid)
vlan_range2 = str(endid+1) + "-" + str(endid+2)
full_range = str(startid) + "-" + str(endid+2)
new_vlan = self.physical_network.vlan + "," + full_range
self.physical_network.update(self.apiclient,
id=self.physical_network.id, vlan=new_vlan)
# Dedicating first range
PhysicalNetwork.dedicate(
self.apiclient,
vlan_range1,
physicalnetworkid=self.physical_network.id,
account=self.account.name,
domainid=self.account.domainid
)
# Dedicating second range
PhysicalNetwork.dedicate(
self.apiclient,
vlan_range2,
physicalnetworkid=self.physical_network.id,
account=self.account.name,
domainid=self.account.domainid
)
dedicated_ranges = PhysicalNetwork.listDedicated(
self.apiclient,
account=self.account.name,
domainid=self.account.domainid,
listall=True
)
self.assertEqual(str(dedicated_ranges[0].guestvlanrange), full_range, "Dedicated vlan\
range not matching with expcted extended range")
return
class TestFailureScenarios(cloudstackTestCase):
@classmethod
def setUpClass(cls):
cls.testClient = super(TestFailureScenarios, cls).getClsTestClient()
cls.apiclient = cls.testClient.getApiClient()
cls.testdata = cls.testClient.getParsedTestDataConfig()
# Get Zone, Domain
cls.domain = get_domain(cls.apiclient)
cls.zone = get_zone(cls.apiclient)
cls.testdata["isolated_network"]["zoneid"] = cls.zone.id
cls.testdata['mode'] = cls.zone.networktype
template = get_template(
cls.apiclient,
cls.zone.id,
cls.testdata["ostype"]
)
cls._cleanup = []
try:
cls.isolated_network_offering = NetworkOffering.create(
cls.apiclient,
cls.testdata["nw_off_isolated_persistent"])
cls._cleanup.append(cls.isolated_network_offering)
cls.isolated_network_offering.update(cls.apiclient, state='Enabled')
cls.testdata["nw_off_isolated_persistent"]["specifyVlan"] = True
cls.isolated_network_offering_vlan = NetworkOffering.create(
cls.apiclient,
cls.testdata["nw_off_isolated_persistent"])
cls._cleanup.append(cls.isolated_network_offering_vlan)
cls.isolated_network_offering_vlan.update(cls.apiclient, state='Enabled')
cls.service_offering = ServiceOffering.create(
cls.apiclient,
cls.testdata["service_offering"])
cls._cleanup.append(cls.service_offering)
cls.testdata["small"]["zoneid"] = cls.zone.id
cls.testdata["small"]["template"] = template.id
except Exception as e:
cls.tearDownClass()
raise unittest.SkipTest(e)
return
@classmethod
def tearDownClass(cls):
try:
# Cleanup resources used
cleanup_resources(cls.apiclient, cls._cleanup)
except Exception as e:
raise Exception("Warning: Exception during cleanup : %s" % e)
return
def setUp(self):
self.apiclient = self.testClient.getApiClient()
self.dbclient = self.testClient.getDbConnection()
self.cleanup = []
self.physical_network, self.free_vlan = setNonContiguousVlanIds(self.apiclient,
self.zone.id)
return
def tearDown(self):
try:
# Clean up
cleanup_resources(self.apiclient, self.cleanup)
except Exception as e:
raise Exception("Warning: Exception during cleanup : %s" % e)
finally:
self.physical_network.update(self.apiclient,
id=self.physical_network.id,
vlan=self.physical_network.vlan)
return
@attr(tags = ["advanced", "selfservice"], required_hardware="false")
def test_01_dedicate_wrong_vlan_range(self):
"""Dedicate invalid vlan range to account
# Validate the following:
# 1. Create an account in root domain
# 2. Try to update physical network with invalid range (5000-5001)
and dedicate it to account
# 3. The operation should fail
"""
self.account = Account.create(
self.apiclient,
self.testdata["account"],
domainid=self.domain.id
)
self.cleanup.append(self.account)
vlan_range = "5000-5001"
new_vlan = self.physical_network.vlan + "," + vlan_range
with self.assertRaises(Exception):
self.physical_network.update(self.apiclient,
id=self.physical_network.id,
vlan=new_vlan)
# Dedicating guest vlan range
PhysicalNetwork.dedicate(
self.apiclient,
vlan_range,
physicalnetworkid=self.physical_network.id,
account=self.account.name,
domainid=self.account.domainid
)
return
@attr(tags = ["advanced", "selfservice"], required_hardware="false")
def test_02_dedicate_vlan_range_invalid_account(self):
"""Dedicate a guest vlan range to invalid account
# Validate the following:
# 1. Create an account in root domain
# 2. Update physical network with new guest vlan range
# 3. Try to dedicate it to invalid account
# 4. The operation should fail
"""
self.account = Account.create(
self.apiclient,
self.testdata["account"],
domainid=self.domain.id
)
self.cleanup.append(self.account)
new_vlan = self.physical_network.vlan + "," + self.free_vlan["partial_range"][0]
self.physical_network.update(self.apiclient,
id=self.physical_network.id,
vlan=new_vlan)
with self.assertRaises(Exception):
# Dedicating guest vlan range
PhysicalNetwork.dedicate(
self.apiclient,
self.free_vlan["partial_range"][0],
physicalnetworkid=self.physical_network.id,
account=self.account.name+random_gen(),
domainid=self.account.domainid
)
return
@attr(tags = ["advanced", "selfservice"], required_hardware="false")
def test_03_dedicate_already_dedicated_range(self):
"""Dedicate a guest vlan range which is already dedicated
# Validate the following:
# 1. Create two accounts in root domain
# 2. Update physical network with new guest vlan range
# 3. Dedicate the vlan range to account 1
# 4. Try to dedicate the same range to account 2, operation should fail
"""
self.account1 = Account.create(
self.apiclient,
self.testdata["account"],
domainid=self.domain.id
)
self.cleanup.append(self.account1)
self.account2 = Account.create(
self.apiclient,
self.testdata["account"],
domainid=self.domain.id
)
self.cleanup.append(self.account2)
new_vlan = self.physical_network.vlan + "," + self.free_vlan["partial_range"][0]
self.physical_network.update(self.apiclient,
id=self.physical_network.id,
vlan=new_vlan)
# Dedicating guest vlan range
PhysicalNetwork.dedicate(
self.apiclient,
self.free_vlan["partial_range"][0],
physicalnetworkid=self.physical_network.id,
account=self.account1.name,
domainid=self.account1.domainid
)
with self.assertRaises(Exception):
# Dedicating guest vlan range
PhysicalNetwork.dedicate(
self.apiclient,
self.free_vlan["partial_range"][0],
physicalnetworkid=self.physical_network.id,
account=self.account2.name,
domainid=self.account2.domainid
)
return
class TestDeleteVlanRange(cloudstackTestCase):
@classmethod
def setUpClass(cls):
cls.testClient = super(TestDeleteVlanRange, cls).getClsTestClient()
cls.apiclient = cls.testClient.getApiClient()
cls.testdata = cls.testClient.getParsedTestDataConfig()
# Get Zone, Domain
cls.domain = get_domain(cls.apiclient)
cls.zone = get_zone(cls.apiclient)
cls.testdata["isolated_network"]["zoneid"] = cls.zone.id
cls.testdata['mode'] = cls.zone.networktype
template = get_template(
cls.apiclient,
cls.zone.id,
cls.testdata["ostype"]
)
cls._cleanup = []
try:
cls.isolated_persistent_network_offering = NetworkOffering.create(
cls.apiclient,
cls.testdata["nw_off_isolated_persistent"])
cls._cleanup.append(cls.isolated_persistent_network_offering)
cls.isolated_persistent_network_offering.update(cls.apiclient, state='Enabled')
cls.isolated_network_offering = NetworkOffering.create(
cls.apiclient,
cls.testdata["isolated_network_offering"])
cls._cleanup.append(cls.isolated_network_offering)
cls.isolated_network_offering.update(cls.apiclient, state='Enabled')
cls.testdata["nw_off_isolated_persistent"]["specifyvlan"] = True
cls.isolated_network_offering_vlan = NetworkOffering.create(
cls.apiclient,
cls.testdata["nw_off_isolated_persistent"])
cls._cleanup.append(cls.isolated_network_offering_vlan)
cls.isolated_network_offering_vlan.update(cls.apiclient, state='Enabled')
cls.service_offering = ServiceOffering.create(
cls.apiclient,
cls.testdata["service_offering"])
cls._cleanup.append(cls.service_offering)
cls.testdata["small"]["zoneid"] = cls.zone.id
cls.testdata["small"]["template"] = template.id
except Exception as e:
cls.tearDownClass()
raise unittest.SkipTest(e)
return
@classmethod
def tearDownClass(cls):
try:
# Cleanup resources used
cleanup_resources(cls.apiclient, cls._cleanup)
except Exception as e:
raise Exception("Warning: Exception during cleanup : %s" % e)
return
def setUp(self):
self.apiclient = self.testClient.getApiClient()
self.dbclient = self.testClient.getDbConnection()
self.cleanup = []
self.physical_network, self.free_vlan = setNonContiguousVlanIds(self.apiclient,
self.zone.id)
return
def tearDown(self):
try:
# Clean up
cleanup_resources(self.apiclient, self.cleanup)
except Exception as e:
raise Exception("Warning: Exception during cleanup : %s" % e)
finally:
self.physical_network.update(self.apiclient,
id=self.physical_network.id,
vlan=self.physical_network.vlan)
return
@attr(tags = ["advanced", "selfservice"], required_hardware="false")
def test_01_delete_dedicated_vlan_range(self):
"""Try to delete a dedicated vlan range which is not in use
# Validate the following:
# 1. Creat an account in the root domain
# 2. update the physical network with a new vlan range
# 3. Dedicated this vlan range to the account
# 4. Verify that the vlan range is dedicated to the account by listing it
and verifying the account name
# 5. Try to delete the vlan range by updating physical network vlan, operation should fail
# 6. Release the dedicted range and then delete the vlan range
# 7. The operation should succeed
"""
self.account = Account.create(
self.apiclient,
self.testdata["account"],
domainid=self.domain.id
)
self.cleanup.append(self.account)
new_vlan = self.free_vlan["partial_range"][0]
extended_vlan = self.physical_network.vlan + "," + new_vlan
self.physical_network.update(self.apiclient,
id=self.physical_network.id,
vlan=extended_vlan)
# Dedicating guest vlan range
dedicate_guest_vlan_range_response = PhysicalNetwork.dedicate(
self.apiclient,
self.free_vlan["partial_range"][0],
physicalnetworkid=self.physical_network.id,
account=self.account.name,
domainid=self.account.domainid
)
list_dedicated_guest_vlan_range_response = PhysicalNetwork.listDedicated(
self.apiclient,
id=dedicate_guest_vlan_range_response.id
)
dedicated_guest_vlan_response = list_dedicated_guest_vlan_range_response[0]
self.assertEqual(
dedicated_guest_vlan_response.account,
self.account.name,
"Check account name is in listDedicatedGuestVlanRanges as the account the range is dedicated to"
)
with self.assertRaises(Exception):
# Deleting the dedicated vlan range
self.physical_network.update(self.apiclient,
id=self.physical_network.id,
vlan=self.physical_network.vlan)
dedicate_guest_vlan_range_response.release(self.apiclient)
self.physical_network.update(self.apiclient,
id=self.physical_network.id,
vlan=self.physical_network.vlan)
physical_networks = PhysicalNetwork.list(self.apiclient, id=self.physical_network.id, listall=True)
self.assertEqual(validateList(physical_networks)[0], PASS, "Physical networks list validation failed")
vlans = xsplit(physical_networks[0].vlan, [','])
self.assertFalse(new_vlan in vlans, "newly added vlan is not deleted from physical network")
@attr(tags = ["advanced", "selfservice"], required_hardware="false")
def test_02_delete_dedicated_vlan_range_vlan_in_use(self):
"""Try to delete a dedicated vlan rang which is in use
# Validate the following:
# 1. Creat an account in the root domain
# 2. update the physical network with a new vlan range
# 3. Dedicated this vlan range to the account
# 4. Verify that the vlan range is dedicated to the account by listing it
and verifying the account name
# 5. Create a guest network in the account and verify that the vlan of network
is from the dedicated range
# 6. Try to delete the vlan range by updating physical network vlan
# 7. The operation should fail
"""
self.account = Account.create(
self.apiclient,
self.testdata["account"],
domainid=self.domain.id
)
self.cleanup.append(self.account)
new_vlan = self.physical_network.vlan + "," + self.free_vlan["partial_range"][0]
self.physical_network.update(self.apiclient,
id=self.physical_network.id,
vlan=new_vlan)
# Dedicating guest vlan range
dedicate_guest_vlan_range_response = PhysicalNetwork.dedicate(
self.apiclient,
self.free_vlan["partial_range"][0],
physicalnetworkid=self.physical_network.id,
account=self.account.name,
domainid=self.account.domainid
)
list_dedicated_guest_vlan_range_response = PhysicalNetwork.listDedicated(
self.apiclient,
id=dedicate_guest_vlan_range_response.id
)
dedicated_guest_vlan_response = list_dedicated_guest_vlan_range_response[0]
self.assertEqual(
dedicated_guest_vlan_response.account,
self.account.name,
"Check account name is in listDedicatedGuestVlanRanges as the account the range is dedicated to"
)
Network.create(
self.apiclient,
self.testdata["isolated_network"],
self.account.name,
self.account.domainid,
networkofferingid=self.isolated_persistent_network_offering.id)
with self.assertRaises(Exception):
# Deleting the dedicated vlan range
self.physical_network.update(self.apiclient,
id=self.physical_network.id,
vlan=self.physical_network.vlan)
return
@attr(tags = ["advanced", "selfservice"], required_hardware="false")
def test_03_delete_account(self):
"""Try to delete a dedicated vlan rang which is in use
# Validate the following:
# 1. Creat an account in the root domain
# 2. Update the physical network with a new vlan range
# 3. Dedicated this vlan range to the account
# 4. Verify that the vlan range is dedicated to the account by listing it
and verifying the account name
# 5. Create a guest network in the account which consumes vlan from dedicated range
# 6. Delete the account
# 7. Verify that the vlan of the physical network remains the same
"""
self.account = Account.create(
self.apiclient,
self.testdata["account"],
domainid=self.domain.id
)
self.cleanup.append(self.account)
new_vlan = self.physical_network.vlan + "," + self.free_vlan["partial_range"][0]
self.physical_network.update(self.apiclient,
id=self.physical_network.id,
vlan=new_vlan)
# Dedicating guest vlan range
dedicate_guest_vlan_range_response = PhysicalNetwork.dedicate(
self.apiclient,
self.free_vlan["partial_range"][0],
physicalnetworkid=self.physical_network.id,
account=self.account.name,
domainid=self.account.domainid
)
list_dedicated_guest_vlan_range_response = PhysicalNetwork.listDedicated(
self.apiclient,
id=dedicate_guest_vlan_range_response.id
)
dedicated_guest_vlan_response = list_dedicated_guest_vlan_range_response[0]
self.assertEqual(
dedicated_guest_vlan_response.account,
self.account.name,
"Check account name is in listDedicatedGuestVlanRanges as the account the range is dedicated to"
)
Network.create(
self.apiclient,
self.testdata["isolated_network"],
self.account.name,
self.account.domainid,
networkofferingid=self.isolated_persistent_network_offering.id)
self.account.delete(self.apiclient)
self.cleanup.remove(self.account)
physical_networks = PhysicalNetwork.list(self.apiclient, id=self.physical_network.id, listall=True)
self.assertEqual(validateList(physical_networks)[0], PASS, "Physical networks list validation failed")
self.assertEqual(physical_networks[0].vlan, new_vlan, "The vlan of physical network \
should be same after deleting account")
return
@attr(tags = ["advanced", "selfservice"], required_hardware="false")
def test_04_release_range_no_vlan_in_use(self):
"""Release a dedicated vlan range when no vlan id is in use
# Validate the following:
# 1. Create account in root domain
# 2. Dedicate a new vlan range to account
# 3. Verify that the new vlan range is dedicated to account
by listing the dedicated range and checking the account name
# 4. Release the range
# 5. Verify the range is released back to system by listing dedicated ranges (list should be empty)
"""
self.account1 = Account.create(
self.apiclient,
self.testdata["account"],
domainid=self.domain.id
)
self.cleanup.append(self.account1)
new_vlan = self.physical_network.vlan + "," + self.free_vlan["partial_range"][0]
self.physical_network.update(self.apiclient,
id=self.physical_network.id, vlan=new_vlan)
# Dedicating guest vlan range
dedicate_guest_vlan_range_response = PhysicalNetwork.dedicate(
self.apiclient,
self.free_vlan["partial_range"][0],
physicalnetworkid=self.physical_network.id,
account=self.account1.name,
domainid=self.account1.domainid
)
list_dedicated_guest_vlan_range_response = PhysicalNetwork.listDedicated(
self.apiclient,
id=dedicate_guest_vlan_range_response.id
)
dedicated_guest_vlan_response = list_dedicated_guest_vlan_range_response[0]
self.assertEqual(
dedicated_guest_vlan_response.account,
self.account1.name,
"Check account name is in listDedicatedGuestVlanRanges as the account the range is dedicated to"
)
self.debug("Releasing guest vlan range");
dedicate_guest_vlan_range_response.release(self.apiclient)
list_dedicated_guest_vlan_range_response = PhysicalNetwork.listDedicated(self.apiclient)
self.assertEqual(
list_dedicated_guest_vlan_range_response,
None,
"Check vlan range is not available in listDedicatedGuestVlanRanges"
)
return
@attr(tags = ["advanced", "selfservice"], required_hardware="false")
def test_05_release_range_vlan_in_use(self):
"""Release a dedicated vlan range when no vlan id is in use
# Validate the following:
# 1. Create account in root domain
# 2. Dedicate a new vlan range to account
# 3. Verify that the new vlan range is dedicated to account
by listing the dedicated range and checking the account name
# 4. Release the range
# 5. The operation should succeed, as all vlans which are not in use should be released
"""
self.account1 = Account.create(
self.apiclient,
self.testdata["account"],
domainid=self.domain.id
)
self.cleanup.append(self.account1)
new_vlan = self.physical_network.vlan + "," + self.free_vlan["partial_range"][0]
self.physical_network.update(self.apiclient,
id=self.physical_network.id, vlan=new_vlan)
# Dedicating guest vlan range
dedicate_guest_vlan_range_response = PhysicalNetwork.dedicate(
self.apiclient,
self.free_vlan["partial_range"][0],
physicalnetworkid=self.physical_network.id,
account=self.account1.name,
domainid=self.account1.domainid
)
list_dedicated_guest_vlan_range_response = PhysicalNetwork.listDedicated(
self.apiclient,
id=dedicate_guest_vlan_range_response.id
)
dedicated_guest_vlan_response = list_dedicated_guest_vlan_range_response[0]
self.assertEqual(
dedicated_guest_vlan_response.account,
self.account1.name,
"Check account name is in listDedicatedGuestVlanRanges as the account the range is dedicated to"
)
dedicatedvlans = str(self.free_vlan["partial_range"][0]).split("-")
isolated_network = Network.create(
self.apiclient,
self.testdata["isolated_network"],
self.account1.name,
self.account1.domainid,
networkofferingid=self.isolated_persistent_network_offering.id)
networks = Network.list(self.apiclient, id=isolated_network.id)
self.assertEqual(validateList(networks)[0], PASS, "networks list validation failed")
self.assertTrue(int(dedicatedvlans[0]) <= int(networks[0].vlan) <= int(dedicatedvlans[1]),
"Vlan of the network should be from the dedicated range")
self.debug("Releasing guest vlan range");
dedicate_guest_vlan_range_response.release(self.apiclient)
list_dedicated_guest_vlan_range_response = PhysicalNetwork.listDedicated(self.apiclient)
self.assertEqual(
list_dedicated_guest_vlan_range_response,
None,
"Check vlan range is not available in listDedicatedGuestVlanRanges"
)
return
| 49.296058 | 142 | 0.537625 | 5,711 | 61,275 | 5.614078 | 0.06006 | 0.03958 | 0.038426 | 0.04117 | 0.883164 | 0.875023 | 0.862859 | 0.856247 | 0.837222 | 0.827584 | 0 | 0.009746 | 0.393798 | 61,275 | 1,242 | 143 | 49.335749 | 0.853412 | 0.144137 | 0 | 0.805239 | 0 | 0 | 0.086855 | 0.013805 | 0 | 0 | 0 | 0 | 0.05467 | 1 | 0.030752 | false | 0.013667 | 0.007973 | 0 | 0.071754 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
8bf3ca6bf9677b7020cd90839db6e479da86e737 | 16,880 | py | Python | manual_files/output/visulization example/energy_table.py | rishavsen1/transit-simulator | e5ae747b304243f8fed06e34b99f9b0d61547e5a | [
"MIT"
] | 4 | 2021-06-23T04:43:59.000Z | 2022-02-02T04:28:23.000Z | manual_files/output/visulization example/energy_table.py | rishavsen1/transit-simulator | e5ae747b304243f8fed06e34b99f9b0d61547e5a | [
"MIT"
] | 1 | 2022-02-21T01:47:32.000Z | 2022-02-21T01:47:32.000Z | manual_files/output/visulization example/energy_table.py | rishavsen1/transit-simulator | e5ae747b304243f8fed06e34b99f9b0d61547e5a | [
"MIT"
] | 2 | 2022-01-31T04:22:57.000Z | 2022-02-09T17:04:09.000Z | import matplotlib.pyplot as plt
import seaborn as sns
import pandas as pd
import numpy as np
import glob
import os
###table of kj/km for base scenario
path = r'/Traj6-7/RR1'
all_files = glob.glob(os.path.join(path, "diesel*.csv"))
b=[mine[-33:-4] for mine in all_files]
energyr1d = pd.DataFrame()
energy=np.float32(0)
c=0
for f in all_files:
df=pd.read_csv(f, usecols=[2,10])
# df.drop(df.index[0],inplace=True)
df = df.astype(float)
df['speed']=0.621371*df['speed']/3600 #mi/s
df['energyrate']=(df['FuelRate']*0.264172/3600) #gal/s
energy=df['speed'].sum()/df['energyrate'].sum() #mi/gal
energyr1d.loc[0,c]=energy
c=c+1
energyr1d.columns=b
#energyr1d = energyr1d.replace([np.inf], np.nan)
#energyr1d.mean()
all_files = glob.glob(os.path.join(path, "hybrid*.csv"))
b=[mine[-33:-4] for mine in all_files]
energyr1h = pd.DataFrame()
energy=np.float32(0)
c=0
for f in all_files:
df=pd.read_csv(f, usecols=[2,10])
# df.drop(df.index[0],inplace=True)
df = df.astype(float)
df['speed']=0.621371*df['speed']/3600 #mi/s
df['energyrate']=(df['FuelRateH']*0.264172/3600) #gal/s
energy=df['speed'].sum()/df['energyrate'].sum() #mi/gal
energyr1h.loc[0,c]=energy
c=c+1
energyr1h.columns=b
all_files = glob.glob(os.path.join(path, "elect*.csv"))
b=[mine[-33:-4] for mine in all_files]
energyr1e = pd.DataFrame()
energy=np.float32(0)
c=0
for f in all_files:
df=pd.read_csv(f, usecols=[2,10])
# df.drop(df.index[0],inplace=True)
df = df.astype(float)
df['speed']=df['speed']*(0.01*3.6)*0.621371/3600 #mi/s
df['energyrate']=(df['Power']*0.02457002457002457*0.28/1000) #gal/s
energy=df['speed'].sum()/df['energyrate'].sum() #mi/gal
energyr1e.loc[0,c]=energy
c=c+1
energyr1e.columns=b
path = r'/Traj6-7/RR3'
all_files = glob.glob(os.path.join(path, "diesel*.csv"))
b=[mine[-33:-4] for mine in all_files]
energyR3d = pd.DataFrame()
energy=np.float32(0)
c=0
for f in all_files:
df=pd.read_csv(f, usecols=[2,10])
# df.drop(df.index[0],inplace=True)
df = df.astype(float)
df['speed']=0.621371*df['speed']/3600 #mi/s
df['energyrate']=(df['FuelRate']*0.264172/3600) #gal/s
energy=df['speed'].sum()/df['energyrate'].sum() #mi/gal
energyR3d.loc[0,c]=energy
c=c+1
energyR3d.columns=b
#energyR3d = energyR3d.replace([np.inf], np.nan)
#energyR3d.mean()
all_files = glob.glob(os.path.join(path, "hybrid*.csv"))
b=[mine[-33:-4] for mine in all_files]
energyR3h = pd.DataFrame()
energy=np.float32(0)
c=0
for f in all_files:
df=pd.read_csv(f, usecols=[2,10])
# df.drop(df.index[0],inplace=True)
df = df.astype(float)
df['speed']=0.621371*df['speed']/3600 #mi/s
df['energyrate']=(df['FuelRateH']*0.264172/3600) #gal/s
energy=df['speed'].sum()/df['energyrate'].sum() #mi/gal
energyR3h.loc[0,c]=energy
c=c+1
energyR3h.columns=b
all_files = glob.glob(os.path.join(path, "elect*.csv"))
b=[mine[-33:-4] for mine in all_files]
energyR3e = pd.DataFrame()
energy=np.float32(0)
c=0
for f in all_files:
df=pd.read_csv(f, usecols=[2,10])
# df.drop(df.index[0],inplace=True)
df = df.astype(float)
df['speed']=df['speed']*(0.01*3.6)*0.621371/3600 #mi/s
df['energyrate']=(df['Power']*0.02457002457002457*0.28/1000) #gal/s
energy=df['speed'].sum()/df['energyrate'].sum() #mi/gal
energyR3e.loc[0,c]=energy
c=c+1
energyR3e.columns=b
path = r'/Traj6-7/RR4'
all_files = glob.glob(os.path.join(path, "diesel*.csv"))
b=[mine[-33:-4] for mine in all_files]
energyR4d = pd.DataFrame()
energy=np.float32(0)
c=0
for f in all_files:
df=pd.read_csv(f, usecols=[2,10])
# df.drop(df.index[0],inplace=True)
df = df.astype(float)
df['speed']=0.621371*df['speed']/3600 #mi/s
df['energyrate']=(df['FuelRate']*0.264172/3600) #gal/s
energy=df['speed'].sum()/df['energyrate'].sum() #mi/gal
energyR4d.loc[0,c]=energy
c=c+1
energyR4d.columns=b
#energyR4d = energyR4d.replace([np.inf], np.nan)
#energyR4d.mean()
all_files = glob.glob(os.path.join(path, "hybrid*.csv"))
b=[mine[-33:-4] for mine in all_files]
energyR4h = pd.DataFrame()
energy=np.float32(0)
c=0
for f in all_files:
df=pd.read_csv(f, usecols=[2,10])
# df.drop(df.index[0],inplace=True)
df = df.astype(float)
df['speed']=0.621371*df['speed']/3600 #mi/s
df['energyrate']=(df['FuelRateH']*0.264172/3600) #gal/s
energy=df['speed'].sum()/df['energyrate'].sum() #mi/gal
energyR4h.loc[0,c]=energy
c=c+1
energyR4h.columns=b
all_files = glob.glob(os.path.join(path, "elect*.csv"))
b=[mine[-33:-4] for mine in all_files]
energyR4e = pd.DataFrame()
energy=np.float32(0)
c=0
for f in all_files:
df=pd.read_csv(f, usecols=[2,10])
# df.drop(df.index[0],inplace=True)
df = df.astype(float)
df['speed']=df['speed']*(0.01*3.6)*0.621371/3600 #mi/s
df['energyrate']=(df['Power']*0.02457002457002457*0.28/1000) #gal/s
energy=df['speed'].sum()/df['energyrate'].sum() #mi/gal
energyR4e.loc[0,c]=energy
c=c+1
energyR4e.columns=b
path = r'/Traj6-7/RR9'
all_files = glob.glob(os.path.join(path, "diesel*.csv"))
b=[mine[-33:-4] for mine in all_files]
energyR9d = pd.DataFrame()
energy=np.float32(0)
c=0
for f in all_files:
df=pd.read_csv(f, usecols=[2,10])
# df.drop(df.index[0],inplace=True)
df = df.astype(float)
df['speed']=0.621371*df['speed']/3600 #mi/s
df['energyrate']=(df['FuelRate']*0.264172/3600) #gal/s
energy=df['speed'].sum()/df['energyrate'].sum() #mi/gal
energyR9d.loc[0,c]=energy
c=c+1
energyR9d.columns=b
#energyR9d = energyR9d.replace([np.inf], np.nan)
#energyR9d.mean()
all_files = glob.glob(os.path.join(path, "hybrid*.csv"))
b=[mine[-33:-4] for mine in all_files]
energyR9h = pd.DataFrame()
energy=np.float32(0)
c=0
for f in all_files:
df=pd.read_csv(f, usecols=[2,10])
# df.drop(df.index[0],inplace=True)
df = df.astype(float)
df['speed']=0.621371*df['speed']/3600 #mi/s
df['energyrate']=(df['FuelRateH']*0.264172/3600) #gal/s
energy=df['speed'].sum()/df['energyrate'].sum() #mi/gal
energyR9h.loc[0,c]=energy
c=c+1
energyR9h.columns=b
all_files = glob.glob(os.path.join(path, "elect*.csv"))
b=[mine[-33:-4] for mine in all_files]
energyR9e = pd.DataFrame()
energy=np.float32(0)
c=0
for f in all_files:
df=pd.read_csv(f, usecols=[2,10])
# df.drop(df.index[0],inplace=True)
df = df.astype(float)
df['speed']=df['speed']*(0.01*3.6)*0.621371/3600 #mi/s
df['energyrate']=(df['Power']*0.02457002457002457*0.28/1000) #gal/s
energy=df['speed'].sum()/df['energyrate'].sum() #mi/gal
energyR9e.loc[0,c]=energy
c=c+1
energyR9e.columns=b
path = r'/Traj6-7/RR10A'
all_files = glob.glob(os.path.join(path, "diesel*.csv"))
b=[mine[-33:-4] for mine in all_files]
energyR10Ad = pd.DataFrame()
energy=np.float32(0)
c=0
for f in all_files:
df=pd.read_csv(f, usecols=[2,10])
# df.drop(df.index[0],inplace=True)
df = df.astype(float)
df['speed']=0.621371*df['speed']/3600 #mi/s
df['energyrate']=(df['FuelRate']*0.264172/3600) #gal/s
energy=df['speed'].sum()/df['energyrate'].sum() #mi/gal
energyR10Ad.loc[0,c]=energy
c=c+1
energyR10Ad.columns=b
#energyR10Ad = energyR10Ad.replace([np.inf], np.nan)
#energyR10Ad.mean()
all_files = glob.glob(os.path.join(path, "hybrid*.csv"))
b=[mine[-33:-4] for mine in all_files]
energyR10Ah = pd.DataFrame()
energy=np.float32(0)
c=0
for f in all_files:
df=pd.read_csv(f, usecols=[2,10])
# df.drop(df.index[0],inplace=True)
df = df.astype(float)
df['speed']=0.621371*df['speed']/3600 #mi/s
df['energyrate']=(df['FuelRateH']*0.264172/3600) #gal/s
energy=df['speed'].sum()/df['energyrate'].sum() #mi/gal
energyR10Ah.loc[0,c]=energy
c=c+1
energyR10Ah.columns=b
all_files = glob.glob(os.path.join(path, "elect*.csv"))
b=[mine[-33:-4] for mine in all_files]
energyR10Ae = pd.DataFrame()
energy=np.float32(0)
c=0
for f in all_files:
df=pd.read_csv(f, usecols=[2,10])
# df.drop(df.index[0],inplace=True)
df = df.astype(float)
df['speed']=df['speed']*(0.01*3.6)*0.621371/3600 #mi/s
df['energyrate']=(df['Power']*0.02457002457002457*0.28/1000) #gal/s
energy=df['speed'].sum()/df['energyrate'].sum() #mi/gal
energyR10Ae.loc[0,c]=energy
c=c+1
energyR10Ae.columns=b
path = r'/Traj6-7/RR10G'
all_files = glob.glob(os.path.join(path, "diesel*.csv"))
b=[mine[-33:-4] for mine in all_files]
energyR10Gd = pd.DataFrame()
energy=np.float32(0)
c=0
for f in all_files:
df=pd.read_csv(f, usecols=[2,10])
# df.drop(df.index[0],inplace=True)
df = df.astype(float)
df['speed']=0.621371*df['speed']/3600 #mi/s
df['energyrate']=(df['FuelRate']*0.264172/3600) #gal/s
energy=df['speed'].sum()/df['energyrate'].sum() #mi/gal
energyR10Gd.loc[0,c]=energy
c=c+1
energyR10Gd.columns=b
#energyR10Gd = energyR10Gd.replace([np.inf], np.nan)
#energyR10Gd.mean()
all_files = glob.glob(os.path.join(path, "hybrid*.csv"))
b=[mine[-33:-4] for mine in all_files]
energyR10Gh = pd.DataFrame()
energy=np.float32(0)
c=0
for f in all_files:
df=pd.read_csv(f, usecols=[2,10])
# df.drop(df.index[0],inplace=True)
df = df.astype(float)
df['speed']=0.621371*df['speed']/3600 #mi/s
df['energyrate']=(df['FuelRateH']*0.264172/3600) #gal/s
energy=df['speed'].sum()/df['energyrate'].sum() #mi/gal
energyR10Gh.loc[0,c]=energy
c=c+1
energyR10Gh.columns=b
all_files = glob.glob(os.path.join(path, "elect*.csv"))
b=[mine[-33:-4] for mine in all_files]
energyR10Ge = pd.DataFrame()
energy=np.float32(0)
c=0
for f in all_files:
df=pd.read_csv(f, usecols=[2,10])
# df.drop(df.index[0],inplace=True)
df = df.astype(float)
df['speed']=df['speed']*(0.01*3.6)*0.621371/3600 #mi/s
df['energyrate']=(df['Power']*0.02457002457002457*0.28/1000) #gal/s
energy=df['speed'].sum()/df['energyrate'].sum() #mi/gal
energyR10Ge.loc[0,c]=energy
c=c+1
energyR10Ge.columns=b
# =============================================================================
# =============================================================================
###table of kj/km for reduced scenario
path = r'/reduced80traj6-7/Diesel'
all_files = glob.glob(os.path.join(path, "diesel*.csv"))
b=[mine[-27:-4] for mine in all_files]
energyr1d = pd.DataFrame()
energy=np.float32(0)
c=0
for f in all_files:
df=pd.read_csv(f, usecols=[2,10])
# df.drop(df.index[0],inplace=True)
df = df.astype(float)
df['speed']=0.621371*df['speed']/3600 #mi/s
df['energyrate']=(df['FuelRate']/3600) #gal/s
energy=df['speed'].sum()/df['energyrate'].sum() #mi/gal
energyr1d.loc[0,c]=energy
c=c+1
energyr1d.columns=b
path = r'/reduced80traj6-7/Hybrid'
all_files = glob.glob(os.path.join(path, "hybrid*.csv"))
b=[mine[-27:-4] for mine in all_files]
energyr1h = pd.DataFrame()
energy=np.float32(0)
c=0
for f in all_files:
df=pd.read_csv(f, usecols=[2,10])
# df.drop(df.index[0],inplace=True)
df = df.astype(float)
df['speed']=0.621371*df['speed']/3600 #mi/s
df['energyrate']=(df['FuelRateH']/3600) #gal/s
energy=df['speed'].sum()/df['energyrate'].sum() #mi/gal
energyr1h.loc[0,c]=energy
c=c+1
energyr1h.columns=b
path = r'/reduced80traj6-7/Electric'
all_files = glob.glob(os.path.join(path, "elect*.csv"))
b=[mine[-27:-4] for mine in all_files]
energyr1e = pd.DataFrame()
energy=np.float32(0)
c=0
for f in all_files:
df=pd.read_csv(f, usecols=[2,10])
# df.drop(df.index[0],inplace=True)
df = df.astype(float)
df['speed']=df['speed']*(0.01*3.6)*0.621371/3600 #mi/s
df['energyrate']=(df['Power']*0.02457002457002457*0.28/1000) #gal/s
energy=df['speed'].sum()/df['energyrate'].sum() #mi/gal
energyr1e.loc[0,c]=energy
c=c+1
energyr1e.columns=b
energyall=pd.concat([energyr1d,energyr1h,energyr1e])
#figure5 left
path = r'E:/SUMO/RUIXIAO/newChattanooganet/Data/output/Traj08'
all_files = glob.glob(os.path.join(path, "Route*.csv"))
df_from_each_file = (pd.read_csv(f, usecols=[2,3]) for f in all_files)
traj08 = pd.concat(df_from_each_file, ignore_index=True)
traj08['speed']=traj08['speed']*(0.01*3.6*0.62137) #mph
traj08['acceleration']=traj08['acceleration']*(0.001)
traj08 = traj08[(traj08.speed > 5)]
path = r'E:/SUMO/RUIXIAO/newChattanooganet/Data/output/Traj12'
all_files = glob.glob(os.path.join(path, "Route*.csv"))
df_from_each_file = (pd.read_csv(f, usecols=[2,3]) for f in all_files)
traj12 = pd.concat(df_from_each_file, ignore_index=True)
traj12['speed']=traj12['speed']*(0.01*3.6*0.62137) #mph
traj12['acceleration']=traj12['acceleration']*(0.001)
traj12 = traj12[(traj12.speed > 5)]
path = r'E:/SUMO/RUIXIAO/newChattanooganet/Data/output/Traj17'
all_files = glob.glob(os.path.join(path, "Route*.csv"))
df_from_each_file = (pd.read_csv(f, usecols=[2,3]) for f in all_files)
traj17 = pd.concat(df_from_each_file, ignore_index=True)
traj17['speed']=traj17['speed']*(0.01*3.6*0.62137) #mph
traj17['acceleration']=traj17['acceleration']*(0.001)
traj17 = traj17[(traj17.speed > 5)]
fig, ax1 = plt.subplots(figsize=(8, 6))
sns.set_style("ticks")
sns.distplot(traj08['speed'], hist = False, kde = True,kde_kws = {"color": "#2980B9",'lw': 2},label="08")
sns.distplot(traj12['speed'], hist = False, kde = True,kde_kws = {"color": "#FE8E2A",'lw': 2},label="12")
sns.distplot(traj17['speed'], hist = False, kde = True,kde_kws = {"color": "#28B463",'lw': 2},label="17")
plt.xlabel("Speed (mph)",fontsize = 14)
plt.ylabel("Density",fontsize = 14)
ax1.set(xlim=(0, 75))
plt.setp(ax1.get_xticklabels(), fontsize = 14)
plt.setp(ax1.get_yticklabels(), fontsize = 14)
ax1.legend().set_title('Hour')
#plt.title('Route 10G',fontsize = 14)
plt.show()
#figure5 right
path = r'E:/SUMO/RUIXIAO/newChattanooganet/Data/output/figure5right/08'
all_files = glob.glob(os.path.join(path, "Route*.csv"))
speedmean08 = pd.DataFrame()
c=0
for f in all_files:
df=pd.read_csv(f, usecols=[2])
# df.drop(df.index[0],inplace=True)
df = df.astype(int)
df['speed']=df['speed']*(0.01*3.6*0.62137) #mph
speedmean08.loc[0,c]=df.iloc[:,0].mean()
c=c+1
speedmean08.columns=['10A','10G','16','1','21','33','4','8','9']
speedmean08=speedmean08.T
speedmean08 = speedmean08.reset_index()
speedmean08.columns=['Route','Speed']
speedmean08['Hour'] = "08"
path = r'E:/SUMO/RUIXIAO/newChattanooganet/Data/output/figure5right/12'
all_files = glob.glob(os.path.join(path, "Route*.csv"))
speedmean12 = pd.DataFrame()
c=0
for f in all_files:
df=pd.read_csv(f, usecols=[2])
# df.drop(df.index[0],inplace=True)
df = df.astype(int)
df['speed']=df['speed']*(0.01*3.6*0.62137) #mph
speedmean12.loc[0,c]=df.iloc[:,0].mean()
c=c+1
speedmean12.columns=['10A','10G','16','1','21','33','4','8','9']
speedmean12=speedmean12.T
speedmean12 = speedmean12.reset_index()
speedmean12.columns=['Route','Speed']
speedmean12['Hour']="12"
path = r'E:/SUMO/RUIXIAO/newChattanooganet/Data/output/figure5right/17'
all_files = glob.glob(os.path.join(path, "Route*.csv"))
speedmean17 = pd.DataFrame()
c=0
for f in all_files:
df=pd.read_csv(f, usecols=[2])
# df.drop(df.index[0],inplace=True)
df = df.astype(int)
df['speed']=df['speed']*(0.01*3.6*0.62137) #mph
speedmean17.loc[0,c]=df.iloc[:,0].mean()
c=c+1
speedmean17.columns=['10A','10G','16','1','21','33','4','8','9']
speedmean17=speedmean17.T
speedmean17 = speedmean17.reset_index()
speedmean17.columns=['Route','Speed']
speedmean17['Hour']="17"
speedmean=pd.concat([speedmean08,speedmean12,speedmean17],ignore_index=True)
fig, ax1 = plt.subplots(figsize=(8, 6))
sns.scatterplot(data=speedmean, x="Route", y="Speed", hue="Hour", style="Hour",legend=False)
sns.lineplot(data=speedmean, x="Route", y="Speed", hue="Hour", style="Hour",linewidth = 2)
plt.xlabel("Route",fontsize = 14)
plt.ylabel("Average Speed of Buses(mph)",fontsize = 14)
#ax1.set(xlim=(0, 75))
plt.setp(ax1.get_xticklabels(), fontsize = 14)
plt.setp(ax1.get_yticklabels(), fontsize = 14)
#ax1.legend().set_title('Hour')
#plt.title('Route 10G',fontsize = 14)
plt.show()
| 32.524085 | 106 | 0.62109 | 2,750 | 16,880 | 3.762545 | 0.068727 | 0.057988 | 0.04639 | 0.041751 | 0.815019 | 0.802455 | 0.790181 | 0.769885 | 0.74524 | 0.718276 | 0 | 0.095434 | 0.165699 | 16,880 | 518 | 107 | 32.586873 | 0.639281 | 0.117417 | 0 | 0.710327 | 0 | 0 | 0.143679 | 0.028974 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.015113 | 0 | 0.015113 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
e314150a3cad6142757c4818336cca3afeb96e5d | 14,379 | py | Python | Krogg/aliens.py | wang0618/ascii-art | 7ce6f152541716034bf0a22d341a898b17e2865f | [
"MIT"
] | 1 | 2021-08-29T09:52:06.000Z | 2021-08-29T09:52:06.000Z | Krogg/aliens.py | wang0618/ascii-art | 7ce6f152541716034bf0a22d341a898b17e2865f | [
"MIT"
] | null | null | null | Krogg/aliens.py | wang0618/ascii-art | 7ce6f152541716034bf0a22d341a898b17e2865f | [
"MIT"
] | null | null | null | # https://web.archive.org/web/20000621125710/http://gtcom.net/~krogg/ascii/ALIENS.HTM
# The aliens
name = "UFO"
frames = [
" \r\n"+
" \r\n"+
" \r\n"+
" \r\n"+
" \r\n"+
" \r\n"+
" \r\n"+
" o o o \r\n"+
" /\\ /\\ /\\ \r\n"+
" /\\ ## /\\ /\\ \r",
" \r\n"+
" \r\n"+
" \r\n"+
" \r\n"+
" \r\n"+
" \r\n"+
" \r\n"+
" o/ o o \r\n"+
" / /\\ /\\ \r\n"+
" /\\ ## /\\ /\\ \r",
" \r\n"+
" \r\n"+
" \r\n"+
" \r\n"+
" \r\n"+
" \r\n"+
" \r\n"+
" o o/ o \r\n"+
" /\\ / /\\ \r\n"+
" /\\ ## /\\ /\\ \r",
" \r\n"+
" \r\n"+
" \r\n"+
" \r\n"+
" \r\n"+
" \r\n"+
" \r\n"+
" o \\o o \r\n"+
" /\\ \\ /\\ \r\n"+
" /\\ ## /\\ /\\ \r",
" \r\n"+
" \r\n"+
" \r\n"+
" \r\n"+
" \r\n"+
" \r\n"+
" \r\n"+
" o o o \r\n"+
" /\\ /\\ /\\ \r\n"+
" /\\ ## /\\ /\\ \r",
" \r\n"+
" \r\n"+
"> \r\n"+
" \r\n"+
" \r\n"+
" \r\n"+
" \r\n"+
" o o o \r\n"+
" /\\ /\\ /\\ \r\n"+
" /\\ ## /\\ /\\ \r",
" \r\n"+
" \r\n"+
"#> \r\n"+
" \r\n"+
" \r\n"+
" \r\n"+
" \r\n"+
" o o o \r\n"+
" /\\ /\\ /\\ \r\n"+
" /\\ ## /\\ /\\ \r",
" \r\n"+
"> \r\n"+
"##> \r\n"+
" \r\n"+
" \r\n"+
" \r\n"+
" \r\n"+
" o o o \r\n"+
" /\\ /\\ /\\ \r\n"+
" /\\ ## /\\ /\\ \r",
" \r\n"+
"\<> \r\n"+
"###> \r\n"+
" \r\n"+
" \r\n"+
" \r\n"+
" \r\n"+
" o o o \r\n"+
" /\\ /\\ /\\ \r\n"+
" /\\ ## /\\ /\\ \r",
" \r\n"+
" \<> \r\n"+
"####> \r\n"+
" \r\n"+
" \r\n"+
" \r\n"+
" \r\n"+
" o o o \r\n"+
" /\\ /\\ /\\ \r\n"+
" /\\ ## /\\ /\\ \r",
" \r\n"+
" \<> \r\n"+
"\<####> \r\n"+
" \r\n"+
" \r\n"+
" \r\n"+
" \r\n"+
" o o o \r\n"+
" /\\ /\\ /\\ \r\n"+
" /\\ ## /\\ /\\ \r",
" \r\n"+
" \<> \r\n"+
" \<####> \r\n"+
" \r\n"+
" \r\n"+
" \r\n"+
" \r\n"+
" o o o \r\n"+
" /\\ /\\ /\\ \r\n"+
" /\\ ## /\\ /\\ \r",
" \r\n"+
" \<> \r\n"+
" \<####> \r\n"+
" \r\n"+
" \r\n"+
" \r\n"+
" \r\n"+
" o o o \r\n"+
" /\\ /\\ /\\ \r\n"+
" /\\ ## /\\ /\\ \r",
" \r\n"+
" \<> \r\n"+
" \<####> \r\n"+
" \r\n"+
" \r\n"+
" \r\n"+
" \r\n"+
" o o o \r\n"+
" /\\ /\\ /\\ \r\n"+
" /\\ ## /\\ /\\ \r",
" \r\n"+
" \<> \r\n"+
" \<####> \r\n"+
" \r\n"+
" WOW!! \r\n"+
" \\ \r\n"+
" \\ \r\n"+
" \\o \\o o \r\n"+
" \\ \\ /\\ \r\n"+
" /\\ ## /\\ /\\ \r",
" \r\n"+
" \<> \r\n"+
" \<####> \r\n"+
" \r\n"+
" WOW!! \r\n"+
" \\ \r\n"+
" \\ \r\n"+
" \\o \\o o \r\n"+
" \\ \\ /\\ \r\n"+
" /\\ ## /\\ /\\ \r",
" \r\n"+
" \<> \r\n"+
" \<####> \r\n"+
" \r\n"+
" \r\n"+
" \r\n"+
" \r\n"+
" \\o \\o o \r\n"+
" \\ \\ /\\ \r\n"+
" /\\ ## /\\ /\\ \r",
" \r\n"+
" \<> \r\n"+
" \<####> \r\n"+
" \r\n"+
" \r\n"+
" \r\n"+
" \r\n"+
" \\o \\o o \r\n"+
" \\ \\ /\\ \r\n"+
" /\\ ## /\\ /\\ \r",
" \r\n"+
" \<> \r\n"+
" \<####> \r\n"+
" : \r\n"+
" : \r\n"+
" : \r\n"+
" : \r\n"+
" \\o \\o o \r\n"+
" \\ \\ /\\ \r\n"+
" /\\ ## /\\ /\\ \r",
" \r\n"+
" \<> \r\n"+
" \<####> \r\n"+
" : \r\n"+
" : \r\n"+
" : \r\n"+
" : \r\n"+
" \\O \\o o \r\n"+
" \\ \\ /\\ \r\n"+
" /\\ ## /\\ /\\ \r",
" \r\n"+
" \<> \r\n"+
" \<####> \r\n"+
" \r\n"+
" \r\n"+
" : \r\n"+
" : \r\n"+
" %O \\o o \r\n"+
" % \\ /\\ \r\n"+
" %% ## /\\ /\\ \r",
" \r\n"+
" \<> \r\n"+
" \<####> \r\n"+
" \r\n"+
" \r\n"+
" \r\n"+
" \r\n"+
" %O \\o o \r\n"+
" % \\ /\\ \r\n"+
" \<\< ## /\\ /\\ \r",
" \r\n"+
" \<> \r\n"+
" \<####> \r\n"+
" \r\n"+
" \r\n"+
" \r\n"+
" / O \r\n"+
" /o /o \r\n"+
" \< / \\ \\ \r\n"+
" \\ ## /\\ /\\ \r",
" \r\n"+
" \<> \r\n"+
" \<####> \r\n"+
" \r\n"+
" \r\n"+
" \\ \r\n"+
" \r\n"+
" O /o /o \r\n"+
" _ \\ \\ \r\n"+
" \\ _ ## /\\ >> \r",
" \r\n"+
" \<> \r\n"+
" \<####> \r\n"+
" \r\n"+
" \r\n"+
" \r\n"+
" \r\n"+
" \\ /o /o \r\n"+
" o_ \\ \\ \r\n"+
" _ _ ## /\\ >> \r",
" \r\n"+
" \<> \r\n"+
" \<####> \r\n"+
" * \r\n"+
" \r\n"+
" \r\n"+
" \r\n"+
" /o /o \r\n"+
" o_ \\ \\ \r\n"+
"_ _ _ ## >> >> \r",
" \r\n"+
" \<> \r\n"+
" \<####> \r\n"+
" \r\n"+
" * \r\n"+
" \r\n"+
" \r\n"+
" /o /o \r\n"+
" o_ \\ \\ \r\n"+
"_ _ _ ## >> >> \r",
" \r\n"+
" \<> \r\n"+
" \<####> \r\n"+
" * \r\n"+
" \r\n"+
" * \r\n"+
" \r\n"+
" /o /o \r\n"+
" o_ \\ \\ \r\n"+
"_ _ _ ## >> >> \r",
" \r\n"+
" \<> \r\n"+
" \<####> \r\n"+
" * \r\n"+
" \r\n"+
" \r\n"+
" * \r\n"+
" /o /o \r\n"+
" o_ \\ \\ \r\n"+
"_ _ _ ## >> >> \r",
" \r\n"+
" \<> \r\n"+
" \<####> \r\n"+
" \r\n"+
" * \r\n"+
" \r\n"+
" \r\n"+
" %O /o \r\n"+
" o_ % \\ \r\n"+
"_ _ _ ## %% >> \r",
" \r\n"+
" \<> \r\n"+
" \<####> \r\n"+
" \r\n"+
" \r\n"+
" * \r\n"+
" / O \r\n"+
" \\ /o \r\n"+
" o_/ \\ \r\n"+
"_ _ _ ## _ >> \r",
" \r\n"+
" \<> \r\n"+
" \<####> \r\n"+
" \r\n"+
" \r\n"+
" / O \r\n"+
" \\* \r\n"+
" /o \r\n"+
" o\\ \\ \r\n"+
"_ _ _ ## _ >> \r",
" \r\n"+
" \<> \r\n"+
" \<####> \r\n"+
" \r\n"+
" / O \r\n"+
" \\ \r\n"+
" \r\n"+
" /O \r\n"+
" o\\ \\ \r\n"+
"_ _ _ ## _ >> \r",
" \r\n"+
" \<> \r\n"+
" \<####> \r\n"+
" \r\n"+
" \\ \r\n"+
" o\r\n"+
" \\ \r\n"+
" %O \r\n"+
" o\\ % \r\n"+
"_ _ _ ## _ %% \r",
" \r\n"+
" \<> \r\n"+
" \<####> \r\n"+
" \r\n"+
" \r\n"+
" _ \r\n"+
" / O \r\n"+
" _ \r\n"+
" o\\ \\\r\n"+
"_ _ _ ## _ _ >\r",
" \r\n"+
" \<> \r\n"+
" \<####> \r\n"+
" \r\n"+
" \r\n"+
" / O \r\n"+
" \r\n"+
" _ \r\n"+
" _ o\\ _\r\n"+
"_ _ _ ## _ _ _\r",
" \r\n"+
" \<> \r\n"+
" \<####> \r\n"+
" O \r\n"+
" \r\n"+
" \r\n"+
" \\ \r\n"+
" \r\n"+
" o\\ \r\n"+
"_ _ _ _ ##_ _ _ =\r",
" \r\n"+
" \<> \r\n"+
" \<####> \r\n"+
" o \r\n"+
" \r\n"+
" \r\n"+
" \r\n"+
" \r\n"+
" o\\ \\ \r\n"+
"_ _ _ _ ##_ _ _ =\r",
" \r\n"+
" \<> \r\n"+
" \<####> \r\n"+
" \r\n"+
" o \r\n"+
" \r\n"+
" \r\n"+
" \r\n"+
" o\\ \r\n"+
"_ _ _ _ ##\\_ _ _ =\r",
" \r\n"+
" \<> \r\n"+
" \<####> \r\n"+
" \r\n"+
" \r\n"+
" \r\n"+
" o \r\n"+
" \r\n"+
" o\\ \r\n"+
"_ _ _ _ ##\\_ _ _ =\r",
" \r\n"+
" \<> \r\n"+
" \<####> \r\n"+
" \r\n"+
" \r\n"+
" \r\n"+
" \r\n"+
" \r\n"+
" o\\ o \r\n"+
"_ _ _ _ ##\\_ _ _ =\r",
" \r\n"+
" \<> \r\n"+
" \<####> \r\n"+
" \r\n"+
" \r\n"+
" \r\n"+
" \r\n"+
" \r\n"+
" o\\ \r\n"+
"_ _ _ _ ##\\_ _ _o=\r",
" \r\n"+
" \<> \r\n"+
" \<####> \r\n"+
" \r\n"+
" \r\n"+
" \r\n"+
" \r\n"+
" \r\n"+
" o\\ o \r\n"+
"_ _ _ _ ##\\_ _ _ =\r",
" \r\n"+
" \<> \r\n"+
" \<####> \r\n"+
" * \r\n"+
" \r\n"+
" \r\n"+
" \r\n"+
" \r\n"+
" o\\ \r\n"+
"_ _ _ _ ##\\_ _ _o=\r",
" \r\n"+
" \<> \r\n"+
" \<####> \r\n"+
" \\|/ \r\n"+
" /|\\ \r\n"+
" \r\n"+
" \r\n"+
" \r\n"+
" o\\ \r\n"+
"_ _ _ _ ##\\_ _ _o=\r",
" \r\n"+
" \<> \r\n"+
" \<\\###/ \r\n"+
" \\|/ \r\n"+
" --#-- \r\n"+
" /|\\ \r\n"+
" / \\ \r\n"+
" \r\n"+
" o\\ \r\n"+
"_ _ _ _ ##\\_ _ _o=\r",
" # # # \r\n"+
" # \<# # \r\n"+
" #\<\\###/ # \r\n"+
" # \\#/ # \r\n"+
" ##### \r\n"+
" ############### \r\n"+
" ##### \r\n"+
" # # # \r\n"+
" # #\\ # \r\n"+
"_ _ #_ _ ##\\_ # _o=\r",
"####################\r\n"+
"####################\r\n"+
"####################\r\n"+
"####################\r\n"+
"####################\r\n"+
"####################\r\n"+
"####################\r\n"+
"####################\r\n"+
"####################\r\n"+
"####################\r",
"# # # # # # # # # # \r\n"+
" # # # # # # # # # #\r\n"+
"# # # # # # # # # # \r\n"+
" # # # # # # # # # #\r\n"+
"# # # # # # # # # \r\n"+
" # # # # # # # # # #\r\n"+
"# # # # # # # # # \r\n"+
" # # # # # # # # # \r\n"+
"# # # # # # # # # #\r\n"+
" ## ## ## # # # # #\r",
" # # # \r\n"+
" \r\n"+
" # # # # # # \r\n"+
" \r\n"+
"# # # # # \r\n"+
" \r\n"+
" # # # # \r\n"+
" \r\n"+
"# # # # \r\n"+
" \r"
]
duration = 230 | 25.861511 | 85 | 0.091522 | 1,099 | 14,379 | 1.075523 | 0.020018 | 0.761421 | 0.954315 | 1.120135 | 0.916244 | 0.916244 | 0.916244 | 0.916244 | 0.916244 | 0.916244 | 0 | 0.003204 | 0.63099 | 14,379 | 556 | 86 | 25.861511 | 0.219563 | 0.006537 | 0 | 0.89881 | 0 | 0 | 0.853742 | 0.016663 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 14 |
e3434e531ae6e0e517fab17d4e93cc35494e4f23 | 10,482 | py | Python | v6.0.6/log/test_fortios_log_setting.py | fortinet-solutions-cse/ansible_fgt_modules | c45fba49258d7c9705e7a8fd9c2a09ea4c8a4719 | [
"Apache-2.0"
] | 14 | 2018-09-25T20:35:25.000Z | 2021-07-14T04:30:54.000Z | v6.0.6/log/test_fortios_log_setting.py | fortinet-solutions-cse/ansible_fgt_modules | c45fba49258d7c9705e7a8fd9c2a09ea4c8a4719 | [
"Apache-2.0"
] | 32 | 2018-10-09T04:13:42.000Z | 2020-05-11T07:20:28.000Z | v6.0.6/log/test_fortios_log_setting.py | fortinet-solutions-cse/ansible_fgt_modules | c45fba49258d7c9705e7a8fd9c2a09ea4c8a4719 | [
"Apache-2.0"
] | 11 | 2018-10-09T00:14:53.000Z | 2021-11-03T10:54:09.000Z | # Copyright 2019 Fortinet, Inc.
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with Ansible. If not, see <https://www.gnu.org/licenses/>.
# Make coding more python3-ish
from __future__ import (absolute_import, division, print_function)
__metaclass__ = type
import os
import json
import pytest
from mock import ANY
from ansible.module_utils.network.fortios.fortios import FortiOSHandler
try:
from ansible.modules.network.fortios import fortios_log_setting
except ImportError:
pytest.skip("Could not load required modules for testing", allow_module_level=True)
@pytest.fixture(autouse=True)
def connection_mock(mocker):
connection_class_mock = mocker.patch('ansible.modules.network.fortios.fortios_log_setting.Connection')
return connection_class_mock
fos_instance = FortiOSHandler(connection_mock)
def test_log_setting_creation(mocker):
schema_method_mock = mocker.patch('ansible.module_utils.network.fortios.fortios.FortiOSHandler.schema')
set_method_result = {'status': 'success', 'http_method': 'POST', 'http_status': 200}
set_method_mock = mocker.patch('ansible.module_utils.network.fortios.fortios.FortiOSHandler.set', return_value=set_method_result)
input_data = {
'username': 'admin',
'state': 'present',
'log_setting': {
'brief_traffic_format': 'enable',
'daemon_log': 'enable',
'expolicy_implicit_log': 'enable',
'fwpolicy_implicit_log': 'enable',
'fwpolicy6_implicit_log': 'enable',
'local_in_allow': 'enable',
'local_in_deny_broadcast': 'enable',
'local_in_deny_unicast': 'enable',
'local_out': 'enable',
'log_invalid_packet': 'enable',
'log_policy_comment': 'enable',
'log_policy_name': 'enable',
'log_user_in_upper': 'enable',
'neighbor_event': 'enable',
'resolve_ip': 'enable',
'resolve_port': 'enable',
'user_anonymize': 'enable'
},
'vdom': 'root'}
is_error, changed, response = fortios_log_setting.fortios_log(input_data, fos_instance)
expected_data = {
'brief-traffic-format': 'enable',
'daemon-log': 'enable',
'expolicy-implicit-log': 'enable',
'fwpolicy-implicit-log': 'enable',
'fwpolicy6-implicit-log': 'enable',
'local-in-allow': 'enable',
'local-in-deny-broadcast': 'enable',
'local-in-deny-unicast': 'enable',
'local-out': 'enable',
'log-invalid-packet': 'enable',
'log-policy-comment': 'enable',
'log-policy-name': 'enable',
'log-user-in-upper': 'enable',
'neighbor-event': 'enable',
'resolve-ip': 'enable',
'resolve-port': 'enable',
'user-anonymize': 'enable'
}
set_method_mock.assert_called_with('log', 'setting', data=expected_data, vdom='root')
schema_method_mock.assert_not_called()
assert not is_error
assert changed
assert response['status'] == 'success'
assert response['http_status'] == 200
def test_log_setting_creation_fails(mocker):
schema_method_mock = mocker.patch('ansible.module_utils.network.fortios.fortios.FortiOSHandler.schema')
set_method_result = {'status': 'error', 'http_method': 'POST', 'http_status': 500}
set_method_mock = mocker.patch('ansible.module_utils.network.fortios.fortios.FortiOSHandler.set', return_value=set_method_result)
input_data = {
'username': 'admin',
'state': 'present',
'log_setting': {
'brief_traffic_format': 'enable',
'daemon_log': 'enable',
'expolicy_implicit_log': 'enable',
'fwpolicy_implicit_log': 'enable',
'fwpolicy6_implicit_log': 'enable',
'local_in_allow': 'enable',
'local_in_deny_broadcast': 'enable',
'local_in_deny_unicast': 'enable',
'local_out': 'enable',
'log_invalid_packet': 'enable',
'log_policy_comment': 'enable',
'log_policy_name': 'enable',
'log_user_in_upper': 'enable',
'neighbor_event': 'enable',
'resolve_ip': 'enable',
'resolve_port': 'enable',
'user_anonymize': 'enable'
},
'vdom': 'root'}
is_error, changed, response = fortios_log_setting.fortios_log(input_data, fos_instance)
expected_data = {
'brief-traffic-format': 'enable',
'daemon-log': 'enable',
'expolicy-implicit-log': 'enable',
'fwpolicy-implicit-log': 'enable',
'fwpolicy6-implicit-log': 'enable',
'local-in-allow': 'enable',
'local-in-deny-broadcast': 'enable',
'local-in-deny-unicast': 'enable',
'local-out': 'enable',
'log-invalid-packet': 'enable',
'log-policy-comment': 'enable',
'log-policy-name': 'enable',
'log-user-in-upper': 'enable',
'neighbor-event': 'enable',
'resolve-ip': 'enable',
'resolve-port': 'enable',
'user-anonymize': 'enable'
}
set_method_mock.assert_called_with('log', 'setting', data=expected_data, vdom='root')
schema_method_mock.assert_not_called()
assert is_error
assert not changed
assert response['status'] == 'error'
assert response['http_status'] == 500
def test_log_setting_idempotent(mocker):
schema_method_mock = mocker.patch('ansible.module_utils.network.fortios.fortios.FortiOSHandler.schema')
set_method_result = {'status': 'error', 'http_method': 'DELETE', 'http_status': 404}
set_method_mock = mocker.patch('ansible.module_utils.network.fortios.fortios.FortiOSHandler.set', return_value=set_method_result)
input_data = {
'username': 'admin',
'state': 'present',
'log_setting': {
'brief_traffic_format': 'enable',
'daemon_log': 'enable',
'expolicy_implicit_log': 'enable',
'fwpolicy_implicit_log': 'enable',
'fwpolicy6_implicit_log': 'enable',
'local_in_allow': 'enable',
'local_in_deny_broadcast': 'enable',
'local_in_deny_unicast': 'enable',
'local_out': 'enable',
'log_invalid_packet': 'enable',
'log_policy_comment': 'enable',
'log_policy_name': 'enable',
'log_user_in_upper': 'enable',
'neighbor_event': 'enable',
'resolve_ip': 'enable',
'resolve_port': 'enable',
'user_anonymize': 'enable'
},
'vdom': 'root'}
is_error, changed, response = fortios_log_setting.fortios_log(input_data, fos_instance)
expected_data = {
'brief-traffic-format': 'enable',
'daemon-log': 'enable',
'expolicy-implicit-log': 'enable',
'fwpolicy-implicit-log': 'enable',
'fwpolicy6-implicit-log': 'enable',
'local-in-allow': 'enable',
'local-in-deny-broadcast': 'enable',
'local-in-deny-unicast': 'enable',
'local-out': 'enable',
'log-invalid-packet': 'enable',
'log-policy-comment': 'enable',
'log-policy-name': 'enable',
'log-user-in-upper': 'enable',
'neighbor-event': 'enable',
'resolve-ip': 'enable',
'resolve-port': 'enable',
'user-anonymize': 'enable'
}
set_method_mock.assert_called_with('log', 'setting', data=expected_data, vdom='root')
schema_method_mock.assert_not_called()
assert not is_error
assert not changed
assert response['status'] == 'error'
assert response['http_status'] == 404
def test_log_setting_filter_foreign_attributes(mocker):
schema_method_mock = mocker.patch('ansible.module_utils.network.fortios.fortios.FortiOSHandler.schema')
set_method_result = {'status': 'success', 'http_method': 'POST', 'http_status': 200}
set_method_mock = mocker.patch('ansible.module_utils.network.fortios.fortios.FortiOSHandler.set', return_value=set_method_result)
input_data = {
'username': 'admin',
'state': 'present',
'log_setting': {
'random_attribute_not_valid': 'tag',
'brief_traffic_format': 'enable',
'daemon_log': 'enable',
'expolicy_implicit_log': 'enable',
'fwpolicy_implicit_log': 'enable',
'fwpolicy6_implicit_log': 'enable',
'local_in_allow': 'enable',
'local_in_deny_broadcast': 'enable',
'local_in_deny_unicast': 'enable',
'local_out': 'enable',
'log_invalid_packet': 'enable',
'log_policy_comment': 'enable',
'log_policy_name': 'enable',
'log_user_in_upper': 'enable',
'neighbor_event': 'enable',
'resolve_ip': 'enable',
'resolve_port': 'enable',
'user_anonymize': 'enable'
},
'vdom': 'root'}
is_error, changed, response = fortios_log_setting.fortios_log(input_data, fos_instance)
expected_data = {
'brief-traffic-format': 'enable',
'daemon-log': 'enable',
'expolicy-implicit-log': 'enable',
'fwpolicy-implicit-log': 'enable',
'fwpolicy6-implicit-log': 'enable',
'local-in-allow': 'enable',
'local-in-deny-broadcast': 'enable',
'local-in-deny-unicast': 'enable',
'local-out': 'enable',
'log-invalid-packet': 'enable',
'log-policy-comment': 'enable',
'log-policy-name': 'enable',
'log-user-in-upper': 'enable',
'neighbor-event': 'enable',
'resolve-ip': 'enable',
'resolve-port': 'enable',
'user-anonymize': 'enable'
}
set_method_mock.assert_called_with('log', 'setting', data=expected_data, vdom='root')
schema_method_mock.assert_not_called()
assert not is_error
assert changed
assert response['status'] == 'success'
assert response['http_status'] == 200
| 37.435714 | 133 | 0.623736 | 1,153 | 10,482 | 5.411969 | 0.156982 | 0.046154 | 0.065385 | 0.04359 | 0.832051 | 0.817628 | 0.802404 | 0.802404 | 0.802404 | 0.802404 | 0 | 0.004741 | 0.23526 | 10,482 | 279 | 134 | 37.569892 | 0.773703 | 0.063347 | 0 | 0.851528 | 0 | 0 | 0.426036 | 0.149765 | 0 | 0 | 0 | 0 | 0.104803 | 1 | 0.021834 | false | 0 | 0.034935 | 0 | 0.061135 | 0.004367 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
e36b393fc84b787895af352d9abf4b4fb8f560f6 | 456 | py | Python | BugInThere/python/BugInThere.py | Lian0123/TextArt | 26a2378a48fb1e49df067240fcb1dbd50e346a8e | [
"MIT"
] | 2 | 2018-07-29T05:02:53.000Z | 2021-08-16T10:52:59.000Z | BugInThere/python/BugInThere.py | Lian0123/TextArt | 26a2378a48fb1e49df067240fcb1dbd50e346a8e | [
"MIT"
] | null | null | null | BugInThere/python/BugInThere.py | Lian0123/TextArt | 26a2378a48fb1e49df067240fcb1dbd50e346a8e | [
"MIT"
] | null | null | null | #
# ┌────\ ╭──────╮ │
# │ \ │ │ │ │ │
# │ / │ │ │ │ │ Σ(⊙ д ⊙ )
# ├────< │ │ │ │ in there!!!
# │ \ │ │ │ ___ │
# │ / │ │ │ │ \ │ /
# └────/ ╰──────╯ ╰──────╯ \│/ By:
# V | 50.666667 | 62 | 0.032895 | 43 | 456 | 1.906977 | 0.302326 | 0.609756 | 0.804878 | 0.926829 | 0.292683 | 0.292683 | 0.292683 | 0.292683 | 0.292683 | 0.292683 | 0 | 0 | 0.756579 | 456 | 9 | 63 | 50.666667 | 0.108108 | 0.837719 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 0 | 0 | 1 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
8b81f6b9724ecabb34d4696dea1eabac2cbb4755 | 6,281 | py | Python | tests/unit/db/redshift/test_unloader.py | cwegrzyn/records-mover | e3b71d6c09d99d0bcd6a956b9d09d20f8abe98d2 | [
"Apache-2.0"
] | 36 | 2020-03-17T11:56:51.000Z | 2022-01-19T16:03:32.000Z | tests/unit/db/redshift/test_unloader.py | cwegrzyn/records-mover | e3b71d6c09d99d0bcd6a956b9d09d20f8abe98d2 | [
"Apache-2.0"
] | 60 | 2020-03-02T23:13:29.000Z | 2021-05-19T15:05:42.000Z | tests/unit/db/redshift/test_unloader.py | cwegrzyn/records-mover | e3b71d6c09d99d0bcd6a956b9d09d20f8abe98d2 | [
"Apache-2.0"
] | 4 | 2020-08-11T13:17:37.000Z | 2021-11-05T21:11:52.000Z | import unittest
from records_mover.db.redshift.unloader import RedshiftUnloader
from records_mover.db.errors import NoTemporaryBucketConfiguration
from records_mover.records.records_format import DelimitedRecordsFormat, ParquetRecordsFormat
from mock import patch, Mock, MagicMock
class TestRedshiftUnloader(unittest.TestCase):
@patch('records_mover.db.redshift.unloader.redshift_unload_options')
@patch('records_mover.db.redshift.unloader.RecordsUnloadPlan')
def test_can_unload_format_true(self,
mock_RecordsUnloadPlan,
mock_redshift_unload_options):
mock_db = Mock(name='db')
mock_table = Mock(name='table')
mock_target_records_format = Mock(name='target_records_format', spec=DelimitedRecordsFormat)
mock_unload_plan = mock_RecordsUnloadPlan.return_value
mock_unload_plan.records_format = mock_target_records_format
mock_processing_instructions = mock_unload_plan.processing_instructions
mock_s3_temp_base_loc = Mock(name='s3_temp_base_loc')
mock_target_records_format.hints = {}
redshift_unloader =\
RedshiftUnloader(db=mock_db,
table=mock_table,
s3_temp_base_loc=mock_s3_temp_base_loc)
out = redshift_unloader.can_unload_format(mock_target_records_format)
mock_RecordsUnloadPlan.\
assert_called_with(records_format=mock_target_records_format)
mock_redshift_unload_options.\
assert_called_with(set(),
mock_unload_plan.records_format,
mock_processing_instructions.fail_if_cant_handle_hint)
self.assertEqual(True, out)
@patch('records_mover.db.redshift.unloader.redshift_unload_options')
@patch('records_mover.db.redshift.unloader.RecordsUnloadPlan')
def test_can_unload_format_delimite_false(self,
mock_RecordsUnloadPlan,
mock_redshift_unload_options):
mock_db = Mock(name='db')
mock_table = Mock(name='table')
mock_target_records_format = Mock(name='target_records_format', spec=DelimitedRecordsFormat)
mock_unload_plan = mock_RecordsUnloadPlan.return_value
mock_unload_plan.records_format = mock_target_records_format
mock_processing_instructions = mock_unload_plan.processing_instructions
mock_s3_temp_base_loc = Mock(name='s3_temp_base_loc')
mock_target_records_format.hints = {}
mock_redshift_unload_options.side_effect = NotImplementedError
redshift_unloader =\
RedshiftUnloader(db=mock_db,
table=mock_table,
s3_temp_base_loc=mock_s3_temp_base_loc)
out = redshift_unloader.can_unload_format(mock_target_records_format)
mock_RecordsUnloadPlan.\
assert_called_with(records_format=mock_target_records_format)
mock_redshift_unload_options.\
assert_called_with(set(),
mock_unload_plan.records_format,
mock_processing_instructions.fail_if_cant_handle_hint)
self.assertEqual(False, out)
def test_can_unload_to_scheme_s3_true(self):
mock_db = Mock(name='db')
mock_table = Mock(name='table')
redshift_unloader =\
RedshiftUnloader(db=mock_db,
table=mock_table,
s3_temp_base_loc=None)
self.assertTrue(redshift_unloader.can_unload_to_scheme('s3'))
def test_can_unload_to_scheme_file_without_temp_bucket_true(self):
mock_db = Mock(name='db')
mock_table = Mock(name='table')
redshift_unloader =\
RedshiftUnloader(db=mock_db,
table=mock_table,
s3_temp_base_loc=None)
self.assertFalse(redshift_unloader.can_unload_to_scheme('file'))
def test_can_unload_to_scheme_file_with_temp_bucket_true(self):
mock_db = Mock(name='db')
mock_table = Mock(name='table')
mock_s3_temp_base_loc = Mock(name='s3_temp_base_loc')
redshift_unloader =\
RedshiftUnloader(db=mock_db,
table=mock_table,
s3_temp_base_loc=mock_s3_temp_base_loc)
self.assertTrue(redshift_unloader.can_unload_to_scheme('file'))
def test_known_supported_records_formats_for_unload(self):
mock_db = Mock(name='db')
mock_table = Mock(name='table')
mock_s3_temp_base_loc = Mock(name='s3_temp_base_loc')
redshift_unloader =\
RedshiftUnloader(db=mock_db,
table=mock_table,
s3_temp_base_loc=mock_s3_temp_base_loc)
formats = redshift_unloader.known_supported_records_formats_for_unload()
self.assertEqual([f.__class__ for f in formats],
[DelimitedRecordsFormat, ParquetRecordsFormat])
def test_temporary_unloadable_directory_loc(self):
mock_db = Mock(name='db')
mock_table = Mock(name='table')
mock_s3_temp_base_loc = MagicMock(name='s3_temp_base_loc')
redshift_unloader =\
RedshiftUnloader(db=mock_db,
table=mock_table,
s3_temp_base_loc=mock_s3_temp_base_loc)
with redshift_unloader.temporary_unloadable_directory_loc() as loc:
self.assertEqual(loc,
mock_s3_temp_base_loc.temporary_directory.return_value.__enter__.
return_value)
def test_temporary_unloadable_directory_loc_unset(self):
mock_db = Mock(name='db')
mock_table = Mock(name='table')
mock_s3_temp_base_loc = None
redshift_unloader =\
RedshiftUnloader(db=mock_db,
table=mock_table,
s3_temp_base_loc=mock_s3_temp_base_loc)
with self.assertRaises(NoTemporaryBucketConfiguration):
with redshift_unloader.temporary_unloadable_directory_loc():
pass
| 45.18705 | 100 | 0.65117 | 690 | 6,281 | 5.434783 | 0.115942 | 0.0416 | 0.069333 | 0.090133 | 0.842933 | 0.8328 | 0.8008 | 0.7408 | 0.7256 | 0.702133 | 0 | 0.00621 | 0.282121 | 6,281 | 138 | 101 | 45.514493 | 0.82546 | 0 | 0 | 0.724138 | 0 | 0 | 0.064958 | 0.041713 | 0 | 0 | 0 | 0 | 0.103448 | 1 | 0.068966 | false | 0.008621 | 0.043103 | 0 | 0.12069 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
8bdb2c66bbc2dc09ab8cf23741dce43c47a9b6be | 2,259 | py | Python | app/main/dto/user_dto.py | victortrinh/cabronis-be | c429767c7a43ba37cdb4850404a4651ea1f3eecf | [
"MIT"
] | null | null | null | app/main/dto/user_dto.py | victortrinh/cabronis-be | c429767c7a43ba37cdb4850404a4651ea1f3eecf | [
"MIT"
] | null | null | null | app/main/dto/user_dto.py | victortrinh/cabronis-be | c429767c7a43ba37cdb4850404a4651ea1f3eecf | [
"MIT"
] | 1 | 2021-02-17T06:46:34.000Z | 2021-02-17T06:46:34.000Z | from flask_restplus import Namespace, fields
from ..model.user import UserRole
class UserDto:
api = Namespace('User', description='user related operations')
user = api.model('user', {
'user_id': fields.String(required=False, description='user identifier'),
'email': fields.String(required=True, description='user email address'),
'first_name': fields.String(required=True, description='user first name'),
'last_name': fields.String(required=True, description='user last name'),
'password': fields.String(required=True, description='user password')
})
user_roles = api.model('user', {
'roles': fields.List(fields.String(required=False, description='user roles', enum=UserRole._member_names_)),
})
user_update = api.model('user', {
'email': fields.String(required=True, description='user email address'),
'first_name': fields.String(required=True, description='user first name'),
'last_name': fields.String(required=True, description='user last name'),
'roles': fields.List(fields.String(required=False, description='user roles', enum=UserRole._member_names_)),
})
user_with_roles = api.model('user', {
'user_id': fields.String(required=False, description='user identifier'),
'email': fields.String(required=True, description='user email address'),
'first_name': fields.String(required=True, description='user first name'),
'last_name': fields.String(required=True, description='user last name'),
'password': fields.String(required=True, description='user password'),
'roles': fields.List(fields.String(required=False, description='user roles', enum=UserRole._member_names_)),
})
user_change_password = api.model('user_change_password', {
'user_id': fields.String(required=False, description='user identifier'),
'email': fields.String(required=True, description='user email address'),
'current_password': fields.String(required=True, description='Current user password'),
'password': fields.String(required=True, description='New user password'),
'confirm_password': fields.String(required=True, description='New user confirm password'),
})
| 51.340909 | 116 | 0.694112 | 259 | 2,259 | 5.942085 | 0.146718 | 0.163743 | 0.272904 | 0.233918 | 0.822612 | 0.822612 | 0.794672 | 0.794672 | 0.729695 | 0.729695 | 0 | 0 | 0.160691 | 2,259 | 43 | 117 | 52.534884 | 0.811709 | 0 | 0 | 0.6 | 0 | 0 | 0.245684 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.171429 | 0.057143 | 0 | 0.257143 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 9 |
8bdf1663eff27e4caa2bf9ff6653d22e0d938867 | 30,735 | py | Python | test/test_main.py | jwodder/qypi | d8f56d5ea902e84189d9bb80c8d23f92cc83fd61 | [
"MIT"
] | 9 | 2017-04-02T20:14:09.000Z | 2022-01-09T15:48:58.000Z | test/test_main.py | jwodder/qypi | d8f56d5ea902e84189d9bb80c8d23f92cc83fd61 | [
"MIT"
] | 4 | 2018-04-19T23:35:53.000Z | 2021-12-09T03:44:03.000Z | test/test_main.py | jwodder/qypi | d8f56d5ea902e84189d9bb80c8d23f92cc83fd61 | [
"MIT"
] | 1 | 2017-06-11T16:41:19.000Z | 2017-06-11T16:41:19.000Z | import json
from traceback import format_exception
from click.testing import CliRunner
import pytest
from qypi.__main__ import qypi
def show_result(r):
if r.exception is not None:
return "".join(format_exception(*r.exc_info))
else:
return r.output
def test_list(mocker):
spinstance = mocker.Mock(
**{
"list_packages.return_value": [
"foobar",
"BarFoo",
"quux",
"Gnusto-Cleesh",
"XYZZY_PLUGH",
],
}
)
spclass = mocker.patch("qypi.api.ServerProxy", return_value=spinstance)
r = CliRunner().invoke(qypi, ["list"])
assert r.exit_code == 0, show_result(r)
assert r.output == (
"foobar\n" "BarFoo\n" "quux\n" "Gnusto-Cleesh\n" "XYZZY_PLUGH\n"
)
spclass.assert_called_once_with("https://pypi.org/pypi")
assert spinstance.method_calls == [mocker.call.list_packages()]
def test_owner(mocker):
spinstance = mocker.Mock(
**{
"package_roles.return_value": [
["Owner", "luser"],
["Maintainer", "jsmith"],
],
}
)
spclass = mocker.patch("qypi.api.ServerProxy", return_value=spinstance)
r = CliRunner().invoke(qypi, ["owner", "foobar"])
assert r.exit_code == 0, show_result(r)
assert r.output == (
"{\n"
' "foobar": [\n'
" {\n"
' "role": "Owner",\n'
' "user": "luser"\n'
" },\n"
" {\n"
' "role": "Maintainer",\n'
' "user": "jsmith"\n'
" }\n"
" ]\n"
"}\n"
)
spclass.assert_called_once_with("https://pypi.org/pypi")
assert spinstance.method_calls == [mocker.call.package_roles("foobar")]
def test_multiple_owner(mocker):
spinstance = mocker.Mock(
**{
"package_roles.side_effect": [
[
["Owner", "luser"],
["Maintainer", "jsmith"],
],
[
["Owner", "jsmith"],
["Owner", "froody"],
],
],
}
)
spclass = mocker.patch("qypi.api.ServerProxy", return_value=spinstance)
r = CliRunner().invoke(qypi, ["owner", "foobar", "Glarch"])
assert r.exit_code == 0, show_result(r)
assert r.output == (
"{\n"
' "foobar": [\n'
" {\n"
' "role": "Owner",\n'
' "user": "luser"\n'
" },\n"
" {\n"
' "role": "Maintainer",\n'
' "user": "jsmith"\n'
" }\n"
" ],\n"
' "Glarch": [\n'
" {\n"
' "role": "Owner",\n'
' "user": "jsmith"\n'
" },\n"
" {\n"
' "role": "Owner",\n'
' "user": "froody"\n'
" }\n"
" ]\n"
"}\n"
)
spclass.assert_called_once_with("https://pypi.org/pypi")
assert spinstance.method_calls == [
mocker.call.package_roles("foobar"),
mocker.call.package_roles("Glarch"),
]
def test_owned(mocker):
spinstance = mocker.Mock(
**{
"user_packages.return_value": [
["Owner", "foobar"],
["Maintainer", "quux"],
],
}
)
spclass = mocker.patch("qypi.api.ServerProxy", return_value=spinstance)
r = CliRunner().invoke(qypi, ["owned", "luser"])
assert r.exit_code == 0, show_result(r)
assert r.output == (
"{\n"
' "luser": [\n'
" {\n"
' "package": "foobar",\n'
' "role": "Owner"\n'
" },\n"
" {\n"
' "package": "quux",\n'
' "role": "Maintainer"\n'
" }\n"
" ]\n"
"}\n"
)
spclass.assert_called_once_with("https://pypi.org/pypi")
assert spinstance.method_calls == [mocker.call.user_packages("luser")]
def test_multiple_owned(mocker):
spinstance = mocker.Mock(
**{
"user_packages.side_effect": [
[
["Owner", "foobar"],
["Maintainer", "quux"],
],
[
["Maintainer", "foobar"],
["Owner", "Glarch"],
],
],
}
)
spclass = mocker.patch("qypi.api.ServerProxy", return_value=spinstance)
r = CliRunner().invoke(qypi, ["owned", "luser", "jsmith"])
assert r.exit_code == 0, show_result(r)
assert r.output == (
"{\n"
' "luser": [\n'
" {\n"
' "package": "foobar",\n'
' "role": "Owner"\n'
" },\n"
" {\n"
' "package": "quux",\n'
' "role": "Maintainer"\n'
" }\n"
" ],\n"
' "jsmith": [\n'
" {\n"
' "package": "foobar",\n'
' "role": "Maintainer"\n'
" },\n"
" {\n"
' "package": "Glarch",\n'
' "role": "Owner"\n'
" }\n"
" ]\n"
"}\n"
)
spclass.assert_called_once_with("https://pypi.org/pypi")
assert spinstance.method_calls == [
mocker.call.user_packages("luser"),
mocker.call.user_packages("jsmith"),
]
def test_search(mocker):
spinstance = mocker.Mock(
**{
"search.return_value": [
{
"name": "foobar",
"version": "1.2.3",
"summary": "Foo all your bars",
"_pypi_ordering": False,
},
{
"name": "quux",
"version": "0.1.0",
"summary": "Do that thing this does",
"_pypi_ordering": True,
},
{
"name": "gnusto",
"version": "0.0.0",
"summary": "",
"_pypi_ordering": False,
},
],
}
)
spclass = mocker.patch("qypi.api.ServerProxy", return_value=spinstance)
r = CliRunner().invoke(qypi, ["search", "term", "keyword:foo", "readme:bar"])
assert r.exit_code == 0, show_result(r)
assert r.output == (
"[\n"
" {\n"
' "name": "foobar",\n'
' "summary": "Foo all your bars",\n'
' "version": "1.2.3"\n'
" },\n"
" {\n"
' "name": "quux",\n'
' "summary": "Do that thing this does",\n'
' "version": "0.1.0"\n'
" },\n"
" {\n"
' "name": "gnusto",\n'
' "summary": null,\n'
' "version": "0.0.0"\n'
" }\n"
"]\n"
)
spclass.assert_called_once_with("https://pypi.org/pypi")
assert spinstance.method_calls == [
mocker.call.search(
{"description": ["term", "bar"], "keywords": ["foo"]},
"and",
)
]
def test_browse(mocker):
spinstance = mocker.Mock(
**{
"browse.return_value": [
["foobar", "1.2.3"],
["foobar", "1.2.2"],
["foobar", "1.2.1"],
["foobar", "1.2.0"],
["quux", "0.1.0"],
["gnusto", "0.0.0"],
],
}
)
spclass = mocker.patch("qypi.api.ServerProxy", return_value=spinstance)
r = CliRunner().invoke(
qypi,
["browse", "Typing :: Typed", "Topic :: Utilities"],
)
assert r.exit_code == 0, show_result(r)
assert r.output == (
"[\n"
" {\n"
' "name": "foobar",\n'
' "version": "1.2.3"\n'
" },\n"
" {\n"
' "name": "foobar",\n'
' "version": "1.2.2"\n'
" },\n"
" {\n"
' "name": "foobar",\n'
' "version": "1.2.1"\n'
" },\n"
" {\n"
' "name": "foobar",\n'
' "version": "1.2.0"\n'
" },\n"
" {\n"
' "name": "quux",\n'
' "version": "0.1.0"\n'
" },\n"
" {\n"
' "name": "gnusto",\n'
' "version": "0.0.0"\n'
" }\n"
"]\n"
)
spclass.assert_called_once_with("https://pypi.org/pypi")
assert spinstance.method_calls == [
mocker.call.browse(("Typing :: Typed", "Topic :: Utilities"))
]
def test_browse_packages(mocker):
spinstance = mocker.Mock(
**{
"browse.return_value": [
["foobar", "1.2.3"],
["foobar", "1.2.2"],
["foobar", "1.2.1"],
["foobar", "1.2.0"],
["quux", "0.1.0"],
["gnusto", "0.0.0"],
],
}
)
spclass = mocker.patch("qypi.api.ServerProxy", return_value=spinstance)
r = CliRunner().invoke(
qypi,
["browse", "--packages", "Typing :: Typed", "Topic :: Utilities"],
)
assert r.exit_code == 0, show_result(r)
assert r.output == (
"[\n"
" {\n"
' "name": "foobar",\n'
' "version": "1.2.3"\n'
" },\n"
" {\n"
' "name": "quux",\n'
' "version": "0.1.0"\n'
" },\n"
" {\n"
' "name": "gnusto",\n'
' "version": "0.0.0"\n'
" }\n"
"]\n"
)
spclass.assert_called_once_with("https://pypi.org/pypi")
assert spinstance.method_calls == [
mocker.call.browse(("Typing :: Typed", "Topic :: Utilities"))
]
@pytest.mark.usefixtures("mock_pypi_json")
def test_info():
r = CliRunner().invoke(qypi, ["info", "foobar"])
assert r.exit_code == 0, show_result(r)
assert r.output == (
"[\n"
" {\n"
' "classifiers": [\n'
' "Topic :: Software Development :: Testing",\n'
' "UNKNOWN"\n'
" ],\n"
' "name": "foobar",\n'
' "people": [\n'
" {\n"
' "email": "megan30@daniels.info",\n'
' "name": "Brandon Perkins",\n'
' "role": "author"\n'
" },\n"
" {\n"
' "email": "cspencer@paul-fisher.com",\n'
' "name": "Denise Adkins",\n'
' "role": "maintainer"\n'
" }\n"
" ],\n"
' "platform": "Amiga",\n'
' "project_url": "https://dummy.nil/pypi/foobar",\n'
' "release_date": "2019-02-01T09:17:59.172284Z",\n'
' "release_url": "https://dummy.nil/pypi/foobar/1.0.0",\n'
' "summary": "Including drive environment my it.",\n'
' "unknown_field": "passed through",\n'
' "url": "https://www.johnson.com/homepage.php",\n'
' "version": "1.0.0"\n'
" }\n"
"]\n"
)
@pytest.mark.usefixtures("mock_pypi_json")
def test_info_explicit_latest_version():
r = CliRunner().invoke(qypi, ["info", "foobar==1.0.0"])
assert r.exit_code == 0, show_result(r)
assert r.output == (
"[\n"
" {\n"
' "classifiers": [\n'
' "Topic :: Software Development :: Testing",\n'
' "UNKNOWN"\n'
" ],\n"
' "name": "foobar",\n'
' "people": [\n'
" {\n"
' "email": "megan30@daniels.info",\n'
' "name": "Brandon Perkins",\n'
' "role": "author"\n'
" },\n"
" {\n"
' "email": "cspencer@paul-fisher.com",\n'
' "name": "Denise Adkins",\n'
' "role": "maintainer"\n'
" }\n"
" ],\n"
' "platform": "Amiga",\n'
' "project_url": "https://dummy.nil/pypi/foobar",\n'
' "release_date": "2019-02-01T09:17:59.172284Z",\n'
' "release_url": "https://dummy.nil/pypi/foobar/1.0.0",\n'
' "summary": "Including drive environment my it.",\n'
' "unknown_field": "passed through",\n'
' "url": "https://www.johnson.com/homepage.php",\n'
' "version": "1.0.0"\n'
" }\n"
"]\n"
)
@pytest.mark.usefixtures("mock_pypi_json")
def test_info_explicit_version():
r = CliRunner().invoke(qypi, ["info", "foobar==0.2.0"])
assert r.exit_code == 0, show_result(r)
assert r.output == (
"[\n"
" {\n"
' "classifiers": [\n'
' "Topic :: Software Development :: Testing",\n'
' "UNKNOWN"\n'
" ],\n"
' "name": "foobar",\n'
' "people": [\n'
" {\n"
' "email": "danielstewart@frye.com",\n'
' "name": "Sonya Johnson",\n'
' "role": "author"\n'
" },\n"
" {\n"
' "email": "maynardtim@hotmail.com",\n'
' "name": "Stephen Romero",\n'
' "role": "maintainer"\n'
" }\n"
" ],\n"
' "platform": "Wood",\n'
' "project_url": "https://dummy.nil/pypi/foobar",\n'
' "release_date": "2017-02-04T12:34:05.766270Z",\n'
' "release_url": "https://dummy.nil/pypi/foobar/0.2.0",\n'
' "summary": "Water audience cut call.",\n'
' "unknown_field": "passed through",\n'
' "url": "http://www.sanchez.net/index.htm",\n'
' "version": "0.2.0"\n'
" }\n"
"]\n"
)
@pytest.mark.usefixtures("mock_pypi_json")
def test_info_description():
r = CliRunner().invoke(qypi, ["info", "--description", "foobar"])
assert r.exit_code == 0, show_result(r)
assert r.output == (
"[\n"
" {\n"
' "classifiers": [\n'
' "Topic :: Software Development :: Testing",\n'
' "UNKNOWN"\n'
" ],\n"
' "description": "foobar v1.0.0\\n\\nDream political close attorney sit cost inside. Seek hard can bad investment authority walk we. Sing range late use speech citizen.\\n\\nCan money issue claim onto really case. Fact garden along all book sister trip step.\\n\\nView table woman her production result. Fine allow prepare should traditional. Send cultural two care eye.\\n\\nGenerated with Faker",\n'
' "name": "foobar",\n'
' "people": [\n'
" {\n"
' "email": "megan30@daniels.info",\n'
' "name": "Brandon Perkins",\n'
' "role": "author"\n'
" },\n"
" {\n"
' "email": "cspencer@paul-fisher.com",\n'
' "name": "Denise Adkins",\n'
' "role": "maintainer"\n'
" }\n"
" ],\n"
' "platform": "Amiga",\n'
' "project_url": "https://dummy.nil/pypi/foobar",\n'
' "release_date": "2019-02-01T09:17:59.172284Z",\n'
' "release_url": "https://dummy.nil/pypi/foobar/1.0.0",\n'
' "summary": "Including drive environment my it.",\n'
' "unknown_field": "passed through",\n'
' "url": "https://www.johnson.com/homepage.php",\n'
' "version": "1.0.0"\n'
" }\n"
"]\n"
)
@pytest.mark.usefixtures("mock_pypi_json")
def test_multiple_info():
r = CliRunner().invoke(qypi, ["info", "has-prerel", "foobar"])
assert r.exit_code == 0, show_result(r)
assert r.output == (
"[\n"
" {\n"
' "classifiers": [\n'
' "Topic :: Software Development :: Testing",\n'
' "UNKNOWN"\n'
" ],\n"
' "name": "has_prerel",\n'
' "people": [\n'
" {\n"
' "email": "freed@hotmail.com",\n'
' "name": "Samantha Gilbert",\n'
' "role": "author"\n'
" },\n"
" {\n"
' "email": "estradakelly@hotmail.com",\n'
' "name": "Bradley Livingston",\n'
' "role": "maintainer"\n'
" }\n"
" ],\n"
' "platform": "Coleco",\n'
' "project_url": "https://dummy.nil/pypi/has_prerel",\n'
' "release_date": "1970-04-21T22:33:29.915221Z",\n'
' "release_url": "https://dummy.nil/pypi/has_prerel/1.0.0",\n'
' "summary": "Boy kid chance indeed resource explain.",\n'
' "unknown_field": "passed through",\n'
' "url": "http://www.johnson.com/author.jsp",\n'
' "version": "1.0.0"\n'
" },\n"
" {\n"
' "classifiers": [\n'
' "Topic :: Software Development :: Testing",\n'
' "UNKNOWN"\n'
" ],\n"
' "name": "foobar",\n'
' "people": [\n'
" {\n"
' "email": "megan30@daniels.info",\n'
' "name": "Brandon Perkins",\n'
' "role": "author"\n'
" },\n"
" {\n"
' "email": "cspencer@paul-fisher.com",\n'
' "name": "Denise Adkins",\n'
' "role": "maintainer"\n'
" }\n"
" ],\n"
' "platform": "Amiga",\n'
' "project_url": "https://dummy.nil/pypi/foobar",\n'
' "release_date": "2019-02-01T09:17:59.172284Z",\n'
' "release_url": "https://dummy.nil/pypi/foobar/1.0.0",\n'
' "summary": "Including drive environment my it.",\n'
' "unknown_field": "passed through",\n'
' "url": "https://www.johnson.com/homepage.php",\n'
' "version": "1.0.0"\n'
" }\n"
"]\n"
)
@pytest.mark.usefixtures("mock_pypi_json")
def test_info_nonexistent():
r = CliRunner().invoke(qypi, ["info", "does-not-exist", "foobar"])
assert r.exit_code == 1, show_result(r)
assert r.output == (
"[\n"
" {\n"
' "classifiers": [\n'
' "Topic :: Software Development :: Testing",\n'
' "UNKNOWN"\n'
" ],\n"
' "name": "foobar",\n'
' "people": [\n'
" {\n"
' "email": "megan30@daniels.info",\n'
' "name": "Brandon Perkins",\n'
' "role": "author"\n'
" },\n"
" {\n"
' "email": "cspencer@paul-fisher.com",\n'
' "name": "Denise Adkins",\n'
' "role": "maintainer"\n'
" }\n"
" ],\n"
' "platform": "Amiga",\n'
' "project_url": "https://dummy.nil/pypi/foobar",\n'
' "release_date": "2019-02-01T09:17:59.172284Z",\n'
' "release_url": "https://dummy.nil/pypi/foobar/1.0.0",\n'
' "summary": "Including drive environment my it.",\n'
' "unknown_field": "passed through",\n'
' "url": "https://www.johnson.com/homepage.php",\n'
' "version": "1.0.0"\n'
" }\n"
"]\n"
"qypi: does-not-exist: package not found\n"
)
@pytest.mark.usefixtures("mock_pypi_json")
def test_info_nonexistent_split():
r = CliRunner(mix_stderr=False).invoke(qypi, ["info", "does-not-exist", "foobar"])
assert r.exit_code == 1, show_result(r)
assert r.stdout == (
"[\n"
" {\n"
' "classifiers": [\n'
' "Topic :: Software Development :: Testing",\n'
' "UNKNOWN"\n'
" ],\n"
' "name": "foobar",\n'
' "people": [\n'
" {\n"
' "email": "megan30@daniels.info",\n'
' "name": "Brandon Perkins",\n'
' "role": "author"\n'
" },\n"
" {\n"
' "email": "cspencer@paul-fisher.com",\n'
' "name": "Denise Adkins",\n'
' "role": "maintainer"\n'
" }\n"
" ],\n"
' "platform": "Amiga",\n'
' "project_url": "https://dummy.nil/pypi/foobar",\n'
' "release_date": "2019-02-01T09:17:59.172284Z",\n'
' "release_url": "https://dummy.nil/pypi/foobar/1.0.0",\n'
' "summary": "Including drive environment my it.",\n'
' "unknown_field": "passed through",\n'
' "url": "https://www.johnson.com/homepage.php",\n'
' "version": "1.0.0"\n'
" }\n"
"]\n"
)
assert r.stderr == "qypi: does-not-exist: package not found\n"
@pytest.mark.usefixtures("mock_pypi_json")
def test_info_nonexistent_version():
r = CliRunner().invoke(qypi, ["info", "foobar==2.23.42"])
assert r.exit_code == 1, show_result(r)
assert r.output == ("[]\n" "qypi: foobar: version 2.23.42 not found\n")
@pytest.mark.usefixtures("mock_pypi_json")
def test_info_nonexistent_version_split():
r = CliRunner(mix_stderr=False).invoke(qypi, ["info", "foobar==2.23.42"])
assert r.exit_code == 1, show_result(r)
assert r.stdout == "[]\n"
assert r.stderr == "qypi: foobar: version 2.23.42 not found\n"
@pytest.mark.usefixtures("mock_pypi_json")
def test_info_nonexistent_explicit_version():
r = CliRunner().invoke(qypi, ["info", "does-not-exist==2.23.42"])
assert r.exit_code == 1, show_result(r)
assert r.output == ("[]\n" "qypi: does-not-exist: version 2.23.42 not found\n")
@pytest.mark.usefixtures("mock_pypi_json")
def test_info_nonexistent_explicit_version_split():
r = CliRunner(mix_stderr=False).invoke(qypi, ["info", "does-not-exist==2.23.42"])
assert r.exit_code == 1, show_result(r)
assert r.stdout == "[]\n"
assert r.stderr == "qypi: does-not-exist: version 2.23.42 not found\n"
@pytest.mark.usefixtures("mock_pypi_json")
def test_info_latest_is_prerelease():
r = CliRunner().invoke(qypi, ["info", "has-prerel"])
assert r.exit_code == 0, show_result(r)
data = json.loads(r.output)
assert data[0]["version"] == "1.0.0"
@pytest.mark.usefixtures("mock_pypi_json")
def test_info_latest_is_prerelease_pre():
r = CliRunner().invoke(qypi, ["info", "--pre", "has-prerel"])
assert r.exit_code == 0, show_result(r)
data = json.loads(r.output)
assert data[0]["version"] == "1.0.1a1"
@pytest.mark.usefixtures("mock_pypi_json")
def test_info_explicit_prerelease():
r = CliRunner().invoke(qypi, ["info", "has-prerel==1.0.1a1"])
assert r.exit_code == 0, show_result(r)
data = json.loads(r.output)
assert data[0]["version"] == "1.0.1a1"
@pytest.mark.usefixtures("mock_pypi_json")
def test_info_all_are_prerelease():
r = CliRunner().invoke(qypi, ["info", "prerelease-only"])
assert r.exit_code == 0, show_result(r)
data = json.loads(r.output)
assert data[0]["version"] == "0.2a1"
@pytest.mark.usefixtures("mock_pypi_json")
def test_info_nullfields():
r = CliRunner().invoke(qypi, ["info", "nullfields"])
assert r.exit_code == 0, show_result(r)
assert r.output == (
"[\n"
" {\n"
' "classifiers": [\n'
' "Topic :: Software Development :: Testing",\n'
' "UNKNOWN"\n'
" ],\n"
' "name": "nullfields",\n'
' "people": [\n'
" {\n"
' "email": "barbara10@yahoo.com",\n'
' "name": "Philip Gonzalez",\n'
' "role": "author"\n'
" }\n"
" ],\n"
' "platform": null,\n'
' "project_url": "https://dummy.nil/pypi/nullfields",\n'
' "release_date": "2007-10-08T07:21:06.191703Z",\n'
' "release_url": "https://dummy.nil/pypi/nullfields/1.0.0",\n'
' "summary": "Film station choose short.",\n'
' "unknown_field": null,\n'
' "url": "https://bryant.com/wp-content/search/author/",\n'
' "version": "1.0.0"\n'
" }\n"
"]\n"
)
@pytest.mark.usefixtures("mock_pypi_json")
def test_readme():
r = CliRunner().invoke(qypi, ["readme", "foobar"])
assert r.exit_code == 0, show_result(r)
assert r.output == (
"foobar v1.0.0\n"
"\n"
"Dream political close attorney sit cost inside. Seek hard can bad investment authority walk we. Sing range late use speech citizen.\n"
"\n"
"Can money issue claim onto really case. Fact garden along all book sister trip step.\n"
"\n"
"View table woman her production result. Fine allow prepare should traditional. Send cultural two care eye.\n"
"\n"
"Generated with Faker\n"
)
@pytest.mark.usefixtures("mock_pypi_json")
def test_readme_explicit_version():
r = CliRunner().invoke(qypi, ["readme", "foobar==0.2.0"])
assert r.exit_code == 0, show_result(r)
assert r.output == (
"foobar v0.2.0\n"
"\n"
"Lead must laugh trouble expert else get million.\n"
"\n"
"Top shake walk. A cold national.\n"
"\n"
"Bring energy yourself suffer. Catch concern official relate voice base.\n"
"\n"
"Generated with Faker\n"
)
@pytest.mark.usefixtures("mock_pypi_json")
def test_files():
r = CliRunner().invoke(qypi, ["files", "foobar"])
assert r.exit_code == 0, show_result(r)
assert r.output == (
"[\n"
" {\n"
' "files": [\n'
" {\n"
' "comment_text": "",\n'
' "digests": {\n'
' "md5": "f92e8964922878760a07f783341a58ae",\n'
' "sha256": "84750bd98e3f61441e4b86ab443ebae41e65557e2b071b5a8e22a7d61a48a59d"\n'
" },\n"
' "filename": "foobar-1.0.0-py2.py3-none-any.whl",\n'
' "has_sig": true,\n'
' "md5_digest": "f92e8964922878760a07f783341a58ae",\n'
' "packagetype": "bdist_wheel",\n'
' "python_version": "py2.py3",\n'
' "size": 735,\n'
' "unknown_field": "passed through",\n'
' "upload_time": "2019-02-01T09:17:59",\n'
' "upload_time_iso_8601": "2019-02-01T09:17:59.172284Z",\n'
' "url": "https://files.dummyhosted.nil/packages/7f/97/e5ec19aed5d108c2f6c2fc6646d8247b1fadb49f0bf48e87a0fca8827696/foobar-1.0.0-py2.py3-none-any.whl"\n'
" }\n"
" ],\n"
' "name": "foobar",\n'
' "version": "1.0.0"\n'
" }\n"
"]\n"
)
@pytest.mark.usefixtures("mock_pypi_json")
def test_files_explicit_version():
r = CliRunner().invoke(qypi, ["files", "foobar==0.2.0"])
assert r.exit_code == 0, show_result(r)
assert r.output == (
"[\n"
" {\n"
' "files": [\n'
" {\n"
' "comment_text": "",\n'
' "digests": {\n'
' "md5": "5ced02e62434eb5649276e6f12003009",\n'
' "sha256": "f0862078b4f1af49f6b8c91153e9a7df88807900f9cf1b24287a901e515c824e"\n'
" },\n"
' "filename": "foobar-0.2.0-py2.py3-none-any.whl",\n'
' "has_sig": false,\n'
' "md5_digest": "5ced02e62434eb5649276e6f12003009",\n'
' "packagetype": "bdist_wheel",\n'
' "python_version": "py2.py3",\n'
' "size": 752,\n'
' "unknown_field": "passed through",\n'
' "upload_time": "2017-02-04T12:34:05",\n'
' "upload_time_iso_8601": "2017-02-04T12:34:05.766270Z",\n'
' "url": "https://files.dummyhosted.nil/packages/54/40/36eccb727704b5dabfda040e0eb23c29dbe26cf1a78cbeb24f33deb26b22/foobar-0.2.0-py2.py3-none-any.whl"\n'
" }\n"
" ],\n"
' "name": "foobar",\n'
' "version": "0.2.0"\n'
" }\n"
"]\n"
)
@pytest.mark.usefixtures("mock_pypi_json")
def test_releases():
r = CliRunner().invoke(qypi, ["releases", "foobar"])
assert r.exit_code == 0, show_result(r)
assert r.output == (
"{\n"
' "foobar": [\n'
" {\n"
' "is_prerelease": false,\n'
' "release_date": "2013-01-18T18:53:56.265173Z",\n'
' "release_url": "https://dummy.nil/pypi/foobar/0.1.0",\n'
' "version": "0.1.0"\n'
" },\n"
" {\n"
' "is_prerelease": false,\n'
' "release_date": "2017-02-04T12:34:05.766270Z",\n'
' "release_url": "https://dummy.nil/pypi/foobar/0.2.0",\n'
' "version": "0.2.0"\n'
" },\n"
" {\n"
' "is_prerelease": false,\n'
' "release_date": "2019-02-01T09:17:59.172284Z",\n'
' "release_url": "https://dummy.nil/pypi/foobar/1.0.0",\n'
' "version": "1.0.0"\n'
" }\n"
" ]\n"
"}\n"
)
# `qypi --index-url`
| 35.779977 | 416 | 0.43156 | 3,054 | 30,735 | 4.239031 | 0.114931 | 0.026108 | 0.014367 | 0.033601 | 0.825274 | 0.806195 | 0.778233 | 0.730419 | 0.711339 | 0.690406 | 0 | 0.047488 | 0.38679 | 30,735 | 858 | 417 | 35.821678 | 0.639412 | 0.000586 | 0 | 0.704403 | 0 | 0.022642 | 0.530164 | 0.04978 | 0 | 0 | 0 | 0 | 0.096855 | 1 | 0.037736 | false | 0.012579 | 0.006289 | 0 | 0.046541 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
4763fd24d4f09955a7763b76debaab8bf1fecf49 | 90 | py | Python | toolkit/simulator_lib/simgc.py | BxNxM/MicrOS | e555253de795260b4141f47cb3358e8c690c8c96 | [
"MIT"
] | 2 | 2020-04-17T12:53:44.000Z | 2020-07-06T13:39:45.000Z | toolkit/simulator_lib/simgc.py | BxNxM/MicrOS | e555253de795260b4141f47cb3358e8c690c8c96 | [
"MIT"
] | null | null | null | toolkit/simulator_lib/simgc.py | BxNxM/MicrOS | e555253de795260b4141f47cb3358e8c690c8c96 | [
"MIT"
] | 2 | 2020-07-06T13:39:46.000Z | 2020-07-06T17:35:40.000Z |
def mem_free(*args, **kwargs):
return 10000
def collect(*args, **kwargs):
pass
| 11.25 | 30 | 0.622222 | 12 | 90 | 4.583333 | 0.75 | 0.363636 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.071429 | 0.222222 | 90 | 7 | 31 | 12.857143 | 0.714286 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | true | 0.25 | 0 | 0.25 | 0.75 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 1 | 0 | 0 | 8 |
9a48c8c8d111909c8fd51c419c93160bac096e60 | 462,927 | py | Python | pysnmp-with-texts/Juniper-RADIUS-CLIENT-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 8 | 2019-05-09T17:04:00.000Z | 2021-06-09T06:50:51.000Z | pysnmp-with-texts/Juniper-RADIUS-CLIENT-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 4 | 2019-05-31T16:42:59.000Z | 2020-01-31T21:57:17.000Z | pysnmp-with-texts/Juniper-RADIUS-CLIENT-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 10 | 2019-04-30T05:51:36.000Z | 2022-02-16T03:33:41.000Z | #
# PySNMP MIB module Juniper-RADIUS-CLIENT-MIB (http://snmplabs.com/pysmi)
# ASN.1 source file:///Users/davwang4/Dev/mibs.snmplabs.com/asn1/Juniper-RADIUS-CLIENT-MIB
# Produced by pysmi-0.3.4 at Wed May 1 14:04:03 2019
# On host DAVWANG4-M-1475 platform Darwin version 18.5.0 by user davwang4
# Using Python version 3.7.3 (default, Mar 27 2019, 09:23:15)
#
ObjectIdentifier, Integer, OctetString = mibBuilder.importSymbols("ASN1", "ObjectIdentifier", "Integer", "OctetString")
NamedValues, = mibBuilder.importSymbols("ASN1-ENUMERATION", "NamedValues")
SingleValueConstraint, ValueSizeConstraint, ValueRangeConstraint, ConstraintsIntersection, ConstraintsUnion = mibBuilder.importSymbols("ASN1-REFINEMENT", "SingleValueConstraint", "ValueSizeConstraint", "ValueRangeConstraint", "ConstraintsIntersection", "ConstraintsUnion")
juniMibs, = mibBuilder.importSymbols("Juniper-MIBs", "juniMibs")
ObjectGroup, NotificationGroup, ModuleCompliance = mibBuilder.importSymbols("SNMPv2-CONF", "ObjectGroup", "NotificationGroup", "ModuleCompliance")
Bits, IpAddress, ModuleIdentity, NotificationType, Unsigned32, iso, Gauge32, Counter32, ObjectIdentity, Counter64, Integer32, MibIdentifier, TimeTicks, MibScalar, MibTable, MibTableRow, MibTableColumn = mibBuilder.importSymbols("SNMPv2-SMI", "Bits", "IpAddress", "ModuleIdentity", "NotificationType", "Unsigned32", "iso", "Gauge32", "Counter32", "ObjectIdentity", "Counter64", "Integer32", "MibIdentifier", "TimeTicks", "MibScalar", "MibTable", "MibTableRow", "MibTableColumn")
TextualConvention, TruthValue, DisplayString, RowStatus = mibBuilder.importSymbols("SNMPv2-TC", "TextualConvention", "TruthValue", "DisplayString", "RowStatus")
juniRadiusClientMIB = ModuleIdentity((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19))
juniRadiusClientMIB.setRevisions(('2009-02-26 16:41', '2008-06-18 10:10', '2008-06-11 06:15', '2007-12-14 15:00', '2007-09-18 18:22', '2007-09-16 22:00', '2007-04-10 01:03', '2006-02-17 22:00', '2006-01-12 22:00', '2005-09-30 14:55', '2005-01-14 15:15', '2004-12-06 02:32', '2004-12-03 22:12', '2004-09-09 19:45', '2003-12-15 16:36', '2003-03-10 19:33', '2003-01-27 18:33', '2002-11-21 19:45', '2002-05-13 17:54', '2001-10-16 19:54', '2001-09-06 21:08', '2001-03-22 15:20', '2000-12-19 16:40', '2000-05-05 19:44', '1999-06-01 00:00',))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
if mibBuilder.loadTexts: juniRadiusClientMIB.setRevisionsDescriptions(('Modified the valid ranges of juniRadiusAuthClientCfgTimeoutInterval, juniRadiusAuthClientCfgRetries, juniRadiusAuthClientCfgDeadTime, juniRadiusAcctClientCfgTimeoutInterval, juniRadiusAcctClientCfgRetries, juniRadiusAcctClientCfgDeadTime. Modified the default values of juniRadiusAuthClientCfgDeadTime and juniRadiusAcctClientCfgDeadTime from 5 to 0. Added juniRadiusClientIncludeIpv6AccountingInAcctStop. Added juniRadiusClientIncludeDelegatedIpv6PrefixInAcctStart, juniRadiusClientIncludeDelegatedIpv6PrefixInAcctStop, juniRadiusClientIncludeFramedIpv6PoolInAcctStart, juniRadiusClientIncludeFramedIpv6PoolInAcctStop, juniRadiusClientIncludeFramedIpv6RouteInAcctStart, juniRadiusClientIncludeFramedIpv6RouteInAcctStop, juniRadiusClientIncludeIpv6LocalInterfaceInAcctStart, juniRadiusClientIncludeIpv6LocalInterfaceInAcctStop, juniRadiusClientIncludeIpv6NdRaPrefixInAcctStart, juniRadiusClientIncludeIpv6NdRaPrefixInAcctStop, juniRadiusClientIncludeIpv6PrimaryDnsInAcctStart, juniRadiusClientIncludeIpv6PrimaryDnsInAcctStop, juniRadiusClientIncludeIpv6SecondaryDnsInAcctStart, juniRadiusClientIncludeIpv6SecondaryDnsInAcctStop, juniRadiusClientIncludeIpv6VirtualRouterInAcctStart, juniRadiusClientIncludeIpv6VirtualRouterInAcctStop.', 'Added juniRadiusClientIgnorePppoeMaxSession', 'Modified juniRadiusClientCallingStationIdFormat of juniRadiusGeneralClient to include the SVLAN ID', 'Added juniRadiusClientIncludeDownStreamCalculatedQosRateInAccessReq, juniRadiusClientIncludeUpStreamCalculatedQosRateInAccessReq, juniRadiusClientIncludeDownStreamCalculatedQosRateInAcctStart, juniRadiusClientIncludeUpStreamCalculatedQosRateInAcctStart, juniRadiusClientIncludeDownStreamCalculatedQosRateInAcctStop, juniRadiusClientIncludeUpStreamCalculatedQosRateInAcctStop.', 'Added juniRadiusClientIncludeInterfaceIdInAcctStart, juniRadiusClientIncludeIpv6PrefixInAcctStart, juniRadiusClientIncludeInterfaceIdInAcctStop, juniRadiusClientIncludeIpAddrInAcctStop, juniRadiusClientIncludeIpv6PrefixInAcctStop.', 'Extended the valid range of juniRadiusAcctClientCfgMaxPendingRequests from 32000 to 96000.', 'Added juniRadiusClientIncludeL2cAccessLoopCircuitIdInAccessReq, juniRadiusClientIncludeL2cAccessAggrCircuitIdBinaryInAccessReq, juniRadiusClientIncludeL2cAccessAggrCircuitIdAsciiInAccessReq, juniRadiusClientIncludeL2cActualDataRateUstrInAccessReq, juniRadiusClientIncludeL2cActualDataRateDstrInAccessReq, juniRadiusClientIncludeL2cMinimumDataRateUstrInAccessReq, juniRadiusClientIncludeL2cMinimumDataRateDstrInAccessReq, juniRadiusClientIncludeL2cAttainDataRateUstrInAccessReq, juniRadiusClientIncludeL2cAttainDataRateDstrInAccessReq, juniRadiusClientIncludeL2cMaximumDataRateUstrInAccessReq, juniRadiusClientIncludeL2cMaximumDataRateDstrInAccessReq, juniRadiusClientIncludeL2cMinLowPowerDataRateUstrInAccessReq, juniRadiusClientIncludeL2cMinLowPowerDataRateDstrInAccessReq, juniRadiusClientIncludeL2cMaxInterleavingDelayUstrInAccessReq, juniRadiusClientIncludeL2cActInterleavingDelayUstrInAccessReq, juniRadiusClientIncludeL2cMaxInterleavingDelayDstrInAccessReq, juniRadiusClientIncludeL2cActInterleavingDelayDstrInAccessReq, juniRadiusClientIncludeL2cDslLineStateInAccessReq, juniRadiusClientIncludeL2cDslTypeInAccessReq, juniRadiusClientIncludeL2cAccessLoopCircuitIdInAcctStart, juniRadiusClientIncludeL2cAccessAggrCircuitIdBinaryInAcctStart, juniRadiusClientIncludeL2cAccessAggrCircuitIdAsciiInAcctStart, juniRadiusClientIncludeL2cActualDataRateUstrInAcctStart, juniRadiusClientIncludeL2cActualDataRateDstrInAcctStart, juniRadiusClientIncludeL2cMinimumDataRateUstrInAcctStart, juniRadiusClientIncludeL2cMinimumDataRateDstrInAcctStart, juniRadiusClientIncludeL2cAttainDataRateUstrInAcctStart, juniRadiusClientIncludeL2cAttainDataRateDstrInAcctStart, juniRadiusClientIncludeL2cMaximumDataRateUstrInAcctStart, juniRadiusClientIncludeL2cMaximumDataRateDstrInAcctStart, juniRadiusClientIncludeL2cMinLowPowerDataRateUstrInAcctStart, juniRadiusClientIncludeL2cMinLowPowerDataRateDstrInAcctStart, juniRadiusClientIncludeL2cMaxInterleavingDelayUstrInAcctStart, juniRadiusClientIncludeL2cActInterleavingDelayUstrInAcctStart, juniRadiusClientIncludeL2cMaxInterleavingDelayDstrInAcctStart, juniRadiusClientIncludeL2cActInterleavingDelayDstrInAcctStart, juniRadiusClientIncludeL2cDslLineStateInAcctStart, juniRadiusClientIncludeL2cDslTypeInAcctStart, juniRadiusClientIncludeL2cAccessLoopCircuitIdInAcctStop, juniRadiusClientIncludeL2cAccessAggrCircuitIdBinaryInAcctStop, juniRadiusClientIncludeL2cAccessAggrCircuitIdAsciiInAcctStop, juniRadiusClientIncludeL2cActualDataRateUstrInAcctStop, juniRadiusClientIncludeL2cActualDataRateDstrInAcctStop, juniRadiusClientIncludeL2cMinimumDataRateUstrInAcctStop, juniRadiusClientIncludeL2cMinimumDataRateDstrInAcctStop, juniRadiusClientIncludeL2cAttainDataRateUstrInAcctStop, juniRadiusClientIncludeL2cAttainDataRateDstrInAcctStop, juniRadiusClientIncludeL2cMaximumDataRateUstrInAcctStop, juniRadiusClientIncludeL2cMaximumDataRateDstrInAcctStop, juniRadiusClientIncludeL2cMinLowPowerDataRateUstrInAcctStop, juniRadiusClientIncludeL2cMinLowPowerDataRateDstrInAcctStop, juniRadiusClientIncludeL2cMaxInterleavingDelayUstrInAcctStop, juniRadiusClientIncludeL2cActInterleavingDelayUstrInAcctStop, juniRadiusClientIncludeL2cMaxInterleavingDelayDstrInAcctStop, juniRadiusClientIncludeL2cActInterleavingDelayDstrInAcctStop, juniRadiusClientIncludeL2cDslLineStateInAcctStop, juniRadiusClientIncludeL2cDslTypeInAcctStop allowing to control generation and format of decoded L2C Attributes.', 'Added new objects BRAS group to allow inclusion of DSL Forum attributes into radius requests.', 'Added new objects BRAS group to allow inclusion of L2C information, L2C up and down stream data into radius requests.', 'Added new value to remote circuit id format types.', 'Added new objects to the BRAS group to allow the widths of the fields in the Nas-Port attribute (attribute number 5) to be configurable for atm and ethernet interfaces. Added new objects to control PPPoE Remote Circuit Id representation.', 'Added new objects BRAS group to allow inclusion of interface description into radius requests.', 'Added a new object to the BRAS group to allow override of nas-ip-address and nas-identifier from authentication router. Added new objects to the BRAS group to allow override of nas-port-id and calling-station-id with PPPoE Remote Circuit Id. Added new objects to the BRAS group to indicate which RADIUS attributes should be included or excluded from RADIUS packets. Added support for inclusion/exclusion of DHCP attributes.', 'Added new objects to the BRAS group to indicate which RADIUS attributes should be included or excluded from RADIUS packets (acct-multi-session-id, ascendNumInMultilink, profileServiceDescription, acctAuthentic, acctDelayTime, acctSessionId, nasIdentifier, eventTimestamp, mlpppBundleName and terminateCause). Added support to format nas-port, and connect-info attributes.', 'Added new objects: juniRadiusAcctClientRejectRequests, juniRadiusAcctClientRejectResponses, juniRadiusClientVlanNasPortFormat.', 'Added new objects: juniRadiusClientPppoeNasPortFormat, juniRadiusClientIncludeTunnelInterfaceIdInAccessReq, juniRadiusClientIncludeTunnelInterfaceIdInAcctStart, juniRadiusClientIncludeTunnelInterfaceIdInAcctStop.', 'Replaced Unisphere names with Juniper names. Added objects to ignore attributes from the access-accept RADIUS packets. Added objects for RADIUS trap enable/disable control and detailed accounting statistics. Added notifications for available RADIUS servers.', 'Added notifications for unavailable RADIUS servers.', 'Added objects (parameters) to indicate which RADIUS attributes should be included/excluded from RADIUS packets.', 'Added juniRadiusClientNasIpAddrUse.', 'Added juniRadiusClientRollover and juniRadiusClientCallingStationIdFormat.', 'Added juniRadiusClientEthernetPortType, juniRadiusClientIncludeIpAddrInAcctStart, and juniRadiusClientIncludeAcctSessionIdInAccessReq.', 'Added support for the RADIUS accounting backoff mechanism.', 'Added support for client source address.', 'Initial version of this MIB module, derived from IETF Internet Drafts of RADIUS Client MIBs for Authentication and Accounting.',))
if mibBuilder.loadTexts: juniRadiusClientMIB.setLastUpdated('200902261641Z')
if mibBuilder.loadTexts: juniRadiusClientMIB.setOrganization('Juniper Networks, Inc.')
if mibBuilder.loadTexts: juniRadiusClientMIB.setContactInfo(' Juniper Networks, Inc. Postal: 10 Technology Park Drive Westford, MA 01886-3146 USA Tel: +1 978 589 5800 Email: mib@Juniper.net')
if mibBuilder.loadTexts: juniRadiusClientMIB.setDescription('The Remote Authentication Dial In User Service (RADIUS) Client MIB for the Juniper enterprise.')
class JuniRadiusClientRemoterCircuitIdFormatComponents(TextualConvention, Integer32):
description = "The set of configurable choices of PPPoE Remote Circtuit Id. The maximum enumerated type will never be greater than 255, 'agentCircuitId' denotes the suboption 1 and 'remoteCircuitId' denotes the suboption 2 of option 82 (RFC3046)."
status = 'current'
subtypeSpec = Integer32.subtypeSpec + ConstraintsUnion(SingleValueConstraint(1, 2, 3, 4))
namedValues = NamedValues(("agentCircuitId", 1), ("agentRemoteId", 2), ("nasIdentifier", 3), ("dsl-format-1", 4))
juniRadiusClientObjects = MibIdentifier((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1))
juniRadiusGeneralClient = MibIdentifier((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1))
juniRadiusAuthClient = MibIdentifier((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 2))
juniRadiusAcctClient = MibIdentifier((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 3))
juniRadiusClientIdentifier = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 1), DisplayString()).setMaxAccess("readonly")
if mibBuilder.loadTexts: juniRadiusClientIdentifier.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIdentifier.setDescription('The NAS-Identifier of the RADIUS client.')
juniRadiusClientAlgorithm = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 2), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1))).clone(namedValues=NamedValues(("direct", 0), ("roundRobin", 1))).clone('direct')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientAlgorithm.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientAlgorithm.setDescription('The algorithm used by the client when multiple authentication/accounting servers are configured: direct Use servers in order of precedence, each time beginning with the highest precedence server and proceeding to lower precedence servers if the the RADIUS request fails, until the request succeeds or all servers have been tried. roundRobin Use servers in round-robin order, each time beginning with the next round-robin-ordered server and proceeding cyclically through servers if the RADIUS request fails, until the request succeeds or all servers have been tried.')
juniRadiusClientSourceAddress = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 3), IpAddress()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientSourceAddress.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientSourceAddress.setDescription('The source address used by the RADIUS client in requests to the RADIUS server. The RADIUS server returns responses from this address. Setting this object to 0.0.0.0 will reset the value to its default.')
juniRadiusClientUdpChecksum = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 4), TruthValue().clone('true')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientUdpChecksum.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientUdpChecksum.setDescription('Enables/disables the checksum calculations on RADIUS UDP packets.')
juniRadiusClientNasIdentifier = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 5), DisplayString().subtype(subtypeSpec=ValueSizeConstraint(0, 64))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientNasIdentifier.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientNasIdentifier.setDescription('The identifier used by the client for the value of NAS-Identifier attribute (number 32) in access and accounting requests. The default is to use the system name.')
juniRadiusClientDslPortType = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 6), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(5, 11, 12, 13, 14, 16))).clone(namedValues=NamedValues(("virtual", 5), ("sdsl", 11), ("adsl-cap", 12), ("adsl-dmt", 13), ("idsl", 14), ("xdsl", 16))).clone('xdsl')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientDslPortType.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientDslPortType.setDescription('The value to use in the NAS-Port-Type RADIUS Attribute (attribute number 61) for DSL interfaces in the RADIUS access and accounting messages: virtual Used for Virtual interfaces. sdsl Used for Symmetric DSL. adsl-cap Used for Asymmetric DSL, Carrierless Amplitude Phase Modulation. adsl-dmt Used for Asymmetric DSL, Discrete Multi-Tone. idsl Used for ISDN Digital Subscriber Line. xdsl Used for Digital Subscriber Line of unknown type.')
juniRadiusClientTunnelAccounting = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 7), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientTunnelAccounting.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientTunnelAccounting.setDescription('Enables/disables the tunnel accounting feature, which causes the system to send tunnel and session accounting requests.')
juniRadiusClientAcctSessionIdFormat = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 8), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1))).clone(namedValues=NamedValues(("decimal", 0), ("description", 1))).clone('description')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientAcctSessionIdFormat.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientAcctSessionIdFormat.setDescription('The format used by the client for the Acct-Session-ID attribute (attribute number 44): decimal Use an ASCII decimal value only in the Acct-Session-ID attribute. description Use an ASCII description value which includes the interface type (i.e. ATM), slot, port, and circuit number (VPI and VCI for ATM), and a hexidecimal value in the Acct-Session-ID attribute.')
juniRadiusClientNasPortFormat = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 9), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1))).clone(namedValues=NamedValues(("xssssppp", 0), ("ssssxppp", 1))).clone('ssssxppp')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientNasPortFormat.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientNasPortFormat.setDescription('The format used by the client for the NAS-Port attribute (attribute number 5): xssssppp In the NAS-Port attribute (attribute 5) use format of 0 bit, followed by 4 bits for the slot number, 3 bits for the port number, and finally, the circuit number in the remaining bits. ssssxppp In the NAS-Port attribute (attribute 5) use format of 4 bits for the slot number, followed by a 0 bit, 3 bits for the port number and finally, the circuit number in the remaining bits.')
juniRadiusClientCallingStationDelimiter = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 10), DisplayString().subtype(subtypeSpec=ValueSizeConstraint(1, 1)).setFixedLength(1).clone('#')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientCallingStationDelimiter.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientCallingStationDelimiter.setDescription("The character use as for delimiting fields in the Calling-Station-ID attribute (attribute 31, from RFC 2865) sent by the client. The default value is '#'.")
juniRadiusClientEthernetPortType = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 11), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(5, 15))).clone(namedValues=NamedValues(("virtual", 5), ("ethernet", 15))).clone('ethernet')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientEthernetPortType.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientEthernetPortType.setDescription('The value to use in the NAS-Port-Type RADIUS Attribute (attribute number 61) for Ethernet interfaces in the RADIUS access and accounting messages: ethernet Used for Ethernet interfaces. virtual Used for Virtual interfaces.')
juniRadiusClientIncludeIpAddrInAcctStart = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 12), TruthValue().clone('true')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeIpAddrInAcctStart.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeIpAddrInAcctStart.setDescription('Enables/disables the inclusion of the Framed-IP-Address attribute in the RADIUS Acct-Start packet.')
juniRadiusClientIncludeAcctSessionIdInAccessReq = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 13), TruthValue().clone('true')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeAcctSessionIdInAccessReq.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeAcctSessionIdInAccessReq.setDescription('Enables/disables the inclusion of the Acct-Session-ID attribute in the RADIUS Access-Request packet.')
juniRadiusClientRollover = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 14), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientRollover.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientRollover.setDescription('Enables/disables the rollover to next server on receipt of access-reject.')
juniRadiusClientCallingStationIdFormat = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 15), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1, 2, 3, 4, 5, 6))).clone(namedValues=NamedValues(("delimited", 0), ("fixedFormat", 1), ("fixedFormatAdapterEmbedded", 2), ("fixedFormatAdapterNewField", 3), ("fixedFormatStacked", 4), ("fixedFormatAdapterEmbeddedStacked", 5), ("fixedFormatAdapterNewFieldStacked", 6))).clone('delimited')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientCallingStationIdFormat.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientCallingStationIdFormat.setDescription("The format used by the client for the Calling-Station-ID attribute (attribute number 31): delimited In the Calling-Station-ID attribute (attribute 31) use the format '<delimiter><BAS name><delimiter> <interface description><delimiter><layer 2 identifier>'. fixedFomat In the Calling-Station-ID attribute (attribute 31) use the format of 4 bytes for the host name (truncated if needed), 2 digits of slot, 1 digit of port, 3 digits of VPI, followed by 5 digits of VCI. fixedFormatAdapterEmbedded In the Calling-Station-ID attribute (attribute 31) use the format of 4 bytes for the host name (truncated if needed), 1 digit of slot, 1 digit of adapter, 1 digit of port, 3 digits of VPI, followed by 5 digits of VCI. fixedFormatAdapterNewField In the Calling-Station-ID attribute (attribute 31) use the format of 4 bytes for the host name (truncated if needed), 2 digits of slot, 1 digit of adapter, 2 digits of port, 3 digits of VPI, followed by 5 digits of VCI. fixedFormatStacked In the Calling-Station-ID attribute (attribute 31) use the format of 4 bytes for the host name (truncated if needed), 2 digits of slot, 1 digit of port, 4 digits of SVLAN ID and 4 digits of VLAN ID only in the case of Ethernet fixedFormatAdapterEmbeddedStacked In the Calling-Station-ID attribute (attribute 31) use the format of 4 bytes for the host name (truncated if needed), 1 digit of slot, 1 digit of adapter, 1 digit of port, 4 digits of SVLAN ID and 4 digits of VLAN ID only in the case of Ethernet fixedFormatAdapterNewFieldStacked In the Calling-Station-ID attribute (attribute 31) use the format of 4 bytes for the host name (truncated if needed), 2 digits of slot, 1 digit of adapter, 2 digits of port, 4 digits of SVLAN ID and 4 digits of VLAN ID only in the case of Ethernet")
juniRadiusClientNasIpAddrUse = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 16), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1))).clone(namedValues=NamedValues(("normal", 0), ("tunnelClientEndpoint", 1))).clone('normal')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientNasIpAddrUse.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientNasIpAddrUse.setDescription("The value used by the client for the NAS-IP-Addr attribute (attribute number 4): normal Use the ERX IP address value in the NAS-IP-Addr attribute (attribute 4). tunnelClientEndpoint Use the Tunnel Client's address value in the NAS-IP-Addr attribute (attribute 4) for tunnel users.")
juniRadiusClientIncludeAcctTunnelConnectionInAccessReq = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 17), TruthValue().clone('true')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeAcctTunnelConnectionInAccessReq.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeAcctTunnelConnectionInAccessReq.setDescription('Enables/disables the inclusion of the Acct-Tunnel-Connection attribute in the RADIUS Access-Request packet.')
juniRadiusClientIncludeCalledStationIdInAccessReq = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 18), TruthValue().clone('true')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeCalledStationIdInAccessReq.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeCalledStationIdInAccessReq.setDescription('Enables/disables the inclusion of the Called-Station-ID attribute in the RADIUS Access-Request packet.')
juniRadiusClientIncludeCallingStationIdInAccessReq = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 19), TruthValue().clone('true')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeCallingStationIdInAccessReq.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeCallingStationIdInAccessReq.setDescription('Enables/disables the inclusion of the Calling-Station-ID attribute in the RADIUS Access-Request packet.')
juniRadiusClientIncludeConnectInfoInAccessReq = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 20), TruthValue().clone('true')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeConnectInfoInAccessReq.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeConnectInfoInAccessReq.setDescription('Enables/disables the inclusion of the Connect-Info attribute in the RADIUS Access-Request packet.')
juniRadiusClientIncludeNasIdentifierInAccessReq = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 21), TruthValue().clone('true')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeNasIdentifierInAccessReq.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeNasIdentifierInAccessReq.setDescription('Enables/disables the inclusion of the NAS-Identifier attribute in the RADIUS Access-Request packet.')
juniRadiusClientIncludeNasPortInAccessReq = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 22), TruthValue().clone('true')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeNasPortInAccessReq.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeNasPortInAccessReq.setDescription('Enables/disables the inclusion of the NAS-Port attribute in the RADIUS Access-Request packet.')
juniRadiusClientIncludeNasPortIdInAccessReq = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 23), TruthValue().clone('true')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeNasPortIdInAccessReq.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeNasPortIdInAccessReq.setDescription('Enables/disables the inclusion of the NAS-Port-ID attribute in the RADIUS Access-Request packet.')
juniRadiusClientIncludeNasPortTypeInAccessReq = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 24), TruthValue().clone('true')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeNasPortTypeInAccessReq.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeNasPortTypeInAccessReq.setDescription('Enables/disables the inclusion of the NAS-Port-Type attribute in the RADIUS Access-Request packet.')
juniRadiusClientIncludePppoeDescriptionInAccessReq = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 25), TruthValue().clone('true')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludePppoeDescriptionInAccessReq.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludePppoeDescriptionInAccessReq.setDescription('Enables/disables the inclusion of the PPPoE-Description (VSA) attribute in the RADIUS Access-Request packet.')
juniRadiusClientIncludeTunnelClientAuthIdInAccessReq = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 26), TruthValue().clone('true')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeTunnelClientAuthIdInAccessReq.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeTunnelClientAuthIdInAccessReq.setDescription('Enables/disables the inclusion of the Tunnel-Client-Auth-Id attribute in the RADIUS Access-Request packet.')
juniRadiusClientIncludeTunnelClientEndpointInAccessReq = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 27), TruthValue().clone('true')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeTunnelClientEndpointInAccessReq.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeTunnelClientEndpointInAccessReq.setDescription('Enables/disables the inclusion of the Tunnel-Client-Endpoint attribute in the RADIUS Access-Request packet.')
juniRadiusClientIncludeTunnelMediumTypeInAccessReq = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 28), TruthValue().clone('true')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeTunnelMediumTypeInAccessReq.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeTunnelMediumTypeInAccessReq.setDescription('Enables/disables the inclusion of the Tunnel-Medium attribute in the RADIUS Access-Request packet.')
juniRadiusClientIncludeTunnelServerAttributesInAccessReq = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 29), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeTunnelServerAttributesInAccessReq.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeTunnelServerAttributesInAccessReq.setDescription('Enables/disables the inclusion of the Tunnel-Server attributes (Tunnel attributes for a PPP session terminated on the LNS) in the RADIUS Access-Request packet.')
juniRadiusClientIncludeTunnelServerAuthIdInAccessReq = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 30), TruthValue().clone('true')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeTunnelServerAuthIdInAccessReq.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeTunnelServerAuthIdInAccessReq.setDescription('Enables/disables the inclusion of the Tunnel-Server-Auth-Id attribute in the RADIUS Access-Request packet.')
juniRadiusClientIncludeTunnelServerEndpointInAccessReq = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 31), TruthValue().clone('true')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeTunnelServerEndpointInAccessReq.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeTunnelServerEndpointInAccessReq.setDescription('Enables/disables the inclusion of the Tunnel-Server-Endpoint attribute in the RADIUS Access-Request packet.')
juniRadiusClientIncludeTunnelTypeInAccessReq = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 32), TruthValue().clone('true')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeTunnelTypeInAccessReq.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeTunnelTypeInAccessReq.setDescription('Enables/disables the inclusion of the Tunnel-Type attribute in the RADIUS Access-Request packet.')
juniRadiusClientIncludeAcctTunnelConnectionInAcctStart = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 33), TruthValue().clone('true')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeAcctTunnelConnectionInAcctStart.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeAcctTunnelConnectionInAcctStart.setDescription('Enables/disables the inclusion of the Acct-Tunnel-Connection attribute in the RADIUS Accounting-Start packet.')
juniRadiusClientIncludeCalledStationIdInAcctStart = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 34), TruthValue().clone('true')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeCalledStationIdInAcctStart.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeCalledStationIdInAcctStart.setDescription('Enables/disables the inclusion of the Called-Station-ID attribute in the RADIUS Accounting-Start packet.')
juniRadiusClientIncludeCallingStationIdInAcctStart = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 35), TruthValue().clone('true')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeCallingStationIdInAcctStart.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeCallingStationIdInAcctStart.setDescription('Enables/disables the inclusion of the Calling-Station-ID attribute in the RADIUS Accounting-Start packet.')
juniRadiusClientIncludeClassInAcctStart = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 36), TruthValue().clone('true')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeClassInAcctStart.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeClassInAcctStart.setDescription('Enables/disables the inclusion of the Class attribute in the RADIUS Accounting-Start packet.')
juniRadiusClientIncludeConnectInfoInAcctStart = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 37), TruthValue().clone('true')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeConnectInfoInAcctStart.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeConnectInfoInAcctStart.setDescription('Enables/disables the inclusion of the Connect-Info attribute in the RADIUS Accounting-Start packet.')
juniRadiusClientIncludeEgressPolicyNameInAcctStart = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 38), TruthValue().clone('true')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeEgressPolicyNameInAcctStart.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeEgressPolicyNameInAcctStart.setDescription('Enables/disables the inclusion of the Egress-Policy-Name (VSA) attribute in the RADIUS Accounting-Start packet.')
juniRadiusClientIncludeEventTimestampInAcctStart = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 39), TruthValue().clone('true')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeEventTimestampInAcctStart.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeEventTimestampInAcctStart.setDescription('Enables/disables the inclusion of the Event-Timestamp attribute in the RADIUS Accounting-Start packet.')
juniRadiusClientIncludeFramedCompressionInAcctStart = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 40), TruthValue().clone('true')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeFramedCompressionInAcctStart.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeFramedCompressionInAcctStart.setDescription('Enables/disables the inclusion of the Framed-Compression attribute in the RADIUS Accounting-Start packet.')
juniRadiusClientIncludeFramedIpNetmaskInAcctStart = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 41), TruthValue().clone('true')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeFramedIpNetmaskInAcctStart.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeFramedIpNetmaskInAcctStart.setDescription('Enables/disables the inclusion of the Framed-IP-Netmask attribute in the RADIUS Accounting-Start packet.')
juniRadiusClientIncludeIngressPolicyNameInAcctStart = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 42), TruthValue().clone('true')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeIngressPolicyNameInAcctStart.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeIngressPolicyNameInAcctStart.setDescription('Enables/disables the inclusion of the Ingress-Policy-Name (VSA) attribute in the RADIUS Accounting-Start packet.')
juniRadiusClientIncludeNasIdentifierInAcctStart = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 43), TruthValue().clone('true')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeNasIdentifierInAcctStart.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeNasIdentifierInAcctStart.setDescription('Enables/disables the inclusion of the NAS-Identifier attribute in the RADIUS Accounting-Start packet.')
juniRadiusClientIncludeNasPortInAcctStart = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 44), TruthValue().clone('true')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeNasPortInAcctStart.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeNasPortInAcctStart.setDescription('Enables/disables the inclusion of the NAS-Port attribute in the RADIUS Accounting-Start packet.')
juniRadiusClientIncludeNasPortIdInAcctStart = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 45), TruthValue().clone('true')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeNasPortIdInAcctStart.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeNasPortIdInAcctStart.setDescription('Enables/disables the inclusion of the NAS-Port-ID attribute in the RADIUS Accounting-Start packet.')
juniRadiusClientIncludeNasPortTypeInAcctStart = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 46), TruthValue().clone('true')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeNasPortTypeInAcctStart.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeNasPortTypeInAcctStart.setDescription('Enables/disables the inclusion of the NAS-Port-Type attribute in the RADIUS Accounting-Start packet.')
juniRadiusClientIncludePppoeDescriptionInAcctStart = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 47), TruthValue().clone('true')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludePppoeDescriptionInAcctStart.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludePppoeDescriptionInAcctStart.setDescription('Enables/disables the inclusion of the PPPoE-Description (VSA) attribute in the RADIUS Accounting-Start packet.')
juniRadiusClientIncludeTunnelAssignmentIdInAcctStart = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 48), TruthValue().clone('true')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeTunnelAssignmentIdInAcctStart.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeTunnelAssignmentIdInAcctStart.setDescription('Enables/disables the inclusion of the Tunnel-Assignment-Id attribute in the RADIUS Accounting-Start packet.')
juniRadiusClientIncludeTunnelClientAuthIdInAcctStart = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 49), TruthValue().clone('true')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeTunnelClientAuthIdInAcctStart.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeTunnelClientAuthIdInAcctStart.setDescription('Enables/disables the inclusion of the Tunnel-Client-Auth-Id attribute in the RADIUS Accounting-Start packet.')
juniRadiusClientIncludeTunnelClientEndpointInAcctStart = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 50), TruthValue().clone('true')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeTunnelClientEndpointInAcctStart.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeTunnelClientEndpointInAcctStart.setDescription('Enables/disables the inclusion of the Tunnel-Client-Endpoint attribute in the RADIUS Accounting-Start packet.')
juniRadiusClientIncludeTunnelMediumTypeInAcctStart = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 51), TruthValue().clone('true')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeTunnelMediumTypeInAcctStart.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeTunnelMediumTypeInAcctStart.setDescription('Enables/disables the inclusion of the Tunnel-Medium attribute in the RADIUS Accounting-Start packet.')
juniRadiusClientIncludeTunnelPreferenceInAcctStart = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 52), TruthValue().clone('true')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeTunnelPreferenceInAcctStart.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeTunnelPreferenceInAcctStart.setDescription('Enables/disables the inclusion of the Tunnel-Preference attribute in the RADIUS Accounting-Start packet.')
juniRadiusClientIncludeTunnelServerAttributesInAcctStart = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 53), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeTunnelServerAttributesInAcctStart.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeTunnelServerAttributesInAcctStart.setDescription('Enables/disables the inclusion of the Tunnel-Server attributes (Tunnel attributes for a PPP session terminated on the LNS) in the RADIUS Accounting-Start packet.')
juniRadiusClientIncludeTunnelServerAuthIdInAcctStart = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 54), TruthValue().clone('true')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeTunnelServerAuthIdInAcctStart.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeTunnelServerAuthIdInAcctStart.setDescription('Enables/disables the inclusion of the Tunnel-Server-Auth-Id attribute in the RADIUS Accounting-Start packet.')
juniRadiusClientIncludeTunnelServerEndpointInAcctStart = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 55), TruthValue().clone('true')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeTunnelServerEndpointInAcctStart.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeTunnelServerEndpointInAcctStart.setDescription('Enables/disables the inclusion of the Tunnel-Server-Endpoint attribute in the RADIUS Accounting-Start packet.')
juniRadiusClientIncludeTunnelTypeInAcctStart = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 56), TruthValue().clone('true')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeTunnelTypeInAcctStart.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeTunnelTypeInAcctStart.setDescription('Enables/disables the inclusion of the Tunnel-Type attribute in the RADIUS Accounting-Start packet.')
juniRadiusClientIncludeAcctTunnelConnectionInAcctStop = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 57), TruthValue().clone('true')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeAcctTunnelConnectionInAcctStop.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeAcctTunnelConnectionInAcctStop.setDescription('Enables/disables the inclusion of the Acct-Tunnel-Connection attribute in the RADIUS Accounting-Stop packet.')
juniRadiusClientIncludeCalledStationIdInAcctStop = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 59), TruthValue().clone('true')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeCalledStationIdInAcctStop.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeCalledStationIdInAcctStop.setDescription('Enables/disables the inclusion of the Called-Station-ID attribute in the RADIUS Accounting-Stop packet.')
juniRadiusClientIncludeCallingStationIdInAcctStop = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 60), TruthValue().clone('true')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeCallingStationIdInAcctStop.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeCallingStationIdInAcctStop.setDescription('Enables/disables the inclusion of the Calling-Station-ID attribute in the RADIUS Accounting-Stop packet.')
juniRadiusClientIncludeClassInAcctStop = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 61), TruthValue().clone('true')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeClassInAcctStop.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeClassInAcctStop.setDescription('Enables/disables the inclusion of the Class attribute in the RADIUS Accounting-Stop packet.')
juniRadiusClientIncludeConnectInfoInAcctStop = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 62), TruthValue().clone('true')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeConnectInfoInAcctStop.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeConnectInfoInAcctStop.setDescription('Enables/disables the inclusion of the Connect-Info attribute in the RADIUS Accounting-Stop packet.')
juniRadiusClientIncludeEgressPolicyNameInAcctStop = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 63), TruthValue().clone('true')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeEgressPolicyNameInAcctStop.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeEgressPolicyNameInAcctStop.setDescription('Enables/disables the inclusion of the Egress-Policy-Name (VSA) attribute in the RADIUS Accounting-Stop packet.')
juniRadiusClientIncludeEventTimestampInAcctStop = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 64), TruthValue().clone('true')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeEventTimestampInAcctStop.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeEventTimestampInAcctStop.setDescription('Enables/disables the inclusion of the Event-Timestamp attribute in the RADIUS Accounting-Stop packet.')
juniRadiusClientIncludeFramedCompressionInAcctStop = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 65), TruthValue().clone('true')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeFramedCompressionInAcctStop.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeFramedCompressionInAcctStop.setDescription('Enables/disables the inclusion of the Framed-Compression attribute in the RADIUS Accounting-Stop packet.')
juniRadiusClientIncludeFramedIpNetmaskInAcctStop = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 66), TruthValue().clone('true')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeFramedIpNetmaskInAcctStop.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeFramedIpNetmaskInAcctStop.setDescription('Enables/disables the inclusion of the Framed-IP-Netmask attribute in the RADIUS Accounting-Stop packet.')
juniRadiusClientIncludeIngressPolicyNameInAcctStop = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 67), TruthValue().clone('true')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeIngressPolicyNameInAcctStop.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeIngressPolicyNameInAcctStop.setDescription('Enables/disables the inclusion of the Ingress-Policy-Name (VSA) attribute in the RADIUS Accounting-Stop packet.')
juniRadiusClientIncludeInputGigawordsInAcctStop = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 68), TruthValue().clone('true')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeInputGigawordsInAcctStop.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeInputGigawordsInAcctStop.setDescription('Enables/disables the inclusion of the Input-Gigawords attribute in the RADIUS Accounting-Stop packet.')
juniRadiusClientIncludeNasIdentifierInAcctStop = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 69), TruthValue().clone('true')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeNasIdentifierInAcctStop.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeNasIdentifierInAcctStop.setDescription('Enables/disables the inclusion of the NAS-Identifier attribute in the RADIUS Accounting-Stop packet.')
juniRadiusClientIncludeNasPortInAcctStop = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 70), TruthValue().clone('true')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeNasPortInAcctStop.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeNasPortInAcctStop.setDescription('Enables/disables the inclusion of the NAS-Port attribute in the RADIUS Accounting-Stop packet.')
juniRadiusClientIncludeNasPortIdInAcctStop = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 71), TruthValue().clone('true')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeNasPortIdInAcctStop.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeNasPortIdInAcctStop.setDescription('Enables/disables the inclusion of the NAS-Port-ID attribute in the RADIUS Accounting-Stop packet.')
juniRadiusClientIncludeNasPortTypeInAcctStop = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 72), TruthValue().clone('true')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeNasPortTypeInAcctStop.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeNasPortTypeInAcctStop.setDescription('Enables/disables the inclusion of the NAS-Port-Type attribute in the RADIUS Accounting-Stop packet.')
juniRadiusClientIncludeOutputGigawordsInAcctStop = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 73), TruthValue().clone('true')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeOutputGigawordsInAcctStop.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeOutputGigawordsInAcctStop.setDescription('Enables/disables the inclusion of the Output-Gigawords attribute in the RADIUS Accounting-Stop packet.')
juniRadiusClientIncludePppoeDescriptionInAcctStop = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 74), TruthValue().clone('true')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludePppoeDescriptionInAcctStop.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludePppoeDescriptionInAcctStop.setDescription('Enables/disables the inclusion of the PPPoE-Description (VSA) attribute in the RADIUS Accounting-Stop packet.')
juniRadiusClientIncludeTunnelAssignmentIdInAcctStop = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 75), TruthValue().clone('true')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeTunnelAssignmentIdInAcctStop.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeTunnelAssignmentIdInAcctStop.setDescription('Enables/disables the inclusion of the Tunnel-Assignment-Id attribute in the RADIUS Accounting-Stop packet.')
juniRadiusClientIncludeTunnelClientAuthIdInAcctStop = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 76), TruthValue().clone('true')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeTunnelClientAuthIdInAcctStop.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeTunnelClientAuthIdInAcctStop.setDescription('Enables/disables the inclusion of the Tunnel-Client-Auth-Id attribute in the RADIUS Accounting-Stop packet.')
juniRadiusClientIncludeTunnelClientEndpointInAcctStop = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 77), TruthValue().clone('true')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeTunnelClientEndpointInAcctStop.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeTunnelClientEndpointInAcctStop.setDescription('Enables/disables the inclusion of the Tunnel-Client-Endpoint attribute in the RADIUS Accounting-Stop packet.')
juniRadiusClientIncludeTunnelMediumTypeInAcctStop = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 78), TruthValue().clone('true')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeTunnelMediumTypeInAcctStop.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeTunnelMediumTypeInAcctStop.setDescription('Enables/disables the inclusion of the Tunnel-Medium attribute in the RADIUS Accounting-Stop packet.')
juniRadiusClientIncludeTunnelPreferenceInAcctStop = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 79), TruthValue().clone('true')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeTunnelPreferenceInAcctStop.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeTunnelPreferenceInAcctStop.setDescription('Enables/disables the inclusion of the Tunnel-Preference attribute in the RADIUS Accounting-Stop packet.')
juniRadiusClientIncludeTunnelServerAttributesInAcctStop = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 80), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeTunnelServerAttributesInAcctStop.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeTunnelServerAttributesInAcctStop.setDescription('Enables/disables the inclusion of the Tunnel-Server attributes (Tunnel attributes for a PPP session terminated on the LNS) in the RADIUS Accounting-Stop packet.')
juniRadiusClientIncludeTunnelServerAuthIdInAcctStop = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 81), TruthValue().clone('true')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeTunnelServerAuthIdInAcctStop.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeTunnelServerAuthIdInAcctStop.setDescription('Enables/disables the inclusion of the Tunnel-Server-Auth-Id attribute in the RADIUS Accounting-Stop packet.')
juniRadiusClientIncludeTunnelServerEndpointInAcctStop = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 82), TruthValue().clone('true')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeTunnelServerEndpointInAcctStop.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeTunnelServerEndpointInAcctStop.setDescription('Enables/disables the inclusion of the Tunnel-Server-Endpoint attribute in the RADIUS Accounting-Stop packet.')
juniRadiusClientIncludeTunnelTypeInAcctStop = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 83), TruthValue().clone('true')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeTunnelTypeInAcctStop.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeTunnelTypeInAcctStop.setDescription('Enables/disables the inclusion of the Tunnel-Type attribute in the RADIUS Accounting-Stop packet.')
juniRadiusClientIncludeInputGigapktsInAcctStop = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 84), TruthValue().clone('true')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeInputGigapktsInAcctStop.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeInputGigapktsInAcctStop.setDescription('Enables/disables the inclusion of the Input-Gigapkts attribute in the RADIUS Accounting-Stop packet.')
juniRadiusClientIncludeOutputGigapktsInAcctStop = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 85), TruthValue().clone('true')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeOutputGigapktsInAcctStop.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeOutputGigapktsInAcctStop.setDescription('Enables/disables the inclusion of the Output-Gigapkts attribute in the RADIUS Accounting-Stop packet.')
juniRadiusClientIgnoreFramedIpNetmask = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 86), TruthValue().clone('true')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIgnoreFramedIpNetmask.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIgnoreFramedIpNetmask.setDescription('Enables/disables ignoring the Framed-IP-Netmask attribute in the RADIUS Access-Accept packet.')
juniRadiusClientIgnoreAtmCategory = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 87), TruthValue().clone('true')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIgnoreAtmCategory.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIgnoreAtmCategory.setDescription('Enables/disables ignoring the ATM-Category (vsa) attribute in the RADIUS Access-Accept packet.')
juniRadiusClientIgnoreAtmMbs = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 88), TruthValue().clone('true')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIgnoreAtmMbs.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIgnoreAtmMbs.setDescription('Enables/disables ignoring the ATM-MBS (vsa) attribute in the RADIUS Access-Accept packet.')
juniRadiusClientIgnoreAtmPcr = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 89), TruthValue().clone('true')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIgnoreAtmPcr.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIgnoreAtmPcr.setDescription('Enables/disables ignoring the ATM-PCR (vsa) attribute in the RADIUS Access-Accept packet.')
juniRadiusClientIgnoreAtmScr = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 90), TruthValue().clone('true')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIgnoreAtmScr.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIgnoreAtmScr.setDescription('Enables/disables ignoring the ATM-SCR-Or-CBR (vsa) attribute in the RADIUS Access-Accept packet.')
juniRadiusClientIgnoreEgressPolicyName = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 91), TruthValue().clone('true')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIgnoreEgressPolicyName.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIgnoreEgressPolicyName.setDescription('Enables/disables ignoring the Egress-Policy-Name (vsa) attribute in the RADIUS Access-Accept packet.')
juniRadiusClientIgnoreIngressPolicyName = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 92), TruthValue().clone('true')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIgnoreIngressPolicyName.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIgnoreIngressPolicyName.setDescription('Enables/disables ignoring the Ingress-Policy-Name (vsa) attribute in the RADIUS Access-Accept packet.')
juniRadiusClientIgnoreVirtualRouter = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 93), TruthValue().clone('true')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIgnoreVirtualRouter.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIgnoreVirtualRouter.setDescription('Enables/disables ignoring the Virtual-Router (vsa) attribute in the RADIUS Access-Accept packet.')
juniRadiusClientTrapOnAuthServerUnavailable = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 94), TruthValue().clone('true')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientTrapOnAuthServerUnavailable.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientTrapOnAuthServerUnavailable.setDescription('Enables/disables sending an SNMP trap for the condition that a specific RADIUS authentication server times out.')
juniRadiusClientTrapOnAcctServerUnavailable = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 95), TruthValue().clone('true')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientTrapOnAcctServerUnavailable.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientTrapOnAcctServerUnavailable.setDescription('Enables/disables sending an SNMP trap for the condition that a specific RADIUS accounting server times out.')
juniRadiusClientTrapOnNoAuthServerAvailable = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 96), TruthValue().clone('true')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientTrapOnNoAuthServerAvailable.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientTrapOnNoAuthServerAvailable.setDescription('Enables/disables sending an SNMP trap for the condition that all of the configured RADIUS authentication servers (in a virtual router context) time out.')
juniRadiusClientTrapOnNoAcctServerAvailable = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 97), TruthValue().clone('true')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientTrapOnNoAcctServerAvailable.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientTrapOnNoAcctServerAvailable.setDescription('Enables/disables sending an SNMP trap for the condition that all of the configured RADIUS accounting servers (in a virtual router context) time out.')
juniRadiusClientTrapOnAuthServerAvailable = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 98), TruthValue().clone('true')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientTrapOnAuthServerAvailable.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientTrapOnAuthServerAvailable.setDescription('Enables/disables sending an SNMP trap for the condition that a specific RADIUS authentication server has sent a response after being declared unavailable.')
juniRadiusClientTrapOnAcctServerAvailable = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 99), TruthValue().clone('true')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientTrapOnAcctServerAvailable.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientTrapOnAcctServerAvailable.setDescription('Enables/disables sending an SNMP trap for the condition that a specific RADIUS accounting server has sent a response after being declared unavailable.')
juniRadiusClientPppoeNasPortFormat = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 100), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1))).clone(namedValues=NamedValues(("none", 0), ("unique", 1))).clone('none')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientPppoeNasPortFormat.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientPppoeNasPortFormat.setDescription('The format used by the client for the Nas-Port attribute (attribute number 5) for PPPoE interfaces: none Use the format specified in juniRadiusClientNasPortFormat unique Use a unique value that is not related to the interface ')
juniRadiusClientIncludeTunnelInterfaceIdInAccessReq = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 101), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeTunnelInterfaceIdInAccessReq.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeTunnelInterfaceIdInAccessReq.setDescription('Enables/disables the inclusion of the Tunnel-Interface-Id (VSA) attribute in the RADIUS Access-Request packet.')
juniRadiusClientIncludeTunnelInterfaceIdInAcctStart = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 102), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeTunnelInterfaceIdInAcctStart.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeTunnelInterfaceIdInAcctStart.setDescription('Enables/disables the inclusion of the Tunnel-Interface-Id (VSA) in the RADIUS Accounting-Start packet.')
juniRadiusClientIncludeTunnelInterfaceIdInAcctStop = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 103), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeTunnelInterfaceIdInAcctStop.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeTunnelInterfaceIdInAcctStop.setDescription('Enables/disables the inclusion of the Tunnel-Interface-Id (VSA) in the RADIUS Accounting-Stop packet.')
juniRadiusClientIncludeL2tpPppDisconnectCauseInAcctStop = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 104), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeL2tpPppDisconnectCauseInAcctStop.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeL2tpPppDisconnectCauseInAcctStop.setDescription('Enables/disables the inclusion of the L2TP PPP Disconnect Cause (VSA) in the RADIUS Accounting-Stop packet.')
juniRadiusClientVlanNasPortFormat = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 105), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1))).clone(namedValues=NamedValues(("none", 0), ("stacked", 1))).clone('none')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientVlanNasPortFormat.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientVlanNasPortFormat.setDescription('The format used by the client for the Nas-Port attribute (attribute number 5) for VLAN interfaces: none Include the VLAN ID if configured stacked Include both the SVLAN ID and VLAN ID if configured ')
juniRadiusClientIncludeAcctMultiSessionIdInAccessReq = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 106), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeAcctMultiSessionIdInAccessReq.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeAcctMultiSessionIdInAccessReq.setDescription('Enables/disables the inclusion of the Accounting Multilink Session ID in the RADIUS Access-Req packet.')
juniRadiusClientIncludeAcctMultiSessionIdInAcctStart = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 107), TruthValue().clone('true')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeAcctMultiSessionIdInAcctStart.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeAcctMultiSessionIdInAcctStart.setDescription('Enables/disables the inclusion of the Accounting Multilink Session ID in the RADIUS Accounting-Start packet.')
juniRadiusClientIncludeAcctMultiSessionIdInAcctStop = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 108), TruthValue().clone('true')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeAcctMultiSessionIdInAcctStop.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeAcctMultiSessionIdInAcctStop.setDescription('Enables/disables the inclusion of the Accounting Multilink Session ID in the RADIUS Accounting-Stop packet.')
juniRadiusClientIncludeAscendNumInMultilinkInAccessReq = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 109), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeAscendNumInMultilinkInAccessReq.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeAscendNumInMultilinkInAccessReq.setDescription('Enables/disables the inclusion of the Ascend Num In Multilink attribute in the RADIUS Access-Req packet.')
juniRadiusClientIncludeAscendNumInMultilinkInAcctStart = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 110), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeAscendNumInMultilinkInAcctStart.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeAscendNumInMultilinkInAcctStart.setDescription('Enables/disables the inclusion of the Ascend Num In Multilink attribute in the RADIUS Accounting-Start packet.')
juniRadiusClientIncludeAscendNumInMultilinkInAcctStop = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 111), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeAscendNumInMultilinkInAcctStop.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeAscendNumInMultilinkInAcctStop.setDescription('Enables/disables the inclusion of the Ascend Num In Multilink attribute in the RADIUS Accounting-Stop packet.')
juniRadiusClientConnectInfoFormat = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 112), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1, 2))).clone(namedValues=NamedValues(("default", 0), ("l2tpConnectSpeed", 1), ("l2tpConnectSpeedRxWhenEqual", 2))).clone('default')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientConnectInfoFormat.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientConnectInfoFormat.setDescription('The format used by the client for the Connect-Info attribute (attribute number 77): default The Connect-Info Attribute is generated from the underlying interface id string. l2tpConnectSpeed The Connect-Info Attribute is generated from the received l2tp connect speed AVPs. The format is in bits per second: <tx-connect-speed>[/<rx-connect-speed>] The receive connect speed is included if non zero and different from the transmit connect speed. l2tpConnectSpeedRXWhenEqual The Connect-Info Attribute is generated from the received l2tp connect speed AVPs. The format is in bits per second: <tx-connect-speed>/<rx-connect-speed> The receive connect speed is included if non-zero.')
juniRadiusClientIncludeProfileServiceDescrInAccessReq = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 113), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeProfileServiceDescrInAccessReq.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeProfileServiceDescrInAccessReq.setDescription('Enables/disables the inclusion of the AAA profile service description attribute in the RADIUS Access-Req packet.')
juniRadiusClientIncludeProfileServiceDescrInAcctStart = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 114), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeProfileServiceDescrInAcctStart.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeProfileServiceDescrInAcctStart.setDescription(' Enables/disables the inclusion of the AAA profile service description attribute in the RADIUS Accounting-Start packet.')
juniRadiusClientIncludeProfileServiceDescrInAcctStop = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 115), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeProfileServiceDescrInAcctStop.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeProfileServiceDescrInAcctStop.setDescription(' Enables/disables the inclusion of the AAA profile service description attribute in the RADIUS Accounting-Stop packet.')
juniRadiusClientIncludeAcctAuthenticInAcctOn = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 116), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeAcctAuthenticInAcctOn.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeAcctAuthenticInAcctOn.setDescription(' Enables/disables the inclusion of the acct-authentic attribute in the RADIUS Accounting-On packet.')
juniRadiusClientIncludeAcctDelayTimeInAcctOn = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 117), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeAcctDelayTimeInAcctOn.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeAcctDelayTimeInAcctOn.setDescription(' Enables/disables the inclusion of the acct-delay-time attribute in the RADIUS Accounting-On packet.')
juniRadiusClientIncludeAcctSessionIdInAcctOn = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 118), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeAcctSessionIdInAcctOn.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeAcctSessionIdInAcctOn.setDescription(' Enables/disables the inclusion of the acct-session-id attribute in the RADIUS Accounting-On packet.')
juniRadiusClientIncludeNasIdentifierInAcctOn = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 119), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeNasIdentifierInAcctOn.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeNasIdentifierInAcctOn.setDescription(' Enables/disables the inclusion of the nas-identifier attribute in the RADIUS Accounting-On packet.')
juniRadiusClientIncludeEventTimestampInAcctOn = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 120), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeEventTimestampInAcctOn.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeEventTimestampInAcctOn.setDescription(' Enables/disables the inclusion of the event-timestamp attribute in the RADIUS Accounting-On packet.')
juniRadiusClientIncludeAcctAuthenticInAcctOff = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 121), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeAcctAuthenticInAcctOff.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeAcctAuthenticInAcctOff.setDescription(' Enables/disables the inclusion of the acct-authentic attribute in the RADIUS Accounting-Off packet.')
juniRadiusClientIncludeAcctDelayTimeInAcctOff = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 122), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeAcctDelayTimeInAcctOff.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeAcctDelayTimeInAcctOff.setDescription(' Enables/disables the inclusion of the acct-delay-time attribute in the RADIUS Accounting-Off packet.')
juniRadiusClientIncludeAcctSessionIdInAcctOff = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 123), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeAcctSessionIdInAcctOff.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeAcctSessionIdInAcctOff.setDescription(' Enables/disables the inclusion of the acct-session-id attribute in the RADIUS Accounting-Off packet.')
juniRadiusClientIncludeAcctTerminateCauseInAcctOff = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 124), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeAcctTerminateCauseInAcctOff.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeAcctTerminateCauseInAcctOff.setDescription(' Enables/disables the inclusion of the acct-terminate-cause attribute in the RADIUS Accounting-Off packet.')
juniRadiusClientIncludeNasIdentifierInAcctOff = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 125), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeNasIdentifierInAcctOff.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeNasIdentifierInAcctOff.setDescription(' Enables/disables the inclusion of the nas-identifier attribute in the RADIUS Accounting-Off packet.')
juniRadiusClientIncludeEventTimestampInAcctOff = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 126), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeEventTimestampInAcctOff.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeEventTimestampInAcctOff.setDescription(' Enables/disables the inclusion of the event-timestamp attribute in the RADIUS Accounting-Off packet.')
juniRadiusClientIncludeDhcpOptionsInAccessReq = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 127), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeDhcpOptionsInAccessReq.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeDhcpOptionsInAccessReq.setDescription('Enables/disables the inclusion of the Dhcp-Options (vsa) attribute in the RADIUS Access-Request packet.')
juniRadiusClientIncludeDhcpMacAddressInAccessReq = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 128), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeDhcpMacAddressInAccessReq.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeDhcpMacAddressInAccessReq.setDescription('Enables/disables the inclusion of the Dhcp-Mac-Addresss (vsa) attribute in the RADIUS Access-Request packet.')
juniRadiusClientIncludeDhcpGiAddressInAccessReq = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 129), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeDhcpGiAddressInAccessReq.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeDhcpGiAddressInAccessReq.setDescription('Enables/disables the inclusion of the Dhcp-Gi-Address (vsa) attribute in the RADIUS Access-Request packet.')
juniRadiusClientIncludeDhcpOptionsInAcctStart = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 130), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeDhcpOptionsInAcctStart.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeDhcpOptionsInAcctStart.setDescription('Enables/disables the inclusion of the Dhcp-Options (vsa) attribute in the RADIUS Accounting-Start packet.')
juniRadiusClientIncludeDhcpMacAddressInAcctStart = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 131), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeDhcpMacAddressInAcctStart.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeDhcpMacAddressInAcctStart.setDescription('Enables/disables the inclusion of the Dhcp-Mac-Addresss (vsa) attribute in the RADIUS Accounting-Start packet.')
juniRadiusClientIncludeDhcpGiAddressInAcctStart = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 132), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeDhcpGiAddressInAcctStart.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeDhcpGiAddressInAcctStart.setDescription('Enables/disables the inclusion of the Dhcp-Gi-Address (vsa) attribute in the RADIUS Accounting-Start packet.')
juniRadiusClientIncludeDhcpOptionsInAcctStop = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 133), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeDhcpOptionsInAcctStop.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeDhcpOptionsInAcctStop.setDescription('Enables/disables the inclusion of the Dhcp-Options (vsa) attribute in the RADIUS Accounting-Stop packet.')
juniRadiusClientIncludeDhcpMacAddressInAcctStop = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 134), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeDhcpMacAddressInAcctStop.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeDhcpMacAddressInAcctStop.setDescription('Enables/disables the inclusion of the Dhcp-Mac-Addresss (vsa) attribute in the RADIUS Accounting-Stop packet.')
juniRadiusClientIncludeDhcpGiAddressInAcctStop = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 135), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeDhcpGiAddressInAcctStop.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeDhcpGiAddressInAcctStop.setDescription('Enables/disables the inclusion of the Dhcp-Gi-Address (vsa) attribute in the RADIUS Accounting-Stop packet.')
juniRadiusClientNasPortIdOverrideRemoteCircuitId = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 136), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientNasPortIdOverrideRemoteCircuitId.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientNasPortIdOverrideRemoteCircuitId.setDescription('Enables/disables overriding the Nas-Port-Id with the PPPoe Remote Circuit Id.')
juniRadiusClientCallingStationIdOverrideRemoteCircuitId = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 137), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientCallingStationIdOverrideRemoteCircuitId.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientCallingStationIdOverrideRemoteCircuitId.setDescription('Enables/disables overriding the Calling-Station-Id with the PPPoe Remote Circuit Id.')
juniRadiusClientIncludeMlpppBundleNameInAccessReq = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 138), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeMlpppBundleNameInAccessReq.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeMlpppBundleNameInAccessReq.setDescription('Enables/disables the inclusion of the MLPPP-Bundle-Name (VSA) attribute in the RADIUS Access-Request packet.')
juniRadiusClientIncludeMlpppBundleNameInAcctStart = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 139), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeMlpppBundleNameInAcctStart.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeMlpppBundleNameInAcctStart.setDescription('Enables/disables the inclusion of the MLPPP-Bundle-Name (VSA) in the RADIUS Accounting-Start packet.')
juniRadiusClientIncludeMlpppBundleNameInAcctStop = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 140), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeMlpppBundleNameInAcctStop.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeMlpppBundleNameInAcctStop.setDescription('Enables/disables the inclusion of the MLPPP-Bundle-Name (VSA) in the RADIUS Accounting-Stop packet.')
juniRadiusClientOverrideNasInfo = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 141), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientOverrideNasInfo.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientOverrideNasInfo.setDescription("Enables/disables the overriding nas-ip-address and nas-identifier by values from authentication virual router. If juniRadiusClientNasIpAddrUse is not 'normal', then nas-ip-address is not overriden.")
juniRadiusClientIncludeInterfaceDescriptionInAccessReq = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 142), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeInterfaceDescriptionInAccessReq.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeInterfaceDescriptionInAccessReq.setDescription('Enables/disables the inclusion of the interface description attribute in the RADIUS Access-Req packet.')
juniRadiusClientIncludeInterfaceDescriptionInAcctStart = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 143), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeInterfaceDescriptionInAcctStart.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeInterfaceDescriptionInAcctStart.setDescription(' Enables/disables the inclusion of the interface description attribute in the RADIUS Accounting-Start packet.')
juniRadiusClientIncludeInterfaceDescriptionInAcctStop = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 144), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeInterfaceDescriptionInAcctStop.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeInterfaceDescriptionInAcctStop.setDescription(' Enables/disables the inclusion of the interface description attribute in the RADIUS Accounting-Stop packet.')
juniRadiusClientAtmNasPortFormat = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 145), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1))).clone(namedValues=NamedValues(("none", 0), ("extended", 1))).clone('none')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientAtmNasPortFormat.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientAtmNasPortFormat.setDescription('The format used by the client for the Nas-Port attribute (attribute number 5) for ATM interfaces: none Use the format specified in juniRadiusClientNasPortFormat extended Use the extended format determined by atm nas-port field width values. ')
juniRadiusClientNasPortFieldWidthAtmSlot = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 146), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 32)).clone(5)).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientNasPortFieldWidthAtmSlot.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientNasPortFieldWidthAtmSlot.setDescription('Specifies the number of bits in the NAS-Port attribute (attribute 5) to be used for the slot of the atm interface when the value of juniRadiusClientAtmNasPortFormat is extended. ')
juniRadiusClientNasPortFieldWidthAtmAdapter = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 147), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 32))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientNasPortFieldWidthAtmAdapter.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientNasPortFieldWidthAtmAdapter.setDescription('Specifies the number of bits in the NAS-Port attribute (attribute 5) to be used for the adapter of the atm interface when the value of juniRadiusClientAtmNasPortFormat is extended. ')
juniRadiusClientNasPortFieldWidthAtmPort = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 148), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 32)).clone(3)).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientNasPortFieldWidthAtmPort.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientNasPortFieldWidthAtmPort.setDescription('Specifies the number of bits in the NAS-Port attribute (attribute 5) to be used for the port of the atm interface when the value of juniRadiusClientAtmNasPortFormat is extended. ')
juniRadiusClientNasPortFieldWidthAtmVpi = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 149), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 32)).clone(8)).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientNasPortFieldWidthAtmVpi.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientNasPortFieldWidthAtmVpi.setDescription('Specifies the number of bits in the NAS-Port attribute (attribute 5) to be used for the vpi of the atm interface when the value of juniRadiusClientAtmNasPortFormat is extended. ')
juniRadiusClientNasPortFieldWidthAtmVci = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 150), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 32)).clone(16)).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientNasPortFieldWidthAtmVci.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientNasPortFieldWidthAtmVci.setDescription('Specifies the number of bits in the NAS-Port attribute (attribute 5) to be used for the vci of the atm interface when the value of juniRadiusClientAtmNasPortFormat is extended. ')
juniRadiusClientEthernetNasPortFormat = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 151), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1))).clone(namedValues=NamedValues(("none", 0), ("extended", 1))).clone('none')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientEthernetNasPortFormat.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientEthernetNasPortFormat.setDescription('The format used by the client for the Nas-Port attribute (attribute number 5) for Ethernet interfaces: none Use the format specified in juniRadiusClientNasPortFormat extended Use the extended format determined by ethernet nas-port field width values. ')
juniRadiusClientNasPortFieldWidthEthernetSlot = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 152), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 32)).clone(5)).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientNasPortFieldWidthEthernetSlot.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientNasPortFieldWidthEthernetSlot.setDescription('Specifies the number of bits in the NAS-Port attribute (attribute 5) to be used for the slot of the ethernet interface when the value of juniRadiusClientEthernetNasPortFormat is extended. ')
juniRadiusClientNasPortFieldWidthEthernetAdapter = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 153), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 32))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientNasPortFieldWidthEthernetAdapter.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientNasPortFieldWidthEthernetAdapter.setDescription('Specifies the number of bits in the NAS-Port attribute (attribute 5) to be used for the adapter of the ethernet interface when the value of juniRadiusClientEthernetNasPortFormat is extended. ')
juniRadiusClientNasPortFieldWidthEthernetPort = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 154), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 32)).clone(3)).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientNasPortFieldWidthEthernetPort.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientNasPortFieldWidthEthernetPort.setDescription('Specifies the number of bits in the NAS-Port attribute (attribute 5) to be used for the port of the ethernet interface when the value of juniRadiusClientEthernetNasPortFormat is extended. ')
juniRadiusClientNasPortFieldWidthEthernetSVlan = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 155), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 32)).clone(12)).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientNasPortFieldWidthEthernetSVlan.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientNasPortFieldWidthEthernetSVlan.setDescription('Specifies the number of bits in the NAS-Port attribute (attribute 5) to be used for the svlan of the ethernet interface when the value of juniRadiusClientEthernetNasPortFormat is extended. ')
juniRadiusClientNasPortFieldWidthEthernetVlan = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 156), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 32)).clone(12)).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientNasPortFieldWidthEthernetVlan.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientNasPortFieldWidthEthernetVlan.setDescription('Specifies the number of bits in the NAS-Port attribute (attribute 5) to be used for the vlan of the ethernet interface when the value of juniRadiusClientEthernetNasPortFormat is extended. ')
juniRadiusClientRemoteCircuitIdFormat = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 157), OctetString().subtype(subtypeSpec=ValueSizeConstraint(1, 3)).clone(hexValue="1")).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientRemoteCircuitIdFormat.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientRemoteCircuitIdFormat.setDescription('The set of PPPoE Remote Circuit Id components configured. Each octet in this object contains one of the values defined in the JuniRadiusClientRemoteCircuitIdFormatComponents TEXTUAL-CONVENTION. Only following combinations are permited: agentCircuitId remoteCircuitId agentCircutiId, remoteCircuitId nasIdentifier, agentCircuitId nasIdentifier, remoteCircuitId nasIdentifier, agentCircutiId, remoteCircuitId dsl-format-1.')
juniRadiusClientRemoteCircuitIdDelimiter = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 158), DisplayString().subtype(subtypeSpec=ValueSizeConstraint(1, 1)).setFixedLength(1).clone('#')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientRemoteCircuitIdDelimiter.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientRemoteCircuitIdDelimiter.setDescription("The character use as for delimiting fields in the PPPoE Remote Circuit ID. The default value is '#'.")
juniRadiusClientIncludeL2cAccessLoopParametersInAccessReq = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 159), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cAccessLoopParametersInAccessReq.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cAccessLoopParametersInAccessReq.setDescription('Enables/disables the inclusion of the l2c-access-loop-parameters (VSA) in the RADIUS Access-Request packet.')
juniRadiusClientIncludeL2cDownStreamDataInAccessReq = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 160), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cDownStreamDataInAccessReq.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cDownStreamDataInAccessReq.setDescription('Enables/disables the inclusion of the l2c-down-stream-data (VSA) in the RADIUS Access-Request packet.')
juniRadiusClientIncludeL2cUpStreamDataInAccessReq = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 161), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cUpStreamDataInAccessReq.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cUpStreamDataInAccessReq.setDescription('Enables/disables the inclusion of the l2c-up-stream-data (VSA) in the RADIUS Access-Request packet.')
juniRadiusClientIncludeL2cDownStreamDataInAcctStart = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 162), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cDownStreamDataInAcctStart.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cDownStreamDataInAcctStart.setDescription('Enables/disables the inclusion of the l2c-down-stream-data (VSA) in the RADIUS Accounting-Start packet.')
juniRadiusClientIncludeL2cUpStreamDataInAcctStart = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 163), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cUpStreamDataInAcctStart.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cUpStreamDataInAcctStart.setDescription('Enables/disables the inclusion of the l2c-up-stream-data (VSA) in the RADIUS Accounting-Start packet.')
juniRadiusClientIncludeL2cDownStreamDataInAcctStop = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 164), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cDownStreamDataInAcctStop.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cDownStreamDataInAcctStop.setDescription('Enables/disables the inclusion of the l2c-down-stream-data (VSA) in the RADIUS Accounting-Stop packet.')
juniRadiusClientIncludeL2cUpStreamDataInAcctStop = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 165), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cUpStreamDataInAcctStop.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cUpStreamDataInAcctStop.setDescription('Enables/disables the inclusion of the l2c-up-stream-data (VSA) in the RADIUS Accounting-Stop packet.')
juniRadiusClientIncludeDslForumAttributesInAccessReq = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 166), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeDslForumAttributesInAccessReq.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeDslForumAttributesInAccessReq.setDescription('Enables/disables the inclusion of the DSL Forum attributes (VSA) in the RADIUS Access-Request packet.')
juniRadiusClientIncludeDslForumAttributesInAcctStart = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 167), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeDslForumAttributesInAcctStart.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeDslForumAttributesInAcctStart.setDescription('Enables/disables the inclusion of the DSL Forum attributes (VSA) in the RADIUS Accounting-Start packet.')
juniRadiusClientIncludeDslForumAttributesInAcctStop = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 168), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeDslForumAttributesInAcctStop.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeDslForumAttributesInAcctStop.setDescription('Enables/disables the inclusion of the DSL Forum attributes (VSA) in the RADIUS Accounting-Stop packet.')
juniRadiusClientIncludeL2cAccessLoopCircuitIdInAccessReq = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 169), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cAccessLoopCircuitIdInAccessReq.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cAccessLoopCircuitIdInAccessReq.setDescription('Enables/disables the inclusion of l2cd-acc-loop-cir-id (VSA) in the RADIUS Access-Request packet.')
juniRadiusClientIncludeL2cAccessAggrCircuitIdBinaryInAccessReq = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 170), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cAccessAggrCircuitIdBinaryInAccessReq.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cAccessAggrCircuitIdBinaryInAccessReq.setDescription('Enables/disables the inclusion of l2cd-acc-aggr-cir-id-bin (VSA) in the RADIUS Access-Request packet.')
juniRadiusClientIncludeL2cAccessAggrCircuitIdAsciiInAccessReq = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 171), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cAccessAggrCircuitIdAsciiInAccessReq.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cAccessAggrCircuitIdAsciiInAccessReq.setDescription('Enables/disables the inclusion of l2cd-acc-aggr-cir-id-asc (VSA) in the RADIUS Access-Request packet.')
juniRadiusClientIncludeL2cActualDataRateUstrInAccessReq = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 172), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cActualDataRateUstrInAccessReq.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cActualDataRateUstrInAccessReq.setDescription('Enables/disables the inclusion of l2cd-act-data-rate-up (VSA) in the RADIUS Access-Request packet.')
juniRadiusClientIncludeL2cActualDataRateDstrInAccessReq = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 173), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cActualDataRateDstrInAccessReq.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cActualDataRateDstrInAccessReq.setDescription('Enables/disables the inclusion of l2cd-act-data-rate-dn (VSA) in the RADIUS Access-Request packet.')
juniRadiusClientIncludeL2cMinimumDataRateUstrInAccessReq = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 174), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cMinimumDataRateUstrInAccessReq.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cMinimumDataRateUstrInAccessReq.setDescription('Enables/disables the inclusion of l2cd-min-data-rate-up (VSA) in the RADIUS Access-Request packet.')
juniRadiusClientIncludeL2cMinimumDataRateDstrInAccessReq = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 175), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cMinimumDataRateDstrInAccessReq.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cMinimumDataRateDstrInAccessReq.setDescription('Enables/disables the inclusion of l2cd-in-data-rate-dn (VSA) in the RADIUS Access-Request packet.')
juniRadiusClientIncludeL2cAttainDataRateUstrInAccessReq = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 176), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cAttainDataRateUstrInAccessReq.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cAttainDataRateUstrInAccessReq.setDescription('Enables/disables the inclusion of l2cd-att-data-rate-up (VSA) in the RADIUS Access-Request packet.')
juniRadiusClientIncludeL2cAttainDataRateDstrInAccessReq = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 177), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cAttainDataRateDstrInAccessReq.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cAttainDataRateDstrInAccessReq.setDescription('Enables/disables the inclusion of l2cd-att-data-rate-dn (VSA) in the RADIUS Access-Request packet.')
juniRadiusClientIncludeL2cMaximumDataRateUstrInAccessReq = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 178), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cMaximumDataRateUstrInAccessReq.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cMaximumDataRateUstrInAccessReq.setDescription('Enables/disables the inclusion of l2cd-max-data-rate-up (VSA) in the RADIUS Access-Request packet.')
juniRadiusClientIncludeL2cMaximumDataRateDstrInAccessReq = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 179), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cMaximumDataRateDstrInAccessReq.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cMaximumDataRateDstrInAccessReq.setDescription('Enables/disables the inclusion of l2cd-max-data-rate-dn (VSA) in the RADIUS Access-Request packet.')
juniRadiusClientIncludeL2cMinLowPowerDataRateUstrInAccessReq = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 180), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cMinLowPowerDataRateUstrInAccessReq.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cMinLowPowerDataRateUstrInAccessReq.setDescription('Enables/disables the inclusion of l2cd-min-lp-data-rate-up (VSA) in the RADIUS Access-Request packet.')
juniRadiusClientIncludeL2cMinLowPowerDataRateDstrInAccessReq = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 181), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cMinLowPowerDataRateDstrInAccessReq.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cMinLowPowerDataRateDstrInAccessReq.setDescription('Enables/disables the inclusion of l2cd-min-lp-data-rate-dn (VSA) in the RADIUS Access-Request packet.')
juniRadiusClientIncludeL2cMaxInterleavingDelayUstrInAccessReq = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 182), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cMaxInterleavingDelayUstrInAccessReq.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cMaxInterleavingDelayUstrInAccessReq.setDescription('Enables/disables the inclusion of l2cd-max-interlv-delay-up (VSA) in the RADIUS Access-Request packet.')
juniRadiusClientIncludeL2cActInterleavingDelayUstrInAccessReq = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 183), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cActInterleavingDelayUstrInAccessReq.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cActInterleavingDelayUstrInAccessReq.setDescription('Enables/disables the inclusion of l2cd-act-interlv-delay-up (VSA) in the RADIUS Access-Request packet.')
juniRadiusClientIncludeL2cMaxInterleavingDelayDstrInAccessReq = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 184), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cMaxInterleavingDelayDstrInAccessReq.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cMaxInterleavingDelayDstrInAccessReq.setDescription('Enables/disables the inclusion of l2cd-max-interlv-delay-dn (VSA) in the RADIUS Access-Request packet.')
juniRadiusClientIncludeL2cActInterleavingDelayDstrInAccessReq = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 185), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cActInterleavingDelayDstrInAccessReq.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cActInterleavingDelayDstrInAccessReq.setDescription('Enables/disables the inclusion of l2cd-act-interlv-delay-dn (VSA) in the RADIUS Access-Request packet.')
juniRadiusClientIncludeL2cDslLineStateInAccessReq = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 186), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cDslLineStateInAccessReq.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cDslLineStateInAccessReq.setDescription('Enables/disables the inclusion of l2cd-dsl-line-state (VSA) in the RADIUS Access-Request packet.')
juniRadiusClientIncludeL2cDslTypeInAccessReq = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 187), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cDslTypeInAccessReq.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cDslTypeInAccessReq.setDescription('Enables/disables the inclusion of l2cd-dsl-type (VSA) in the RADIUS Access-Request packet.')
juniRadiusClientIncludeL2cAccessLoopCircuitIdInAcctStart = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 188), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cAccessLoopCircuitIdInAcctStart.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cAccessLoopCircuitIdInAcctStart.setDescription('Enables/disables the inclusion of l2cd-acc-loop-cir-id (VSA) in the RADIUS Acct-Start packet.')
juniRadiusClientIncludeL2cAccessAggrCircuitIdBinaryInAcctStart = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 189), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cAccessAggrCircuitIdBinaryInAcctStart.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cAccessAggrCircuitIdBinaryInAcctStart.setDescription('Enables/disables the inclusion of l2cd-acc-aggr-cir-id-bin (VSA) in the RADIUS Acct-Start packet.')
juniRadiusClientIncludeL2cAccessAggrCircuitIdAsciiInAcctStart = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 190), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cAccessAggrCircuitIdAsciiInAcctStart.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cAccessAggrCircuitIdAsciiInAcctStart.setDescription('Enables/disables the inclusion of l2cd-acc-aggr-cir-id-asc (VSA) in the RADIUS Acct-Start packet.')
juniRadiusClientIncludeL2cActualDataRateUstrInAcctStart = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 191), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cActualDataRateUstrInAcctStart.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cActualDataRateUstrInAcctStart.setDescription('Enables/disables the inclusion of l2cd-act-data-rate-up (VSA) in the RADIUS Acct-Start packet.')
juniRadiusClientIncludeL2cActualDataRateDstrInAcctStart = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 192), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cActualDataRateDstrInAcctStart.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cActualDataRateDstrInAcctStart.setDescription('Enables/disables the inclusion of l2cd-act-data-rate-dn (VSA) in the RADIUS Acct-Start packet.')
juniRadiusClientIncludeL2cMinimumDataRateUstrInAcctStart = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 193), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cMinimumDataRateUstrInAcctStart.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cMinimumDataRateUstrInAcctStart.setDescription('Enables/disables the inclusion of l2cd-min-data-rate-up (VSA) in the RADIUS Acct-Start packet.')
juniRadiusClientIncludeL2cMinimumDataRateDstrInAcctStart = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 194), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cMinimumDataRateDstrInAcctStart.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cMinimumDataRateDstrInAcctStart.setDescription('Enables/disables the inclusion of l2cd-in-data-rate-dn (VSA) in the RADIUS Acct-Start packet.')
juniRadiusClientIncludeL2cAttainDataRateUstrInAcctStart = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 195), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cAttainDataRateUstrInAcctStart.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cAttainDataRateUstrInAcctStart.setDescription('Enables/disables the inclusion of l2cd-att-data-rate-up (VSA) in the RADIUS Acct-Start packet.')
juniRadiusClientIncludeL2cAttainDataRateDstrInAcctStart = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 196), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cAttainDataRateDstrInAcctStart.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cAttainDataRateDstrInAcctStart.setDescription('Enables/disables the inclusion of l2cd-att-data-rate-dn (VSA) in the RADIUS Acct-Start packet.')
juniRadiusClientIncludeL2cMaximumDataRateUstrInAcctStart = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 197), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cMaximumDataRateUstrInAcctStart.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cMaximumDataRateUstrInAcctStart.setDescription('Enables/disables the inclusion of l2cd-max-data-rate-up (VSA) in the RADIUS Acct-Start packet.')
juniRadiusClientIncludeL2cMaximumDataRateDstrInAcctStart = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 198), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cMaximumDataRateDstrInAcctStart.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cMaximumDataRateDstrInAcctStart.setDescription('Enables/disables the inclusion of l2cd-max-data-rate-dn (VSA) in the RADIUS Acct-Start packet.')
juniRadiusClientIncludeL2cMinLowPowerDataRateUstrInAcctStart = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 199), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cMinLowPowerDataRateUstrInAcctStart.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cMinLowPowerDataRateUstrInAcctStart.setDescription('Enables/disables the inclusion of l2cd-min-lp-data-rate-up (VSA) in the RADIUS Acct-Start packet.')
juniRadiusClientIncludeL2cMinLowPowerDataRateDstrInAcctStart = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 200), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cMinLowPowerDataRateDstrInAcctStart.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cMinLowPowerDataRateDstrInAcctStart.setDescription('Enables/disables the inclusion of l2cd-min-lp-data-rate-dn (VSA) in the RADIUS Acct-Start packet.')
juniRadiusClientIncludeL2cMaxInterleavingDelayUstrInAcctStart = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 201), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cMaxInterleavingDelayUstrInAcctStart.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cMaxInterleavingDelayUstrInAcctStart.setDescription('Enables/disables the inclusion of l2cd-max-interlv-delay-up (VSA) in the RADIUS Acct-Start packet.')
juniRadiusClientIncludeL2cActInterleavingDelayUstrInAcctStart = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 202), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cActInterleavingDelayUstrInAcctStart.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cActInterleavingDelayUstrInAcctStart.setDescription('Enables/disables the inclusion of l2cd-act-interlv-delay-up (VSA) in the RADIUS Acct-Start packet.')
juniRadiusClientIncludeL2cMaxInterleavingDelayDstrInAcctStart = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 203), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cMaxInterleavingDelayDstrInAcctStart.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cMaxInterleavingDelayDstrInAcctStart.setDescription('Enables/disables the inclusion of l2cd-max-interlv-delay-dn (VSA) in the RADIUS Acct-Start packet.')
juniRadiusClientIncludeL2cActInterleavingDelayDstrInAcctStart = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 204), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cActInterleavingDelayDstrInAcctStart.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cActInterleavingDelayDstrInAcctStart.setDescription('Enables/disables the inclusion of l2cd-act-interlv-delay-dn (VSA) in the RADIUS Acct-Start packet.')
juniRadiusClientIncludeL2cDslLineStateInAcctStart = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 205), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cDslLineStateInAcctStart.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cDslLineStateInAcctStart.setDescription('Enables/disables the inclusion of l2cd-dsl-line-state (VSA) in the RADIUS Acct-Start packet.')
juniRadiusClientIncludeL2cDslTypeInAcctStart = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 206), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cDslTypeInAcctStart.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cDslTypeInAcctStart.setDescription('Enables/disables the inclusion of l2cd-dsl-type (VSA) in the RADIUS Acct-Start packet.')
juniRadiusClientIncludeL2cAccessLoopCircuitIdInAcctStop = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 207), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cAccessLoopCircuitIdInAcctStop.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cAccessLoopCircuitIdInAcctStop.setDescription('Enables/disables the inclusion of l2cd-acc-loop-cir-id (VSA) in the RADIUS Acct-Stop packet.')
juniRadiusClientIncludeL2cAccessAggrCircuitIdBinaryInAcctStop = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 208), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cAccessAggrCircuitIdBinaryInAcctStop.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cAccessAggrCircuitIdBinaryInAcctStop.setDescription('Enables/disables the inclusion of l2cd-acc-aggr-cir-id-bin (VSA) in the RADIUS Acct-Stop packet.')
juniRadiusClientIncludeL2cAccessAggrCircuitIdAsciiInAcctStop = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 209), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cAccessAggrCircuitIdAsciiInAcctStop.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cAccessAggrCircuitIdAsciiInAcctStop.setDescription('Enables/disables the inclusion of l2cd-acc-aggr-cir-id-asc (VSA) in the RADIUS Acct-Stop packet.')
juniRadiusClientIncludeL2cActualDataRateUstrInAcctStop = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 210), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cActualDataRateUstrInAcctStop.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cActualDataRateUstrInAcctStop.setDescription('Enables/disables the inclusion of l2cd-act-data-rate-up (VSA) in the RADIUS Acct-Stop packet.')
juniRadiusClientIncludeL2cActualDataRateDstrInAcctStop = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 211), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cActualDataRateDstrInAcctStop.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cActualDataRateDstrInAcctStop.setDescription('Enables/disables the inclusion of l2cd-act-data-rate-dn (VSA) in the RADIUS Acct-Stop packet.')
juniRadiusClientIncludeL2cMinimumDataRateUstrInAcctStop = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 212), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cMinimumDataRateUstrInAcctStop.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cMinimumDataRateUstrInAcctStop.setDescription('Enables/disables the inclusion of l2cd-min-data-rate-up (VSA) in the RADIUS Acct-Stop packet.')
juniRadiusClientIncludeL2cMinimumDataRateDstrInAcctStop = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 213), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cMinimumDataRateDstrInAcctStop.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cMinimumDataRateDstrInAcctStop.setDescription('Enables/disables the inclusion of l2cd-in-data-rate-dn (VSA) in the RADIUS Acct-Stop packet.')
juniRadiusClientIncludeL2cAttainDataRateUstrInAcctStop = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 214), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cAttainDataRateUstrInAcctStop.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cAttainDataRateUstrInAcctStop.setDescription('Enables/disables the inclusion of l2cd-att-data-rate-up (VSA) in the RADIUS Acct-Stop packet.')
juniRadiusClientIncludeL2cAttainDataRateDstrInAcctStop = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 215), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cAttainDataRateDstrInAcctStop.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cAttainDataRateDstrInAcctStop.setDescription('Enables/disables the inclusion of l2cd-att-data-rate-dn (VSA) in the RADIUS Acct-Stop packet.')
juniRadiusClientIncludeL2cMaximumDataRateUstrInAcctStop = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 216), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cMaximumDataRateUstrInAcctStop.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cMaximumDataRateUstrInAcctStop.setDescription('Enables/disables the inclusion of l2cd-max-data-rate-up (VSA) in the RADIUS Acct-Stop packet.')
juniRadiusClientIncludeL2cMaximumDataRateDstrInAcctStop = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 217), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cMaximumDataRateDstrInAcctStop.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cMaximumDataRateDstrInAcctStop.setDescription('Enables/disables the inclusion of l2cd-max-data-rate-dn (VSA) in the RADIUS Acct-Stop packet.')
juniRadiusClientIncludeL2cMinLowPowerDataRateUstrInAcctStop = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 218), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cMinLowPowerDataRateUstrInAcctStop.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cMinLowPowerDataRateUstrInAcctStop.setDescription('Enables/disables the inclusion of l2cd-min-lp-data-rate-up (VSA) in the RADIUS Acct-Stop packet.')
juniRadiusClientIncludeL2cMinLowPowerDataRateDstrInAcctStop = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 219), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cMinLowPowerDataRateDstrInAcctStop.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cMinLowPowerDataRateDstrInAcctStop.setDescription('Enables/disables the inclusion of l2cd-min-lp-data-rate-dn (VSA) in the RADIUS Acct-Stop packet.')
juniRadiusClientIncludeL2cMaxInterleavingDelayUstrInAcctStop = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 220), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cMaxInterleavingDelayUstrInAcctStop.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cMaxInterleavingDelayUstrInAcctStop.setDescription('Enables/disables the inclusion of l2cd-max-interlv-delay-up (VSA) in the RADIUS Acct-Stop packet.')
juniRadiusClientIncludeL2cActInterleavingDelayUstrInAcctStop = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 221), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cActInterleavingDelayUstrInAcctStop.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cActInterleavingDelayUstrInAcctStop.setDescription('Enables/disables the inclusion of l2cd-act-interlv-delay-up (VSA) in the RADIUS Acct-Stop packet.')
juniRadiusClientIncludeL2cMaxInterleavingDelayDstrInAcctStop = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 222), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cMaxInterleavingDelayDstrInAcctStop.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cMaxInterleavingDelayDstrInAcctStop.setDescription('Enables/disables the inclusion of l2cd-max-interlv-delay-dn (VSA) in the RADIUS Acct-Stop packet.')
juniRadiusClientIncludeL2cActInterleavingDelayDstrInAcctStop = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 223), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cActInterleavingDelayDstrInAcctStop.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cActInterleavingDelayDstrInAcctStop.setDescription('Enables/disables the inclusion of l2cd-act-interlv-delay-dn (VSA) in the RADIUS Acct-Stop packet.')
juniRadiusClientIncludeL2cDslLineStateInAcctStop = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 224), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cDslLineStateInAcctStop.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cDslLineStateInAcctStop.setDescription('Enables/disables the inclusion of l2cd-dsl-line-state (VSA) in the RADIUS Acct-Stop packet.')
juniRadiusClientIncludeL2cDslTypeInAcctStop = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 225), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cDslTypeInAcctStop.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeL2cDslTypeInAcctStop.setDescription('Enables/disables the inclusion of l2cd-dsl-type (VSA) in the RADIUS Acct-Stop packet.')
juniRadiusClientIncludeInterfaceIdInAcctStart = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 226), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeInterfaceIdInAcctStart.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeInterfaceIdInAcctStart.setDescription('Enables/disables the inclusion of framed-interface-id (96) in the RADIUS Acct-Start packet.')
juniRadiusClientIncludeIpv6PrefixInAcctStart = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 227), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeIpv6PrefixInAcctStart.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeIpv6PrefixInAcctStart.setDescription('Enables/disables the inclusion of framed-ipv6-prefix (97) in the RADIUS Acct-Start packet.')
juniRadiusClientIncludeInterfaceIdInAcctStop = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 228), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeInterfaceIdInAcctStop.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeInterfaceIdInAcctStop.setDescription('Enables/disables the inclusion of framed-interface-id (96) in the RADIUS Acct-Stop packet.')
juniRadiusClientIncludeIpAddrInAcctStop = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 229), TruthValue().clone('true')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeIpAddrInAcctStop.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeIpAddrInAcctStop.setDescription('Enables/disables the inclusion of framed-ip-address (8) in the RADIUS Acct-Stop packet.')
juniRadiusClientIncludeIpv6PrefixInAcctStop = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 230), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeIpv6PrefixInAcctStop.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeIpv6PrefixInAcctStop.setDescription('Enables/disables the inclusion of framed-ipv6-prefix (97) in the RADIUS Acct-Stop packet.')
juniRadiusClientIncludeDownStreamCalculatedQosRateInAccessReq = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 231), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeDownStreamCalculatedQosRateInAccessReq.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeDownStreamCalculatedQosRateInAccessReq.setDescription('Enables/disables the inclusion of downstream-calculated-qos-rate (VSA) in the RADIUS Access-Request packet.')
juniRadiusClientIncludeUpStreamCalculatedQosRateInAccessReq = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 232), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeUpStreamCalculatedQosRateInAccessReq.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeUpStreamCalculatedQosRateInAccessReq.setDescription('Enables/disables the inclusion of upstream-calculated-qos-rate (VSA) in the RADIUS Access-Request packet.')
juniRadiusClientIncludeDownStreamCalculatedQosRateInAcctStart = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 233), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeDownStreamCalculatedQosRateInAcctStart.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeDownStreamCalculatedQosRateInAcctStart.setDescription('Enables/disables the inclusion of downstream-calculated-qos-rate (VSA) in the RADIUS Acct-Start packet.')
juniRadiusClientIncludeUpStreamCalculatedQosRateInAcctStart = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 234), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeUpStreamCalculatedQosRateInAcctStart.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeUpStreamCalculatedQosRateInAcctStart.setDescription('Enables/disables the inclusion of upstream-calculated-qos-rate (VSA) in the RADIUS Acct-Start packet.')
juniRadiusClientIncludeDownStreamCalculatedQosRateInAcctStop = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 235), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeDownStreamCalculatedQosRateInAcctStop.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeDownStreamCalculatedQosRateInAcctStop.setDescription('Enables/disables the inclusion of downstream-calculated-qos-rate (VSA) in the RADIUS Acct-Stop packet.')
juniRadiusClientIncludeUpStreamCalculatedQosRateInAcctStop = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 236), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeUpStreamCalculatedQosRateInAcctStop.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeUpStreamCalculatedQosRateInAcctStop.setDescription('Enables/disables the inclusion of upstream-calculated-qos-rate (VSA) in the RADIUS Acct-Stop packet.')
juniRadiusClientIgnorePppoeMaxSession = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 237), TruthValue().clone('true')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIgnorePppoeMaxSession.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIgnorePppoeMaxSession.setDescription('Enables/disables ignoring the PPPoE Max Session (vsa) attribute in the RADIUS Access-Accept packet.')
juniRadiusClientIncludeIpv6AccountingInAcctStop = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 238), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeIpv6AccountingInAcctStop.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeIpv6AccountingInAcctStop.setDescription('Enables/disables the inclusion of the IPv6 Accounting (VSA) attributes in the RADIUS Accounting-Stop packet.')
juniRadiusClientIncludeDelegatedIpv6PrefixInAcctStart = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 239), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeDelegatedIpv6PrefixInAcctStart.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeDelegatedIpv6PrefixInAcctStart.setDescription('Enables/disables the inclusion of delegated-ipv6-prefix (123) in the RADIUS Acct-Start packet.')
juniRadiusClientIncludeDelegatedIpv6PrefixInAcctStop = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 240), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeDelegatedIpv6PrefixInAcctStop.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeDelegatedIpv6PrefixInAcctStop.setDescription('Enables/disables the inclusion of delegated-ipv6-prefix (123) in the RADIUS Acct-Stop packet.')
juniRadiusClientIncludeFramedIpv6PoolInAcctStart = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 241), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeFramedIpv6PoolInAcctStart.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeFramedIpv6PoolInAcctStart.setDescription('Enables/disables the inclusion of framed-ipv6-pool (100) in the RADIUS Acct-Start packet.')
juniRadiusClientIncludeFramedIpv6PoolInAcctStop = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 242), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeFramedIpv6PoolInAcctStop.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeFramedIpv6PoolInAcctStop.setDescription('Enables/disables the inclusion of framed-ipv6-pool (100) in the RADIUS Acct-Stop packet.')
juniRadiusClientIncludeFramedIpv6RouteInAcctStart = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 243), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeFramedIpv6RouteInAcctStart.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeFramedIpv6RouteInAcctStart.setDescription('Enables/disables the inclusion of framed-ipv6-route (99) in the RADIUS Acct-Start packet.')
juniRadiusClientIncludeFramedIpv6RouteInAcctStop = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 244), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeFramedIpv6RouteInAcctStop.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeFramedIpv6RouteInAcctStop.setDescription('Enables/disables the inclusion of framed-ipv6-route (99) in the RADIUS Acct-Stop packet.')
juniRadiusClientIncludeIpv6LocalInterfaceInAcctStart = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 245), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeIpv6LocalInterfaceInAcctStart.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeIpv6LocalInterfaceInAcctStart.setDescription('Enables/disables the inclusion of ipv6-local-interface (vsa) in the RADIUS Acct-Start packet.')
juniRadiusClientIncludeIpv6LocalInterfaceInAcctStop = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 246), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeIpv6LocalInterfaceInAcctStop.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeIpv6LocalInterfaceInAcctStop.setDescription('Enables/disables the inclusion of ipv6-local-interface (vsa) in the RADIUS Acct-Stop packet.')
juniRadiusClientIncludeIpv6NdRaPrefixInAcctStart = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 247), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeIpv6NdRaPrefixInAcctStart.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeIpv6NdRaPrefixInAcctStart.setDescription('Enables/disables the inclusion of ipv6-nd-ra-prefix (vsa) in the RADIUS Acct-Start packet.')
juniRadiusClientIncludeIpv6NdRaPrefixInAcctStop = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 248), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeIpv6NdRaPrefixInAcctStop.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeIpv6NdRaPrefixInAcctStop.setDescription('Enables/disables the inclusion of ipv6-nd-ra-prefix (vsa) in the RADIUS Acct-Stop packet.')
juniRadiusClientIncludeIpv6PrimaryDnsInAcctStart = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 249), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeIpv6PrimaryDnsInAcctStart.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeIpv6PrimaryDnsInAcctStart.setDescription('Enables/disables the inclusion of ipv6-primary-dns (vsa) in the RADIUS Acct-Start packet.')
juniRadiusClientIncludeIpv6PrimaryDnsInAcctStop = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 250), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeIpv6PrimaryDnsInAcctStop.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeIpv6PrimaryDnsInAcctStop.setDescription('Enables/disables the inclusion of ipv6-primary-dns (vsa) in the RADIUS Acct-Stop packet.')
juniRadiusClientIncludeIpv6SecondaryDnsInAcctStart = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 251), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeIpv6SecondaryDnsInAcctStart.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeIpv6SecondaryDnsInAcctStart.setDescription('Enables/disables the inclusion of ipv6-secondary-dns (vsa) in the RADIUS Acct-Start packet.')
juniRadiusClientIncludeIpv6SecondaryDnsInAcctStop = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 252), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeIpv6SecondaryDnsInAcctStop.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeIpv6SecondaryDnsInAcctStop.setDescription('Enables/disables the inclusion of ipv6-secondary-dns (vsa) in the RADIUS Acct-Stop packet.')
juniRadiusClientIncludeIpv6VirtualRouterInAcctStart = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 253), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeIpv6VirtualRouterInAcctStart.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeIpv6VirtualRouterInAcctStart.setDescription('Enables/disables the inclusion of ipv6-virtual-router (vsa) in the RADIUS Acct-Start packet.')
juniRadiusClientIncludeIpv6VirtualRouterInAcctStop = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 1, 254), TruthValue().clone('false')).setMaxAccess("readwrite")
if mibBuilder.loadTexts: juniRadiusClientIncludeIpv6VirtualRouterInAcctStop.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientIncludeIpv6VirtualRouterInAcctStop.setDescription('Enables/disables the inclusion of ipv6-virtual-router (vsa) in the RADIUS Acct-Stop packet.')
juniRadiusAuthClientInvalidServerAddresses = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 2, 1), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: juniRadiusAuthClientInvalidServerAddresses.setStatus('current')
if mibBuilder.loadTexts: juniRadiusAuthClientInvalidServerAddresses.setDescription('The number of RADIUS Access-Response packets received from unknown addresses.')
juniRadiusAuthClientServerTable = MibTable((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 2, 2), )
if mibBuilder.loadTexts: juniRadiusAuthClientServerTable.setStatus('current')
if mibBuilder.loadTexts: juniRadiusAuthClientServerTable.setDescription('The (conceptual) table listing the RADIUS authentication servers with which the client shares a secret.')
juniRadiusAuthClientServerEntry = MibTableRow((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 2, 2, 1), ).setIndexNames((0, "Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientServerAddress"))
if mibBuilder.loadTexts: juniRadiusAuthClientServerEntry.setStatus('current')
if mibBuilder.loadTexts: juniRadiusAuthClientServerEntry.setDescription('An entry (conceptual row) representing a RADIUS authentication server with which the client shares a secret.')
juniRadiusAuthClientServerAddress = MibTableColumn((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 2, 2, 1, 1), IpAddress())
if mibBuilder.loadTexts: juniRadiusAuthClientServerAddress.setStatus('current')
if mibBuilder.loadTexts: juniRadiusAuthClientServerAddress.setDescription('The IP address of the RADIUS authentication server referred to in this table entry. A value of 0.0.0.0 indicates this entry is not in use.')
juniRadiusAuthClientServerPortNumber = MibTableColumn((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 2, 2, 1, 2), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: juniRadiusAuthClientServerPortNumber.setStatus('current')
if mibBuilder.loadTexts: juniRadiusAuthClientServerPortNumber.setDescription('The UDP port the client is using to send requests to this server.')
juniRadiusAuthClientRoundTripTime = MibTableColumn((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 2, 2, 1, 3), TimeTicks()).setMaxAccess("readonly")
if mibBuilder.loadTexts: juniRadiusAuthClientRoundTripTime.setStatus('current')
if mibBuilder.loadTexts: juniRadiusAuthClientRoundTripTime.setDescription('The time interval (in hundredths of seconds) between the most recent Access-Reply/Access-Challenge and the Access-Request that matched it from this RADIUS authentication server.')
juniRadiusAuthClientAccessRequests = MibTableColumn((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 2, 2, 1, 4), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: juniRadiusAuthClientAccessRequests.setStatus('current')
if mibBuilder.loadTexts: juniRadiusAuthClientAccessRequests.setDescription('The number of RADIUS Access-Request packets sent to this server. This does not include retransmissions.')
juniRadiusAuthClientAccessRetransmissions = MibTableColumn((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 2, 2, 1, 5), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: juniRadiusAuthClientAccessRetransmissions.setStatus('current')
if mibBuilder.loadTexts: juniRadiusAuthClientAccessRetransmissions.setDescription('The number of RADIUS Access-Request packets retransmitted to this RADIUS authentication server.')
juniRadiusAuthClientAccessAccepts = MibTableColumn((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 2, 2, 1, 6), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: juniRadiusAuthClientAccessAccepts.setStatus('current')
if mibBuilder.loadTexts: juniRadiusAuthClientAccessAccepts.setDescription('The number of RADIUS Access-Accept packets (valid or invalid) received from this server.')
juniRadiusAuthClientAccessRejects = MibTableColumn((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 2, 2, 1, 7), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: juniRadiusAuthClientAccessRejects.setStatus('current')
if mibBuilder.loadTexts: juniRadiusAuthClientAccessRejects.setDescription('The number of RADIUS Access-Reject packets (valid or invalid) received from this server.')
juniRadiusAuthClientAccessChallenges = MibTableColumn((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 2, 2, 1, 8), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: juniRadiusAuthClientAccessChallenges.setStatus('current')
if mibBuilder.loadTexts: juniRadiusAuthClientAccessChallenges.setDescription('The number of RADIUS Access-Challenge packets (valid or invalid) received from this server.')
juniRadiusAuthClientMalformedAccessResponses = MibTableColumn((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 2, 2, 1, 9), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: juniRadiusAuthClientMalformedAccessResponses.setStatus('current')
if mibBuilder.loadTexts: juniRadiusAuthClientMalformedAccessResponses.setDescription('The number of malformed RADIUS Access-Response packets received from this server. Malformed packets include packets with an invalid length. Bad authenticators or signature attributes or unknown types are not included as malformed access responses.')
juniRadiusAuthClientBadAuthenticators = MibTableColumn((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 2, 2, 1, 10), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: juniRadiusAuthClientBadAuthenticators.setStatus('current')
if mibBuilder.loadTexts: juniRadiusAuthClientBadAuthenticators.setDescription('The number of RADIUS Access-Response packets containing invalid authenticators or signature attributes received from this server.')
juniRadiusAuthClientPendingRequests = MibTableColumn((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 2, 2, 1, 11), Gauge32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: juniRadiusAuthClientPendingRequests.setStatus('current')
if mibBuilder.loadTexts: juniRadiusAuthClientPendingRequests.setDescription('The number of RADIUS Access-Request packets destined for this server that have not yet timed out or received a response. This variable is incremented when an Access-Request is sent and decremented due to receipt of an Access-Accept, Access-Reject or Access-Challenge, a timeout or retransmission.')
juniRadiusAuthClientTimeouts = MibTableColumn((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 2, 2, 1, 12), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: juniRadiusAuthClientTimeouts.setStatus('current')
if mibBuilder.loadTexts: juniRadiusAuthClientTimeouts.setDescription('The number of authentication timeouts to this server. After a timeout the client may retry to the same server, send to a different server, or give up. A retry to the same server is counted as a retransmit as well as a timeout. A send to a different server is counted as a Request as well as a timeout.')
juniRadiusAuthClientUnknownTypes = MibTableColumn((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 2, 2, 1, 13), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: juniRadiusAuthClientUnknownTypes.setStatus('current')
if mibBuilder.loadTexts: juniRadiusAuthClientUnknownTypes.setDescription('The number of RADIUS packets of unknown type which were received from this server on the authentication port.')
juniRadiusAuthClientPacketsDropped = MibTableColumn((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 2, 2, 1, 14), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: juniRadiusAuthClientPacketsDropped.setStatus('current')
if mibBuilder.loadTexts: juniRadiusAuthClientPacketsDropped.setDescription('The number of RADIUS packets of which were received from this server on the authentication port and dropped for some other reason.')
juniRadiusAuthClientCfgServerTable = MibTable((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 2, 3), )
if mibBuilder.loadTexts: juniRadiusAuthClientCfgServerTable.setStatus('current')
if mibBuilder.loadTexts: juniRadiusAuthClientCfgServerTable.setDescription('The (conceptual) table listing the RADIUS authentication servers with which the client shares a secret.')
juniRadiusAuthClientCfgServerEntry = MibTableRow((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 2, 3, 1), ).setIndexNames((0, "Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientCfgServerAddress"))
if mibBuilder.loadTexts: juniRadiusAuthClientCfgServerEntry.setStatus('current')
if mibBuilder.loadTexts: juniRadiusAuthClientCfgServerEntry.setDescription('An entry (conceptual row) representing a RADIUS authentication server with which the client shares a secret.')
juniRadiusAuthClientCfgServerAddress = MibTableColumn((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 2, 3, 1, 1), IpAddress())
if mibBuilder.loadTexts: juniRadiusAuthClientCfgServerAddress.setStatus('current')
if mibBuilder.loadTexts: juniRadiusAuthClientCfgServerAddress.setDescription('The IP address of the RADIUS authentication server referred to in this table entry.')
juniRadiusAuthClientCfgServerPortNumber = MibTableColumn((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 2, 3, 1, 2), Integer32().subtype(subtypeSpec=ValueRangeConstraint(1, 65535)).clone(1812)).setMaxAccess("readcreate")
if mibBuilder.loadTexts: juniRadiusAuthClientCfgServerPortNumber.setStatus('current')
if mibBuilder.loadTexts: juniRadiusAuthClientCfgServerPortNumber.setDescription('The UDP port the client is using to send requests to this server.')
juniRadiusAuthClientCfgKey = MibTableColumn((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 2, 3, 1, 3), DisplayString().subtype(subtypeSpec=ValueSizeConstraint(0, 32)).clone(hexValue="")).setMaxAccess("readcreate")
if mibBuilder.loadTexts: juniRadiusAuthClientCfgKey.setStatus('current')
if mibBuilder.loadTexts: juniRadiusAuthClientCfgKey.setDescription('The secret (RADIUS authenticator) used by the client during exchanges with this authentication server. The default is a zero-length string, indicating no authenticator is used.')
juniRadiusAuthClientCfgTimeoutInterval = MibTableColumn((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 2, 3, 1, 4), Integer32().subtype(subtypeSpec=ValueRangeConstraint(1, 1000)).clone(3)).setUnits('seconds').setMaxAccess("readcreate")
if mibBuilder.loadTexts: juniRadiusAuthClientCfgTimeoutInterval.setStatus('current')
if mibBuilder.loadTexts: juniRadiusAuthClientCfgTimeoutInterval.setDescription('The interval between retransmissions of a request to this authentication server.')
juniRadiusAuthClientCfgRetries = MibTableColumn((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 2, 3, 1, 5), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 100)).clone(3)).setMaxAccess("readcreate")
if mibBuilder.loadTexts: juniRadiusAuthClientCfgRetries.setStatus('current')
if mibBuilder.loadTexts: juniRadiusAuthClientCfgRetries.setDescription('The maximum number of times to resend a request to this authentication server (in addition to the original request), before resorting to the server specified in the next entry.')
juniRadiusAuthClientCfgMaxPendingRequests = MibTableColumn((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 2, 3, 1, 6), Integer32().subtype(subtypeSpec=ValueRangeConstraint(10, 32000)).clone(255)).setMaxAccess("readcreate")
if mibBuilder.loadTexts: juniRadiusAuthClientCfgMaxPendingRequests.setStatus('current')
if mibBuilder.loadTexts: juniRadiusAuthClientCfgMaxPendingRequests.setDescription('The maximum number of outstanding requests this server can support.')
juniRadiusAuthClientCfgRowStatus = MibTableColumn((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 2, 3, 1, 7), RowStatus()).setMaxAccess("readcreate")
if mibBuilder.loadTexts: juniRadiusAuthClientCfgRowStatus.setStatus('current')
if mibBuilder.loadTexts: juniRadiusAuthClientCfgRowStatus.setDescription("Supports 'createAndGo' and 'destroy' only.")
juniRadiusAuthClientCfgPrecedence = MibTableColumn((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 2, 3, 1, 8), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 2147483647))).setMaxAccess("readonly")
if mibBuilder.loadTexts: juniRadiusAuthClientCfgPrecedence.setStatus('current')
if mibBuilder.loadTexts: juniRadiusAuthClientCfgPrecedence.setDescription('Relative precedence of this server with respect to other servers configured in this table. Lower values correspond to higher precedence. Precedence is assigned by the device, in order of entry creation, from higher to lower precedence.')
juniRadiusAuthClientCfgDeadTime = MibTableColumn((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 2, 3, 1, 9), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 1440))).setUnits('minutes').setMaxAccess("readcreate")
if mibBuilder.loadTexts: juniRadiusAuthClientCfgDeadTime.setStatus('current')
if mibBuilder.loadTexts: juniRadiusAuthClientCfgDeadTime.setDescription('The period of time, in minutes, to ignore this server after a request to the server times out (thereby avoiding additional request timeouts for this period, if the server failure persists).')
juniRadiusAcctClientInvalidServerAddresses = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 3, 1), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: juniRadiusAcctClientInvalidServerAddresses.setStatus('current')
if mibBuilder.loadTexts: juniRadiusAcctClientInvalidServerAddresses.setDescription('The number of RADIUS Accounting-Response packets received from unknown addresses.')
juniRadiusAcctClientServerTable = MibTable((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 3, 2), )
if mibBuilder.loadTexts: juniRadiusAcctClientServerTable.setStatus('current')
if mibBuilder.loadTexts: juniRadiusAcctClientServerTable.setDescription('The (conceptual) table listing the RADIUS accounting servers with which the client shares a secret.')
juniRadiusAcctClientServerEntry = MibTableRow((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 3, 2, 1), ).setIndexNames((0, "Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientServerAddress"))
if mibBuilder.loadTexts: juniRadiusAcctClientServerEntry.setStatus('current')
if mibBuilder.loadTexts: juniRadiusAcctClientServerEntry.setDescription('An entry (conceptual row) representing a RADIUS accounting server with which the client shares a secret.')
juniRadiusAcctClientServerAddress = MibTableColumn((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 3, 2, 1, 1), IpAddress())
if mibBuilder.loadTexts: juniRadiusAcctClientServerAddress.setStatus('current')
if mibBuilder.loadTexts: juniRadiusAcctClientServerAddress.setDescription('The IP address of the RADIUS accounting server referred to in this table entry. A value of 0.0.0.0 indicates this entry is not in use.')
juniRadiusAcctClientServerPortNumber = MibTableColumn((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 3, 2, 1, 2), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: juniRadiusAcctClientServerPortNumber.setStatus('current')
if mibBuilder.loadTexts: juniRadiusAcctClientServerPortNumber.setDescription('The UDP port the client is using to send requests to this server.')
juniRadiusAcctClientRoundTripTime = MibTableColumn((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 3, 2, 1, 3), TimeTicks()).setMaxAccess("readonly")
if mibBuilder.loadTexts: juniRadiusAcctClientRoundTripTime.setStatus('current')
if mibBuilder.loadTexts: juniRadiusAcctClientRoundTripTime.setDescription('The time interval between the most recent Accounting-Response and the Accounting-Request that matched it from this RADIUS accounting server.')
juniRadiusAcctClientRequests = MibTableColumn((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 3, 2, 1, 4), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: juniRadiusAcctClientRequests.setStatus('current')
if mibBuilder.loadTexts: juniRadiusAcctClientRequests.setDescription('The number of RADIUS Accounting-Request packets sent. This does not include retransmissions.')
juniRadiusAcctClientRetransmissions = MibTableColumn((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 3, 2, 1, 5), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: juniRadiusAcctClientRetransmissions.setStatus('current')
if mibBuilder.loadTexts: juniRadiusAcctClientRetransmissions.setDescription('The number of RADIUS Accounting-Request packets retransmitted to this RADIUS accounting server. Retransmissions include retries where the Identifier and Acct-Delay have been updated, as well as those in which they remain the same.')
juniRadiusAcctClientResponses = MibTableColumn((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 3, 2, 1, 6), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: juniRadiusAcctClientResponses.setStatus('current')
if mibBuilder.loadTexts: juniRadiusAcctClientResponses.setDescription('The number of RADIUS packets received on the accounting port from this server.')
juniRadiusAcctClientMalformedResponses = MibTableColumn((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 3, 2, 1, 7), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: juniRadiusAcctClientMalformedResponses.setStatus('current')
if mibBuilder.loadTexts: juniRadiusAcctClientMalformedResponses.setDescription('The number of malformed RADIUS Accounting-Response packets received from this server. Malformed packets include packets with an invalid length. Bad authenticators and unknown types are not included as malformed accounting responses.')
juniRadiusAcctClientBadAuthenticators = MibTableColumn((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 3, 2, 1, 8), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: juniRadiusAcctClientBadAuthenticators.setStatus('current')
if mibBuilder.loadTexts: juniRadiusAcctClientBadAuthenticators.setDescription('The number of RADIUS Accounting-Response packets which contained invalid authenticators received from this server.')
juniRadiusAcctClientPendingRequests = MibTableColumn((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 3, 2, 1, 9), Gauge32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: juniRadiusAcctClientPendingRequests.setStatus('current')
if mibBuilder.loadTexts: juniRadiusAcctClientPendingRequests.setDescription('The number of RADIUS Accounting-Request packets sent to this server that have not yet timed out or received a response. This variable is incremented when an Accounting-Request is sent and decremented due to receipt of an Accounting-Response, a timeout or a retransmission.')
juniRadiusAcctClientTimeouts = MibTableColumn((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 3, 2, 1, 10), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: juniRadiusAcctClientTimeouts.setStatus('current')
if mibBuilder.loadTexts: juniRadiusAcctClientTimeouts.setDescription('The number of accounting timeouts to this server. After a timeout the client may retry to the same server, send to a different server, or give up. A retry to the same server is counted as a retransmit as well as a timeout. A send to a different server is counted as an Accounting-Request as well as a timeout.')
juniRadiusAcctClientUnknownTypes = MibTableColumn((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 3, 2, 1, 11), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: juniRadiusAcctClientUnknownTypes.setStatus('current')
if mibBuilder.loadTexts: juniRadiusAcctClientUnknownTypes.setDescription('The number of RADIUS packets of unknown type which were received from this server on the accounting port.')
juniRadiusAcctClientPacketsDropped = MibTableColumn((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 3, 2, 1, 12), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: juniRadiusAcctClientPacketsDropped.setStatus('current')
if mibBuilder.loadTexts: juniRadiusAcctClientPacketsDropped.setDescription('The number of RADIUS packets which were received from this server on the accounting port and dropped for some other reason.')
juniRadiusAcctClientStartRequests = MibTableColumn((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 3, 2, 1, 13), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: juniRadiusAcctClientStartRequests.setStatus('current')
if mibBuilder.loadTexts: juniRadiusAcctClientStartRequests.setDescription('The number of RADIUS Accounting-Start request packets sent. This does not include retransmissions.')
juniRadiusAcctClientInterimRequests = MibTableColumn((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 3, 2, 1, 14), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: juniRadiusAcctClientInterimRequests.setStatus('current')
if mibBuilder.loadTexts: juniRadiusAcctClientInterimRequests.setDescription('The number of RADIUS Accounting-Interim request packets sent. This does not include retransmissions.')
juniRadiusAcctClientStopRequests = MibTableColumn((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 3, 2, 1, 15), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: juniRadiusAcctClientStopRequests.setStatus('current')
if mibBuilder.loadTexts: juniRadiusAcctClientStopRequests.setDescription('The number of RADIUS Accounting-Stop request packets sent. This does not include retransmissions.')
juniRadiusAcctClientStartResponses = MibTableColumn((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 3, 2, 1, 16), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: juniRadiusAcctClientStartResponses.setStatus('current')
if mibBuilder.loadTexts: juniRadiusAcctClientStartResponses.setDescription('The number of RADIUS accounting-start response packets received on the accounting port from this server.')
juniRadiusAcctClientInterimResponses = MibTableColumn((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 3, 2, 1, 17), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: juniRadiusAcctClientInterimResponses.setStatus('current')
if mibBuilder.loadTexts: juniRadiusAcctClientInterimResponses.setDescription('The number of RADIUS accounting-interim response packets received on the accounting port from this server.')
juniRadiusAcctClientStopResponses = MibTableColumn((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 3, 2, 1, 18), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: juniRadiusAcctClientStopResponses.setStatus('current')
if mibBuilder.loadTexts: juniRadiusAcctClientStopResponses.setDescription('The number of RADIUS accounting-stop response packets received on the accounting port from this server.')
juniRadiusAcctClientRejectRequests = MibTableColumn((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 3, 2, 1, 19), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: juniRadiusAcctClientRejectRequests.setStatus('current')
if mibBuilder.loadTexts: juniRadiusAcctClientRejectRequests.setDescription('The number of RADIUS accounting-reject packets sent on the accounting port from this server.')
juniRadiusAcctClientRejectResponses = MibTableColumn((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 3, 2, 1, 20), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: juniRadiusAcctClientRejectResponses.setStatus('current')
if mibBuilder.loadTexts: juniRadiusAcctClientRejectResponses.setDescription('The number of RADIUS accounting-reject response packets received on the accounting port from this server.')
juniRadiusAcctClientCfgServerTable = MibTable((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 3, 3), )
if mibBuilder.loadTexts: juniRadiusAcctClientCfgServerTable.setStatus('current')
if mibBuilder.loadTexts: juniRadiusAcctClientCfgServerTable.setDescription('The (conceptual) table listing the RADIUS accounting servers with which the client shares a secret.')
juniRadiusAcctClientCfgServerEntry = MibTableRow((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 3, 3, 1), ).setIndexNames((0, "Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientCfgServerAddress"))
if mibBuilder.loadTexts: juniRadiusAcctClientCfgServerEntry.setStatus('current')
if mibBuilder.loadTexts: juniRadiusAcctClientCfgServerEntry.setDescription('An entry (conceptual row) representing a RADIUS accounting server with which the client shares a secret.')
juniRadiusAcctClientCfgServerAddress = MibTableColumn((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 3, 3, 1, 1), IpAddress())
if mibBuilder.loadTexts: juniRadiusAcctClientCfgServerAddress.setStatus('current')
if mibBuilder.loadTexts: juniRadiusAcctClientCfgServerAddress.setDescription('The IP address of the RADIUS accounting server referred to in this table entry.')
juniRadiusAcctClientCfgServerPortNumber = MibTableColumn((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 3, 3, 1, 2), Integer32().subtype(subtypeSpec=ValueRangeConstraint(1, 65535)).clone(1813)).setMaxAccess("readcreate")
if mibBuilder.loadTexts: juniRadiusAcctClientCfgServerPortNumber.setStatus('current')
if mibBuilder.loadTexts: juniRadiusAcctClientCfgServerPortNumber.setDescription('The UDP port the client is using to send requests to this server.')
juniRadiusAcctClientCfgKey = MibTableColumn((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 3, 3, 1, 3), DisplayString().subtype(subtypeSpec=ValueSizeConstraint(0, 32)).clone(hexValue="")).setMaxAccess("readcreate")
if mibBuilder.loadTexts: juniRadiusAcctClientCfgKey.setStatus('current')
if mibBuilder.loadTexts: juniRadiusAcctClientCfgKey.setDescription('The secret (RADIUS authenticator) used by the client during exchanges with this accounting server. The default is a zero-length string, indicating no authenticator is used.')
juniRadiusAcctClientCfgTimeoutInterval = MibTableColumn((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 3, 3, 1, 4), Integer32().subtype(subtypeSpec=ValueRangeConstraint(1, 1000)).clone(3)).setUnits('seconds').setMaxAccess("readcreate")
if mibBuilder.loadTexts: juniRadiusAcctClientCfgTimeoutInterval.setStatus('current')
if mibBuilder.loadTexts: juniRadiusAcctClientCfgTimeoutInterval.setDescription('The interval between retransmissions of a request to this accounting server.')
juniRadiusAcctClientCfgRetries = MibTableColumn((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 3, 3, 1, 5), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 100)).clone(3)).setMaxAccess("readcreate")
if mibBuilder.loadTexts: juniRadiusAcctClientCfgRetries.setStatus('current')
if mibBuilder.loadTexts: juniRadiusAcctClientCfgRetries.setDescription('The maximum number of times to resend a request to this accounting server (in addition to the original request), before resorting to the server specified in the next entry.')
juniRadiusAcctClientCfgMaxPendingRequests = MibTableColumn((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 3, 3, 1, 6), Integer32().subtype(subtypeSpec=ValueRangeConstraint(10, 96000)).clone(255)).setMaxAccess("readcreate")
if mibBuilder.loadTexts: juniRadiusAcctClientCfgMaxPendingRequests.setStatus('current')
if mibBuilder.loadTexts: juniRadiusAcctClientCfgMaxPendingRequests.setDescription('The maximum number of outstanding requests this server can support.')
juniRadiusAcctClientCfgRowStatus = MibTableColumn((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 3, 3, 1, 7), RowStatus()).setMaxAccess("readcreate")
if mibBuilder.loadTexts: juniRadiusAcctClientCfgRowStatus.setStatus('current')
if mibBuilder.loadTexts: juniRadiusAcctClientCfgRowStatus.setDescription("Supports 'createAndGo' and 'destroy' only.")
juniRadiusAcctClientCfgPrecedence = MibTableColumn((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 3, 3, 1, 8), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 2147483647))).setMaxAccess("readonly")
if mibBuilder.loadTexts: juniRadiusAcctClientCfgPrecedence.setStatus('current')
if mibBuilder.loadTexts: juniRadiusAcctClientCfgPrecedence.setDescription('Relative precedence of this server with respect to other servers configured in this table. Lower values correspond to higher precedence. Precedence is assigned by the device, in order of entry creation, from higher to lower precedence.')
juniRadiusAcctClientCfgDeadTime = MibTableColumn((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 1, 3, 3, 1, 9), Integer32().subtype(subtypeSpec=ValueRangeConstraint(0, 1440))).setUnits('minutes').setMaxAccess("readcreate")
if mibBuilder.loadTexts: juniRadiusAcctClientCfgDeadTime.setStatus('current')
if mibBuilder.loadTexts: juniRadiusAcctClientCfgDeadTime.setDescription('The period of time, in minutes, to ignore this server after a request to the server times out (thereby avoiding additional request timeouts for this period, if the server failure persists).')
juniRadiusClientTrapControl = MibIdentifier((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 4))
juniRadiusAuthClientUnavailableServer = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 4, 1), IpAddress()).setMaxAccess("accessiblefornotify")
if mibBuilder.loadTexts: juniRadiusAuthClientUnavailableServer.setStatus('current')
if mibBuilder.loadTexts: juniRadiusAuthClientUnavailableServer.setDescription("The IP address of the RADIUS client's former authentication server that is no longer available. The value of this object is equivalent to the prior value of juniRadiusAuthClientCfgServerAddress.")
juniRadiusAuthClientNextAvailableServer = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 4, 2), IpAddress()).setMaxAccess("accessiblefornotify")
if mibBuilder.loadTexts: juniRadiusAuthClientNextAvailableServer.setStatus('current')
if mibBuilder.loadTexts: juniRadiusAuthClientNextAvailableServer.setDescription('The next available RADIUS authentication server, replacing the one that is unavailable. The value of this object is equivalent to the current value of juniRadiusAuthClientCfgServerAddress.')
juniRadiusAcctClientUnavailableServer = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 4, 3), IpAddress()).setMaxAccess("accessiblefornotify")
if mibBuilder.loadTexts: juniRadiusAcctClientUnavailableServer.setStatus('current')
if mibBuilder.loadTexts: juniRadiusAcctClientUnavailableServer.setDescription("The IP address of the RADIUS client's former accounting server that is no longer available. The value of this object is equivalent to the prior value of juniRadiusAcctClientCfgServerAddress.")
juniRadiusAcctClientNextAvailableServer = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 4, 4), IpAddress()).setMaxAccess("accessiblefornotify")
if mibBuilder.loadTexts: juniRadiusAcctClientNextAvailableServer.setStatus('current')
if mibBuilder.loadTexts: juniRadiusAcctClientNextAvailableServer.setDescription('The next available RADIUS accounting server, replacing the one that is unavailable. The value of this object is equivalent to the current value of juniRadiusAcctClientCfgServerAddress.')
juniRadiusAuthClientAvailableServer = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 4, 5), IpAddress()).setMaxAccess("accessiblefornotify")
if mibBuilder.loadTexts: juniRadiusAuthClientAvailableServer.setStatus('current')
if mibBuilder.loadTexts: juniRadiusAuthClientAvailableServer.setDescription('The re-available RADIUS authentication server after a period of time called dead-time. The value of this object is equivalent to the current value of juniRadiusAuthClientCfgServerAddress.')
juniRadiusAcctClientAvailableServer = MibScalar((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 4, 6), IpAddress()).setMaxAccess("accessiblefornotify")
if mibBuilder.loadTexts: juniRadiusAcctClientAvailableServer.setStatus('current')
if mibBuilder.loadTexts: juniRadiusAcctClientAvailableServer.setDescription('The re-available RADIUS accounting server after a period of time called dead-time. The value of this object is equivalent to the current value of juniRadiusAcctClientCfgServerAddress.')
juniRadiusClientTraps = MibIdentifier((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 3))
juniRadiusClientTrapPrefix = MibIdentifier((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 3, 0))
juniRadiusAuthClientServerUnavailable = NotificationType((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 3, 0, 1)).setObjects(("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientUnavailableServer"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientNextAvailableServer"))
if mibBuilder.loadTexts: juniRadiusAuthClientServerUnavailable.setStatus('current')
if mibBuilder.loadTexts: juniRadiusAuthClientServerUnavailable.setDescription('This trap will be generated when the requested authentication server is not available.')
juniRadiusAuthClientNoServerAvailable = NotificationType((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 3, 0, 2))
if mibBuilder.loadTexts: juniRadiusAuthClientNoServerAvailable.setStatus('current')
if mibBuilder.loadTexts: juniRadiusAuthClientNoServerAvailable.setDescription('This trap will be generated when all of the requested servers were not available.')
juniRadiusAcctClientServerUnavailable = NotificationType((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 3, 0, 3)).setObjects(("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientUnavailableServer"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientNextAvailableServer"))
if mibBuilder.loadTexts: juniRadiusAcctClientServerUnavailable.setStatus('current')
if mibBuilder.loadTexts: juniRadiusAcctClientServerUnavailable.setDescription('This trap will be generated when the requested accounting server is not available.')
juniRadiusAcctClientNoServerAvailable = NotificationType((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 3, 0, 4))
if mibBuilder.loadTexts: juniRadiusAcctClientNoServerAvailable.setStatus('current')
if mibBuilder.loadTexts: juniRadiusAcctClientNoServerAvailable.setDescription('This trap will be generated when all of the requested servers were not available.')
juniRadiusAuthClientServerAvailable = NotificationType((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 3, 0, 5)).setObjects(("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientAvailableServer"))
if mibBuilder.loadTexts: juniRadiusAuthClientServerAvailable.setStatus('current')
if mibBuilder.loadTexts: juniRadiusAuthClientServerAvailable.setDescription('This trap will be generated when the requested authentication server becomes available again after a period of time.')
juniRadiusAcctClientServerAvailable = NotificationType((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 3, 0, 6)).setObjects(("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientAvailableServer"))
if mibBuilder.loadTexts: juniRadiusAcctClientServerAvailable.setStatus('current')
if mibBuilder.loadTexts: juniRadiusAcctClientServerAvailable.setDescription('This trap will be generated when the requested accounting server becomes available again after a period of time.')
juniRadiusClientMIBConformance = MibIdentifier((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 2))
juniRadiusClientMIBCompliances = MibIdentifier((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 2, 1))
juniRadiusClientMIBGroups = MibIdentifier((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 2, 2))
juniRadiusAuthClientCompliance = ModuleCompliance((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 2, 1, 1)).setObjects(("Juniper-RADIUS-CLIENT-MIB", "juniRadiusGeneralClientGroup"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientGroup"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
juniRadiusAuthClientCompliance = juniRadiusAuthClientCompliance.setStatus('obsolete')
if mibBuilder.loadTexts: juniRadiusAuthClientCompliance.setDescription('Obsolete compliance statement for authentication clients implementing the Juniper RADIUS Client MIB authentication functionality. This statement became obsolete when the juniRadiusClientSourceAddress object was added.')
juniRadiusAcctClientCompliance = ModuleCompliance((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 2, 1, 2)).setObjects(("Juniper-RADIUS-CLIENT-MIB", "juniRadiusGeneralClientGroup"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientGroup"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
juniRadiusAcctClientCompliance = juniRadiusAcctClientCompliance.setStatus('obsolete')
if mibBuilder.loadTexts: juniRadiusAcctClientCompliance.setDescription('Obsolete compliance statement for accounting clients implementing the Juniper RADIUS Client MIB accounting functionality. This statement became obsolete when the juniRadiusClientSourceAddress object was added.')
juniRadiusAuthClientCompliance2 = ModuleCompliance((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 2, 1, 3)).setObjects(("Juniper-RADIUS-CLIENT-MIB", "juniRadiusGeneralClientGroup2"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientGroup"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
juniRadiusAuthClientCompliance2 = juniRadiusAuthClientCompliance2.setStatus('obsolete')
if mibBuilder.loadTexts: juniRadiusAuthClientCompliance2.setDescription('Obsolete compliance statement for authentication clients implementing the Juniper RADIUS Client MIB authentication functionality. This statement became obsolete when new objects were added.')
juniRadiusAcctClientCompliance2 = ModuleCompliance((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 2, 1, 4)).setObjects(("Juniper-RADIUS-CLIENT-MIB", "juniRadiusGeneralClientGroup2"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientGroup"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
juniRadiusAcctClientCompliance2 = juniRadiusAcctClientCompliance2.setStatus('obsolete')
if mibBuilder.loadTexts: juniRadiusAcctClientCompliance2.setDescription('Obsolete compliance statement for accounting clients implementing the Juniper RADIUS Client MIB accounting functionality. This statement became obsolete when new objects were added.')
juniRadiusClientCompliance = ModuleCompliance((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 2, 1, 5)).setObjects(("Juniper-RADIUS-CLIENT-MIB", "juniRadiusBasicClientGroup"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientGroup"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientGroup"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusBrasClientGroup"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusTunnelClientGroup"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
juniRadiusClientCompliance = juniRadiusClientCompliance.setStatus('obsolete')
if mibBuilder.loadTexts: juniRadiusClientCompliance.setDescription('Obsolete compliance statement for authentication clients implementing the Juniper RADIUS Client MIB authentication functionality. This statement became obsolete when new B-RAS objects were added.')
juniRadiusClientCompliance2 = ModuleCompliance((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 2, 1, 6)).setObjects(("Juniper-RADIUS-CLIENT-MIB", "juniRadiusBasicClientGroup"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientGroup"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientGroup"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusBrasClientGroup2"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusTunnelClientGroup"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
juniRadiusClientCompliance2 = juniRadiusClientCompliance2.setStatus('obsolete')
if mibBuilder.loadTexts: juniRadiusClientCompliance2.setDescription('Obsolete compliance statement for authentication clients implementing the Juniper RADIUS Client MIB authentication functionality. This statement became obsolete when new objects were added.')
juniRadiusClientCompliance3 = ModuleCompliance((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 2, 1, 7)).setObjects(("Juniper-RADIUS-CLIENT-MIB", "juniRadiusBasicClientGroup2"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientGroup"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientGroup"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusBrasClientGroup3"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusTunnelClientGroup"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
juniRadiusClientCompliance3 = juniRadiusClientCompliance3.setStatus('obsolete')
if mibBuilder.loadTexts: juniRadiusClientCompliance3.setDescription('Obsolete compliance statement for authentication clients implementing the Juniper RADIUS Client MIB authentication functionality. This statement became obsolete when the juniRadiusClientNasIpAddrUse object was added.')
juniRadiusClientCompliance4 = ModuleCompliance((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 2, 1, 8)).setObjects(("Juniper-RADIUS-CLIENT-MIB", "juniRadiusBasicClientGroup2"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientGroup"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientGroup"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusBrasClientGroup4"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusTunnelClientGroup"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
juniRadiusClientCompliance4 = juniRadiusClientCompliance4.setStatus('obsolete')
if mibBuilder.loadTexts: juniRadiusClientCompliance4.setDescription('Obsolete compliance statement for authentication clients implementing the Juniper RADIUS Client MIB authentication functionality. This statement became obsolete when objects were added to indicate which RADIUS attributes should be included or excluded from RADIUS packets.')
juniRadiusClientCompliance5 = ModuleCompliance((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 2, 1, 9)).setObjects(("Juniper-RADIUS-CLIENT-MIB", "juniRadiusBasicClientGroup2"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientGroup"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientGroup"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusBrasClientGroup5"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusTunnelClientGroup"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
juniRadiusClientCompliance5 = juniRadiusClientCompliance5.setStatus('obsolete')
if mibBuilder.loadTexts: juniRadiusClientCompliance5.setDescription('Obsolete compliance statement for authentication clients implementing the Juniper RADIUS Client MIB authentication functionality. This statement became obsolete when notifications for unavailable RADIUS servers were added.')
juniRadiusClientCompliance6 = ModuleCompliance((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 2, 1, 10)).setObjects(("Juniper-RADIUS-CLIENT-MIB", "juniRadiusBasicClientGroup2"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientGroup2"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthNotificationGroup"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientGroup2"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctNotificationGroup"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusBrasClientGroup5"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusTunnelClientGroup"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
juniRadiusClientCompliance6 = juniRadiusClientCompliance6.setStatus('obsolete')
if mibBuilder.loadTexts: juniRadiusClientCompliance6.setDescription('Obsolete compliance statement for authentication clients implementing the Juniper RADIUS Client MIB authentication functionality. This statement became obsolete when attribute-ignore objects were added to the B-RAS group and accounting and authetication servers available notifications were added.')
juniRadiusClientCompliance7 = ModuleCompliance((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 2, 1, 11)).setObjects(("Juniper-RADIUS-CLIENT-MIB", "juniRadiusBasicClientGroup2"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientGroup3"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthNotificationGroup2"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientGroup3"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctNotificationGroup2"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusBrasClientGroup6"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusTunnelClientGroup"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
juniRadiusClientCompliance7 = juniRadiusClientCompliance7.setStatus('obsolete')
if mibBuilder.loadTexts: juniRadiusClientCompliance7.setDescription('Obsolete compliance statement for authentication clients implementing the Juniper RADIUS Client MIB authentication functionality. This statement became obsolete when authentication and accounting objects were added.')
juniRadiusClientCompliance8 = ModuleCompliance((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 2, 1, 12)).setObjects(("Juniper-RADIUS-CLIENT-MIB", "juniRadiusBasicClientGroup2"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientGroup3"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthNotificationGroup2"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientGroup3"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctNotificationGroup2"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusBrasClientGroup7"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusTunnelClientGroup"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
juniRadiusClientCompliance8 = juniRadiusClientCompliance8.setStatus('obsolete')
if mibBuilder.loadTexts: juniRadiusClientCompliance8.setDescription('Obsolete compliance statement for authentication clients implementing the Juniper RADIUS Client MIB authentication functionality. This statement became obsolete when accounting reject counters were added.')
juniRadiusClientCompliance9 = ModuleCompliance((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 2, 1, 13)).setObjects(("Juniper-RADIUS-CLIENT-MIB", "juniRadiusBasicClientGroup2"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientGroup3"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthNotificationGroup2"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientGroup4"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctNotificationGroup2"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusBrasClientGroup8"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusTunnelClientGroup"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
juniRadiusClientCompliance9 = juniRadiusClientCompliance9.setStatus('obsolete')
if mibBuilder.loadTexts: juniRadiusClientCompliance9.setDescription('Obsolete compliance statement for authentication clients implementing the Juniper RADIUS Client MIB authentication functionality. This statement became obsolete when new object was added to the BRAS group to indicate which RADIUS attributes should be included or excluded from RADIUS packets.')
juniRadiusClientCompliance10 = ModuleCompliance((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 2, 1, 14)).setObjects(("Juniper-RADIUS-CLIENT-MIB", "juniRadiusBasicClientGroup2"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientGroup3"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthNotificationGroup2"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientGroup4"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctNotificationGroup2"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusBrasClientGroup9"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusTunnelClientGroup"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
juniRadiusClientCompliance10 = juniRadiusClientCompliance10.setStatus('obsolete')
if mibBuilder.loadTexts: juniRadiusClientCompliance10.setDescription('Obsolete compliance statement for authentication clients implementing the Juniper RADIUS Client MIB authentication functionality. This statement became obsolete when new objects were added to indicate which RADIUS attributes for DHCP VSAs should be included or excluded from RADIUS packets.')
juniRadiusClientCompliance11 = ModuleCompliance((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 2, 1, 15)).setObjects(("Juniper-RADIUS-CLIENT-MIB", "juniRadiusBasicClientGroup2"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientGroup3"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthNotificationGroup2"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientGroup4"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctNotificationGroup2"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusBrasClientGroup10"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusTunnelClientGroup"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
juniRadiusClientCompliance11 = juniRadiusClientCompliance11.setStatus('obsolete')
if mibBuilder.loadTexts: juniRadiusClientCompliance11.setDescription('Obsolete compliance statement for authentication clients implementing the Juniper RADIUS Client MIB authentication functionality.')
juniRadiusClientCompliance12 = ModuleCompliance((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 2, 1, 16)).setObjects(("Juniper-RADIUS-CLIENT-MIB", "juniRadiusBasicClientGroup2"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientGroup3"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthNotificationGroup2"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientGroup4"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctNotificationGroup2"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusBrasClientGroup11"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusTunnelClientGroup"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
juniRadiusClientCompliance12 = juniRadiusClientCompliance12.setStatus('obsolete')
if mibBuilder.loadTexts: juniRadiusClientCompliance12.setDescription('Obsolete compliance statement for authentication clients implementing the Juniper RADIUS Client MIB authentication functionality.')
juniRadiusClientCompliance13 = ModuleCompliance((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 2, 1, 17)).setObjects(("Juniper-RADIUS-CLIENT-MIB", "juniRadiusBasicClientGroup2"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientGroup3"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthNotificationGroup2"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientGroup4"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctNotificationGroup2"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusBrasClientGroup14"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusTunnelClientGroup"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
juniRadiusClientCompliance13 = juniRadiusClientCompliance13.setStatus('obsolete')
if mibBuilder.loadTexts: juniRadiusClientCompliance13.setDescription('Obsolete compliance statement for authentication clients implementing the Juniper RADIUS Client MIB authentication functionality.')
juniRadiusClientCompliance14 = ModuleCompliance((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 2, 1, 18)).setObjects(("Juniper-RADIUS-CLIENT-MIB", "juniRadiusBasicClientGroup2"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientGroup3"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthNotificationGroup2"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientGroup4"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctNotificationGroup2"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusBrasClientGroup15"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusTunnelClientGroup"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
juniRadiusClientCompliance14 = juniRadiusClientCompliance14.setStatus('obsolete')
if mibBuilder.loadTexts: juniRadiusClientCompliance14.setDescription('The compliance statement for authentication clients implementing the Juniper RADIUS Client MIB authentication functionality.')
juniRadiusClientCompliance15 = ModuleCompliance((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 2, 1, 19)).setObjects(("Juniper-RADIUS-CLIENT-MIB", "juniRadiusBasicClientGroup2"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientGroup3"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthNotificationGroup2"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientGroup4"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctNotificationGroup2"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusBrasClientGroup16"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusTunnelClientGroup"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
juniRadiusClientCompliance15 = juniRadiusClientCompliance15.setStatus('obsolete')
if mibBuilder.loadTexts: juniRadiusClientCompliance15.setDescription('Obsolete compliance statement for authentication clients implementing the Juniper RADIUS Client MIB authentication functionality.')
juniRadiusClientCompliance16 = ModuleCompliance((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 2, 1, 20)).setObjects(("Juniper-RADIUS-CLIENT-MIB", "juniRadiusBasicClientGroup2"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientGroup3"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthNotificationGroup2"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientGroup4"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctNotificationGroup2"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusBrasClientGroup18"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusTunnelClientGroup"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
juniRadiusClientCompliance16 = juniRadiusClientCompliance16.setStatus('obsolete')
if mibBuilder.loadTexts: juniRadiusClientCompliance16.setDescription('The compliance statement for authentication clients implementing the Juniper RADIUS Client MIB authentication functionality.')
juniRadiusClientCompliance17 = ModuleCompliance((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 2, 1, 21)).setObjects(("Juniper-RADIUS-CLIENT-MIB", "juniRadiusBasicClientGroup2"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientGroup3"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthNotificationGroup2"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientGroup4"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctNotificationGroup2"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusBrasClientGroup19"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusTunnelClientGroup"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
juniRadiusClientCompliance17 = juniRadiusClientCompliance17.setStatus('obsolete')
if mibBuilder.loadTexts: juniRadiusClientCompliance17.setDescription('The compliance statement for authentication clients implementing the Juniper RADIUS Client MIB authentication functionality.')
juniRadiusClientCompliance18 = ModuleCompliance((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 2, 1, 22)).setObjects(("Juniper-RADIUS-CLIENT-MIB", "juniRadiusBasicClientGroup2"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientGroup3"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthNotificationGroup2"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientGroup4"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctNotificationGroup2"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusBrasClientGroup20"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusTunnelClientGroup"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
juniRadiusClientCompliance18 = juniRadiusClientCompliance18.setStatus('current')
if mibBuilder.loadTexts: juniRadiusClientCompliance18.setDescription('The compliance statement for authentication clients implementing the Juniper RADIUS Client MIB authentication functionality.')
juniRadiusGeneralClientGroup = ObjectGroup((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 2, 2, 1)).setObjects(("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIdentifier"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientAlgorithm"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
juniRadiusGeneralClientGroup = juniRadiusGeneralClientGroup.setStatus('obsolete')
if mibBuilder.loadTexts: juniRadiusGeneralClientGroup.setDescription('Obsolete basic collection of objects providing management of RADIUS Clients. This group became obsolete when juniRadiusClientSourceAddress was added.')
juniRadiusAuthClientGroup = ObjectGroup((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 2, 2, 2)).setObjects(("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientInvalidServerAddresses"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientServerPortNumber"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientRoundTripTime"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientAccessRequests"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientAccessRetransmissions"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientAccessAccepts"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientAccessRejects"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientAccessChallenges"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientMalformedAccessResponses"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientBadAuthenticators"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientPendingRequests"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientTimeouts"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientUnknownTypes"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientPacketsDropped"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientCfgServerPortNumber"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientCfgKey"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientCfgTimeoutInterval"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientCfgRetries"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientCfgMaxPendingRequests"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientCfgRowStatus"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientCfgPrecedence"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientCfgDeadTime"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
juniRadiusAuthClientGroup = juniRadiusAuthClientGroup.setStatus('obsolete')
if mibBuilder.loadTexts: juniRadiusAuthClientGroup.setDescription('Obsolete collection of objects providing management of RADIUS Authentication Clients. This group became obsolete when notification objects for an unavailable authentication server were added.')
juniRadiusAcctClientGroup = ObjectGroup((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 2, 2, 3)).setObjects(("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientInvalidServerAddresses"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientServerPortNumber"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientRoundTripTime"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientRequests"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientRetransmissions"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientResponses"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientMalformedResponses"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientBadAuthenticators"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientPendingRequests"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientTimeouts"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientUnknownTypes"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientPacketsDropped"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientCfgServerPortNumber"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientCfgKey"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientCfgTimeoutInterval"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientCfgRetries"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientCfgMaxPendingRequests"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientCfgRowStatus"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientCfgPrecedence"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientCfgDeadTime"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
juniRadiusAcctClientGroup = juniRadiusAcctClientGroup.setStatus('obsolete')
if mibBuilder.loadTexts: juniRadiusAcctClientGroup.setDescription('Obsolete collection of objects providing management of RADIUS Accounting Clients. This group became obsolete when notification objects for an unavailable accounting server were added.')
juniRadiusGeneralClientGroup2 = ObjectGroup((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 2, 2, 4)).setObjects(("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIdentifier"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientAlgorithm"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientSourceAddress"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
juniRadiusGeneralClientGroup2 = juniRadiusGeneralClientGroup2.setStatus('obsolete')
if mibBuilder.loadTexts: juniRadiusGeneralClientGroup2.setDescription('Obsolete basic collection of objects providing management of RADIUS Clients. This group became obsolete when new objects were added.')
juniRadiusBasicClientGroup = ObjectGroup((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 2, 2, 5)).setObjects(("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIdentifier"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientAlgorithm"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientSourceAddress"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientUdpChecksum"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasIdentifier"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
juniRadiusBasicClientGroup = juniRadiusBasicClientGroup.setStatus('obsolete')
if mibBuilder.loadTexts: juniRadiusBasicClientGroup.setDescription('Obsolete collection of objects providing basic management of RADIUS Clients. This group became obsolete when the juniRadiusClientRollover object was added.')
juniRadiusBrasClientGroup = ObjectGroup((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 2, 2, 6)).setObjects(("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientDslPortType"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientAcctSessionIdFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientCallingStationDelimiter"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
juniRadiusBrasClientGroup = juniRadiusBrasClientGroup.setStatus('obsolete')
if mibBuilder.loadTexts: juniRadiusBrasClientGroup.setDescription('Obsolete collection of objects providing management of general B-RAS functions for RADIUS Clients. This group became obsolete when new objects were added.')
juniRadiusTunnelClientGroup = ObjectGroup((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 2, 2, 7)).setObjects(("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientTunnelAccounting"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
juniRadiusTunnelClientGroup = juniRadiusTunnelClientGroup.setStatus('current')
if mibBuilder.loadTexts: juniRadiusTunnelClientGroup.setDescription('An object providing management of tunneling functions for RADIUS Clients.')
juniRadiusBrasClientGroup2 = ObjectGroup((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 2, 2, 8)).setObjects(("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientDslPortType"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientAcctSessionIdFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientCallingStationDelimiter"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientEthernetPortType"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeIpAddrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctSessionIdInAccessReq"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
juniRadiusBrasClientGroup2 = juniRadiusBrasClientGroup2.setStatus('obsolete')
if mibBuilder.loadTexts: juniRadiusBrasClientGroup2.setDescription('Obsolete collection of objects providing management of general B-RAS functions for RADIUS Clients. This group became obsolete when the juniRadiusClientCallingStationIdFormat object was added.')
juniRadiusBasicClientGroup2 = ObjectGroup((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 2, 2, 9)).setObjects(("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIdentifier"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientAlgorithm"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientSourceAddress"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientUdpChecksum"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasIdentifier"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientRollover"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
juniRadiusBasicClientGroup2 = juniRadiusBasicClientGroup2.setStatus('current')
if mibBuilder.loadTexts: juniRadiusBasicClientGroup2.setDescription('A collection of objects providing basic management of RADIUS Clients.')
juniRadiusBrasClientGroup3 = ObjectGroup((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 2, 2, 10)).setObjects(("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientDslPortType"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientAcctSessionIdFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientCallingStationDelimiter"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientEthernetPortType"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeIpAddrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctSessionIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientCallingStationIdFormat"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
juniRadiusBrasClientGroup3 = juniRadiusBrasClientGroup3.setStatus('obsolete')
if mibBuilder.loadTexts: juniRadiusBrasClientGroup3.setDescription('Obsolete collection of objects providing management of general B-RAS functions for RADIUS Clients. This group became obsolete when the juniRadiusClientNasIpAddrUse object was added.')
juniRadiusBrasClientGroup4 = ObjectGroup((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 2, 2, 11)).setObjects(("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientDslPortType"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientAcctSessionIdFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientCallingStationDelimiter"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientEthernetPortType"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeIpAddrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctSessionIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientCallingStationIdFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasIpAddrUse"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
juniRadiusBrasClientGroup4 = juniRadiusBrasClientGroup4.setStatus('obsolete')
if mibBuilder.loadTexts: juniRadiusBrasClientGroup4.setDescription('Obsolete collection of objects providing management of general B-RAS functions for RADIUS Clients. This group became obsolete when objects were added to indicate which RADIUS attributes should be included or excluded from RADIUS packets.')
juniRadiusBrasClientGroup5 = ObjectGroup((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 2, 2, 12)).setObjects(("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientDslPortType"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientAcctSessionIdFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientCallingStationDelimiter"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientEthernetPortType"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeIpAddrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctSessionIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientCallingStationIdFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasIpAddrUse"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctTunnelConnectionInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCalledStationIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCallingStationIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeConnectInfoInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasIdentifierInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortTypeInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludePppoeDescriptionInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientAuthIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientEndpointInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelMediumTypeInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAttributesInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAuthIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerEndpointInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelTypeInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctTunnelConnectionInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCalledStationIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCallingStationIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeClassInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeConnectInfoInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeEgressPolicyNameInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeEventTimestampInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeFramedCompressionInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeFramedIpNetmaskInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeIngressPolicyNameInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasIdentifierInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortTypeInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludePppoeDescriptionInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelAssignmentIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientAuthIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientEndpointInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelMediumTypeInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelPreferenceInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAttributesInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAuthIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerEndpointInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelTypeInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctTunnelConnectionInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCalledStationIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCallingStationIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeClassInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeConnectInfoInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeEgressPolicyNameInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeEventTimestampInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeFramedCompressionInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeFramedIpNetmaskInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeIngressPolicyNameInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeInputGigawordsInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasIdentifierInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortTypeInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeOutputGigawordsInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludePppoeDescriptionInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelAssignmentIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientAuthIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientEndpointInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelMediumTypeInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelPreferenceInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAttributesInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAuthIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerEndpointInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelTypeInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeInputGigapktsInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeOutputGigapktsInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreFramedIpNetmask"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
juniRadiusBrasClientGroup5 = juniRadiusBrasClientGroup5.setStatus('obsolete')
if mibBuilder.loadTexts: juniRadiusBrasClientGroup5.setDescription('Obsolete collection of objects providing management of general B-RAS functions for RADIUS Clients. This group became obsolete when objects to ignore attributes and enable/disable traps were added.')
juniRadiusAuthClientGroup2 = ObjectGroup((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 2, 2, 13)).setObjects(("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientInvalidServerAddresses"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientServerPortNumber"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientRoundTripTime"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientAccessRequests"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientAccessRetransmissions"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientAccessAccepts"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientAccessRejects"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientAccessChallenges"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientMalformedAccessResponses"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientBadAuthenticators"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientPendingRequests"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientTimeouts"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientUnknownTypes"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientPacketsDropped"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientCfgServerPortNumber"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientCfgKey"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientCfgTimeoutInterval"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientCfgRetries"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientCfgMaxPendingRequests"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientCfgRowStatus"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientCfgPrecedence"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientCfgDeadTime"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientUnavailableServer"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientNextAvailableServer"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
juniRadiusAuthClientGroup2 = juniRadiusAuthClientGroup2.setStatus('obsolete')
if mibBuilder.loadTexts: juniRadiusAuthClientGroup2.setDescription('Obsolete collection of objects providing management of RADIUS Authentication Clients. This group became obsolete when authentication server available notification support was added.')
juniRadiusAcctClientGroup2 = ObjectGroup((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 2, 2, 14)).setObjects(("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientInvalidServerAddresses"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientServerPortNumber"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientRoundTripTime"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientRequests"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientRetransmissions"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientResponses"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientMalformedResponses"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientBadAuthenticators"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientPendingRequests"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientTimeouts"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientUnknownTypes"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientPacketsDropped"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientCfgServerPortNumber"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientCfgKey"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientCfgTimeoutInterval"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientCfgRetries"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientCfgMaxPendingRequests"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientCfgRowStatus"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientCfgPrecedence"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientCfgDeadTime"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientUnavailableServer"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientNextAvailableServer"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
juniRadiusAcctClientGroup2 = juniRadiusAcctClientGroup2.setStatus('obsolete')
if mibBuilder.loadTexts: juniRadiusAcctClientGroup2.setDescription('Obsolete collection of objects providing management of RADIUS Accounting Clients. This group became obsolete when detailed accounting statistics and accounting server available notification support were added.')
juniRadiusAuthNotificationGroup = NotificationGroup((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 2, 2, 15)).setObjects(("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientServerUnavailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientNoServerAvailable"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
juniRadiusAuthNotificationGroup = juniRadiusAuthNotificationGroup.setStatus('obsolete')
if mibBuilder.loadTexts: juniRadiusAuthNotificationGroup.setDescription('Obsolete collection of management notifications for RADUIS authentication events. This group became obsolete when authentication server available notification was added.')
juniRadiusAcctNotificationGroup = NotificationGroup((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 2, 2, 16)).setObjects(("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientServerUnavailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientNoServerAvailable"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
juniRadiusAcctNotificationGroup = juniRadiusAcctNotificationGroup.setStatus('obsolete')
if mibBuilder.loadTexts: juniRadiusAcctNotificationGroup.setDescription('Obsolete collection of management notifications for RADUIS accounting events. This group became obsolete when accounting server available notification was added.')
juniRadiusBrasClientGroup6 = ObjectGroup((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 2, 2, 17)).setObjects(("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientDslPortType"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientAcctSessionIdFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientCallingStationDelimiter"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientEthernetPortType"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeIpAddrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctSessionIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientCallingStationIdFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasIpAddrUse"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctTunnelConnectionInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCalledStationIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCallingStationIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeConnectInfoInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasIdentifierInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortTypeInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludePppoeDescriptionInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientAuthIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientEndpointInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelMediumTypeInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAttributesInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAuthIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerEndpointInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelTypeInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctTunnelConnectionInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCalledStationIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCallingStationIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeClassInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeConnectInfoInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeEgressPolicyNameInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeEventTimestampInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeFramedCompressionInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeFramedIpNetmaskInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeIngressPolicyNameInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasIdentifierInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortTypeInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludePppoeDescriptionInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelAssignmentIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientAuthIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientEndpointInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelMediumTypeInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelPreferenceInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAttributesInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAuthIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerEndpointInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelTypeInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctTunnelConnectionInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCalledStationIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCallingStationIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeClassInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeConnectInfoInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeEgressPolicyNameInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeEventTimestampInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeFramedCompressionInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeFramedIpNetmaskInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeIngressPolicyNameInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeInputGigawordsInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasIdentifierInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortTypeInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeOutputGigawordsInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludePppoeDescriptionInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelAssignmentIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientAuthIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientEndpointInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelMediumTypeInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelPreferenceInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAttributesInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAuthIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerEndpointInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelTypeInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeInputGigapktsInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeOutputGigapktsInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreFramedIpNetmask"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreAtmCategory"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreAtmMbs"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreAtmPcr"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreAtmScr"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreEgressPolicyName"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreIngressPolicyName"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreVirtualRouter"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientTrapOnAuthServerUnavailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientTrapOnAcctServerUnavailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientTrapOnNoAuthServerAvailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientTrapOnNoAcctServerAvailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientTrapOnAuthServerAvailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientTrapOnAcctServerAvailable"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
juniRadiusBrasClientGroup6 = juniRadiusBrasClientGroup6.setStatus('obsolete')
if mibBuilder.loadTexts: juniRadiusBrasClientGroup6.setDescription('Obsolete collection of objects providing management of general B-RAS functions for RADIUS Clients. This group became obsolete when objects for PPPoE Nas-Port format were added.')
juniRadiusAuthClientGroup3 = ObjectGroup((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 2, 2, 18)).setObjects(("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientInvalidServerAddresses"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientServerPortNumber"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientRoundTripTime"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientAccessRequests"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientAccessRetransmissions"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientAccessAccepts"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientAccessRejects"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientAccessChallenges"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientMalformedAccessResponses"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientBadAuthenticators"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientPendingRequests"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientTimeouts"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientUnknownTypes"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientPacketsDropped"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientCfgServerPortNumber"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientCfgKey"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientCfgTimeoutInterval"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientCfgRetries"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientCfgMaxPendingRequests"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientCfgRowStatus"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientCfgPrecedence"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientCfgDeadTime"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientUnavailableServer"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientNextAvailableServer"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientAvailableServer"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
juniRadiusAuthClientGroup3 = juniRadiusAuthClientGroup3.setStatus('current')
if mibBuilder.loadTexts: juniRadiusAuthClientGroup3.setDescription('A collection of objects providing management of RADIUS Authentication Clients.')
juniRadiusAcctClientGroup3 = ObjectGroup((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 2, 2, 19)).setObjects(("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientInvalidServerAddresses"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientServerPortNumber"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientRoundTripTime"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientRequests"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientStartRequests"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientInterimRequests"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientStopRequests"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientRetransmissions"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientResponses"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientStartResponses"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientInterimResponses"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientStopResponses"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientMalformedResponses"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientBadAuthenticators"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientPendingRequests"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientTimeouts"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientUnknownTypes"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientPacketsDropped"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientCfgServerPortNumber"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientCfgKey"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientCfgTimeoutInterval"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientCfgRetries"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientCfgMaxPendingRequests"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientCfgRowStatus"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientCfgPrecedence"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientCfgDeadTime"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientUnavailableServer"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientNextAvailableServer"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientAvailableServer"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
juniRadiusAcctClientGroup3 = juniRadiusAcctClientGroup3.setStatus('obsolete')
if mibBuilder.loadTexts: juniRadiusAcctClientGroup3.setDescription('Obsolete collection of objects providing management of RADIUS Accounting Clients. This group became obsolete when reject counters were added.')
juniRadiusAuthNotificationGroup2 = NotificationGroup((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 2, 2, 20)).setObjects(("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientServerUnavailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientNoServerAvailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAuthClientServerAvailable"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
juniRadiusAuthNotificationGroup2 = juniRadiusAuthNotificationGroup2.setStatus('current')
if mibBuilder.loadTexts: juniRadiusAuthNotificationGroup2.setDescription('Management notifications for RADUIS authentication events.')
juniRadiusAcctNotificationGroup2 = NotificationGroup((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 2, 2, 21)).setObjects(("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientServerUnavailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientNoServerAvailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientServerAvailable"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
juniRadiusAcctNotificationGroup2 = juniRadiusAcctNotificationGroup2.setStatus('current')
if mibBuilder.loadTexts: juniRadiusAcctNotificationGroup2.setDescription('Management notifications for RADUIS accounting events.')
juniRadiusBrasClientGroup7 = ObjectGroup((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 2, 2, 22)).setObjects(("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientDslPortType"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientAcctSessionIdFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientCallingStationDelimiter"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientEthernetPortType"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeIpAddrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctSessionIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientCallingStationIdFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasIpAddrUse"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctTunnelConnectionInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCalledStationIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCallingStationIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeConnectInfoInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasIdentifierInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortTypeInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludePppoeDescriptionInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientAuthIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientEndpointInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelMediumTypeInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAttributesInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAuthIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerEndpointInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelTypeInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctTunnelConnectionInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCalledStationIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCallingStationIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeClassInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeConnectInfoInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeEgressPolicyNameInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeEventTimestampInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeFramedCompressionInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeFramedIpNetmaskInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeIngressPolicyNameInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasIdentifierInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortTypeInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludePppoeDescriptionInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelAssignmentIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientAuthIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientEndpointInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelMediumTypeInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelPreferenceInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAttributesInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAuthIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerEndpointInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelTypeInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctTunnelConnectionInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCalledStationIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCallingStationIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeClassInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeConnectInfoInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeEgressPolicyNameInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeEventTimestampInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeFramedCompressionInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeFramedIpNetmaskInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeIngressPolicyNameInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeInputGigawordsInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasIdentifierInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortTypeInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeOutputGigawordsInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludePppoeDescriptionInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelAssignmentIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientAuthIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientEndpointInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelMediumTypeInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelPreferenceInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAttributesInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAuthIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerEndpointInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelTypeInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeInputGigapktsInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeOutputGigapktsInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreFramedIpNetmask"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreAtmCategory"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreAtmMbs"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreAtmPcr"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreAtmScr"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreEgressPolicyName"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreIngressPolicyName"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreVirtualRouter"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientTrapOnAuthServerUnavailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientTrapOnAcctServerUnavailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientTrapOnNoAuthServerAvailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientTrapOnNoAcctServerAvailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientTrapOnAuthServerAvailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientTrapOnAcctServerAvailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientPppoeNasPortFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelInterfaceIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelInterfaceIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelInterfaceIdInAcctStop"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
juniRadiusBrasClientGroup7 = juniRadiusBrasClientGroup7.setStatus('obsolete')
if mibBuilder.loadTexts: juniRadiusBrasClientGroup7.setDescription('Obsolete collection of objects providing management of general B-RAS functions for RADIUS Clients. This group became obsolete when an object for VLAN Nas-Port format was added.')
juniRadiusAcctClientGroup4 = ObjectGroup((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 2, 2, 23)).setObjects(("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientInvalidServerAddresses"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientServerPortNumber"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientRoundTripTime"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientRequests"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientStartRequests"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientInterimRequests"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientStopRequests"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientRejectRequests"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientRetransmissions"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientResponses"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientStartResponses"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientInterimResponses"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientStopResponses"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientRejectResponses"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientMalformedResponses"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientBadAuthenticators"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientPendingRequests"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientTimeouts"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientUnknownTypes"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientPacketsDropped"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientCfgServerPortNumber"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientCfgKey"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientCfgTimeoutInterval"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientCfgRetries"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientCfgMaxPendingRequests"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientCfgRowStatus"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientCfgPrecedence"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientCfgDeadTime"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientUnavailableServer"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientNextAvailableServer"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusAcctClientAvailableServer"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
juniRadiusAcctClientGroup4 = juniRadiusAcctClientGroup4.setStatus('current')
if mibBuilder.loadTexts: juniRadiusAcctClientGroup4.setDescription('A collection of objects providing management of RADIUS Accounting Clients.')
juniRadiusBrasClientGroup8 = ObjectGroup((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 2, 2, 24)).setObjects(("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientDslPortType"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientAcctSessionIdFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientCallingStationDelimiter"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientEthernetPortType"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeIpAddrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctSessionIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientCallingStationIdFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasIpAddrUse"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctTunnelConnectionInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCalledStationIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCallingStationIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeConnectInfoInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasIdentifierInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortTypeInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludePppoeDescriptionInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientAuthIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientEndpointInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelMediumTypeInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAttributesInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAuthIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerEndpointInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelTypeInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctTunnelConnectionInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCalledStationIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCallingStationIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeClassInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeConnectInfoInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeEgressPolicyNameInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeEventTimestampInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeFramedCompressionInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeFramedIpNetmaskInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeIngressPolicyNameInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasIdentifierInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortTypeInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludePppoeDescriptionInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelAssignmentIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientAuthIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientEndpointInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelMediumTypeInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelPreferenceInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAttributesInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAuthIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerEndpointInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelTypeInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctTunnelConnectionInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCalledStationIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCallingStationIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeClassInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeConnectInfoInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeEgressPolicyNameInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeEventTimestampInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeFramedCompressionInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeFramedIpNetmaskInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeIngressPolicyNameInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeInputGigawordsInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasIdentifierInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortTypeInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeOutputGigawordsInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludePppoeDescriptionInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelAssignmentIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientAuthIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientEndpointInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelMediumTypeInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelPreferenceInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAttributesInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAuthIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerEndpointInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelTypeInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeInputGigapktsInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeOutputGigapktsInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreFramedIpNetmask"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreAtmCategory"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreAtmMbs"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreAtmPcr"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreAtmScr"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreEgressPolicyName"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreIngressPolicyName"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreVirtualRouter"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientTrapOnAuthServerUnavailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientTrapOnAcctServerUnavailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientTrapOnNoAuthServerAvailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientTrapOnNoAcctServerAvailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientTrapOnAuthServerAvailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientTrapOnAcctServerAvailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientPppoeNasPortFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelInterfaceIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelInterfaceIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelInterfaceIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientVlanNasPortFormat"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
juniRadiusBrasClientGroup8 = juniRadiusBrasClientGroup8.setStatus('obsolete')
if mibBuilder.loadTexts: juniRadiusBrasClientGroup8.setDescription('Obsolete collection of objects providing management of general B-RAS functions for RADIUS Clients. This group became obsolete when new objects were added to indicate which RADIUS attributes should be included or excluded from RADIUS packets.')
juniRadiusBrasClientGroup9 = ObjectGroup((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 2, 2, 25)).setObjects(("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientDslPortType"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientAcctSessionIdFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientCallingStationDelimiter"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientEthernetPortType"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeIpAddrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctSessionIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientCallingStationIdFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasIpAddrUse"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctTunnelConnectionInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCalledStationIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCallingStationIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeConnectInfoInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasIdentifierInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortTypeInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludePppoeDescriptionInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientAuthIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientEndpointInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelMediumTypeInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAttributesInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAuthIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerEndpointInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelTypeInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctTunnelConnectionInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCalledStationIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCallingStationIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeClassInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeConnectInfoInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeEgressPolicyNameInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeEventTimestampInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeFramedCompressionInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeFramedIpNetmaskInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeIngressPolicyNameInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasIdentifierInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortTypeInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludePppoeDescriptionInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelAssignmentIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientAuthIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientEndpointInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelMediumTypeInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelPreferenceInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAttributesInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAuthIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerEndpointInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelTypeInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctTunnelConnectionInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCalledStationIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCallingStationIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeClassInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeConnectInfoInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeEgressPolicyNameInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeEventTimestampInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeFramedCompressionInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeFramedIpNetmaskInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeIngressPolicyNameInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeInputGigawordsInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasIdentifierInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortTypeInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeOutputGigawordsInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludePppoeDescriptionInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelAssignmentIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientAuthIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientEndpointInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelMediumTypeInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelPreferenceInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAttributesInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAuthIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerEndpointInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelTypeInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeInputGigapktsInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeOutputGigapktsInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreFramedIpNetmask"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreAtmCategory"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreAtmMbs"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreAtmPcr"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreAtmScr"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreEgressPolicyName"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreIngressPolicyName"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreVirtualRouter"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientTrapOnAuthServerUnavailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientTrapOnAcctServerUnavailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientTrapOnNoAuthServerAvailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientTrapOnNoAcctServerAvailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientTrapOnAuthServerAvailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientTrapOnAcctServerAvailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientPppoeNasPortFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelInterfaceIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelInterfaceIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelInterfaceIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2tpPppDisconnectCauseInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientVlanNasPortFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctMultiSessionIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctMultiSessionIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctMultiSessionIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAscendNumInMultilinkInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAscendNumInMultilinkInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAscendNumInMultilinkInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientConnectInfoFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeProfileServiceDescrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeProfileServiceDescrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeProfileServiceDescrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctAuthenticInAcctOn"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctDelayTimeInAcctOn"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctSessionIdInAcctOn"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctAuthenticInAcctOff"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctDelayTimeInAcctOff"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctSessionIdInAcctOff"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctTerminateCauseInAcctOff"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeMlpppBundleNameInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeMlpppBundleNameInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeMlpppBundleNameInAcctStop"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
juniRadiusBrasClientGroup9 = juniRadiusBrasClientGroup9.setStatus('obsolete')
if mibBuilder.loadTexts: juniRadiusBrasClientGroup9.setDescription('A collection of objects providing management of general B-RAS functions for RADIUS Clients.')
juniRadiusBrasClientGroup10 = ObjectGroup((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 2, 2, 26)).setObjects(("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientDslPortType"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientAcctSessionIdFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientCallingStationDelimiter"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientEthernetPortType"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeIpAddrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctSessionIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientCallingStationIdFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasIpAddrUse"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctTunnelConnectionInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCalledStationIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCallingStationIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeConnectInfoInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasIdentifierInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortTypeInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludePppoeDescriptionInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientAuthIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientEndpointInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelMediumTypeInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAttributesInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAuthIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerEndpointInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelTypeInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctTunnelConnectionInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCalledStationIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCallingStationIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeClassInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeConnectInfoInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeEgressPolicyNameInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeEventTimestampInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeFramedCompressionInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeFramedIpNetmaskInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeIngressPolicyNameInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasIdentifierInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortTypeInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludePppoeDescriptionInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelAssignmentIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientAuthIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientEndpointInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelMediumTypeInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelPreferenceInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAttributesInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAuthIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerEndpointInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelTypeInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctTunnelConnectionInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCalledStationIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCallingStationIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeClassInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeConnectInfoInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeEgressPolicyNameInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeEventTimestampInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeFramedCompressionInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeFramedIpNetmaskInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeIngressPolicyNameInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeInputGigawordsInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasIdentifierInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortTypeInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeOutputGigawordsInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludePppoeDescriptionInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelAssignmentIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientAuthIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientEndpointInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelMediumTypeInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelPreferenceInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAttributesInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAuthIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerEndpointInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelTypeInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeInputGigapktsInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeOutputGigapktsInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreFramedIpNetmask"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreAtmCategory"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreAtmMbs"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreAtmPcr"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreAtmScr"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreEgressPolicyName"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreIngressPolicyName"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreVirtualRouter"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientTrapOnAuthServerUnavailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientTrapOnAcctServerUnavailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientTrapOnNoAuthServerAvailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientTrapOnNoAcctServerAvailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientTrapOnAuthServerAvailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientTrapOnAcctServerAvailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientPppoeNasPortFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelInterfaceIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelInterfaceIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelInterfaceIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2tpPppDisconnectCauseInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientVlanNasPortFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctMultiSessionIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctMultiSessionIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctMultiSessionIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAscendNumInMultilinkInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAscendNumInMultilinkInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAscendNumInMultilinkInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientConnectInfoFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeProfileServiceDescrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeProfileServiceDescrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeProfileServiceDescrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctAuthenticInAcctOn"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctDelayTimeInAcctOn"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctSessionIdInAcctOn"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctAuthenticInAcctOff"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctDelayTimeInAcctOff"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctSessionIdInAcctOff"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctTerminateCauseInAcctOff"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeMlpppBundleNameInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeMlpppBundleNameInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeMlpppBundleNameInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpOptionsInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpMacAddressInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpGiAddressInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpOptionsInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpMacAddressInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpGiAddressInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpOptionsInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpMacAddressInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpGiAddressInAcctStop"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
juniRadiusBrasClientGroup10 = juniRadiusBrasClientGroup10.setStatus('obsolete')
if mibBuilder.loadTexts: juniRadiusBrasClientGroup10.setDescription('A collection of objects providing management of general B-RAS functions for RADIUS Clients. This group became obsolete when new objects were added to enable/disable the overriding of the nas-port-id and/or calling-station-id values with the PPPoE Remote Circuit Id.')
juniRadiusBrasClientGroup11 = ObjectGroup((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 2, 2, 27)).setObjects(("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientDslPortType"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientAcctSessionIdFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientCallingStationDelimiter"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientEthernetPortType"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeIpAddrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctSessionIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientCallingStationIdFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasIpAddrUse"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctTunnelConnectionInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCalledStationIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCallingStationIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeConnectInfoInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasIdentifierInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortTypeInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludePppoeDescriptionInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientAuthIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientEndpointInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelMediumTypeInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAttributesInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAuthIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerEndpointInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelTypeInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctTunnelConnectionInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCalledStationIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCallingStationIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeClassInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeConnectInfoInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeEgressPolicyNameInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeEventTimestampInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeFramedCompressionInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeFramedIpNetmaskInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeIngressPolicyNameInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasIdentifierInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortTypeInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludePppoeDescriptionInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelAssignmentIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientAuthIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientEndpointInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelMediumTypeInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelPreferenceInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAttributesInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAuthIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerEndpointInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelTypeInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctTunnelConnectionInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCalledStationIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCallingStationIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeClassInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeConnectInfoInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeEgressPolicyNameInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeEventTimestampInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeFramedCompressionInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeFramedIpNetmaskInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeIngressPolicyNameInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeInputGigawordsInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasIdentifierInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortTypeInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeOutputGigawordsInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludePppoeDescriptionInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelAssignmentIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientAuthIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientEndpointInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelMediumTypeInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelPreferenceInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAttributesInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAuthIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerEndpointInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelTypeInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeInputGigapktsInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeOutputGigapktsInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreFramedIpNetmask"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreAtmCategory"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreAtmMbs"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreAtmPcr"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreAtmScr"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreEgressPolicyName"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreIngressPolicyName"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreVirtualRouter"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientTrapOnAuthServerUnavailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientTrapOnAcctServerUnavailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientTrapOnNoAuthServerAvailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientTrapOnNoAcctServerAvailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientTrapOnAuthServerAvailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientTrapOnAcctServerAvailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientPppoeNasPortFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelInterfaceIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelInterfaceIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelInterfaceIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2tpPppDisconnectCauseInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientVlanNasPortFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctMultiSessionIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctMultiSessionIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctMultiSessionIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAscendNumInMultilinkInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAscendNumInMultilinkInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAscendNumInMultilinkInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientConnectInfoFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeProfileServiceDescrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeProfileServiceDescrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeProfileServiceDescrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctAuthenticInAcctOn"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctDelayTimeInAcctOn"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctSessionIdInAcctOn"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctAuthenticInAcctOff"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctDelayTimeInAcctOff"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctSessionIdInAcctOff"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctTerminateCauseInAcctOff"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeMlpppBundleNameInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeMlpppBundleNameInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeMlpppBundleNameInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpOptionsInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpMacAddressInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpGiAddressInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpOptionsInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpMacAddressInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpGiAddressInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpOptionsInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpMacAddressInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpGiAddressInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortIdOverrideRemoteCircuitId"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientCallingStationIdOverrideRemoteCircuitId"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientOverrideNasInfo"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
juniRadiusBrasClientGroup11 = juniRadiusBrasClientGroup11.setStatus('obsolete')
if mibBuilder.loadTexts: juniRadiusBrasClientGroup11.setDescription('Obsolete collection of objects providing management of general B-RAS functions for RADIUS Clients.')
juniRadiusBrasClientGroup12 = ObjectGroup((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 2, 2, 28)).setObjects(("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientDslPortType"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientAcctSessionIdFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientCallingStationDelimiter"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientEthernetPortType"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeIpAddrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctSessionIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientCallingStationIdFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasIpAddrUse"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctTunnelConnectionInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCalledStationIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCallingStationIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeConnectInfoInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasIdentifierInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortTypeInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludePppoeDescriptionInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientAuthIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientEndpointInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelMediumTypeInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAttributesInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAuthIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerEndpointInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelTypeInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctTunnelConnectionInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCalledStationIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCallingStationIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeClassInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeConnectInfoInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeEgressPolicyNameInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeEventTimestampInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeFramedCompressionInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeFramedIpNetmaskInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeIngressPolicyNameInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasIdentifierInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortTypeInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludePppoeDescriptionInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelAssignmentIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientAuthIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientEndpointInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelMediumTypeInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelPreferenceInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAttributesInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAuthIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerEndpointInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelTypeInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctTunnelConnectionInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCalledStationIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCallingStationIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeClassInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeConnectInfoInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeEgressPolicyNameInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeEventTimestampInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeFramedCompressionInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeFramedIpNetmaskInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeIngressPolicyNameInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeInputGigawordsInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasIdentifierInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortTypeInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeOutputGigawordsInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludePppoeDescriptionInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelAssignmentIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientAuthIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientEndpointInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelMediumTypeInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelPreferenceInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAttributesInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAuthIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerEndpointInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelTypeInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeInputGigapktsInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeOutputGigapktsInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreFramedIpNetmask"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreAtmCategory"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreAtmMbs"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreAtmPcr"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreAtmScr"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreEgressPolicyName"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreIngressPolicyName"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreVirtualRouter"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientTrapOnAuthServerUnavailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientTrapOnAcctServerUnavailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientTrapOnNoAuthServerAvailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientTrapOnNoAcctServerAvailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientTrapOnAuthServerAvailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientTrapOnAcctServerAvailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientPppoeNasPortFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelInterfaceIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelInterfaceIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelInterfaceIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2tpPppDisconnectCauseInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientVlanNasPortFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctMultiSessionIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctMultiSessionIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctMultiSessionIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAscendNumInMultilinkInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAscendNumInMultilinkInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAscendNumInMultilinkInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientConnectInfoFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeProfileServiceDescrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeProfileServiceDescrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeProfileServiceDescrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctAuthenticInAcctOn"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctDelayTimeInAcctOn"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctSessionIdInAcctOn"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctAuthenticInAcctOff"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctDelayTimeInAcctOff"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctSessionIdInAcctOff"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctTerminateCauseInAcctOff"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeMlpppBundleNameInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeMlpppBundleNameInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeMlpppBundleNameInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpOptionsInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpMacAddressInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpGiAddressInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpOptionsInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpMacAddressInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpGiAddressInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpOptionsInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpMacAddressInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpGiAddressInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortIdOverrideRemoteCircuitId"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientCallingStationIdOverrideRemoteCircuitId"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientOverrideNasInfo"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeInterfaceDescriptionInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeInterfaceDescriptionInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeInterfaceDescriptionInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortIdOverrideRemoteCircuitId"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientCallingStationIdOverrideRemoteCircuitId"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFieldWidthAtmSlot"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFieldWidthAtmAdapter"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFieldWidthAtmPort"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFieldWidthAtmVpi"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFieldWidthAtmVci"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFieldWidthEthernetSlot"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFieldWidthEthernetAdapter"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFieldWidthEthernetPort"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFieldWidthEthernetSVlan"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFieldWidthEthernetVlan"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientRemoteCircuitIdFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientRemoteCircuitIdDelimiter"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
juniRadiusBrasClientGroup12 = juniRadiusBrasClientGroup12.setStatus('obsolete')
if mibBuilder.loadTexts: juniRadiusBrasClientGroup12.setDescription('A collection of objects providing management of general B-RAS functions for RADIUS Clients.')
juniRadiusBrasClientGroup13 = ObjectGroup((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 2, 2, 29)).setObjects(("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientDslPortType"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientAcctSessionIdFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientCallingStationDelimiter"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientEthernetPortType"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeIpAddrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctSessionIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientCallingStationIdFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasIpAddrUse"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctTunnelConnectionInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCalledStationIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCallingStationIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeConnectInfoInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasIdentifierInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortTypeInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludePppoeDescriptionInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientAuthIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientEndpointInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelMediumTypeInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAttributesInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAuthIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerEndpointInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelTypeInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctTunnelConnectionInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCalledStationIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCallingStationIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeClassInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeConnectInfoInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeEgressPolicyNameInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeEventTimestampInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeFramedCompressionInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeFramedIpNetmaskInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeIngressPolicyNameInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasIdentifierInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortTypeInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludePppoeDescriptionInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelAssignmentIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientAuthIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientEndpointInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelMediumTypeInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelPreferenceInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAttributesInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAuthIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerEndpointInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelTypeInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctTunnelConnectionInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCalledStationIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCallingStationIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeClassInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeConnectInfoInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeEgressPolicyNameInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeEventTimestampInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeFramedCompressionInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeFramedIpNetmaskInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeIngressPolicyNameInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeInputGigawordsInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasIdentifierInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortTypeInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeOutputGigawordsInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludePppoeDescriptionInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelAssignmentIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientAuthIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientEndpointInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelMediumTypeInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelPreferenceInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAttributesInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAuthIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerEndpointInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelTypeInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeInputGigapktsInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeOutputGigapktsInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreFramedIpNetmask"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreAtmCategory"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreAtmMbs"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreAtmPcr"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreAtmScr"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreEgressPolicyName"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreIngressPolicyName"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreVirtualRouter"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientTrapOnAuthServerUnavailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientTrapOnAcctServerUnavailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientTrapOnNoAuthServerAvailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientTrapOnNoAcctServerAvailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientTrapOnAuthServerAvailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientTrapOnAcctServerAvailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientPppoeNasPortFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelInterfaceIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelInterfaceIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelInterfaceIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2tpPppDisconnectCauseInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientVlanNasPortFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctMultiSessionIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctMultiSessionIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctMultiSessionIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAscendNumInMultilinkInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAscendNumInMultilinkInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAscendNumInMultilinkInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientConnectInfoFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeProfileServiceDescrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeProfileServiceDescrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeProfileServiceDescrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctAuthenticInAcctOn"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctDelayTimeInAcctOn"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctSessionIdInAcctOn"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctAuthenticInAcctOff"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctDelayTimeInAcctOff"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctSessionIdInAcctOff"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctTerminateCauseInAcctOff"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeMlpppBundleNameInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeMlpppBundleNameInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeMlpppBundleNameInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpOptionsInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpMacAddressInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpGiAddressInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpOptionsInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpMacAddressInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpGiAddressInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpOptionsInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpMacAddressInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpGiAddressInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortIdOverrideRemoteCircuitId"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientCallingStationIdOverrideRemoteCircuitId"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientOverrideNasInfo"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeInterfaceDescriptionInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeInterfaceDescriptionInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeInterfaceDescriptionInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortIdOverrideRemoteCircuitId"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientCallingStationIdOverrideRemoteCircuitId"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFieldWidthAtmSlot"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFieldWidthAtmAdapter"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFieldWidthAtmPort"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFieldWidthAtmVpi"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFieldWidthAtmVci"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFieldWidthEthernetSlot"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFieldWidthEthernetAdapter"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFieldWidthEthernetPort"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFieldWidthEthernetSVlan"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFieldWidthEthernetVlan"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientRemoteCircuitIdFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientRemoteCircuitIdDelimiter"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAccessLoopParametersInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cDownStreamDataInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cUpStreamDataInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cDownStreamDataInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cUpStreamDataInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cDownStreamDataInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cUpStreamDataInAcctStop"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
juniRadiusBrasClientGroup13 = juniRadiusBrasClientGroup13.setStatus('obsolete')
if mibBuilder.loadTexts: juniRadiusBrasClientGroup13.setDescription('A collection of objects providing management of general B-RAS functions for RADIUS Clients.')
juniRadiusBrasClientGroup14 = ObjectGroup((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 2, 2, 30)).setObjects(("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientDslPortType"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientAcctSessionIdFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientCallingStationDelimiter"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientEthernetPortType"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeIpAddrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctSessionIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientCallingStationIdFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasIpAddrUse"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctTunnelConnectionInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCalledStationIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCallingStationIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeConnectInfoInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasIdentifierInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortTypeInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludePppoeDescriptionInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientAuthIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientEndpointInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelMediumTypeInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAttributesInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAuthIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerEndpointInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelTypeInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctTunnelConnectionInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCalledStationIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCallingStationIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeClassInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeConnectInfoInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeEgressPolicyNameInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeEventTimestampInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeFramedCompressionInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeFramedIpNetmaskInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeIngressPolicyNameInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasIdentifierInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortTypeInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludePppoeDescriptionInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelAssignmentIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientAuthIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientEndpointInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelMediumTypeInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelPreferenceInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAttributesInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAuthIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerEndpointInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelTypeInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctTunnelConnectionInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCalledStationIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCallingStationIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeClassInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeConnectInfoInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeEgressPolicyNameInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeEventTimestampInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeFramedCompressionInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeFramedIpNetmaskInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeIngressPolicyNameInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeInputGigawordsInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasIdentifierInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortTypeInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeOutputGigawordsInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludePppoeDescriptionInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelAssignmentIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientAuthIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientEndpointInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelMediumTypeInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelPreferenceInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAttributesInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAuthIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerEndpointInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelTypeInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeInputGigapktsInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeOutputGigapktsInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreFramedIpNetmask"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreAtmCategory"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreAtmMbs"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreAtmPcr"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreAtmScr"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreEgressPolicyName"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreIngressPolicyName"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreVirtualRouter"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientTrapOnAuthServerUnavailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientTrapOnAcctServerUnavailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientTrapOnNoAuthServerAvailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientTrapOnNoAcctServerAvailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientTrapOnAuthServerAvailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientTrapOnAcctServerAvailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientPppoeNasPortFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelInterfaceIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelInterfaceIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelInterfaceIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2tpPppDisconnectCauseInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientVlanNasPortFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctMultiSessionIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctMultiSessionIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctMultiSessionIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAscendNumInMultilinkInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAscendNumInMultilinkInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAscendNumInMultilinkInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientConnectInfoFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeProfileServiceDescrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeProfileServiceDescrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeProfileServiceDescrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctAuthenticInAcctOn"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctDelayTimeInAcctOn"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctSessionIdInAcctOn"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctAuthenticInAcctOff"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctDelayTimeInAcctOff"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctSessionIdInAcctOff"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctTerminateCauseInAcctOff"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeMlpppBundleNameInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeMlpppBundleNameInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeMlpppBundleNameInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpOptionsInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpMacAddressInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpGiAddressInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpOptionsInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpMacAddressInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpGiAddressInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpOptionsInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpMacAddressInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpGiAddressInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortIdOverrideRemoteCircuitId"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientCallingStationIdOverrideRemoteCircuitId"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientOverrideNasInfo"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeInterfaceDescriptionInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeInterfaceDescriptionInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeInterfaceDescriptionInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortIdOverrideRemoteCircuitId"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientCallingStationIdOverrideRemoteCircuitId"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFieldWidthAtmSlot"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFieldWidthAtmAdapter"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFieldWidthAtmPort"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFieldWidthAtmVpi"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFieldWidthAtmVci"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFieldWidthEthernetSlot"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFieldWidthEthernetAdapter"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFieldWidthEthernetPort"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFieldWidthEthernetSVlan"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFieldWidthEthernetVlan"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientRemoteCircuitIdFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientRemoteCircuitIdDelimiter"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAccessLoopParametersInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cDownStreamDataInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cUpStreamDataInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cDownStreamDataInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cUpStreamDataInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cDownStreamDataInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cUpStreamDataInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDslForumAttributesInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDslForumAttributesInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDslForumAttributesInAcctStop"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
juniRadiusBrasClientGroup14 = juniRadiusBrasClientGroup14.setStatus('obsolete')
if mibBuilder.loadTexts: juniRadiusBrasClientGroup14.setDescription('A collection of objects providing management of general B-RAS functions for RADIUS Clients.')
juniRadiusBrasClientGroup15 = ObjectGroup((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 2, 2, 31)).setObjects(("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientDslPortType"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientAcctSessionIdFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientCallingStationDelimiter"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientEthernetPortType"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeIpAddrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctSessionIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientCallingStationIdFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasIpAddrUse"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctTunnelConnectionInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCalledStationIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCallingStationIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeConnectInfoInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasIdentifierInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortTypeInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludePppoeDescriptionInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientAuthIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientEndpointInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelMediumTypeInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAttributesInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAuthIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerEndpointInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelTypeInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctTunnelConnectionInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCalledStationIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCallingStationIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeClassInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeConnectInfoInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeEgressPolicyNameInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeEventTimestampInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeFramedCompressionInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeFramedIpNetmaskInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeIngressPolicyNameInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasIdentifierInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortTypeInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludePppoeDescriptionInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelAssignmentIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientAuthIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientEndpointInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelMediumTypeInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelPreferenceInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAttributesInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAuthIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerEndpointInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelTypeInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctTunnelConnectionInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCalledStationIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCallingStationIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeClassInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeConnectInfoInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeEgressPolicyNameInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeEventTimestampInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeFramedCompressionInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeFramedIpNetmaskInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeIngressPolicyNameInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeInputGigawordsInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasIdentifierInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortTypeInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeOutputGigawordsInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludePppoeDescriptionInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelAssignmentIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientAuthIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientEndpointInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelMediumTypeInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelPreferenceInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAttributesInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAuthIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerEndpointInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelTypeInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeInputGigapktsInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeOutputGigapktsInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreFramedIpNetmask"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreAtmCategory"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreAtmMbs"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreAtmPcr"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreAtmScr"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreEgressPolicyName"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreIngressPolicyName"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreVirtualRouter"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientTrapOnAuthServerUnavailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientTrapOnAcctServerUnavailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientTrapOnNoAuthServerAvailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientTrapOnNoAcctServerAvailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientTrapOnAuthServerAvailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientTrapOnAcctServerAvailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientPppoeNasPortFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelInterfaceIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelInterfaceIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelInterfaceIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2tpPppDisconnectCauseInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientVlanNasPortFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctMultiSessionIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctMultiSessionIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctMultiSessionIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAscendNumInMultilinkInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAscendNumInMultilinkInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAscendNumInMultilinkInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientConnectInfoFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeProfileServiceDescrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeProfileServiceDescrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeProfileServiceDescrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctAuthenticInAcctOn"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctDelayTimeInAcctOn"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctSessionIdInAcctOn"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctAuthenticInAcctOff"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctDelayTimeInAcctOff"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctSessionIdInAcctOff"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctTerminateCauseInAcctOff"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeMlpppBundleNameInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeMlpppBundleNameInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeMlpppBundleNameInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpOptionsInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpMacAddressInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpGiAddressInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpOptionsInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpMacAddressInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpGiAddressInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpOptionsInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpMacAddressInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpGiAddressInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortIdOverrideRemoteCircuitId"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientCallingStationIdOverrideRemoteCircuitId"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientOverrideNasInfo"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeInterfaceDescriptionInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeInterfaceDescriptionInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeInterfaceDescriptionInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortIdOverrideRemoteCircuitId"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientCallingStationIdOverrideRemoteCircuitId"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFieldWidthAtmSlot"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFieldWidthAtmAdapter"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFieldWidthAtmPort"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFieldWidthAtmVpi"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFieldWidthAtmVci"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFieldWidthEthernetSlot"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFieldWidthEthernetAdapter"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFieldWidthEthernetPort"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFieldWidthEthernetSVlan"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFieldWidthEthernetVlan"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientRemoteCircuitIdFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientRemoteCircuitIdDelimiter"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAccessLoopParametersInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cDownStreamDataInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cUpStreamDataInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cDownStreamDataInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cUpStreamDataInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cDownStreamDataInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cUpStreamDataInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDslForumAttributesInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDslForumAttributesInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAccessLoopCircuitIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAccessAggrCircuitIdBinaryInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAccessAggrCircuitIdAsciiInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cActualDataRateUstrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cActualDataRateDstrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMinimumDataRateUstrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMinimumDataRateDstrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAttainDataRateUstrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAttainDataRateDstrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMaximumDataRateUstrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMaximumDataRateDstrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMinLowPowerDataRateUstrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMinLowPowerDataRateDstrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMaxInterleavingDelayUstrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cActInterleavingDelayUstrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMaxInterleavingDelayDstrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cActInterleavingDelayDstrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cDslLineStateInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cDslTypeInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAccessLoopCircuitIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAccessAggrCircuitIdBinaryInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAccessAggrCircuitIdAsciiInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cActualDataRateUstrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cActualDataRateDstrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMinimumDataRateUstrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMinimumDataRateDstrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAttainDataRateUstrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAttainDataRateDstrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMaximumDataRateUstrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMaximumDataRateDstrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMinLowPowerDataRateUstrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMinLowPowerDataRateDstrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMaxInterleavingDelayUstrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cActInterleavingDelayUstrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMaxInterleavingDelayDstrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cActInterleavingDelayDstrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cDslLineStateInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cDslTypeInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAccessLoopCircuitIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAccessAggrCircuitIdBinaryInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAccessAggrCircuitIdAsciiInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cActualDataRateUstrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cActualDataRateDstrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMinimumDataRateUstrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMinimumDataRateDstrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAttainDataRateUstrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAttainDataRateDstrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMaximumDataRateUstrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMaximumDataRateDstrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMinLowPowerDataRateUstrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMinLowPowerDataRateDstrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMaxInterleavingDelayUstrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cActInterleavingDelayUstrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMaxInterleavingDelayDstrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cActInterleavingDelayDstrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cDslLineStateInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cDslTypeInAcctStop"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
juniRadiusBrasClientGroup15 = juniRadiusBrasClientGroup15.setStatus('obsolete')
if mibBuilder.loadTexts: juniRadiusBrasClientGroup15.setDescription('A collection of objects providing management of general B-RAS functions for RADIUS Clients.')
juniRadiusBrasClientGroup16 = ObjectGroup((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 2, 2, 32)).setObjects(("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientDslPortType"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientAcctSessionIdFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientCallingStationDelimiter"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientEthernetPortType"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeIpAddrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctSessionIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientCallingStationIdFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasIpAddrUse"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctTunnelConnectionInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCalledStationIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCallingStationIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeConnectInfoInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasIdentifierInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortTypeInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludePppoeDescriptionInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientAuthIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientEndpointInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelMediumTypeInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAttributesInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAuthIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerEndpointInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelTypeInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctTunnelConnectionInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCalledStationIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCallingStationIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeClassInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeConnectInfoInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeEgressPolicyNameInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeEventTimestampInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeFramedCompressionInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeFramedIpNetmaskInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeIngressPolicyNameInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasIdentifierInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortTypeInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludePppoeDescriptionInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelAssignmentIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientAuthIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientEndpointInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelMediumTypeInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelPreferenceInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAttributesInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAuthIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerEndpointInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelTypeInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctTunnelConnectionInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCalledStationIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCallingStationIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeClassInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeConnectInfoInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeEgressPolicyNameInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeEventTimestampInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeFramedCompressionInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeFramedIpNetmaskInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeIngressPolicyNameInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeInputGigawordsInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasIdentifierInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortTypeInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeOutputGigawordsInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludePppoeDescriptionInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelAssignmentIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientAuthIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientEndpointInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelMediumTypeInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelPreferenceInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAttributesInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAuthIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerEndpointInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelTypeInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeInputGigapktsInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeOutputGigapktsInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreFramedIpNetmask"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreAtmCategory"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreAtmMbs"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreAtmPcr"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreAtmScr"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreEgressPolicyName"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreIngressPolicyName"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreVirtualRouter"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientTrapOnAuthServerUnavailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientTrapOnAcctServerUnavailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientTrapOnNoAuthServerAvailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientTrapOnNoAcctServerAvailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientTrapOnAuthServerAvailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientTrapOnAcctServerAvailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientPppoeNasPortFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelInterfaceIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelInterfaceIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelInterfaceIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2tpPppDisconnectCauseInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientVlanNasPortFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctMultiSessionIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctMultiSessionIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctMultiSessionIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAscendNumInMultilinkInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAscendNumInMultilinkInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAscendNumInMultilinkInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientConnectInfoFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeProfileServiceDescrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeProfileServiceDescrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeProfileServiceDescrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctAuthenticInAcctOn"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctDelayTimeInAcctOn"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctSessionIdInAcctOn"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctAuthenticInAcctOff"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctDelayTimeInAcctOff"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctSessionIdInAcctOff"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctTerminateCauseInAcctOff"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeMlpppBundleNameInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeMlpppBundleNameInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeMlpppBundleNameInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpOptionsInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpMacAddressInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpGiAddressInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpOptionsInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpMacAddressInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpGiAddressInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpOptionsInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpMacAddressInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpGiAddressInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortIdOverrideRemoteCircuitId"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientCallingStationIdOverrideRemoteCircuitId"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientOverrideNasInfo"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeInterfaceDescriptionInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeInterfaceDescriptionInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeInterfaceDescriptionInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortIdOverrideRemoteCircuitId"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientCallingStationIdOverrideRemoteCircuitId"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFieldWidthAtmSlot"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFieldWidthAtmAdapter"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFieldWidthAtmPort"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFieldWidthAtmVpi"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFieldWidthAtmVci"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFieldWidthEthernetSlot"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFieldWidthEthernetAdapter"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFieldWidthEthernetPort"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFieldWidthEthernetSVlan"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFieldWidthEthernetVlan"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientRemoteCircuitIdFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientRemoteCircuitIdDelimiter"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAccessLoopParametersInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cDownStreamDataInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cUpStreamDataInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cDownStreamDataInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cUpStreamDataInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cDownStreamDataInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cUpStreamDataInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDslForumAttributesInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDslForumAttributesInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAccessLoopCircuitIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAccessAggrCircuitIdBinaryInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAccessAggrCircuitIdAsciiInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cActualDataRateUstrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cActualDataRateDstrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMinimumDataRateUstrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMinimumDataRateDstrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAttainDataRateUstrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAttainDataRateDstrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMaximumDataRateUstrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMaximumDataRateDstrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMinLowPowerDataRateUstrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMinLowPowerDataRateDstrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMaxInterleavingDelayUstrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cActInterleavingDelayUstrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMaxInterleavingDelayDstrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cActInterleavingDelayDstrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cDslLineStateInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cDslTypeInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAccessLoopCircuitIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAccessAggrCircuitIdBinaryInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAccessAggrCircuitIdAsciiInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cActualDataRateUstrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cActualDataRateDstrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMinimumDataRateUstrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMinimumDataRateDstrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAttainDataRateUstrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAttainDataRateDstrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMaximumDataRateUstrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMaximumDataRateDstrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMinLowPowerDataRateUstrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMinLowPowerDataRateDstrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMaxInterleavingDelayUstrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cActInterleavingDelayUstrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMaxInterleavingDelayDstrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cActInterleavingDelayDstrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cDslLineStateInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cDslTypeInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAccessLoopCircuitIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAccessAggrCircuitIdBinaryInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAccessAggrCircuitIdAsciiInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cActualDataRateUstrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cActualDataRateDstrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMinimumDataRateUstrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMinimumDataRateDstrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAttainDataRateUstrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAttainDataRateDstrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMaximumDataRateUstrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMaximumDataRateDstrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMinLowPowerDataRateUstrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMinLowPowerDataRateDstrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMaxInterleavingDelayUstrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cActInterleavingDelayUstrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMaxInterleavingDelayDstrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cActInterleavingDelayDstrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cDslLineStateInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cDslTypeInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeInterfaceIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeIpv6PrefixInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeInterfaceIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeIpAddrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeIpv6PrefixInAcctStop"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
juniRadiusBrasClientGroup16 = juniRadiusBrasClientGroup16.setStatus('obsolete')
if mibBuilder.loadTexts: juniRadiusBrasClientGroup16.setDescription('A collection of objects providing management of general B-RAS functions for RADIUS Clients.')
juniRadiusBrasClientGroup17 = ObjectGroup((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 2, 2, 33)).setObjects(("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientDslPortType"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientAcctSessionIdFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientCallingStationDelimiter"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientEthernetPortType"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeIpAddrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctSessionIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientCallingStationIdFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasIpAddrUse"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctTunnelConnectionInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCalledStationIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCallingStationIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeConnectInfoInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasIdentifierInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortTypeInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludePppoeDescriptionInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientAuthIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientEndpointInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelMediumTypeInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAttributesInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAuthIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerEndpointInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelTypeInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctTunnelConnectionInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCalledStationIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCallingStationIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeClassInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeConnectInfoInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeEgressPolicyNameInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeEventTimestampInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeFramedCompressionInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeFramedIpNetmaskInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeIngressPolicyNameInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasIdentifierInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortTypeInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludePppoeDescriptionInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelAssignmentIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientAuthIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientEndpointInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelMediumTypeInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelPreferenceInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAttributesInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAuthIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerEndpointInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelTypeInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctTunnelConnectionInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCalledStationIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCallingStationIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeClassInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeConnectInfoInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeEgressPolicyNameInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeEventTimestampInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeFramedCompressionInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeFramedIpNetmaskInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeIngressPolicyNameInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeInputGigawordsInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasIdentifierInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortTypeInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeOutputGigawordsInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludePppoeDescriptionInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelAssignmentIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientAuthIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientEndpointInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelMediumTypeInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelPreferenceInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAttributesInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAuthIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerEndpointInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelTypeInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeInputGigapktsInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeOutputGigapktsInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreFramedIpNetmask"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreAtmCategory"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreAtmMbs"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreAtmPcr"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreAtmScr"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreEgressPolicyName"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreIngressPolicyName"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreVirtualRouter"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientTrapOnAuthServerUnavailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientTrapOnAcctServerUnavailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientTrapOnNoAuthServerAvailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientTrapOnNoAcctServerAvailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientTrapOnAuthServerAvailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientTrapOnAcctServerAvailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientPppoeNasPortFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelInterfaceIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelInterfaceIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelInterfaceIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2tpPppDisconnectCauseInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientVlanNasPortFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctMultiSessionIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctMultiSessionIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctMultiSessionIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAscendNumInMultilinkInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAscendNumInMultilinkInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAscendNumInMultilinkInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientConnectInfoFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeProfileServiceDescrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeProfileServiceDescrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeProfileServiceDescrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctAuthenticInAcctOn"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctDelayTimeInAcctOn"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctSessionIdInAcctOn"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctAuthenticInAcctOff"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctDelayTimeInAcctOff"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctSessionIdInAcctOff"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctTerminateCauseInAcctOff"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeMlpppBundleNameInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeMlpppBundleNameInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeMlpppBundleNameInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpOptionsInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpMacAddressInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpGiAddressInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpOptionsInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpMacAddressInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpGiAddressInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpOptionsInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpMacAddressInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpGiAddressInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortIdOverrideRemoteCircuitId"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientCallingStationIdOverrideRemoteCircuitId"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientOverrideNasInfo"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeInterfaceDescriptionInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeInterfaceDescriptionInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeInterfaceDescriptionInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortIdOverrideRemoteCircuitId"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientCallingStationIdOverrideRemoteCircuitId"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFieldWidthAtmSlot"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFieldWidthAtmAdapter"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFieldWidthAtmPort"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFieldWidthAtmVpi"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFieldWidthAtmVci"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFieldWidthEthernetSlot"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFieldWidthEthernetAdapter"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFieldWidthEthernetPort"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFieldWidthEthernetSVlan"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFieldWidthEthernetVlan"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientRemoteCircuitIdFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientRemoteCircuitIdDelimiter"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAccessLoopParametersInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cDownStreamDataInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cUpStreamDataInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cDownStreamDataInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cUpStreamDataInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cDownStreamDataInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cUpStreamDataInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDslForumAttributesInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDslForumAttributesInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAccessLoopCircuitIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAccessAggrCircuitIdBinaryInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAccessAggrCircuitIdAsciiInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cActualDataRateUstrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cActualDataRateDstrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMinimumDataRateUstrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMinimumDataRateDstrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAttainDataRateUstrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAttainDataRateDstrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMaximumDataRateUstrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMaximumDataRateDstrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMinLowPowerDataRateUstrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMinLowPowerDataRateDstrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMaxInterleavingDelayUstrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cActInterleavingDelayUstrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMaxInterleavingDelayDstrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cActInterleavingDelayDstrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cDslLineStateInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cDslTypeInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAccessLoopCircuitIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAccessAggrCircuitIdBinaryInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAccessAggrCircuitIdAsciiInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cActualDataRateUstrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cActualDataRateDstrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMinimumDataRateUstrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMinimumDataRateDstrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAttainDataRateUstrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAttainDataRateDstrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMaximumDataRateUstrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMaximumDataRateDstrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMinLowPowerDataRateUstrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMinLowPowerDataRateDstrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMaxInterleavingDelayUstrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cActInterleavingDelayUstrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMaxInterleavingDelayDstrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cActInterleavingDelayDstrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cDslLineStateInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cDslTypeInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAccessLoopCircuitIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAccessAggrCircuitIdBinaryInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAccessAggrCircuitIdAsciiInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cActualDataRateUstrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cActualDataRateDstrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMinimumDataRateUstrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMinimumDataRateDstrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAttainDataRateUstrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAttainDataRateDstrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMaximumDataRateUstrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMaximumDataRateDstrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMinLowPowerDataRateUstrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMinLowPowerDataRateDstrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMaxInterleavingDelayUstrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cActInterleavingDelayUstrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMaxInterleavingDelayDstrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cActInterleavingDelayDstrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cDslLineStateInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cDslTypeInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeInterfaceIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeIpv6PrefixInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeInterfaceIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeIpAddrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeIpv6PrefixInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDownStreamCalculatedQosRateInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeUpStreamCalculatedQosRateInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDownStreamCalculatedQosRateInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeUpStreamCalculatedQosRateInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDownStreamCalculatedQosRateInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeUpStreamCalculatedQosRateInAcctStop"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
juniRadiusBrasClientGroup17 = juniRadiusBrasClientGroup17.setStatus('obsolete')
if mibBuilder.loadTexts: juniRadiusBrasClientGroup17.setDescription('A collection of objects providing management of general B-RAS functions for RADIUS Clients.')
juniRadiusBrasClientGroup18 = ObjectGroup((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 2, 2, 34)).setObjects(("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientDslPortType"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientAcctSessionIdFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientCallingStationDelimiter"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientEthernetPortType"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeIpAddrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctSessionIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientCallingStationIdFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasIpAddrUse"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctTunnelConnectionInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCalledStationIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCallingStationIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeConnectInfoInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasIdentifierInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortTypeInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludePppoeDescriptionInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientAuthIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientEndpointInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelMediumTypeInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAttributesInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAuthIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerEndpointInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelTypeInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctTunnelConnectionInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCalledStationIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCallingStationIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeClassInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeConnectInfoInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeEgressPolicyNameInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeEventTimestampInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeFramedCompressionInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeFramedIpNetmaskInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeIngressPolicyNameInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasIdentifierInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortTypeInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludePppoeDescriptionInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelAssignmentIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientAuthIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientEndpointInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelMediumTypeInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelPreferenceInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAttributesInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAuthIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerEndpointInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelTypeInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctTunnelConnectionInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCalledStationIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCallingStationIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeClassInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeConnectInfoInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeEgressPolicyNameInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeEventTimestampInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeFramedCompressionInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeFramedIpNetmaskInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeIngressPolicyNameInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeInputGigawordsInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasIdentifierInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortTypeInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeOutputGigawordsInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludePppoeDescriptionInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelAssignmentIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientAuthIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientEndpointInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelMediumTypeInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelPreferenceInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAttributesInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAuthIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerEndpointInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelTypeInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeInputGigapktsInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeOutputGigapktsInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreFramedIpNetmask"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreAtmCategory"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreAtmMbs"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreAtmPcr"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreAtmScr"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreEgressPolicyName"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreIngressPolicyName"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreVirtualRouter"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientTrapOnAuthServerUnavailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientTrapOnAcctServerUnavailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientTrapOnNoAuthServerAvailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientTrapOnNoAcctServerAvailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientTrapOnAuthServerAvailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientTrapOnAcctServerAvailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientPppoeNasPortFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelInterfaceIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelInterfaceIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelInterfaceIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2tpPppDisconnectCauseInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientVlanNasPortFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctMultiSessionIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctMultiSessionIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctMultiSessionIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAscendNumInMultilinkInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAscendNumInMultilinkInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAscendNumInMultilinkInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientConnectInfoFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeProfileServiceDescrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeProfileServiceDescrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeProfileServiceDescrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctAuthenticInAcctOn"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctDelayTimeInAcctOn"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctSessionIdInAcctOn"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctAuthenticInAcctOff"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctDelayTimeInAcctOff"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctSessionIdInAcctOff"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctTerminateCauseInAcctOff"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeMlpppBundleNameInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeMlpppBundleNameInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeMlpppBundleNameInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpOptionsInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpMacAddressInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpGiAddressInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpOptionsInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpMacAddressInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpGiAddressInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpOptionsInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpMacAddressInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpGiAddressInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortIdOverrideRemoteCircuitId"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientCallingStationIdOverrideRemoteCircuitId"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientOverrideNasInfo"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeInterfaceDescriptionInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeInterfaceDescriptionInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeInterfaceDescriptionInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortIdOverrideRemoteCircuitId"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientCallingStationIdOverrideRemoteCircuitId"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFieldWidthAtmSlot"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFieldWidthAtmAdapter"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFieldWidthAtmPort"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFieldWidthAtmVpi"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFieldWidthAtmVci"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFieldWidthEthernetSlot"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFieldWidthEthernetAdapter"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFieldWidthEthernetPort"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFieldWidthEthernetSVlan"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFieldWidthEthernetVlan"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientRemoteCircuitIdFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientRemoteCircuitIdDelimiter"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAccessLoopParametersInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cDownStreamDataInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cUpStreamDataInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cDownStreamDataInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cUpStreamDataInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cDownStreamDataInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cUpStreamDataInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDslForumAttributesInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDslForumAttributesInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAccessLoopCircuitIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAccessAggrCircuitIdBinaryInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAccessAggrCircuitIdAsciiInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cActualDataRateUstrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cActualDataRateDstrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMinimumDataRateUstrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMinimumDataRateDstrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAttainDataRateUstrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAttainDataRateDstrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMaximumDataRateUstrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMaximumDataRateDstrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMinLowPowerDataRateUstrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMinLowPowerDataRateDstrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMaxInterleavingDelayUstrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cActInterleavingDelayUstrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMaxInterleavingDelayDstrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cActInterleavingDelayDstrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cDslLineStateInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cDslTypeInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAccessLoopCircuitIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAccessAggrCircuitIdBinaryInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAccessAggrCircuitIdAsciiInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cActualDataRateUstrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cActualDataRateDstrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMinimumDataRateUstrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMinimumDataRateDstrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAttainDataRateUstrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAttainDataRateDstrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMaximumDataRateUstrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMaximumDataRateDstrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMinLowPowerDataRateUstrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMinLowPowerDataRateDstrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMaxInterleavingDelayUstrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cActInterleavingDelayUstrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMaxInterleavingDelayDstrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cActInterleavingDelayDstrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cDslLineStateInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cDslTypeInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAccessLoopCircuitIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAccessAggrCircuitIdBinaryInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAccessAggrCircuitIdAsciiInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cActualDataRateUstrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cActualDataRateDstrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMinimumDataRateUstrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMinimumDataRateDstrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAttainDataRateUstrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAttainDataRateDstrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMaximumDataRateUstrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMaximumDataRateDstrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMinLowPowerDataRateUstrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMinLowPowerDataRateDstrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMaxInterleavingDelayUstrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cActInterleavingDelayUstrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMaxInterleavingDelayDstrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cActInterleavingDelayDstrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cDslLineStateInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cDslTypeInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeInterfaceIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeIpv6PrefixInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeInterfaceIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeIpAddrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeIpv6PrefixInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDownStreamCalculatedQosRateInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeUpStreamCalculatedQosRateInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDownStreamCalculatedQosRateInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeUpStreamCalculatedQosRateInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDownStreamCalculatedQosRateInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeUpStreamCalculatedQosRateInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnorePppoeMaxSession"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
juniRadiusBrasClientGroup18 = juniRadiusBrasClientGroup18.setStatus('obsolete')
if mibBuilder.loadTexts: juniRadiusBrasClientGroup18.setDescription('A collection of objects providing management of general B-RAS functions for RADIUS Clients.')
juniRadiusBrasClientGroup19 = ObjectGroup((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 2, 2, 35)).setObjects(("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientDslPortType"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientAcctSessionIdFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientCallingStationDelimiter"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientEthernetPortType"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeIpAddrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctSessionIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientCallingStationIdFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasIpAddrUse"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctTunnelConnectionInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCalledStationIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCallingStationIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeConnectInfoInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasIdentifierInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortTypeInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludePppoeDescriptionInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientAuthIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientEndpointInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelMediumTypeInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAttributesInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAuthIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerEndpointInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelTypeInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctTunnelConnectionInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCalledStationIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCallingStationIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeClassInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeConnectInfoInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeEgressPolicyNameInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeEventTimestampInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeFramedCompressionInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeFramedIpNetmaskInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeIngressPolicyNameInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasIdentifierInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortTypeInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludePppoeDescriptionInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelAssignmentIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientAuthIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientEndpointInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelMediumTypeInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelPreferenceInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAttributesInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAuthIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerEndpointInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelTypeInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctTunnelConnectionInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCalledStationIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCallingStationIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeClassInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeConnectInfoInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeEgressPolicyNameInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeEventTimestampInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeFramedCompressionInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeFramedIpNetmaskInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeIngressPolicyNameInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeInputGigawordsInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasIdentifierInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortTypeInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeOutputGigawordsInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludePppoeDescriptionInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelAssignmentIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientAuthIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientEndpointInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelMediumTypeInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelPreferenceInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAttributesInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAuthIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerEndpointInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelTypeInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeInputGigapktsInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeOutputGigapktsInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreFramedIpNetmask"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreAtmCategory"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreAtmMbs"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreAtmPcr"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreAtmScr"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreEgressPolicyName"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreIngressPolicyName"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreVirtualRouter"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientTrapOnAuthServerUnavailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientTrapOnAcctServerUnavailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientTrapOnNoAuthServerAvailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientTrapOnNoAcctServerAvailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientTrapOnAuthServerAvailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientTrapOnAcctServerAvailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientPppoeNasPortFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelInterfaceIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelInterfaceIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelInterfaceIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2tpPppDisconnectCauseInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientVlanNasPortFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctMultiSessionIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctMultiSessionIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctMultiSessionIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAscendNumInMultilinkInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAscendNumInMultilinkInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAscendNumInMultilinkInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientConnectInfoFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeProfileServiceDescrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeProfileServiceDescrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeProfileServiceDescrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctAuthenticInAcctOn"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctDelayTimeInAcctOn"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctSessionIdInAcctOn"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctAuthenticInAcctOff"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctDelayTimeInAcctOff"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctSessionIdInAcctOff"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctTerminateCauseInAcctOff"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeMlpppBundleNameInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeMlpppBundleNameInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeMlpppBundleNameInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpOptionsInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpMacAddressInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpGiAddressInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpOptionsInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpMacAddressInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpGiAddressInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpOptionsInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpMacAddressInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpGiAddressInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortIdOverrideRemoteCircuitId"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientCallingStationIdOverrideRemoteCircuitId"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientOverrideNasInfo"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeInterfaceDescriptionInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeInterfaceDescriptionInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeInterfaceDescriptionInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortIdOverrideRemoteCircuitId"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientCallingStationIdOverrideRemoteCircuitId"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFieldWidthAtmSlot"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFieldWidthAtmAdapter"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFieldWidthAtmPort"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFieldWidthAtmVpi"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFieldWidthAtmVci"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFieldWidthEthernetSlot"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFieldWidthEthernetAdapter"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFieldWidthEthernetPort"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFieldWidthEthernetSVlan"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFieldWidthEthernetVlan"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientRemoteCircuitIdFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientRemoteCircuitIdDelimiter"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAccessLoopParametersInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cDownStreamDataInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cUpStreamDataInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cDownStreamDataInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cUpStreamDataInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cDownStreamDataInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cUpStreamDataInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDslForumAttributesInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDslForumAttributesInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAccessLoopCircuitIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAccessAggrCircuitIdBinaryInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAccessAggrCircuitIdAsciiInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cActualDataRateUstrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cActualDataRateDstrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMinimumDataRateUstrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMinimumDataRateDstrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAttainDataRateUstrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAttainDataRateDstrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMaximumDataRateUstrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMaximumDataRateDstrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMinLowPowerDataRateUstrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMinLowPowerDataRateDstrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMaxInterleavingDelayUstrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cActInterleavingDelayUstrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMaxInterleavingDelayDstrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cActInterleavingDelayDstrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cDslLineStateInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cDslTypeInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAccessLoopCircuitIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAccessAggrCircuitIdBinaryInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAccessAggrCircuitIdAsciiInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cActualDataRateUstrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cActualDataRateDstrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMinimumDataRateUstrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMinimumDataRateDstrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAttainDataRateUstrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAttainDataRateDstrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMaximumDataRateUstrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMaximumDataRateDstrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMinLowPowerDataRateUstrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMinLowPowerDataRateDstrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMaxInterleavingDelayUstrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cActInterleavingDelayUstrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMaxInterleavingDelayDstrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cActInterleavingDelayDstrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cDslLineStateInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cDslTypeInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAccessLoopCircuitIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAccessAggrCircuitIdBinaryInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAccessAggrCircuitIdAsciiInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cActualDataRateUstrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cActualDataRateDstrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMinimumDataRateUstrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMinimumDataRateDstrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAttainDataRateUstrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAttainDataRateDstrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMaximumDataRateUstrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMaximumDataRateDstrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMinLowPowerDataRateUstrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMinLowPowerDataRateDstrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMaxInterleavingDelayUstrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cActInterleavingDelayUstrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMaxInterleavingDelayDstrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cActInterleavingDelayDstrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cDslLineStateInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cDslTypeInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeInterfaceIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeIpv6PrefixInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeInterfaceIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeIpAddrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeIpv6PrefixInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDownStreamCalculatedQosRateInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeUpStreamCalculatedQosRateInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDownStreamCalculatedQosRateInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeUpStreamCalculatedQosRateInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDownStreamCalculatedQosRateInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeUpStreamCalculatedQosRateInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnorePppoeMaxSession"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeIpv6AccountingInAcctStop"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
juniRadiusBrasClientGroup19 = juniRadiusBrasClientGroup19.setStatus('obsolete')
if mibBuilder.loadTexts: juniRadiusBrasClientGroup19.setDescription('A collection of objects providing management of general B-RAS functions for RADIUS Clients.')
juniRadiusBrasClientGroup20 = ObjectGroup((1, 3, 6, 1, 4, 1, 4874, 2, 2, 19, 2, 2, 36)).setObjects(("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientDslPortType"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientAcctSessionIdFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientCallingStationDelimiter"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientEthernetPortType"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeIpAddrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctSessionIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientCallingStationIdFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasIpAddrUse"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctTunnelConnectionInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCalledStationIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCallingStationIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeConnectInfoInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasIdentifierInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortTypeInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludePppoeDescriptionInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientAuthIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientEndpointInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelMediumTypeInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAttributesInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAuthIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerEndpointInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelTypeInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctTunnelConnectionInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCalledStationIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCallingStationIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeClassInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeConnectInfoInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeEgressPolicyNameInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeEventTimestampInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeFramedCompressionInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeFramedIpNetmaskInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeIngressPolicyNameInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasIdentifierInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortTypeInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludePppoeDescriptionInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelAssignmentIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientAuthIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientEndpointInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelMediumTypeInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelPreferenceInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAttributesInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAuthIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerEndpointInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelTypeInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctTunnelConnectionInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCalledStationIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeCallingStationIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeClassInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeConnectInfoInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeEgressPolicyNameInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeEventTimestampInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeFramedCompressionInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeFramedIpNetmaskInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeIngressPolicyNameInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeInputGigawordsInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasIdentifierInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeNasPortTypeInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeOutputGigawordsInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludePppoeDescriptionInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelAssignmentIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientAuthIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelClientEndpointInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelMediumTypeInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelPreferenceInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAttributesInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerAuthIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelServerEndpointInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelTypeInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeInputGigapktsInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeOutputGigapktsInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreFramedIpNetmask"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreAtmCategory"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreAtmMbs"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreAtmPcr"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreAtmScr"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreEgressPolicyName"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreIngressPolicyName"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnoreVirtualRouter"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientTrapOnAuthServerUnavailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientTrapOnAcctServerUnavailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientTrapOnNoAuthServerAvailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientTrapOnNoAcctServerAvailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientTrapOnAuthServerAvailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientTrapOnAcctServerAvailable"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientPppoeNasPortFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelInterfaceIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelInterfaceIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeTunnelInterfaceIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2tpPppDisconnectCauseInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientVlanNasPortFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctMultiSessionIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctMultiSessionIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctMultiSessionIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAscendNumInMultilinkInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAscendNumInMultilinkInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAscendNumInMultilinkInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientConnectInfoFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeProfileServiceDescrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeProfileServiceDescrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeProfileServiceDescrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctAuthenticInAcctOn"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctDelayTimeInAcctOn"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctSessionIdInAcctOn"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctAuthenticInAcctOff"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctDelayTimeInAcctOff"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctSessionIdInAcctOff"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeAcctTerminateCauseInAcctOff"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeMlpppBundleNameInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeMlpppBundleNameInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeMlpppBundleNameInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpOptionsInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpMacAddressInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpGiAddressInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpOptionsInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpMacAddressInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpGiAddressInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpOptionsInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpMacAddressInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDhcpGiAddressInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortIdOverrideRemoteCircuitId"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientCallingStationIdOverrideRemoteCircuitId"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientOverrideNasInfo"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeInterfaceDescriptionInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeInterfaceDescriptionInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeInterfaceDescriptionInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortIdOverrideRemoteCircuitId"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientCallingStationIdOverrideRemoteCircuitId"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFieldWidthAtmSlot"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFieldWidthAtmAdapter"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFieldWidthAtmPort"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFieldWidthAtmVpi"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFieldWidthAtmVci"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFieldWidthEthernetSlot"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFieldWidthEthernetAdapter"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFieldWidthEthernetPort"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFieldWidthEthernetSVlan"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientNasPortFieldWidthEthernetVlan"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientRemoteCircuitIdFormat"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientRemoteCircuitIdDelimiter"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAccessLoopParametersInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cDownStreamDataInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cUpStreamDataInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cDownStreamDataInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cUpStreamDataInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cDownStreamDataInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cUpStreamDataInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDslForumAttributesInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDslForumAttributesInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAccessLoopCircuitIdInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAccessAggrCircuitIdBinaryInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAccessAggrCircuitIdAsciiInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cActualDataRateUstrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cActualDataRateDstrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMinimumDataRateUstrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMinimumDataRateDstrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAttainDataRateUstrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAttainDataRateDstrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMaximumDataRateUstrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMaximumDataRateDstrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMinLowPowerDataRateUstrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMinLowPowerDataRateDstrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMaxInterleavingDelayUstrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cActInterleavingDelayUstrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMaxInterleavingDelayDstrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cActInterleavingDelayDstrInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cDslLineStateInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cDslTypeInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAccessLoopCircuitIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAccessAggrCircuitIdBinaryInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAccessAggrCircuitIdAsciiInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cActualDataRateUstrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cActualDataRateDstrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMinimumDataRateUstrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMinimumDataRateDstrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAttainDataRateUstrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAttainDataRateDstrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMaximumDataRateUstrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMaximumDataRateDstrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMinLowPowerDataRateUstrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMinLowPowerDataRateDstrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMaxInterleavingDelayUstrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cActInterleavingDelayUstrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMaxInterleavingDelayDstrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cActInterleavingDelayDstrInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cDslLineStateInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cDslTypeInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAccessLoopCircuitIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAccessAggrCircuitIdBinaryInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAccessAggrCircuitIdAsciiInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cActualDataRateUstrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cActualDataRateDstrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMinimumDataRateUstrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMinimumDataRateDstrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAttainDataRateUstrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cAttainDataRateDstrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMaximumDataRateUstrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMaximumDataRateDstrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMinLowPowerDataRateUstrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMinLowPowerDataRateDstrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMaxInterleavingDelayUstrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cActInterleavingDelayUstrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cMaxInterleavingDelayDstrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cActInterleavingDelayDstrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cDslLineStateInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeL2cDslTypeInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeInterfaceIdInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeIpv6PrefixInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeInterfaceIdInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeIpAddrInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeIpv6PrefixInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDownStreamCalculatedQosRateInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeUpStreamCalculatedQosRateInAccessReq"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDownStreamCalculatedQosRateInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeUpStreamCalculatedQosRateInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDownStreamCalculatedQosRateInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeUpStreamCalculatedQosRateInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIgnorePppoeMaxSession"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeIpv6AccountingInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDelegatedIpv6PrefixInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeDelegatedIpv6PrefixInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeFramedIpv6PoolInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeFramedIpv6PoolInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeFramedIpv6RouteInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeFramedIpv6RouteInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeIpv6LocalInterfaceInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeIpv6LocalInterfaceInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeIpv6NdRaPrefixInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeIpv6NdRaPrefixInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeIpv6PrimaryDnsInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeIpv6PrimaryDnsInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeIpv6SecondaryDnsInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeIpv6SecondaryDnsInAcctStop"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeIpv6VirtualRouterInAcctStart"), ("Juniper-RADIUS-CLIENT-MIB", "juniRadiusClientIncludeIpv6VirtualRouterInAcctStop"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
juniRadiusBrasClientGroup20 = juniRadiusBrasClientGroup20.setStatus('current')
if mibBuilder.loadTexts: juniRadiusBrasClientGroup20.setDescription('A collection of objects providing management of general B-RAS functions for RADIUS Clients.')
mibBuilder.exportSymbols("Juniper-RADIUS-CLIENT-MIB", juniRadiusClientCompliance6=juniRadiusClientCompliance6, juniRadiusClientOverrideNasInfo=juniRadiusClientOverrideNasInfo, juniRadiusClientRemoteCircuitIdDelimiter=juniRadiusClientRemoteCircuitIdDelimiter, juniRadiusClientCompliance16=juniRadiusClientCompliance16, juniRadiusClientIncludeL2cDownStreamDataInAcctStop=juniRadiusClientIncludeL2cDownStreamDataInAcctStop, juniRadiusClientIncludeTunnelAssignmentIdInAcctStop=juniRadiusClientIncludeTunnelAssignmentIdInAcctStop, juniRadiusAuthClientAccessChallenges=juniRadiusAuthClientAccessChallenges, juniRadiusClientTrapControl=juniRadiusClientTrapControl, juniRadiusClientIncludeTunnelAssignmentIdInAcctStart=juniRadiusClientIncludeTunnelAssignmentIdInAcctStart, juniRadiusAcctClientCompliance2=juniRadiusAcctClientCompliance2, juniRadiusClientIncludeL2cMaxInterleavingDelayUstrInAccessReq=juniRadiusClientIncludeL2cMaxInterleavingDelayUstrInAccessReq, juniRadiusClientIncludeL2cMinLowPowerDataRateDstrInAccessReq=juniRadiusClientIncludeL2cMinLowPowerDataRateDstrInAccessReq, juniRadiusBrasClientGroup17=juniRadiusBrasClientGroup17, juniRadiusClientCompliance4=juniRadiusClientCompliance4, juniRadiusClientIncludeTunnelPreferenceInAcctStop=juniRadiusClientIncludeTunnelPreferenceInAcctStop, juniRadiusClientIncludeTunnelMediumTypeInAcctStart=juniRadiusClientIncludeTunnelMediumTypeInAcctStart, juniRadiusClientIncludeL2cAccessLoopParametersInAccessReq=juniRadiusClientIncludeL2cAccessLoopParametersInAccessReq, juniRadiusClientIncludeL2cMaxInterleavingDelayUstrInAcctStart=juniRadiusClientIncludeL2cMaxInterleavingDelayUstrInAcctStart, juniRadiusAcctClientCfgKey=juniRadiusAcctClientCfgKey, juniRadiusClientIncludeTunnelClientEndpointInAcctStop=juniRadiusClientIncludeTunnelClientEndpointInAcctStop, juniRadiusClientIncludeConnectInfoInAcctStart=juniRadiusClientIncludeConnectInfoInAcctStart, juniRadiusClientIncludeAcctSessionIdInAcctOff=juniRadiusClientIncludeAcctSessionIdInAcctOff, juniRadiusAuthClientBadAuthenticators=juniRadiusAuthClientBadAuthenticators, juniRadiusClientIncludeNasPortInAcctStop=juniRadiusClientIncludeNasPortInAcctStop, juniRadiusClientIncludeEventTimestampInAcctStart=juniRadiusClientIncludeEventTimestampInAcctStart, juniRadiusAuthClientAccessRequests=juniRadiusAuthClientAccessRequests, juniRadiusBrasClientGroup2=juniRadiusBrasClientGroup2, juniRadiusClientIncludeL2cDslTypeInAcctStop=juniRadiusClientIncludeL2cDslTypeInAcctStop, juniRadiusAuthClientAccessRejects=juniRadiusAuthClientAccessRejects, juniRadiusBrasClientGroup15=juniRadiusBrasClientGroup15, juniRadiusAcctClientRejectResponses=juniRadiusAcctClientRejectResponses, JuniRadiusClientRemoterCircuitIdFormatComponents=JuniRadiusClientRemoterCircuitIdFormatComponents, juniRadiusClientIncludeL2cActInterleavingDelayUstrInAcctStart=juniRadiusClientIncludeL2cActInterleavingDelayUstrInAcctStart, juniRadiusClientUdpChecksum=juniRadiusClientUdpChecksum, juniRadiusClientIncludeAcctTunnelConnectionInAcctStop=juniRadiusClientIncludeAcctTunnelConnectionInAcctStop, juniRadiusClientIncludeL2cDownStreamDataInAccessReq=juniRadiusClientIncludeL2cDownStreamDataInAccessReq, juniRadiusClientIncludeDslForumAttributesInAcctStop=juniRadiusClientIncludeDslForumAttributesInAcctStop, juniRadiusClientIncludeL2cMaximumDataRateDstrInAccessReq=juniRadiusClientIncludeL2cMaximumDataRateDstrInAccessReq, juniRadiusClientIncludeAcctMultiSessionIdInAcctStart=juniRadiusClientIncludeAcctMultiSessionIdInAcctStart, juniRadiusClientIncludeConnectInfoInAcctStop=juniRadiusClientIncludeConnectInfoInAcctStop, juniRadiusClientIncludeDownStreamCalculatedQosRateInAcctStop=juniRadiusClientIncludeDownStreamCalculatedQosRateInAcctStop, juniRadiusClientIncludeL2cMinimumDataRateUstrInAccessReq=juniRadiusClientIncludeL2cMinimumDataRateUstrInAccessReq, juniRadiusClientMIB=juniRadiusClientMIB, juniRadiusAcctClientInterimRequests=juniRadiusAcctClientInterimRequests, juniRadiusClientIncludeEventTimestampInAcctStop=juniRadiusClientIncludeEventTimestampInAcctStop, juniRadiusAuthClientCfgKey=juniRadiusAuthClientCfgKey, juniRadiusAcctClientInvalidServerAddresses=juniRadiusAcctClientInvalidServerAddresses, juniRadiusClientIncludeTunnelServerAttributesInAcctStop=juniRadiusClientIncludeTunnelServerAttributesInAcctStop, juniRadiusAuthClientServerUnavailable=juniRadiusAuthClientServerUnavailable, juniRadiusTunnelClientGroup=juniRadiusTunnelClientGroup, juniRadiusAcctClientGroup=juniRadiusAcctClientGroup, juniRadiusClientIncludeL2cMaximumDataRateUstrInAcctStop=juniRadiusClientIncludeL2cMaximumDataRateUstrInAcctStop, juniRadiusClientObjects=juniRadiusClientObjects, juniRadiusClientIncludeConnectInfoInAccessReq=juniRadiusClientIncludeConnectInfoInAccessReq, juniRadiusAuthClientGroup3=juniRadiusAuthClientGroup3, juniRadiusClientIncludeL2cUpStreamDataInAcctStart=juniRadiusClientIncludeL2cUpStreamDataInAcctStart, juniRadiusClientIncludeL2tpPppDisconnectCauseInAcctStop=juniRadiusClientIncludeL2tpPppDisconnectCauseInAcctStop, juniRadiusClientTrapOnAuthServerUnavailable=juniRadiusClientTrapOnAuthServerUnavailable, juniRadiusClientIncludeL2cDslLineStateInAcctStop=juniRadiusClientIncludeL2cDslLineStateInAcctStop, juniRadiusBrasClientGroup6=juniRadiusBrasClientGroup6, juniRadiusClientCompliance17=juniRadiusClientCompliance17, juniRadiusBrasClientGroup10=juniRadiusBrasClientGroup10, juniRadiusClientIncludeUpStreamCalculatedQosRateInAccessReq=juniRadiusClientIncludeUpStreamCalculatedQosRateInAccessReq, juniRadiusClientIncludeL2cAccessLoopCircuitIdInAccessReq=juniRadiusClientIncludeL2cAccessLoopCircuitIdInAccessReq, juniRadiusAuthClientNoServerAvailable=juniRadiusAuthClientNoServerAvailable, juniRadiusClientMIBConformance=juniRadiusClientMIBConformance, juniRadiusAcctClientGroup2=juniRadiusAcctClientGroup2, juniRadiusClientIncludeIpv6AccountingInAcctStop=juniRadiusClientIncludeIpv6AccountingInAcctStop, juniRadiusBrasClientGroup11=juniRadiusBrasClientGroup11, juniRadiusAcctClientServerPortNumber=juniRadiusAcctClientServerPortNumber, juniRadiusAcctClientNextAvailableServer=juniRadiusAcctClientNextAvailableServer, juniRadiusClientCompliance10=juniRadiusClientCompliance10, juniRadiusClientIncludeIpAddrInAcctStop=juniRadiusClientIncludeIpAddrInAcctStop, juniRadiusClientIncludeIngressPolicyNameInAcctStop=juniRadiusClientIncludeIngressPolicyNameInAcctStop, juniRadiusClientIncludeUpStreamCalculatedQosRateInAcctStart=juniRadiusClientIncludeUpStreamCalculatedQosRateInAcctStart, juniRadiusAuthClientCfgServerTable=juniRadiusAuthClientCfgServerTable, juniRadiusClientNasPortFieldWidthAtmSlot=juniRadiusClientNasPortFieldWidthAtmSlot, juniRadiusClientNasPortFieldWidthAtmVci=juniRadiusClientNasPortFieldWidthAtmVci, juniRadiusClientIncludeAscendNumInMultilinkInAcctStart=juniRadiusClientIncludeAscendNumInMultilinkInAcctStart, juniRadiusAcctClientServerTable=juniRadiusAcctClientServerTable, juniRadiusClientIncludeL2cMinimumDataRateDstrInAcctStop=juniRadiusClientIncludeL2cMinimumDataRateDstrInAcctStop, juniRadiusClientIncludePppoeDescriptionInAcctStart=juniRadiusClientIncludePppoeDescriptionInAcctStart, juniRadiusClientIncludeTunnelMediumTypeInAccessReq=juniRadiusClientIncludeTunnelMediumTypeInAccessReq, juniRadiusClientIncludeFramedIpv6RouteInAcctStop=juniRadiusClientIncludeFramedIpv6RouteInAcctStop, juniRadiusAcctClientServerAddress=juniRadiusAcctClientServerAddress, juniRadiusClientIncludeL2cActInterleavingDelayDstrInAcctStop=juniRadiusClientIncludeL2cActInterleavingDelayDstrInAcctStop, juniRadiusBasicClientGroup=juniRadiusBasicClientGroup, juniRadiusClientIncludeCalledStationIdInAcctStop=juniRadiusClientIncludeCalledStationIdInAcctStop, juniRadiusClientIncludeIpv6PrimaryDnsInAcctStop=juniRadiusClientIncludeIpv6PrimaryDnsInAcctStop, juniRadiusClientTrapOnAcctServerUnavailable=juniRadiusClientTrapOnAcctServerUnavailable, juniRadiusAuthClientAccessAccepts=juniRadiusAuthClientAccessAccepts, juniRadiusAcctClientPacketsDropped=juniRadiusAcctClientPacketsDropped, juniRadiusClientIncludeIpv6NdRaPrefixInAcctStop=juniRadiusClientIncludeIpv6NdRaPrefixInAcctStop, juniRadiusAuthClientPacketsDropped=juniRadiusAuthClientPacketsDropped, juniRadiusClientIncludeDelegatedIpv6PrefixInAcctStart=juniRadiusClientIncludeDelegatedIpv6PrefixInAcctStart, juniRadiusClientIncludeFramedIpNetmaskInAcctStop=juniRadiusClientIncludeFramedIpNetmaskInAcctStop, juniRadiusClientIncludeAcctTunnelConnectionInAcctStart=juniRadiusClientIncludeAcctTunnelConnectionInAcctStart, juniRadiusBrasClientGroup19=juniRadiusBrasClientGroup19, juniRadiusClientIncludeL2cMinLowPowerDataRateUstrInAcctStart=juniRadiusClientIncludeL2cMinLowPowerDataRateUstrInAcctStart, juniRadiusClientCompliance3=juniRadiusClientCompliance3, juniRadiusClientIncludeL2cAccessAggrCircuitIdAsciiInAcctStop=juniRadiusClientIncludeL2cAccessAggrCircuitIdAsciiInAcctStop, juniRadiusBrasClientGroup13=juniRadiusBrasClientGroup13, juniRadiusClientCompliance5=juniRadiusClientCompliance5, juniRadiusClientAlgorithm=juniRadiusClientAlgorithm, juniRadiusClientTrapOnNoAuthServerAvailable=juniRadiusClientTrapOnNoAuthServerAvailable, juniRadiusClientIncludeL2cAttainDataRateUstrInAcctStart=juniRadiusClientIncludeL2cAttainDataRateUstrInAcctStart, juniRadiusClientIncludeDhcpOptionsInAcctStop=juniRadiusClientIncludeDhcpOptionsInAcctStop, juniRadiusClientIncludeL2cDslTypeInAcctStart=juniRadiusClientIncludeL2cDslTypeInAcctStart, juniRadiusClientIncludeOutputGigapktsInAcctStop=juniRadiusClientIncludeOutputGigapktsInAcctStop, juniRadiusClientIncludeProfileServiceDescrInAccessReq=juniRadiusClientIncludeProfileServiceDescrInAccessReq, juniRadiusClientIgnoreVirtualRouter=juniRadiusClientIgnoreVirtualRouter, juniRadiusClientIncludeL2cAttainDataRateDstrInAcctStart=juniRadiusClientIncludeL2cAttainDataRateDstrInAcctStart, juniRadiusClientIncludeOutputGigawordsInAcctStop=juniRadiusClientIncludeOutputGigawordsInAcctStop, juniRadiusAcctClientCompliance=juniRadiusAcctClientCompliance, juniRadiusAcctClientStartResponses=juniRadiusAcctClientStartResponses, juniRadiusClientNasPortIdOverrideRemoteCircuitId=juniRadiusClientNasPortIdOverrideRemoteCircuitId, juniRadiusClientIncludeIpv6SecondaryDnsInAcctStop=juniRadiusClientIncludeIpv6SecondaryDnsInAcctStop, juniRadiusClientIncludeEventTimestampInAcctOff=juniRadiusClientIncludeEventTimestampInAcctOff, juniRadiusClientIncludeEventTimestampInAcctOn=juniRadiusClientIncludeEventTimestampInAcctOn, juniRadiusClientIncludeInterfaceIdInAcctStop=juniRadiusClientIncludeInterfaceIdInAcctStop, juniRadiusClientIncludeDownStreamCalculatedQosRateInAccessReq=juniRadiusClientIncludeDownStreamCalculatedQosRateInAccessReq, juniRadiusClientIncludeNasPortInAcctStart=juniRadiusClientIncludeNasPortInAcctStart, juniRadiusClientIncludeAcctDelayTimeInAcctOn=juniRadiusClientIncludeAcctDelayTimeInAcctOn, juniRadiusClientIncludeL2cActInterleavingDelayDstrInAcctStart=juniRadiusClientIncludeL2cActInterleavingDelayDstrInAcctStart, juniRadiusClientIncludeL2cAccessAggrCircuitIdBinaryInAccessReq=juniRadiusClientIncludeL2cAccessAggrCircuitIdBinaryInAccessReq, juniRadiusClientIncludeL2cMaximumDataRateDstrInAcctStop=juniRadiusClientIncludeL2cMaximumDataRateDstrInAcctStop, juniRadiusGeneralClientGroup=juniRadiusGeneralClientGroup, juniRadiusClientIncludeNasIdentifierInAccessReq=juniRadiusClientIncludeNasIdentifierInAccessReq, juniRadiusClientIncludePppoeDescriptionInAccessReq=juniRadiusClientIncludePppoeDescriptionInAccessReq, juniRadiusClientIncludeDhcpOptionsInAcctStart=juniRadiusClientIncludeDhcpOptionsInAcctStart, juniRadiusClientEthernetPortType=juniRadiusClientEthernetPortType, juniRadiusClientIncludeL2cActualDataRateDstrInAccessReq=juniRadiusClientIncludeL2cActualDataRateDstrInAccessReq, juniRadiusClientIncludeTunnelClientAuthIdInAcctStart=juniRadiusClientIncludeTunnelClientAuthIdInAcctStart, juniRadiusClientIncludeTunnelClientEndpointInAcctStart=juniRadiusClientIncludeTunnelClientEndpointInAcctStart, juniRadiusAuthClientAccessRetransmissions=juniRadiusAuthClientAccessRetransmissions, juniRadiusClientCallingStationIdOverrideRemoteCircuitId=juniRadiusClientCallingStationIdOverrideRemoteCircuitId, juniRadiusClientIncludeIpv6NdRaPrefixInAcctStart=juniRadiusClientIncludeIpv6NdRaPrefixInAcctStart, juniRadiusClientCompliance18=juniRadiusClientCompliance18, juniRadiusClientCallingStationDelimiter=juniRadiusClientCallingStationDelimiter, juniRadiusClientIncludeL2cMinLowPowerDataRateDstrInAcctStart=juniRadiusClientIncludeL2cMinLowPowerDataRateDstrInAcctStart, juniRadiusClientIncludeTunnelTypeInAcctStop=juniRadiusClientIncludeTunnelTypeInAcctStop, juniRadiusClientIncludeDslForumAttributesInAccessReq=juniRadiusClientIncludeDslForumAttributesInAccessReq, juniRadiusClientMIBCompliances=juniRadiusClientMIBCompliances, juniRadiusClientCompliance15=juniRadiusClientCompliance15, juniRadiusClientIgnoreAtmPcr=juniRadiusClientIgnoreAtmPcr, juniRadiusBrasClientGroup9=juniRadiusBrasClientGroup9, juniRadiusAcctClientBadAuthenticators=juniRadiusAcctClientBadAuthenticators, juniRadiusClientIncludeNasIdentifierInAcctOn=juniRadiusClientIncludeNasIdentifierInAcctOn, juniRadiusClientIncludeClassInAcctStart=juniRadiusClientIncludeClassInAcctStart, juniRadiusClientIncludeDhcpMacAddressInAcctStop=juniRadiusClientIncludeDhcpMacAddressInAcctStop, juniRadiusClientIncludeL2cActualDataRateUstrInAccessReq=juniRadiusClientIncludeL2cActualDataRateUstrInAccessReq, juniRadiusClientIncludeProfileServiceDescrInAcctStop=juniRadiusClientIncludeProfileServiceDescrInAcctStop, juniRadiusClientIncludeL2cAccessAggrCircuitIdBinaryInAcctStart=juniRadiusClientIncludeL2cAccessAggrCircuitIdBinaryInAcctStart, juniRadiusClientIncludeIpv6LocalInterfaceInAcctStop=juniRadiusClientIncludeIpv6LocalInterfaceInAcctStop, juniRadiusClientRollover=juniRadiusClientRollover, juniRadiusClientIncludeL2cUpStreamDataInAcctStop=juniRadiusClientIncludeL2cUpStreamDataInAcctStop, juniRadiusClientIncludeAcctDelayTimeInAcctOff=juniRadiusClientIncludeAcctDelayTimeInAcctOff, juniRadiusClientIncludeL2cMinLowPowerDataRateDstrInAcctStop=juniRadiusClientIncludeL2cMinLowPowerDataRateDstrInAcctStop, juniRadiusAuthClientTimeouts=juniRadiusAuthClientTimeouts, juniRadiusClientIncludeDhcpOptionsInAccessReq=juniRadiusClientIncludeDhcpOptionsInAccessReq, juniRadiusClientIncludeL2cMinimumDataRateUstrInAcctStart=juniRadiusClientIncludeL2cMinimumDataRateUstrInAcctStart, juniRadiusClientIncludeIpv6PrefixInAcctStop=juniRadiusClientIncludeIpv6PrefixInAcctStop, juniRadiusClientIncludeDhcpMacAddressInAccessReq=juniRadiusClientIncludeDhcpMacAddressInAccessReq, juniRadiusClientRemoteCircuitIdFormat=juniRadiusClientRemoteCircuitIdFormat, juniRadiusClientTrapOnAcctServerAvailable=juniRadiusClientTrapOnAcctServerAvailable, juniRadiusClientIncludeInputGigawordsInAcctStop=juniRadiusClientIncludeInputGigawordsInAcctStop, juniRadiusClientIncludeInterfaceDescriptionInAcctStart=juniRadiusClientIncludeInterfaceDescriptionInAcctStart, juniRadiusAcctClientStartRequests=juniRadiusAcctClientStartRequests, juniRadiusAcctClientCfgTimeoutInterval=juniRadiusAcctClientCfgTimeoutInterval, juniRadiusClientIncludeEgressPolicyNameInAcctStop=juniRadiusClientIncludeEgressPolicyNameInAcctStop, juniRadiusClientCompliance13=juniRadiusClientCompliance13, juniRadiusAcctClientCfgServerTable=juniRadiusAcctClientCfgServerTable, juniRadiusClientPppoeNasPortFormat=juniRadiusClientPppoeNasPortFormat, juniRadiusAuthClientServerAvailable=juniRadiusAuthClientServerAvailable, juniRadiusClientIncludePppoeDescriptionInAcctStop=juniRadiusClientIncludePppoeDescriptionInAcctStop, juniRadiusClientTraps=juniRadiusClientTraps, juniRadiusBrasClientGroup3=juniRadiusBrasClientGroup3, juniRadiusClientIgnoreEgressPolicyName=juniRadiusClientIgnoreEgressPolicyName, juniRadiusAuthClient=juniRadiusAuthClient, juniRadiusBrasClientGroup16=juniRadiusBrasClientGroup16, juniRadiusClientNasPortFieldWidthEthernetVlan=juniRadiusClientNasPortFieldWidthEthernetVlan, juniRadiusClientIncludeL2cMaxInterleavingDelayDstrInAcctStop=juniRadiusClientIncludeL2cMaxInterleavingDelayDstrInAcctStop, juniRadiusClientIncludeL2cActualDataRateUstrInAcctStop=juniRadiusClientIncludeL2cActualDataRateUstrInAcctStop, juniRadiusClientIncludeNasIdentifierInAcctOff=juniRadiusClientIncludeNasIdentifierInAcctOff, juniRadiusClientIncludeInterfaceDescriptionInAccessReq=juniRadiusClientIncludeInterfaceDescriptionInAccessReq, juniRadiusClientIncludeFramedIpNetmaskInAcctStart=juniRadiusClientIncludeFramedIpNetmaskInAcctStart, juniRadiusClientNasPortFieldWidthEthernetSVlan=juniRadiusClientNasPortFieldWidthEthernetSVlan, juniRadiusClientIncludeTunnelServerEndpointInAccessReq=juniRadiusClientIncludeTunnelServerEndpointInAccessReq, juniRadiusClientIncludeL2cActualDataRateDstrInAcctStop=juniRadiusClientIncludeL2cActualDataRateDstrInAcctStop, juniRadiusClientIgnoreAtmScr=juniRadiusClientIgnoreAtmScr, juniRadiusClientIncludeL2cActualDataRateDstrInAcctStart=juniRadiusClientIncludeL2cActualDataRateDstrInAcctStart, juniRadiusBrasClientGroup12=juniRadiusBrasClientGroup12, juniRadiusClientTrapOnAuthServerAvailable=juniRadiusClientTrapOnAuthServerAvailable, juniRadiusAcctClientMalformedResponses=juniRadiusAcctClientMalformedResponses, juniRadiusClientIncludeAcctMultiSessionIdInAcctStop=juniRadiusClientIncludeAcctMultiSessionIdInAcctStop, juniRadiusClientIncludeTunnelClientEndpointInAccessReq=juniRadiusClientIncludeTunnelClientEndpointInAccessReq, juniRadiusClientIncludeL2cMaximumDataRateUstrInAccessReq=juniRadiusClientIncludeL2cMaximumDataRateUstrInAccessReq, juniRadiusClientIncludeMlpppBundleNameInAcctStop=juniRadiusClientIncludeMlpppBundleNameInAcctStop, juniRadiusClientIncludeClassInAcctStop=juniRadiusClientIncludeClassInAcctStop, juniRadiusClientIncludeL2cActualDataRateUstrInAcctStart=juniRadiusClientIncludeL2cActualDataRateUstrInAcctStart, juniRadiusAcctClientCfgServerEntry=juniRadiusAcctClientCfgServerEntry, juniRadiusClientIgnoreFramedIpNetmask=juniRadiusClientIgnoreFramedIpNetmask, juniRadiusClientIncludeL2cAttainDataRateUstrInAccessReq=juniRadiusClientIncludeL2cAttainDataRateUstrInAccessReq, juniRadiusAcctClientServerUnavailable=juniRadiusAcctClientServerUnavailable, juniRadiusClientIgnoreIngressPolicyName=juniRadiusClientIgnoreIngressPolicyName, juniRadiusClientIncludeTunnelTypeInAcctStart=juniRadiusClientIncludeTunnelTypeInAcctStart, juniRadiusClientIncludeL2cMaxInterleavingDelayDstrInAccessReq=juniRadiusClientIncludeL2cMaxInterleavingDelayDstrInAccessReq, juniRadiusClientIncludeIngressPolicyNameInAcctStart=juniRadiusClientIncludeIngressPolicyNameInAcctStart, juniRadiusAcctClientStopRequests=juniRadiusAcctClientStopRequests, juniRadiusClientIncludeTunnelServerAttributesInAcctStart=juniRadiusClientIncludeTunnelServerAttributesInAcctStart, juniRadiusClientIncludeL2cMinLowPowerDataRateUstrInAccessReq=juniRadiusClientIncludeL2cMinLowPowerDataRateUstrInAccessReq, juniRadiusClientIncludeCallingStationIdInAcctStop=juniRadiusClientIncludeCallingStationIdInAcctStop, juniRadiusClientIncludeAcctSessionIdInAccessReq=juniRadiusClientIncludeAcctSessionIdInAccessReq, juniRadiusAuthClientCompliance2=juniRadiusAuthClientCompliance2, juniRadiusClientIncludeL2cActInterleavingDelayUstrInAccessReq=juniRadiusClientIncludeL2cActInterleavingDelayUstrInAccessReq, juniRadiusClientIncludeL2cMaximumDataRateDstrInAcctStart=juniRadiusClientIncludeL2cMaximumDataRateDstrInAcctStart, juniRadiusAuthClientRoundTripTime=juniRadiusAuthClientRoundTripTime, juniRadiusAuthClientServerPortNumber=juniRadiusAuthClientServerPortNumber, juniRadiusClientIncludeNasPortTypeInAcctStop=juniRadiusClientIncludeNasPortTypeInAcctStop, juniRadiusAuthClientServerAddress=juniRadiusAuthClientServerAddress, juniRadiusClientNasIdentifier=juniRadiusClientNasIdentifier, juniRadiusClientIncludeNasIdentifierInAcctStart=juniRadiusClientIncludeNasIdentifierInAcctStart, juniRadiusClientCompliance=juniRadiusClientCompliance, juniRadiusAuthClientUnavailableServer=juniRadiusAuthClientUnavailableServer, juniRadiusClientIncludeIpv6VirtualRouterInAcctStop=juniRadiusClientIncludeIpv6VirtualRouterInAcctStop, juniRadiusClientIncludeDhcpGiAddressInAcctStop=juniRadiusClientIncludeDhcpGiAddressInAcctStop, juniRadiusClientIncludeCallingStationIdInAccessReq=juniRadiusClientIncludeCallingStationIdInAccessReq, juniRadiusClientIncludeNasPortIdInAcctStop=juniRadiusClientIncludeNasPortIdInAcctStop, juniRadiusAcctClientNoServerAvailable=juniRadiusAcctClientNoServerAvailable, juniRadiusClientNasIpAddrUse=juniRadiusClientNasIpAddrUse, juniRadiusClientIncludeAcctMultiSessionIdInAccessReq=juniRadiusClientIncludeAcctMultiSessionIdInAccessReq, juniRadiusClientIncludeTunnelClientAuthIdInAccessReq=juniRadiusClientIncludeTunnelClientAuthIdInAccessReq, juniRadiusBrasClientGroup=juniRadiusBrasClientGroup, juniRadiusAuthClientMalformedAccessResponses=juniRadiusAuthClientMalformedAccessResponses, juniRadiusAcctClientCfgServerAddress=juniRadiusAcctClientCfgServerAddress, juniRadiusBrasClientGroup18=juniRadiusBrasClientGroup18, juniRadiusAcctClientPendingRequests=juniRadiusAcctClientPendingRequests, juniRadiusAuthClientServerTable=juniRadiusAuthClientServerTable, juniRadiusClientIncludeCalledStationIdInAccessReq=juniRadiusClientIncludeCalledStationIdInAccessReq, juniRadiusClientIncludeInputGigapktsInAcctStop=juniRadiusClientIncludeInputGigapktsInAcctStop, juniRadiusAcctNotificationGroup=juniRadiusAcctNotificationGroup, juniRadiusClientIncludeTunnelServerEndpointInAcctStop=juniRadiusClientIncludeTunnelServerEndpointInAcctStop, juniRadiusClientIncludeL2cMinLowPowerDataRateUstrInAcctStop=juniRadiusClientIncludeL2cMinLowPowerDataRateUstrInAcctStop, juniRadiusClientNasPortFieldWidthEthernetAdapter=juniRadiusClientNasPortFieldWidthEthernetAdapter, juniRadiusClientIncludeL2cMinimumDataRateDstrInAcctStart=juniRadiusClientIncludeL2cMinimumDataRateDstrInAcctStart, juniRadiusClientCompliance11=juniRadiusClientCompliance11, juniRadiusAcctClientCfgRowStatus=juniRadiusAcctClientCfgRowStatus, juniRadiusClientIncludeDhcpGiAddressInAccessReq=juniRadiusClientIncludeDhcpGiAddressInAccessReq, juniRadiusClientIncludeL2cAccessAggrCircuitIdAsciiInAcctStart=juniRadiusClientIncludeL2cAccessAggrCircuitIdAsciiInAcctStart, juniRadiusClientIncludeDelegatedIpv6PrefixInAcctStop=juniRadiusClientIncludeDelegatedIpv6PrefixInAcctStop, juniRadiusAcctClientTimeouts=juniRadiusAcctClientTimeouts, juniRadiusAcctClientCfgRetries=juniRadiusAcctClientCfgRetries, juniRadiusAcctClientCfgServerPortNumber=juniRadiusAcctClientCfgServerPortNumber, juniRadiusClientTunnelAccounting=juniRadiusClientTunnelAccounting)
mibBuilder.exportSymbols("Juniper-RADIUS-CLIENT-MIB", juniRadiusAuthClientCfgServerAddress=juniRadiusAuthClientCfgServerAddress, juniRadiusClientIncludeInterfaceDescriptionInAcctStop=juniRadiusClientIncludeInterfaceDescriptionInAcctStop, juniRadiusAcctClientUnavailableServer=juniRadiusAcctClientUnavailableServer, juniRadiusClientIncludeL2cMinimumDataRateUstrInAcctStop=juniRadiusClientIncludeL2cMinimumDataRateUstrInAcctStop, juniRadiusAuthClientNextAvailableServer=juniRadiusAuthClientNextAvailableServer, juniRadiusAuthClientAvailableServer=juniRadiusAuthClientAvailableServer, juniRadiusClientIncludeL2cAccessAggrCircuitIdAsciiInAccessReq=juniRadiusClientIncludeL2cAccessAggrCircuitIdAsciiInAccessReq, juniRadiusClientIncludeFramedCompressionInAcctStart=juniRadiusClientIncludeFramedCompressionInAcctStart, juniRadiusGeneralClient=juniRadiusGeneralClient, juniRadiusClientIgnorePppoeMaxSession=juniRadiusClientIgnorePppoeMaxSession, juniRadiusClientCompliance14=juniRadiusClientCompliance14, juniRadiusAcctClientRequests=juniRadiusAcctClientRequests, juniRadiusAuthClientCfgTimeoutInterval=juniRadiusAuthClientCfgTimeoutInterval, juniRadiusClientIncludeTunnelServerEndpointInAcctStart=juniRadiusClientIncludeTunnelServerEndpointInAcctStart, juniRadiusClientCompliance7=juniRadiusClientCompliance7, juniRadiusAuthClientGroup=juniRadiusAuthClientGroup, juniRadiusClientIncludeTunnelServerAuthIdInAccessReq=juniRadiusClientIncludeTunnelServerAuthIdInAccessReq, juniRadiusAcctClientRejectRequests=juniRadiusAcctClientRejectRequests, juniRadiusAuthClientCfgServerPortNumber=juniRadiusAuthClientCfgServerPortNumber, juniRadiusAcctClientCfgMaxPendingRequests=juniRadiusAcctClientCfgMaxPendingRequests, juniRadiusClientIncludeMlpppBundleNameInAcctStart=juniRadiusClientIncludeMlpppBundleNameInAcctStart, juniRadiusBrasClientGroup20=juniRadiusBrasClientGroup20, juniRadiusClientIncludeDhcpGiAddressInAcctStart=juniRadiusClientIncludeDhcpGiAddressInAcctStart, juniRadiusClientDslPortType=juniRadiusClientDslPortType, juniRadiusClientIncludeL2cAttainDataRateUstrInAcctStop=juniRadiusClientIncludeL2cAttainDataRateUstrInAcctStop, juniRadiusClientIncludeDslForumAttributesInAcctStart=juniRadiusClientIncludeDslForumAttributesInAcctStart, juniRadiusClientIncludeIpv6VirtualRouterInAcctStart=juniRadiusClientIncludeIpv6VirtualRouterInAcctStart, juniRadiusClientAcctSessionIdFormat=juniRadiusClientAcctSessionIdFormat, juniRadiusAcctClientRoundTripTime=juniRadiusAcctClientRoundTripTime, juniRadiusClientIncludeL2cAccessAggrCircuitIdBinaryInAcctStop=juniRadiusClientIncludeL2cAccessAggrCircuitIdBinaryInAcctStop, juniRadiusAcctClientRetransmissions=juniRadiusAcctClientRetransmissions, juniRadiusClientIncludeTunnelInterfaceIdInAcctStop=juniRadiusClientIncludeTunnelInterfaceIdInAcctStop, juniRadiusAuthClientCompliance=juniRadiusAuthClientCompliance, juniRadiusAuthClientCfgDeadTime=juniRadiusAuthClientCfgDeadTime, juniRadiusClientIncludeL2cActInterleavingDelayDstrInAccessReq=juniRadiusClientIncludeL2cActInterleavingDelayDstrInAccessReq, juniRadiusAcctClientCfgPrecedence=juniRadiusAcctClientCfgPrecedence, juniRadiusClientNasPortFieldWidthEthernetPort=juniRadiusClientNasPortFieldWidthEthernetPort, juniRadiusClientIncludeL2cUpStreamDataInAccessReq=juniRadiusClientIncludeL2cUpStreamDataInAccessReq, juniRadiusClientAtmNasPortFormat=juniRadiusClientAtmNasPortFormat, juniRadiusClientIncludeL2cDslLineStateInAcctStart=juniRadiusClientIncludeL2cDslLineStateInAcctStart, juniRadiusClientIncludeInterfaceIdInAcctStart=juniRadiusClientIncludeInterfaceIdInAcctStart, juniRadiusClientIncludeMlpppBundleNameInAccessReq=juniRadiusClientIncludeMlpppBundleNameInAccessReq, juniRadiusClientIncludeFramedIpv6PoolInAcctStart=juniRadiusClientIncludeFramedIpv6PoolInAcctStart, juniRadiusClientIncludeCalledStationIdInAcctStart=juniRadiusClientIncludeCalledStationIdInAcctStart, juniRadiusClientIncludeCallingStationIdInAcctStart=juniRadiusClientIncludeCallingStationIdInAcctStart, juniRadiusAcctClientInterimResponses=juniRadiusAcctClientInterimResponses, juniRadiusClientIncludeL2cMaximumDataRateUstrInAcctStart=juniRadiusClientIncludeL2cMaximumDataRateUstrInAcctStart, juniRadiusClientIncludeNasPortTypeInAcctStart=juniRadiusClientIncludeNasPortTypeInAcctStart, juniRadiusClientCompliance8=juniRadiusClientCompliance8, juniRadiusClientIncludeTunnelInterfaceIdInAcctStart=juniRadiusClientIncludeTunnelInterfaceIdInAcctStart, juniRadiusAcctNotificationGroup2=juniRadiusAcctNotificationGroup2, juniRadiusClientIncludeL2cActInterleavingDelayUstrInAcctStop=juniRadiusClientIncludeL2cActInterleavingDelayUstrInAcctStop, juniRadiusAcctClient=juniRadiusAcctClient, juniRadiusClientMIBGroups=juniRadiusClientMIBGroups, juniRadiusClientConnectInfoFormat=juniRadiusClientConnectInfoFormat, juniRadiusClientIncludeAcctSessionIdInAcctOn=juniRadiusClientIncludeAcctSessionIdInAcctOn, juniRadiusAcctClientUnknownTypes=juniRadiusAcctClientUnknownTypes, juniRadiusClientIncludeIpAddrInAcctStart=juniRadiusClientIncludeIpAddrInAcctStart, juniRadiusBrasClientGroup5=juniRadiusBrasClientGroup5, juniRadiusAuthClientServerEntry=juniRadiusAuthClientServerEntry, juniRadiusAuthClientPendingRequests=juniRadiusAuthClientPendingRequests, juniRadiusClientIncludeEgressPolicyNameInAcctStart=juniRadiusClientIncludeEgressPolicyNameInAcctStart, juniRadiusClientIncludeTunnelClientAuthIdInAcctStop=juniRadiusClientIncludeTunnelClientAuthIdInAcctStop, juniRadiusClientIncludeAcctTerminateCauseInAcctOff=juniRadiusClientIncludeAcctTerminateCauseInAcctOff, juniRadiusClientIncludeAcctAuthenticInAcctOn=juniRadiusClientIncludeAcctAuthenticInAcctOn, juniRadiusClientNasPortFieldWidthAtmVpi=juniRadiusClientNasPortFieldWidthAtmVpi, juniRadiusAuthClientCfgPrecedence=juniRadiusAuthClientCfgPrecedence, juniRadiusClientNasPortFieldWidthAtmPort=juniRadiusClientNasPortFieldWidthAtmPort, juniRadiusBrasClientGroup8=juniRadiusBrasClientGroup8, juniRadiusClientCompliance12=juniRadiusClientCompliance12, juniRadiusAuthClientInvalidServerAddresses=juniRadiusAuthClientInvalidServerAddresses, juniRadiusClientIncludeIpv6SecondaryDnsInAcctStart=juniRadiusClientIncludeIpv6SecondaryDnsInAcctStart, juniRadiusClientIncludeTunnelServerAuthIdInAcctStart=juniRadiusClientIncludeTunnelServerAuthIdInAcctStart, juniRadiusClientNasPortFormat=juniRadiusClientNasPortFormat, juniRadiusClientTrapOnNoAcctServerAvailable=juniRadiusClientTrapOnNoAcctServerAvailable, juniRadiusClientIncludeAcctAuthenticInAcctOff=juniRadiusClientIncludeAcctAuthenticInAcctOff, juniRadiusClientIncludeL2cAccessLoopCircuitIdInAcctStart=juniRadiusClientIncludeL2cAccessLoopCircuitIdInAcctStart, juniRadiusClientIncludeL2cDslTypeInAccessReq=juniRadiusClientIncludeL2cDslTypeInAccessReq, juniRadiusClientIncludeTunnelServerAttributesInAccessReq=juniRadiusClientIncludeTunnelServerAttributesInAccessReq, juniRadiusAuthClientCfgRetries=juniRadiusAuthClientCfgRetries, juniRadiusClientIncludeL2cMinimumDataRateDstrInAccessReq=juniRadiusClientIncludeL2cMinimumDataRateDstrInAccessReq, juniRadiusClientCompliance9=juniRadiusClientCompliance9, juniRadiusClientIncludeNasPortInAccessReq=juniRadiusClientIncludeNasPortInAccessReq, juniRadiusClientIncludeTunnelInterfaceIdInAccessReq=juniRadiusClientIncludeTunnelInterfaceIdInAccessReq, juniRadiusClientIgnoreAtmMbs=juniRadiusClientIgnoreAtmMbs, juniRadiusAuthClientCfgMaxPendingRequests=juniRadiusAuthClientCfgMaxPendingRequests, juniRadiusClientIncludeIpv6LocalInterfaceInAcctStart=juniRadiusClientIncludeIpv6LocalInterfaceInAcctStart, juniRadiusClientIncludeNasIdentifierInAcctStop=juniRadiusClientIncludeNasIdentifierInAcctStop, juniRadiusClientIdentifier=juniRadiusClientIdentifier, juniRadiusClientIncludeNasPortIdInAcctStart=juniRadiusClientIncludeNasPortIdInAcctStart, juniRadiusClientIncludeAcctTunnelConnectionInAccessReq=juniRadiusClientIncludeAcctTunnelConnectionInAccessReq, juniRadiusClientIncludeL2cDownStreamDataInAcctStart=juniRadiusClientIncludeL2cDownStreamDataInAcctStart, juniRadiusAuthClientCfgServerEntry=juniRadiusAuthClientCfgServerEntry, juniRadiusClientCallingStationIdFormat=juniRadiusClientCallingStationIdFormat, juniRadiusAcctClientGroup3=juniRadiusAcctClientGroup3, juniRadiusClientIncludeFramedIpv6PoolInAcctStop=juniRadiusClientIncludeFramedIpv6PoolInAcctStop, juniRadiusClientIncludeUpStreamCalculatedQosRateInAcctStop=juniRadiusClientIncludeUpStreamCalculatedQosRateInAcctStop, juniRadiusAcctClientCfgDeadTime=juniRadiusAcctClientCfgDeadTime, juniRadiusClientVlanNasPortFormat=juniRadiusClientVlanNasPortFormat, juniRadiusClientIncludeDownStreamCalculatedQosRateInAcctStart=juniRadiusClientIncludeDownStreamCalculatedQosRateInAcctStart, juniRadiusBrasClientGroup4=juniRadiusBrasClientGroup4, juniRadiusClientIncludeIpv6PrimaryDnsInAcctStart=juniRadiusClientIncludeIpv6PrimaryDnsInAcctStart, juniRadiusGeneralClientGroup2=juniRadiusGeneralClientGroup2, juniRadiusAcctClientGroup4=juniRadiusAcctClientGroup4, juniRadiusAcctClientServerEntry=juniRadiusAcctClientServerEntry, juniRadiusAcctClientResponses=juniRadiusAcctClientResponses, juniRadiusClientIncludeProfileServiceDescrInAcctStart=juniRadiusClientIncludeProfileServiceDescrInAcctStart, juniRadiusClientIncludeNasPortTypeInAccessReq=juniRadiusClientIncludeNasPortTypeInAccessReq, juniRadiusAcctClientStopResponses=juniRadiusAcctClientStopResponses, juniRadiusClientNasPortFieldWidthAtmAdapter=juniRadiusClientNasPortFieldWidthAtmAdapter, juniRadiusAuthNotificationGroup=juniRadiusAuthNotificationGroup, PYSNMP_MODULE_ID=juniRadiusClientMIB, juniRadiusClientIncludeTunnelTypeInAccessReq=juniRadiusClientIncludeTunnelTypeInAccessReq, juniRadiusClientIncludeTunnelServerAuthIdInAcctStop=juniRadiusClientIncludeTunnelServerAuthIdInAcctStop, juniRadiusClientIncludeIpv6PrefixInAcctStart=juniRadiusClientIncludeIpv6PrefixInAcctStart, juniRadiusClientIncludeL2cAttainDataRateDstrInAccessReq=juniRadiusClientIncludeL2cAttainDataRateDstrInAccessReq, juniRadiusClientIncludeL2cMaxInterleavingDelayUstrInAcctStop=juniRadiusClientIncludeL2cMaxInterleavingDelayUstrInAcctStop, juniRadiusAuthClientUnknownTypes=juniRadiusAuthClientUnknownTypes, juniRadiusClientIncludeAscendNumInMultilinkInAccessReq=juniRadiusClientIncludeAscendNumInMultilinkInAccessReq, juniRadiusClientIncludeDhcpMacAddressInAcctStart=juniRadiusClientIncludeDhcpMacAddressInAcctStart, juniRadiusClientTrapPrefix=juniRadiusClientTrapPrefix, juniRadiusAuthClientGroup2=juniRadiusAuthClientGroup2, juniRadiusClientIncludeL2cAttainDataRateDstrInAcctStop=juniRadiusClientIncludeL2cAttainDataRateDstrInAcctStop, juniRadiusBrasClientGroup14=juniRadiusBrasClientGroup14, juniRadiusClientNasPortFieldWidthEthernetSlot=juniRadiusClientNasPortFieldWidthEthernetSlot, juniRadiusAuthClientCfgRowStatus=juniRadiusAuthClientCfgRowStatus, juniRadiusBasicClientGroup2=juniRadiusBasicClientGroup2, juniRadiusClientCompliance2=juniRadiusClientCompliance2, juniRadiusClientIncludeL2cAccessLoopCircuitIdInAcctStop=juniRadiusClientIncludeL2cAccessLoopCircuitIdInAcctStop, juniRadiusClientIncludeL2cDslLineStateInAccessReq=juniRadiusClientIncludeL2cDslLineStateInAccessReq, juniRadiusClientIgnoreAtmCategory=juniRadiusClientIgnoreAtmCategory, juniRadiusClientIncludeL2cMaxInterleavingDelayDstrInAcctStart=juniRadiusClientIncludeL2cMaxInterleavingDelayDstrInAcctStart, juniRadiusClientIncludeFramedIpv6RouteInAcctStart=juniRadiusClientIncludeFramedIpv6RouteInAcctStart, juniRadiusAuthNotificationGroup2=juniRadiusAuthNotificationGroup2, juniRadiusBrasClientGroup7=juniRadiusBrasClientGroup7, juniRadiusClientIncludeNasPortIdInAccessReq=juniRadiusClientIncludeNasPortIdInAccessReq, juniRadiusAcctClientAvailableServer=juniRadiusAcctClientAvailableServer, juniRadiusClientIncludeFramedCompressionInAcctStop=juniRadiusClientIncludeFramedCompressionInAcctStop, juniRadiusClientEthernetNasPortFormat=juniRadiusClientEthernetNasPortFormat, juniRadiusClientIncludeAscendNumInMultilinkInAcctStop=juniRadiusClientIncludeAscendNumInMultilinkInAcctStop, juniRadiusClientSourceAddress=juniRadiusClientSourceAddress, juniRadiusClientIncludeTunnelPreferenceInAcctStart=juniRadiusClientIncludeTunnelPreferenceInAcctStart, juniRadiusClientIncludeTunnelMediumTypeInAcctStop=juniRadiusClientIncludeTunnelMediumTypeInAcctStop, juniRadiusAcctClientServerAvailable=juniRadiusAcctClientServerAvailable)
| 362.511355 | 22,441 | 0.836173 | 36,538 | 462,927 | 10.594039 | 0.036291 | 0.090616 | 0.113076 | 0.165788 | 0.737432 | 0.695356 | 0.687877 | 0.682106 | 0.637976 | 0.631073 | 0 | 0.02292 | 0.049312 | 462,927 | 1,276 | 22,442 | 362.795455 | 0.856621 | 0.000743 | 0 | 0.04739 | 0 | 0.210442 | 0.586823 | 0.460367 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.005622 | 0 | 0.009639 | 0 | 0 | 0 | 1 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
d024fb7f0cf0eb0f781b38cd7aab5b54e80f9e0f | 10,851 | py | Python | REST_UBER/sam_rest_old/sam_dataqueries.py | quanted/pram_flask | f8e7bdc1433699a08d8da501e876dcbad584aeea | [
"Unlicense"
] | null | null | null | REST_UBER/sam_rest_old/sam_dataqueries.py | quanted/pram_flask | f8e7bdc1433699a08d8da501e876dcbad584aeea | [
"Unlicense"
] | null | null | null | REST_UBER/sam_rest_old/sam_dataqueries.py | quanted/pram_flask | f8e7bdc1433699a08d8da501e876dcbad584aeea | [
"Unlicense"
] | null | null | null | # coding: utf-8
import numpy as np
import pandas as pd
import matplotlib.pyplot as plt
from pylab import *
## agg backend is used to create plot as a .png file
# mpl.use('agg')
# mongo call support
import os
import json
import auth_s3
import requests
import logging
# Set HTTP header
http_headers = auth_s3.setHTTPHeaders()
# this probably not set for back end
url_part1 = os.environ['UBERTOOL_REST_SERVER']
##############################
## mongo calls
###########################function to retrieve model object from MongoDB################################
def get_model_object(jid, model_name):
"""Retrieves JSON from MongoDB representing model (Python) object and returns it as Python dictionary"""
all_dic = {"jid": jid, "model_name": model_name}
data = json.dumps(all_dic)
url = url_part1 + '/get_model_object'
try:
response = requests.post(url, data=data, headers=http_headers, timeout=60)
if response:
model_object = json.loads(response.content)['model_object']
else:
model_object = ""
except:
return {"error": "error"}
return model_object
###########################function to retrieve model object from MongoDB################################
def get_sam_huc_output(jid, huc12):
"""Retrieves JSON from MongoDB representing model (Python) object and returns it as Python dictionary"""
all_dic = {"jid": jid, "model_name": "sam", "huc12": huc12}
data = json.dumps(all_dic)
url = url_part1 + '/get_sam_huc_output'
try:
response = requests.post(url, data=data, headers=http_headers, timeout=60)
if response:
model_object = json.loads(response.content)['huc12_output']
else:
model_object = ""
except:
logging.exception(Exception)
return "error"
return model_object
#############################
# get data
##############################
##################################
# monthly streak data for huc
# need a dictionary with a single huc as a key and
# 12 monthly values in a list as the value
def GetSAM_MonthlyHUCStreakOutput(jobid, hucid):
# fake
sam_dict = {0: [0., 0., 0., 3.3, 2.4, 1.1, 3.6, 2.0, 0., 0., 0., 0.]}
# actual mongo query - function is in front end in /REST/rest_funcs.py
# sam_dict = get_sam_huc_output(jid, huc12)
# get values from dictionary- return the list
sam_huc = sam_dict.values()
return sam_huc # annual streak data for huc
# need a dictionary with a single huc as a key and
# 30 annual values in a list as the value
def GetSAM_AnnualHUCStreakOutput(jobid, hucid):
# fake
sam_dict = {0: [0., 0., 0., 3.3, 2.4, 1.1, 3.6, 2.0, 0., 0., 0., 0.,
0., 0., 0., 3.3, 2.4, 1.1, 3.6, 2.0, 0., 0., 0., 0.,
0., 0., 0., 3.3, 2.4, 1.1]}
# actual mongo query - function is in front end in /REST/rest_funcs.py
# sam_dict = get_sam_huc_output(jid, huc12)
# get values from dictionary- return the list
sam_huc = sam_dict.values()
return sam_huc # monthly frequency of exceedance data for huc
def GetSAM_MonthlyHUCFreqofExceedOutput(jobid, hucid):
# fake
sam_dict = {0: [0., 0., 0., 0.03, 0.05, 0.05, 0.04, 0.02, 0., 0., 0., 0.]}
# actual mongo query - function is in front end in /REST/rest_funcs.py
# sam_dict = get_sam_huc_output(jid, huc12)
# get values from dictionary- return the list
sam_huc = sam_dict.values()
return sam_huc # annual frequency of exceedance data for huc
def GetSAM_AnnualHUCFreqofExceedOutput(jobid, hucid):
# fake
sam_huc = {0: [0.04, 0.08, 0.06, 0.07, 0.02, 0.03, 0.04, 0.01, 0.11, 0.08, 0.11, 0.03,
0.03, 0.02, 0.04, 0.11, 0.09, 0.07, 0.05, 0.03, 0.02, 0.06, 0.04, 0.06,
0.01, 0.05, 0.03, 0.06, 0.07, 0.02]}
# actual mongo query - function is on front end in /REST/rest_funcs.py
# sam_dict = get_sam_huc_output(jid, huc12)
# get values from dictionary- return the list
sam_huc = sam_dict.values()
return sam_huc ############################################
############################################
# get data as 2-d array for boxplots
# monthly streak data
def GetSAM_MonthlyArrayStreakOutput(jobid):
# fake, change to mongoquery
# rand produces a numpy array but the mongo call returns a dictionary of lists
huc01 = rand(12).tolist()
huc02 = rand(12).tolist()
huc03 = rand(12).tolist()
huc04 = rand(12).tolist()
huc05 = rand(12).tolist()
huc06 = rand(12).tolist()
huc07 = rand(12).tolist()
huc08 = rand(12).tolist()
huc09 = rand(12).tolist()
huc10 = rand(12).tolist()
sam_dict = {0: huc01, 1: huc02, 2: huc03, 3: huc04, 4: huc05, 5: huc06, 6: huc07, 7: huc08, 8: huc09, 9: huc10}
# actual mongo query
# sam_dict = get_model_object(jid, "SAM")
# convert dictionary to numpy array using pandas dataframe
sam = pd.DataFrame.from_dict(sam_dict, orient="index").as_matrix()
return sam # annual streak data
def GetSAM_AnnualArrayStreakOutput(jobid):
# fake, change to mongoquery
huc01 = rand(30).tolist()
huc02 = rand(30).tolist()
huc03 = rand(30).tolist()
huc04 = rand(30).tolist()
huc05 = rand(30).tolist()
huc06 = rand(30).tolist()
huc07 = rand(30).tolist()
huc08 = rand(30).tolist()
huc09 = rand(30).tolist()
huc10 = rand(30).tolist()
sam_dict = {0: huc01, 1: huc02, 2: huc03, 3: huc04, 4: huc05, 5: huc06, 6: huc07, 7: huc08, 8: huc09, 9: huc10}
# actual mongo query
# sam_dict = get_model_object(jid, "SAM")
# convert dictionary to numpy array using pandas dataframe
sam = pd.DataFrame.from_dict(sam_dict, orient="index").as_matrix()
return sam # monthly frequency of exceedance data
def GetSAM_MonthlyArrayFreqofExceedOutput(jobid):
# fake, change to mongoquery
# rand produces a numpy array but the mongo call returns a dictionary of lists
huc01 = rand(12).tolist()
huc02 = rand(12).tolist()
huc03 = rand(12).tolist()
huc04 = rand(12).tolist()
huc05 = rand(12).tolist()
huc06 = rand(12).tolist()
huc07 = rand(12).tolist()
huc08 = rand(12).tolist()
huc09 = rand(12).tolist()
huc10 = rand(12).tolist()
sam_dict = {0: huc01, 1: huc02, 2: huc03, 3: huc04, 4: huc05, 5: huc06, 6: huc07, 7: huc08, 8: huc09, 9: huc10}
# actual mongo query
# sam_dict = get_model_object(jid, "SAM")
# convert dictionary to numpy array using pandas dataframe
sam = pd.DataFrame.from_dict(sam_dict, orient="index").as_matrix()
return sam # annual frequency of exceedance data
def GetSAM_AnnualArrayFreqofExceedOutput(jobid):
# fake, change to mongoquery
huc01 = rand(30).tolist()
huc02 = rand(30).tolist()
huc03 = rand(30).tolist()
huc04 = rand(30).tolist()
huc05 = rand(30).tolist()
huc06 = rand(30).tolist()
huc07 = rand(30).tolist()
huc08 = rand(30).tolist()
huc09 = rand(30).tolist()
huc10 = rand(30).tolist()
sam_dict = {0: huc01, 1: huc02, 2: huc03, 3: huc04, 4: huc05, 5: huc06, 6: huc07, 7: huc08, 8: huc09, 9: huc10}
# actual mongo query
# sam_dict = get_model_object(jid, "SAM")
# convert dictionary to numpy array using pandas dataframe
sam = pd.DataFrame.from_dict(sam_dict, orient="index").as_matrix()
return sam ############################################
############################################
# get all data as 1-d vector for histograms
# all streak data - monthly
def GetSAM_MonthlyVectorStreakOutput(jobid):
# fake, change to mongoquery
# rand produces a numpy array
huc01 = rand(12).tolist()
huc02 = rand(12).tolist()
huc03 = rand(12).tolist()
huc04 = rand(12).tolist()
huc05 = rand(12).tolist()
huc06 = rand(12).tolist()
huc07 = rand(12).tolist()
huc08 = rand(12).tolist()
huc09 = rand(12).tolist()
huc10 = rand(12).tolist()
sam_dict = {0: huc01, 1: huc02, 2: huc03, 3: huc04, 4: huc05, 5: huc06, 6: huc07, 7: huc08, 8: huc09, 9: huc10}
# actual mongo query
# sam_dict = get_model_object(jid, "SAM")
# convert dictionary to 2-d numpy array using pandas dataframe
sam_matrix = pd.DataFrame.from_dict(sam_dict, orient="index").as_matrix()
# numpy to stack arrays horizontally
sam_vector = np.hstack(sam_matrix)
return sam_vector # all streak data - monthly
def GetSAM_MonthlyVectorFreqofExceedOutput(jobid):
# fake, change to mongoquery
# rand produces a numpy array
huc01 = rand(12).tolist()
huc02 = rand(12).tolist()
huc03 = rand(12).tolist()
huc04 = rand(12).tolist()
huc05 = rand(12).tolist()
huc06 = rand(12).tolist()
huc07 = rand(12).tolist()
huc08 = rand(12).tolist()
huc09 = rand(12).tolist()
huc10 = rand(12).tolist()
sam_dict = {0: huc01, 1: huc02, 2: huc03, 3: huc04, 4: huc05, 5: huc06, 6: huc07, 7: huc08, 8: huc09, 9: huc10}
# actual mongo query
# sam_dict = get_model_object(jid, "SAM")
# convert dictionary to 2-d numpy array using pandas dataframe
sam_matrix = pd.DataFrame.from_dict(sam_dict, orient="index").as_matrix()
# numpy to stack arrays horizontally
sam_vector = np.hstack(sam_matrix)
return sam_vector # monthly streak data - annual
def GetSAM_AnnualVectorStreakOutput(jobid):
# fake, change to mongoquery
# rand produces a numpy array
huc01 = rand(30).tolist()
huc02 = rand(30).tolist()
huc03 = rand(30).tolist()
huc04 = rand(30).tolist()
huc05 = rand(30).tolist()
huc06 = rand(30).tolist()
huc07 = rand(30).tolist()
huc08 = rand(30).tolist()
huc09 = rand(30).tolist()
huc10 = rand(30).tolist()
sam_dict = {0: huc01, 1: huc02, 2: huc03, 3: huc04, 4: huc05, 5: huc06, 6: huc07, 7: huc08, 8: huc09, 9: huc10}
# actual mongo query
# sam_dict = get_model_object(jid, "SAM")
# convert dictionary to 2-d numpy array using pandas dataframe
sam_matrix = pd.DataFrame.from_dict(sam_dict, orient="index").as_matrix()
# numpy to stack arrays horizontally
sam_vector = np.hstack(sam_matrix)
return sam_vector # frequency of exceedance data - annual
def GetSAM_AnnualFreqofExceedStreakOutput(jobid):
# fake, change to mongoquery
# rand produces a numpy array
huc01 = rand(30).tolist()
huc02 = rand(30).tolist()
huc03 = rand(30).tolist()
huc04 = rand(30).tolist()
huc05 = rand(30).tolist()
huc06 = rand(30).tolist()
huc07 = rand(30).tolist()
huc08 = rand(30).tolist()
huc09 = rand(30).tolist()
huc10 = rand(30).tolist()
sam_dict = {0: huc01, 1: huc02, 2: huc03, 3: huc04, 4: huc05, 5: huc06, 6: huc07, 7: huc08, 8: huc09, 9: huc10}
# actual mongo query
# sam_dict = get_model_object(jid, "SAM")
# convert dictionary to 2-d numpy array using pandas dataframe
sam_matrix = pd.DataFrame.from_dict(sam_dict, orient="index").as_matrix()
# numpy to stack arrays horizontally
sam_vector = np.hstack(sam_matrix)
return sam_vector ############################################
| 32.782477 | 111 | 0.635057 | 1,582 | 10,851 | 4.257269 | 0.114412 | 0.035635 | 0.071269 | 0.010097 | 0.829844 | 0.820342 | 0.795991 | 0.795991 | 0.784113 | 0.747439 | 0 | 0.088695 | 0.197862 | 10,851 | 330 | 112 | 32.881818 | 0.685087 | 0.303843 | 0 | 0.758621 | 0 | 0 | 0.024153 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.08046 | false | 0 | 0.051724 | 0 | 0.224138 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
d030d5d2d3c8be4e6355ec89af123e327a87643b | 39,160 | py | Python | trees/binary_trees/threaded_binary_tree.py | shunsvineyard/shunsvineyard-sample | 5cea78b30b4b8f1ad3dcedc952fca56103638892 | [
"MIT"
] | 7 | 2019-12-11T05:32:52.000Z | 2022-03-18T18:21:09.000Z | trees/binary_trees/threaded_binary_tree.py | shunsvineyard/shunsvineyard-sample | 5cea78b30b4b8f1ad3dcedc952fca56103638892 | [
"MIT"
] | 4 | 2021-01-18T07:21:02.000Z | 2021-07-04T06:22:32.000Z | trees/binary_trees/threaded_binary_tree.py | shunsvineyard/shunsvineyard-sample | 5cea78b30b4b8f1ad3dcedc952fca56103638892 | [
"MIT"
] | 24 | 2019-11-17T08:44:16.000Z | 2022-03-31T06:16:48.000Z | # Copyright © 2021 by Shun Huang. All rights reserved.
# Licensed under MIT License.
# See LICENSE in the project root for license information.
"""Threaded Binary Search Trees."""
from dataclasses import dataclass
from typing import Any, Optional
from trees import tree_exceptions
from trees.binary_trees import binary_tree
@dataclass
class SingleThreadNode(binary_tree.Node):
"""Single Threaded Tree node definition."""
left: Optional["SingleThreadNode"] = None
right: Optional["SingleThreadNode"] = None
parent: Optional["SingleThreadNode"] = None
isThread: bool = False
@dataclass
class DoubleThreadNode(binary_tree.Node):
"""Double Threaded Tree node definition."""
left: Optional["DoubleThreadNode"] = None
right: Optional["DoubleThreadNode"] = None
parent: Optional["DoubleThreadNode"] = None
leftThread: bool = False
rightThread: bool = False
class RightThreadedBinaryTree(binary_tree.BinaryTree):
"""Right Threaded Binary Tree.
Attributes
----------
root: `Optional[SingleThreadNode]`
The root node of the right threaded binary search tree.
empty: `bool`
`True` if the tree is empty; `False` otherwise.
Methods
-------
search(key: `Any`)
Look for a node based on the given key.
insert(key: `Any`, data: `Any`)
Insert a (key, data) pair into the tree.
delete(key: `Any`)
Delete a node based on the given key from the tree.
inorder_traverse()
In-order traversal by using the right threads.
preorder_traverse()
Pre-order traversal by using the right threads.
get_leftmost(node: `SingleThreadNode`)
Return the node whose key is the smallest from the given subtree.
get_rightmost(node: `SingleThreadNode`)
Return the node whose key is the biggest from the given subtree.
get_successor(node: `SingleThreadNode`)
Return the successor node in the in-order order.
get_predecessor(node: `SingleThreadNode`)
Return the predecessor node in the in-order order.
get_height(node: `Optional[SingleThreadNode]`)
Return the height of the given node.
Examples
--------
>>> from trees.binary_trees import threaded_binary_tree
>>> tree = threaded_binary_tree.RightThreadedBinaryTree()
>>> tree.insert(key=23, data="23")
>>> tree.insert(key=4, data="4")
>>> tree.insert(key=30, data="30")
>>> tree.insert(key=11, data="11")
>>> tree.insert(key=7, data="7")
>>> tree.insert(key=34, data="34")
>>> tree.insert(key=20, data="20")
>>> tree.insert(key=24, data="24")
>>> tree.insert(key=22, data="22")
>>> tree.insert(key=15, data="15")
>>> tree.insert(key=1, data="1")
>>> [item for item in tree.inorder_traverse()]
[(1, '1'), (4, '4'), (7, '7'), (11, '11'), (15, '15'), (20, '20'),
(22, '22'), (23, '23'), (24, '24'), (30, '30'), (34, '34')]
>>> [item for item in tree.preorder_traverse()]
[(1, '1'), (4, '4'), (7, '7'), (11, '11'), (15, '15'), (20, '20'),
(22, '22'), (23, '23'), (24, '24'), (30, '30'), (34, '34')]
>>> tree.get_leftmost().key
1
>>> tree.get_leftmost().data
'1'
>>> tree.get_rightmost().key
34
>>> tree.get_rightmost().data
"34"
>>> tree.get_height(tree.root)
4
>>> tree.search(24).data
`24`
>>> tree.delete(15)
"""
def __init__(self):
binary_tree.BinaryTree.__init__(self)
# Override
def search(self, key: Any) -> SingleThreadNode:
"""Look for a node by a given key.
See Also
--------
:py:meth:`trees.binary_trees.binary_tree.BinaryTree.search`.
"""
current = self.root
while current:
if key == current.key:
return current # type: ignore
elif key < current.key:
current = current.left
else: # key > current.key
if current.isThread is False:
current = current.right
else:
break
raise tree_exceptions.KeyNotFoundError(key=key)
# Override
def insert(self, key: Any, data: Any):
"""Insert a (key, data) pair into the right threaded binary tree.
See Also
--------
:py:meth:`trees.binary_trees.binary_tree.BinaryTree.insert`.
"""
node = SingleThreadNode(key=key, data=data)
if self.root is None:
self.root = node
else:
temp = self.root
while temp:
# Move to left subtree
if node.key < temp.key:
if temp.left:
temp = temp.left
continue
else:
temp.left = node
node.right = temp
node.isThread = True
node.parent = temp
break
# Move to right subtree
elif node.key > temp.key:
if temp.isThread is False and temp.right:
temp = temp.right
continue
else:
node.right = temp.right
temp.right = node
node.isThread = temp.isThread
temp.isThread = False
node.parent = temp
break
else:
raise tree_exceptions.DuplicateKeyError(key=key)
# Override
def delete(self, key: Any):
"""Delete the node by the given key.
See Also
--------
:py:meth:`trees.binary_trees.binary_tree.BinaryTree.delete`.
"""
if self.root:
deleting_node = self.search(key=key)
# The deleting node has no child
if deleting_node.left is None and (
deleting_node.right is None or deleting_node.isThread
):
self._transplant(deleting_node=deleting_node, replacing_node=None)
# The deleting node has only one right child
elif deleting_node.left is None and deleting_node.isThread is False:
self._transplant(
deleting_node=deleting_node, replacing_node=deleting_node.right
)
# The deleting node has only one left child,
elif deleting_node.left and deleting_node.isThread:
predecessor = self.get_predecessor(node=deleting_node)
if predecessor:
predecessor.right = deleting_node.right
self._transplant(
deleting_node=deleting_node, replacing_node=deleting_node.left
)
# The deleting node has two children
elif (
deleting_node.left
and deleting_node.right
and deleting_node.isThread is False
):
predecessor = self.get_predecessor(node=deleting_node)
replacing_node: SingleThreadNode = self.get_leftmost(
node=deleting_node.right
)
# the minmum node is not the direct child of the deleting node
if replacing_node.parent != deleting_node:
if replacing_node.isThread:
self._transplant(
deleting_node=replacing_node, replacing_node=None
)
else:
self._transplant(
deleting_node=replacing_node,
replacing_node=replacing_node.right,
)
replacing_node.right = deleting_node.right
replacing_node.right.parent = replacing_node
replacing_node.isThread = False
self._transplant(
deleting_node=deleting_node, replacing_node=replacing_node
)
replacing_node.left = deleting_node.left
replacing_node.left.parent = replacing_node
if predecessor and predecessor.isThread:
predecessor.right = replacing_node
else:
raise RuntimeError("Invalid case. Should never happened")
# Override
def get_leftmost(self, node: SingleThreadNode) -> SingleThreadNode:
"""Return the leftmost node from a given subtree.
See Also
--------
:py:meth:`trees.binary_trees.binary_tree.BinaryTree.get_leftmost`.
"""
current_node = node
while current_node.left:
current_node = current_node.left
return current_node
# Override
def get_rightmost(self, node: SingleThreadNode) -> SingleThreadNode:
"""Return the rightmost node from a given subtree.
See Also
--------
:py:meth:`trees.binary_trees.binary_tree.BinaryTree.get_rightmost`.
"""
current_node = node
while current_node.isThread is False and current_node.right:
current_node = current_node.right
return current_node
# Override
def get_successor(self, node: SingleThreadNode) -> Optional[SingleThreadNode]:
"""Return the successor node in the in-order order.
See Also
--------
:py:meth:`trees.binary_trees.binary_tree.BinaryTree.get_successor`.
"""
if node.isThread:
return node.right
else:
if node.right:
return self.get_leftmost(node=node.right)
# if node.right is None, it means no successor of the given node.
return None
# Override
def get_predecessor(self, node: SingleThreadNode) -> Optional[SingleThreadNode]:
"""Return the predecessor node in the in-order order.
See Also
--------
:py:meth:`trees.binary_trees.binary_tree.BinaryTree.get_predecessor`.
"""
if node.left:
return self.get_rightmost(node=node.left)
parent = node.parent
while parent and node == parent.left:
node = parent
parent = parent.parent
return parent
# Override
def get_height(self, node: Optional[SingleThreadNode]) -> int:
"""Return the height of the given node.
See Also
--------
:py:meth:`trees.binary_trees.binary_tree.BinaryTree.get_height`.
"""
if node is None:
return 0
if node.left is None and node.isThread:
return 0
return max(self.get_height(node.left), self.get_height(node.right)) + 1
def inorder_traverse(self) -> binary_tree.Pairs:
"""Use the right threads to traverse the tree in in-order order.
Yields
------
`Pairs`
The next (key, data) pair in the tree in-order traversal.
"""
if self.root:
current: Optional[SingleThreadNode] = self.get_leftmost(node=self.root)
while current:
yield (current.key, current.data)
if current.isThread:
current = current.right
else:
if current.right is None:
break
current = self.get_leftmost(current.right)
def preorder_traverse(self) -> binary_tree.Pairs:
"""Use the right threads to traverse the tree in pre-order order.
Yields
------
`Pairs`
The next (key, data) pair in the tree pre-order traversal.
"""
current = self.root
while current:
yield (current.key, current.data)
if current.isThread:
current = current.right.right
else:
current = current.left
def _transplant(
self,
deleting_node: SingleThreadNode,
replacing_node: Optional[SingleThreadNode],
):
if deleting_node.parent is None:
self.root = replacing_node
if self.root:
self.root.isThread = False
elif deleting_node == deleting_node.parent.left:
deleting_node.parent.left = replacing_node
if replacing_node:
if deleting_node.isThread:
if replacing_node.isThread:
replacing_node.right = replacing_node.right
else: # deleting_node == deleting_node.parent.right
deleting_node.parent.right = replacing_node
if replacing_node:
if deleting_node.isThread:
if replacing_node.isThread:
replacing_node.right = replacing_node.right
else:
deleting_node.parent.right = deleting_node.right
deleting_node.parent.isThread = True
if replacing_node:
replacing_node.parent = deleting_node.parent
class LeftThreadedBinaryTree(binary_tree.BinaryTree):
"""Left Threaded Binary Tree.
Attributes
----------
root: `Optional[SingleThreadNode]`
The root node of the left threaded binary search tree.
empty: `bool`
`True` if the tree is empty; `False` otherwise.
Methods
-------
search(key: `Any`)
Look for a node based on the given key.
insert(key: `Any`, data: `Any`)
Insert a (key, data) pair into the tree.
delete(key: `Any`)
Delete a node based on the given key from the tree.
reverse_inorder_traverse()
Reversed In-order traversal by using the left threads.
get_leftmost(node: `SingleThreadNode`)
Return the node whose key is the smallest from the given subtree.
get_rightmost(node: `SingleThreadNode`)
Return the node whose key is the biggest from the given subtree.
get_successor(node: `SingleThreadNode`)
Return the successor node in the in-order order.
get_predecessor(node: `SingleThreadNode`)
Return the predecessor node in the in-order order.
get_height(node: `Optional[SingleThreadNode]`)
Return the height of the given node.
Examples
--------
>>> from trees.binary_trees import threaded_binary_tree
>>> tree = threaded_binary_tree.LeftThreadedBinaryTree()
>>> tree.insert(key=23, data="23")
>>> tree.insert(key=4, data="4")
>>> tree.insert(key=30, data="30")
>>> tree.insert(key=11, data="11")
>>> tree.insert(key=7, data="7")
>>> tree.insert(key=34, data="34")
>>> tree.insert(key=20, data="20")
>>> tree.insert(key=24, data="24")
>>> tree.insert(key=22, data="22")
>>> tree.insert(key=15, data="15")
>>> tree.insert(key=1, data="1")
>>> [item for item in tree.reverse_inorder_traverse()]
[(34, "34"), (30, "30"), (24, "24"), (23, "23"), (22, "22"),
(20, "20"), (15, "15"), (11, "11"), (7, "7"), (4, "4"), (1, "1")]
>>> tree.get_leftmost().key
1
>>> tree.get_leftmost().data
'1'
>>> tree.get_rightmost().key
34
>>> tree.get_rightmost().data
"34"
>>> tree.get_height(tree.root)
4
>>> tree.search(24).data
`24`
>>> tree.delete(15)
"""
def __init__(self):
binary_tree.BinaryTree.__init__(self)
# Override
def search(self, key: Any) -> SingleThreadNode:
"""Look for a node by a given key.
See Also
--------
:py:meth:`trees.binary_trees.binary_tree.BinaryTree.search`.
"""
current = self.root
while current:
if key == current.key:
return current # type: ignore
elif key < current.key:
if current.isThread is False:
current = current.left
else:
break
else: # key > current.key:
current = current.right
raise tree_exceptions.KeyNotFoundError(key=key)
# Override
def insert(self, key: Any, data: Any):
"""Insert a (key, data) pair into the left threaded binary tree.
See Also
--------
:py:meth:`trees.binary_trees.binary_tree.BinaryTree.insert`.
"""
node = SingleThreadNode(key=key, data=data)
if self.root is None:
self.root = node
else:
temp = self.root
while temp:
# Move to right subtree
if node.key > temp.key:
if temp.right:
temp = temp.right
continue
else:
temp.right = node
node.left = temp
node.isThread = True
node.parent = temp
break
# Move to left subtree
elif node.key < temp.key:
if temp.isThread is False and temp.left:
temp = temp.left
continue
else:
node.left = temp.left
temp.left = node
node.isThread = temp.isThread
temp.isThread = False
node.parent = temp
break
else:
raise tree_exceptions.DuplicateKeyError(key=key)
# Override
def delete(self, key: Any):
"""Delete the node by the given key.
See Also
--------
:py:meth:`trees.binary_trees.binary_tree.BinaryTree.delete`.
"""
if self.root:
deleting_node = self.search(key=key)
# The deleting node has no child
if deleting_node.right is None and (
deleting_node.left is None or deleting_node.isThread
):
self._transplant(deleting_node=deleting_node, replacing_node=None)
# The deleting node has only one right child,
elif deleting_node.right and deleting_node.isThread:
successor = self.get_successor(node=deleting_node)
if successor:
successor.left = deleting_node.left
self._transplant(
deleting_node=deleting_node, replacing_node=deleting_node.right
)
# The deleting node has only one left child
elif (deleting_node.right is None) and (deleting_node.isThread is False):
self._transplant(
deleting_node=deleting_node, replacing_node=deleting_node.left
)
# The deleting node has two children
elif deleting_node.right and deleting_node.left:
replacing_node: SingleThreadNode = self.get_leftmost(
node=deleting_node.right
)
successor = self.get_successor(node=replacing_node)
# the minmum node is not the direct child of the deleting node
if replacing_node.parent != deleting_node:
if replacing_node.isThread:
self._transplant(
deleting_node=replacing_node, replacing_node=None
)
else:
self._transplant(
deleting_node=replacing_node,
replacing_node=replacing_node.right,
)
replacing_node.right = deleting_node.right
replacing_node.right.parent = replacing_node
self._transplant(
deleting_node=deleting_node, replacing_node=replacing_node
)
replacing_node.left = deleting_node.left
replacing_node.left.parent = replacing_node
replacing_node.isThread = False
if successor and successor.isThread:
successor.left = replacing_node
else:
raise RuntimeError("Invalid case. Should never happened")
# Override
def get_leftmost(self, node: SingleThreadNode) -> SingleThreadNode:
"""Return the leftmost node from a given subtree.
See Also
--------
:py:meth:`trees.binary_trees.binary_tree.BinaryTree.get_leftmost`.
"""
current_node = node
while current_node.left and current_node.isThread is False:
current_node = current_node.left
return current_node
# Override
def get_rightmost(self, node: SingleThreadNode) -> SingleThreadNode:
"""Return the rightmost node from a given subtree.
See Also
--------
:py:meth:`trees.binary_trees.binary_tree.BinaryTree.get_rightmost`.
"""
current_node = node
if current_node:
while current_node.right:
current_node = current_node.right
return current_node
# Override
def get_successor(self, node: SingleThreadNode) -> Optional[SingleThreadNode]:
"""Return the successor node in the in-order order.
See Also
--------
:py:meth:`trees.binary_trees.binary_tree.BinaryTree.get_successor`.
"""
if node.right:
return self.get_leftmost(node=node.right)
parent = node.parent
while parent and node == parent.right:
node = parent
parent = parent.parent
return parent
# Override
def get_predecessor(self, node: SingleThreadNode) -> Optional[SingleThreadNode]:
"""Return the predecessor node in the in-order order.
See Also
--------
:py:meth:`trees.binary_trees.binary_tree.BinaryTree.get_predecessor`.
"""
if node.isThread:
return node.left
else:
if node.left:
return self.get_rightmost(node=node.left)
# if node.left is None, it means no predecessor of the given node.
return None
# Override
def get_height(self, node: Optional[SingleThreadNode]) -> int:
"""Return the height of the given node.
See Also
--------
:py:meth:`trees.binary_trees.binary_tree.BinaryTree.get_height`.
"""
if node is None:
return 0
if node.isThread and node.right is None:
return 0
return max(self.get_height(node.left), self.get_height(node.right)) + 1
def reverse_inorder_traverse(self) -> binary_tree.Pairs:
"""Use the left threads to traverse the tree in reversed in-order.
Yields
------
`Pairs`
The next (key, data) pair in the tree reversed in-order traversal.
"""
if self.root:
current: Optional[SingleThreadNode] = self.get_rightmost(node=self.root)
while current:
yield (current.key, current.data)
if current.isThread:
current = current.left
else:
if current.left is None:
break
current = self.get_rightmost(current.left)
def _transplant(
self,
deleting_node: SingleThreadNode,
replacing_node: Optional[SingleThreadNode],
):
if deleting_node.parent is None:
self.root = replacing_node
if self.root:
self.root.isThread = False
elif deleting_node == deleting_node.parent.left:
deleting_node.parent.left = replacing_node
if replacing_node:
if deleting_node.isThread:
if replacing_node.isThread:
replacing_node.left = deleting_node.left
else:
deleting_node.parent.left = deleting_node.left
deleting_node.parent.isThread = True
else: # deleting_node == deleting_node.parent.right
deleting_node.parent.right = replacing_node
if replacing_node:
if deleting_node.isThread:
if replacing_node.isThread:
replacing_node.left = deleting_node.left
if replacing_node:
replacing_node.parent = deleting_node.parent
class DoubleThreadedBinaryTree(binary_tree.BinaryTree):
"""Double Threaded Binary Tree.
Attributes
----------
root: `Optional[DoubleThreadNode]`
The root node of the left threaded binary search tree.
empty: `bool`
`True` if the tree is empty; `False` otherwise.
Methods
-------
search(key: `Any`)
Look for a node based on the given key.
insert(key: `Any`, data: `Any`)
Insert a (key, data) pair into the tree.
delete(key: `Any`)
Delete a node based on the given key from the tree.
inorder_traverse()
In-order traversal by using the right threads.
preorder_traverse()
Pre-order traversal by using the right threads.
reverse_inorder_traverse()
Reversed In-order traversal by using the left threads.
get_leftmost(node: `DoubleThreadNode`)
Return the node whose key is the smallest from the given subtree.
get_rightmost(node: `DoubleThreadNode`)
Return the node whose key is the biggest from the given subtree.
get_successor(node: `DoubleThreadNode`)
Return the successor node in the in-order order.
get_predecessor(node: `DoubleThreadNode`)
Return the predecessor node in the in-order order.
get_height(node: `Optional[DoubleThreadNode]`)
Return the height of the given node.
Examples
--------
>>> from trees.binary_trees import threaded_binary_tree
>>> tree = threaded_binary_tree.DoubleThreadedBinaryTree()
>>> tree.insert(key=23, data="23")
>>> tree.insert(key=4, data="4")
>>> tree.insert(key=30, data="30")
>>> tree.insert(key=11, data="11")
>>> tree.insert(key=7, data="7")
>>> tree.insert(key=34, data="34")
>>> tree.insert(key=20, data="20")
>>> tree.insert(key=24, data="24")
>>> tree.insert(key=22, data="22")
>>> tree.insert(key=15, data="15")
>>> tree.insert(key=1, data="1")
>>> [item for item in tree.inorder_traverse()]
[(1, '1'), (4, '4'), (7, '7'), (11, '11'), (15, '15'), (20, '20'),
(22, '22'), (23, '23'), (24, '24'), (30, '30'), (34, '34')]
>>> [item for item in tree.preorder_traverse()]
[(1, '1'), (4, '4'), (7, '7'), (11, '11'), (15, '15'), (20, '20'),
(22, '22'), (23, '23'), (24, '24'), (30, '30'), (34, '34')]
>>> [item for item in tree.reverse_inorder_traverse()]
[(34, "34"), (30, "30"), (24, "24"), (23, "23"), (22, "22"),
(20, "20"), (15, "15"), (11, "11"), (7, "7"), (4, "4"), (1, "1")]
>>> tree.get_leftmost().key
1
>>> tree.get_leftmost().data
'1'
>>> tree.get_rightmost().key
34
>>> tree.get_rightmost().data
"34"
>>> tree.get_height(tree.root)
4
>>> tree.search(24).data
`24`
>>> tree.delete(15)
"""
def __init__(self):
binary_tree.BinaryTree.__init__(self)
# Override
def search(self, key: Any) -> DoubleThreadNode:
"""Look for a node by a given key.
See Also
--------
:py:meth:`trees.binary_trees.binary_tree.BinaryTree.search`.
"""
current = self.root
while current:
if key == current.key:
return current # type: ignore
elif key < current.key:
if current.leftThread is False:
current = current.left
else:
break
else: # key > current.key
if current.rightThread is False:
current = current.right
else:
break
raise tree_exceptions.KeyNotFoundError(key=key)
# Override
def insert(self, key: Any, data: Any):
"""Insert a (key, data) pair into the double threaded binary tree.
See Also
--------
:py:meth:`trees.binary_trees.binary_tree.BinaryTree.insert`.
"""
node = DoubleThreadNode(key=key, data=data)
if self.root is None:
self.root = node
else:
temp = self.root
while temp:
# Move to left subtree
if node.key < temp.key:
if temp.leftThread is False and temp.left:
temp = temp.left
continue
else:
node.left = temp.left
temp.left = node
node.right = temp
node.rightThread = True
node.parent = temp
temp.leftThread = False
if node.left:
node.leftThread = True
break
# Move to right subtree
elif node.key > temp.key:
if temp.rightThread is False and temp.right:
temp = temp.right
continue
else:
node.right = temp.right
temp.right = node
node.left = temp
node.leftThread = True
temp.rightThread = False
node.parent = temp
if node.right:
node.rightThread = True
break
else:
raise tree_exceptions.DuplicateKeyError(key=key)
# Override
def delete(self, key: Any):
"""Delete the node by the given key.
See Also
--------
:py:meth:`treesnary_trees.binary_tree.BinaryTree.delete`.
"""
if self.root:
deleting_node = self.search(key=key)
# The deleting node has no child
if (deleting_node.leftThread or deleting_node.left is None) and (
deleting_node.rightThread or deleting_node.right is None
):
self._transplant(deleting_node=deleting_node, replacing_node=None)
# The deleting node has only one right child
elif (
deleting_node.leftThread or deleting_node.left is None
) and deleting_node.rightThread is False:
successor = self.get_successor(node=deleting_node)
if successor:
successor.left = deleting_node.left
self._transplant(
deleting_node=deleting_node, replacing_node=deleting_node.right
)
# The deleting node has only one left child,
elif (
deleting_node.rightThread or deleting_node.right is None
) and deleting_node.leftThread is False:
predecessor = self.get_predecessor(node=deleting_node)
if predecessor:
predecessor.right = deleting_node.right
self._transplant(
deleting_node=deleting_node, replacing_node=deleting_node.left
)
# The deleting node has two children
elif deleting_node.left and deleting_node.right:
predecessor = self.get_predecessor(node=deleting_node)
replacing_node: DoubleThreadNode = self.get_leftmost(
node=deleting_node.right
)
successor = self.get_successor(node=replacing_node)
# the minmum node is not the direct child of the deleting node
if replacing_node.parent != deleting_node:
if replacing_node.rightThread:
self._transplant(
deleting_node=replacing_node, replacing_node=None
)
else:
self._transplant(
deleting_node=replacing_node,
replacing_node=replacing_node.right,
)
replacing_node.right = deleting_node.right
replacing_node.right.parent = replacing_node
replacing_node.rightThread = False
self._transplant(
deleting_node=deleting_node, replacing_node=replacing_node
)
replacing_node.left = deleting_node.left
replacing_node.left.parent = replacing_node
replacing_node.leftThread = False
if predecessor and predecessor.rightThread:
predecessor.right = replacing_node
if successor and successor.leftThread:
successor.left = replacing_node
else:
raise RuntimeError("Invalid case. Should never happened")
# Override
def get_leftmost(self, node: DoubleThreadNode) -> DoubleThreadNode:
"""Return the leftmost node from a given subtree.
See Also
--------
:py:meth:`trees.binary_trees.binary_tree.BinaryTree.get_leftmost`.
"""
current_node = node
while current_node.left and current_node.leftThread is False:
current_node = current_node.left
return current_node
# Override
def get_rightmost(self, node: DoubleThreadNode) -> DoubleThreadNode:
"""Return the rightmost node from a given subtree.
See Also
--------
:py:meth:`trees.binary_trees.binary_tree.BinaryTree.get_rightmost`.
"""
current_node = node
if current_node:
while current_node.right and current_node.rightThread is False:
current_node = current_node.right
return current_node
# Override
def get_successor(self, node: DoubleThreadNode) -> Optional[DoubleThreadNode]:
"""Return the successor node in the in-order order.
See Also
--------
:py:meth:`trees.binary_trees.binary_tree.BinaryTree.get_successor`.
"""
if node.rightThread:
return node.right
else:
if node.right:
return self.get_leftmost(node=node.right)
return None
# Override
def get_predecessor(self, node: DoubleThreadNode) -> Optional[DoubleThreadNode]:
"""Return the predecessor node in the in-order order.
See Also
--------
:py:meth:`trees.binary_trees.binary_tree.BinaryTree.get_predecessor`.
"""
if node.leftThread:
return node.left
else:
if node.left:
return self.get_rightmost(node=node.left)
return None
# Override
def get_height(self, node: Optional[DoubleThreadNode]) -> int:
"""Return the height of the given node.
See Also
--------
:py:meth:`trees.binary_trees.binary_tree.BinaryTree.get_height`.
"""
if node is None:
return 0
if (
(node.left is None and node.right is None)
or (node.leftThread and node.right is None)
or (node.left is None and node.rightThread)
or (node.leftThread and node.rightThread)
):
return 0
return max(self.get_height(node.left), self.get_height(node.right)) + 1
def preorder_traverse(self) -> binary_tree.Pairs:
"""Use the right threads to traverse the tree in pre-order order.
Yields
------
`Pairs`
The next (key, data) pair in the tree pre-order traversal.
"""
current = self.root
while current:
yield (current.key, current.data)
if current.rightThread:
current = current.right.right
elif current.leftThread is False:
current = current.left
else:
break
def inorder_traverse(self) -> binary_tree.Pairs:
"""Use the right threads to traverse the tree in in-order order.
Yields
------
`Pairs`
The next (key, data) pair in the tree in-order traversal.
"""
if self.root:
current: Optional[DoubleThreadNode] = self.get_leftmost(node=self.root)
while current:
yield (current.key, current.data)
if current.rightThread:
current = current.right
else:
if current.right is None:
break
current = self.get_leftmost(current.right)
def reverse_inorder_traverse(self) -> binary_tree.Pairs:
"""Use the left threads to traverse the tree in reversed in-order.
Yields
------
`Pairs`
The next (key, data) pair in the tree reversed in-order traversal.
"""
if self.root:
current: Optional[DoubleThreadNode] = self.get_rightmost(node=self.root)
while current:
yield (current.key, current.data)
if current.leftThread:
current = current.left
else:
if current.left is None:
break
current = self.get_rightmost(current.left)
def _transplant(
self,
deleting_node: DoubleThreadNode,
replacing_node: Optional[DoubleThreadNode],
):
if deleting_node.parent is None:
self.root = replacing_node
if self.root:
self.root.leftThread = False
self.root.rightThread = False
elif deleting_node == deleting_node.parent.left:
deleting_node.parent.left = replacing_node
if replacing_node:
if deleting_node.leftThread:
if replacing_node.leftThread:
replacing_node.left = deleting_node.left
if deleting_node.rightThread:
if replacing_node.rightThread:
replacing_node.right = replacing_node.right
else:
deleting_node.parent.left = deleting_node.left
deleting_node.parent.leftThread = True
else: # deleting_node == deleting_node.parent.right
deleting_node.parent.right = replacing_node
if replacing_node:
if deleting_node.leftThread:
if replacing_node.leftThread:
replacing_node.left = deleting_node.left
if deleting_node.rightThread:
if replacing_node.rightThread:
replacing_node.right = replacing_node.right
else:
deleting_node.parent.right = deleting_node.right
deleting_node.parent.rightThread = True
if replacing_node:
replacing_node.parent = deleting_node.parent
| 34.933095 | 85 | 0.550536 | 4,184 | 39,160 | 5.024857 | 0.038002 | 0.090183 | 0.035578 | 0.01484 | 0.92304 | 0.912148 | 0.901065 | 0.885607 | 0.87481 | 0.857449 | 0 | 0.015615 | 0.355669 | 39,160 | 1,120 | 86 | 34.964286 | 0.817573 | 0.309168 | 0 | 0.808511 | 0 | 0 | 0.008035 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.06383 | false | 0 | 0.007092 | 0 | 0.156028 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
ef75f4b4f5425f204bf9742e97ddc6ff3b1c68f6 | 1,793 | py | Python | notebooks/utils/plot_utils.py | meren26/playground | b6729da40d3241c66697a1c3d19560cf77cc49e8 | [
"Apache-2.0"
] | null | null | null | notebooks/utils/plot_utils.py | meren26/playground | b6729da40d3241c66697a1c3d19560cf77cc49e8 | [
"Apache-2.0"
] | null | null | null | notebooks/utils/plot_utils.py | meren26/playground | b6729da40d3241c66697a1c3d19560cf77cc49e8 | [
"Apache-2.0"
] | null | null | null | import matplotlib.pyplot as plt
import seaborn as sns
def countplots_with_multiple_categories(df, target: str, text: str):
"""
Plots multiple countplots with shared context.
Meaningful only when num of categorical variables <= 6 and >= 2
df: dataframe
hue: target
text: shared text among columns.
"""
cols = [col for col in df.columns if text in col]
if len(cols) <= 3:
figsize = (24, 7)
ncols = len(cols)
nrows = 1
elif len(cols) > 3:
figsize = (24, 24)
ncols = 2
nrows = int(len(cols) / 2) + (len(cols) % 2 > 0)
else:
print("Categorical variables are out of defined size")
fig, axes = plt.subplots(ncols=ncols, nrows=nrows, figsize=figsize, sharey=True)
axes = axes.flatten()
for ax, col in zip(axes, cols):
sns.countplot(x=col, hue=target, data=df, ax=ax)
plt.tight_layout()
plt.show()
def boxplots_with_multiple_categories(df, target: str, text: str):
"""
Plots multiple boxplots with shared context.
Meaningful only when num of categorical variables <= 6 and >= 2
df: dataframe
x: col
y: target
text: shared text among columns.
"""
cols = [col for col in df.columns if text in col]
if len(cols) <= 3:
figsize = (24, 7)
ncols = len(cols)
nrows = 1
elif len(cols) > 3:
figsize = (24, 24)
ncols = 2
nrows = int(len(cols) / 2) + (len(cols) % 2 > 0)
else:
print("Categorical variables are out of defined size")
fig, axes = plt.subplots(ncols=ncols, nrows=nrows, figsize=figsize, sharey=True)
axes = axes.flatten()
for ax, col in zip(axes, cols):
sns.boxplot(x=col, y=target, data=df, ax=ax)
plt.tight_layout()
plt.show() | 27.166667 | 84 | 0.601785 | 255 | 1,793 | 4.2 | 0.282353 | 0.065359 | 0.029879 | 0.056022 | 0.885154 | 0.885154 | 0.885154 | 0.885154 | 0.885154 | 0.885154 | 0 | 0.024864 | 0.282209 | 1,793 | 66 | 85 | 27.166667 | 0.807304 | 0.191857 | 0 | 0.842105 | 0 | 0 | 0.065076 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.052632 | false | 0 | 0.052632 | 0 | 0.105263 | 0.052632 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
4bcee7ed56912a649fde60e5fb71faf979d78fa0 | 93 | py | Python | server/apps/project/tests/__init__.py | iotile/iotile_cloud | 9dc65ac86d3a730bba42108ed7d9bbb963d22ba6 | [
"MIT"
] | null | null | null | server/apps/project/tests/__init__.py | iotile/iotile_cloud | 9dc65ac86d3a730bba42108ed7d9bbb963d22ba6 | [
"MIT"
] | null | null | null | server/apps/project/tests/__init__.py | iotile/iotile_cloud | 9dc65ac86d3a730bba42108ed7d9bbb963d22ba6 | [
"MIT"
] | null | null | null | from .test_project import *
from .test_project_api import *
from .test_project_clone import * | 31 | 33 | 0.817204 | 14 | 93 | 5.071429 | 0.428571 | 0.338028 | 0.633803 | 0.591549 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.11828 | 93 | 3 | 33 | 31 | 0.865854 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
4bd18cbd7b99fb038ef29c202518aa21d0a846df | 123 | py | Python | zeeguu/core/definition_of_learned/__init__.py | mircealungu/Zeeguu-API-2 | 1e8ea7f5dd0b883ed2d714b9324162b1a8edd170 | [
"MIT"
] | 8 | 2018-02-06T15:47:55.000Z | 2021-05-26T15:24:49.000Z | zeeguu/core/definition_of_learned/__init__.py | mircealungu/Zeeguu-API-2 | 1e8ea7f5dd0b883ed2d714b9324162b1a8edd170 | [
"MIT"
] | 82 | 2017-12-09T16:15:02.000Z | 2020-11-12T11:34:09.000Z | zeeguu/core/definition_of_learned/__init__.py | mircealungu/Zeeguu-API-2 | 1e8ea7f5dd0b883ed2d714b9324162b1a8edd170 | [
"MIT"
] | 13 | 2017-10-12T09:05:19.000Z | 2020-02-19T09:38:01.000Z | from .is_learned import is_learned_based_on_exercise_outcomes
from .is_learned import CORRECTS_IN_DISTINCT_DAYS_FOR_LEARNED | 61.5 | 61 | 0.926829 | 20 | 123 | 5.1 | 0.65 | 0.264706 | 0.254902 | 0.372549 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.056911 | 123 | 2 | 62 | 61.5 | 0.87931 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
08a4a82d00a17792b2452cd8e868c46ada594d35 | 4,352 | py | Python | app.py | gjaiswal108/Automatic-Notification-Sender | 978eaf30ab7a5d77418bb0ca9537f8cfc452c03a | [
"Apache-2.0"
] | null | null | null | app.py | gjaiswal108/Automatic-Notification-Sender | 978eaf30ab7a5d77418bb0ca9537f8cfc452c03a | [
"Apache-2.0"
] | null | null | null | app.py | gjaiswal108/Automatic-Notification-Sender | 978eaf30ab7a5d77418bb0ca9537f8cfc452c03a | [
"Apache-2.0"
] | null | null | null | import smtplib,requests,bs4,mysql.connector
from email.mime.text import MIMEText
from email.mime.multipart import MIMEMultipart
if(True):
print("Program started")
mydb=mysql.connector.connect(host="localhost",user="root",password="",database="user")
r=requests.get("http://mmmut.ac.in/")
r.raise_for_status()
mySoup=bs4.BeautifulSoup(r.text,features="html.parser")
elems=mySoup.select('#ctl00_DataList1 span')
elemText=[]
for x in elems:
if(x.span):
if(x.a):
elemText.append([(x.span.string),(x.a.get('href'))])
else:
elemText.append([(x.span.string),''])
mycursor=mydb.cursor()
mycursor.execute("select * from data")
res=mycursor.fetchall()
res_set=set()
for x in res:
res_set.add(x[0])
for elem in elemText:
if(elem[0] not in res_set):
s = smtplib.SMTP('smtp.gmail.com', 587)
# start TLS for security
s.starttls()
# Authentication
s.login("sender_gmail_id", "password")
text="New Notice-> "+elem[0]
html="<h3>New Notice-> </h3><h4>"+elem[0]+"</h4>"
if(elem[1]!=''):
if('http' in elem[1]):
html+='<br/><h4><a href="'+elem[1]+'">Click here</a> to download.</h4>'
else:
html+='<br/><h4><a href="'+'http://mmmut.ac.in/'+elem[1]+'">Click here</a> to download.</h4>'
mycursor.execute("select * from emailList");
emaillist=mycursor.fetchall()
for i in emaillist:
message = MIMEMultipart("alternative")
message["Subject"] = "New Notice on mmmut.ac.in"
message["From"] = "sender_gmail_id"
part1 = MIMEText(text, "plain")
part2 = MIMEText(html, "html")
message["To"] =i[0]
message.attach(part1)
message.attach(part2)
s.sendmail("sender_gmail_id",i[0], message.as_string())
print(elem[0])
s.quit()
if(len(elemText)):
mycursor.execute("truncate table data")
mycursor.execute("commit")
for elem in elemText:
sqlString='insert into data values("'+elem[0]+'")'
mycursor.execute(sqlString)
mycursor.execute("commit")
elems=mySoup.select('#ctl00_DataList2 span')
elemText=[]
for x in elems:
if(x.span):
if(x.a):
elemText.append([(x.span.string),(x.a.get('href'))])
else:
elemText.append([(x.span.string),''])
mycursor=mydb.cursor()
mycursor.execute("select * from notice")
res=mycursor.fetchall()
res_set=set()
for x in res:
res_set.add(x[0])
for elem in elemText:
if(elem[0] not in res_set):
s = smtplib.SMTP('smtp.gmail.com', 587)
# start TLS for security
s.starttls()
# Authentication
s.login("sender_gmail_id", "password")
text="New Notice-> "+elem[0]
html="<h3>New Notice-> </h3><h4>"+elem[0]+"</h4>"
if(elem[1]!=''):
if('http' in elem[1]):
html+='<br/><h4><a href="'+elem[1]+'">Click here</a> to download.</h4>'
else:
html+='<br/><h4><a href="'+'http://mmmut.ac.in/'+elem[1]+'">Click here</a> to download.</h4>'
mycursor.execute("select * from emailList");
emaillist=mycursor.fetchall()
for i in emaillist:
message = MIMEMultipart("alternative")
message["Subject"] = "New Notice on mmmut.ac.in"
message["From"] = "sender_gmail_id"
part1 = MIMEText(text, "plain")
part2 = MIMEText(html, "html")
message["To"] =i[0]
message.attach(part1)
message.attach(part2)
s.sendmail("sender_gmail_id",i[0], message.as_string())
print(elem[0])
s.quit()
if(len(elemText)):
mycursor.execute("truncate table notice")
mycursor.execute("commit")
for elem in elemText:
sqlString='insert into notice values("'+elem[0]+'")'
mycursor.execute(sqlString)
mycursor.execute("commit")
print("Ended")
| 35.382114 | 113 | 0.527114 | 511 | 4,352 | 4.44227 | 0.213307 | 0.079295 | 0.034361 | 0.03348 | 0.829075 | 0.829075 | 0.829075 | 0.829075 | 0.829075 | 0.779736 | 0 | 0.020625 | 0.309283 | 4,352 | 122 | 114 | 35.672131 | 0.734531 | 0.017923 | 0 | 0.826923 | 0 | 0 | 0.213968 | 0 | 0.019231 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.028846 | 0.028846 | 0 | 0.028846 | 0.038462 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
08ceb9ca5926e45b522436b8b77ab57a173f531f | 239 | py | Python | Modulo_1/semana3/Ciclos/while/while-else.py | rubens233/cocid_python | 492ebdf21817e693e5eb330ee006397272f2e0cc | [
"MIT"
] | null | null | null | Modulo_1/semana3/Ciclos/while/while-else.py | rubens233/cocid_python | 492ebdf21817e693e5eb330ee006397272f2e0cc | [
"MIT"
] | null | null | null | Modulo_1/semana3/Ciclos/while/while-else.py | rubens233/cocid_python | 492ebdf21817e693e5eb330ee006397272f2e0cc | [
"MIT"
] | 1 | 2022-03-04T00:57:18.000Z | 2022-03-04T00:57:18.000Z |
i = 0
while i < 4:
i += 1
print(i)
else: # Ejecutado por que no se ropio while
print("No Break\n")
i = 0
while i < 4:
i += 1
print(i)
break
else: # No Ejecutado por que no se ropio while
print("No Break") | 15.933333 | 48 | 0.552301 | 43 | 239 | 3.069767 | 0.348837 | 0.030303 | 0.106061 | 0.121212 | 0.878788 | 0.878788 | 0.878788 | 0.878788 | 0.878788 | 0.621212 | 0 | 0.037736 | 0.334728 | 239 | 15 | 49 | 15.933333 | 0.792453 | 0.317992 | 0 | 0.769231 | 0 | 0 | 0.1125 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.307692 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
3ea5c317db09a998320b2353f5dedde4b7deb55e | 76,224 | py | Python | kuryr_libnetwork/tests/unit/test_kuryr_ipam.py | bbc/kuryr-libnetwork | 8f7866809240acd52667f9e47a5df4a700d9290b | [
"Apache-2.0"
] | 26 | 2016-05-23T01:18:10.000Z | 2020-04-20T14:01:07.000Z | kuryr_libnetwork/tests/unit/test_kuryr_ipam.py | bbc/kuryr-libnetwork | 8f7866809240acd52667f9e47a5df4a700d9290b | [
"Apache-2.0"
] | 1 | 2019-11-01T13:03:25.000Z | 2019-11-01T13:03:26.000Z | kuryr_libnetwork/tests/unit/test_kuryr_ipam.py | bbc/kuryr-libnetwork | 8f7866809240acd52667f9e47a5df4a700d9290b | [
"Apache-2.0"
] | 16 | 2016-07-02T23:46:51.000Z | 2021-05-21T09:55:02.000Z | # Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import ipaddress
from unittest import mock
import ddt
from oslo_serialization import jsonutils
from oslo_utils import uuidutils
from werkzeug import exceptions as w_exceptions
from kuryr.lib import constants as lib_const
from kuryr.lib import utils as lib_utils
from kuryr_libnetwork import config
from kuryr_libnetwork import constants as const
from kuryr_libnetwork.tests.unit import base
from kuryr_libnetwork import utils
FAKE_IP4_CIDR = '10.0.0.0/16'
FAKE_IP6_CIDR = 'fe80::/64'
@ddt.ddt
class TestKuryrIpam(base.TestKuryrBase):
"""Basic unit tests for libnetwork remote IPAM driver URI endpoints.
This test class covers the following HTTP methods and URIs as described in
the remote IPAM driver specification as below:
https://github.com/docker/libnetwork/blob/9bf339f27e9f5c7c922036706c9bcc410899f249/docs/ipam.md # noqa
- POST /IpamDriver.GetDefaultAddressSpaces
- POST /IpamDriver.RequestPool
- POST /IpamDriver.ReleasePool
- POST /IpamDriver.RequestAddress
- POST /IpamDriver.ReleaseAddress
"""
@ddt.data(
('/IpamDriver.GetDefaultAddressSpaces',
{"LocalDefaultAddressSpace":
config.CONF.local_default_address_space,
"GlobalDefaultAddressSpace":
config.CONF.global_default_address_space}),
('/IpamDriver.GetCapabilities',
{"RequiresMACAddress": True}))
@ddt.unpack
def test_remote_ipam_driver_endpoint(self, endpoint, expected):
response = self.app.post(endpoint)
self.assertEqual(200, response.status_code)
decoded_json = jsonutils.loads(response.data)
self.assertEqual(expected, decoded_json)
@mock.patch('kuryr_libnetwork.controllers.app.neutron.add_tag')
@mock.patch('kuryr_libnetwork.controllers.app.neutron.create_subnetpool')
@mock.patch('kuryr_libnetwork.controllers.app.neutron.list_subnetpools')
@ddt.data((FAKE_IP4_CIDR), (FAKE_IP6_CIDR))
def test_ipam_driver_request_pool_with_existing_subnet_id(self,
pool_cidr, mock_list_subnetpools,
mock_create_subnetpool, mock_add_tag):
neutron_subnet_v4_id = uuidutils.generate_uuid()
pool_name = lib_utils.get_neutron_subnetpool_name(pool_cidr)
prefixlen = ipaddress.ip_network(str(pool_cidr)).prefixlen
new_subnetpool = {
'name': pool_name,
'default_prefixlen': prefixlen,
'prefixes': [pool_cidr],
'shared': False}
fake_kuryr_subnetpool_id = uuidutils.generate_uuid()
fake_name = pool_name
if pool_cidr == FAKE_IP4_CIDR:
kuryr_subnetpools = self._get_fake_v4_subnetpools(
fake_kuryr_subnetpool_id, prefixes=[pool_cidr],
name=fake_name)
else:
kuryr_subnetpools = self._get_fake_v6_subnetpools(
fake_kuryr_subnetpool_id, prefixes=[pool_cidr],
name=fake_name)
mock_list_subnetpools.return_value = {'subnetpools': []}
fake_subnetpool_response = {
'subnetpool': kuryr_subnetpools['subnetpools'][0]
}
mock_create_subnetpool.return_value = fake_subnetpool_response
fake_request = {
'AddressSpace': '',
'Pool': pool_cidr,
'SubPool': '', # In the case --ip-range is not given
'Options': {
'neutron.subnet.uuid': neutron_subnet_v4_id
},
'V6': False
}
response = self.app.post('/IpamDriver.RequestPool',
content_type='application/json',
data=jsonutils.dumps(fake_request))
self.assertEqual(200, response.status_code)
mock_list_subnetpools.assert_called_with(
name=pool_name, tags=[str(neutron_subnet_v4_id)])
mock_create_subnetpool.assert_called_with(
{'subnetpool': new_subnetpool})
mock_add_tag.assert_called_once_with(
'subnetpools', fake_kuryr_subnetpool_id, neutron_subnet_v4_id)
decoded_json = jsonutils.loads(response.data)
self.assertEqual(fake_kuryr_subnetpool_id, decoded_json['PoolID'])
@mock.patch('kuryr_libnetwork.controllers.app.neutron.add_tag')
@mock.patch('kuryr_libnetwork.controllers.app.neutron.create_subnetpool')
@mock.patch('kuryr_libnetwork.controllers.app.neutron.list_subnetpools')
@ddt.data((FAKE_IP4_CIDR), (FAKE_IP6_CIDR))
def test_ipam_driver_request_pool_with_existing_subnet_id_and_shared(self,
pool_cidr, mock_list_subnetpools,
mock_create_subnetpool, mock_add_tag):
neutron_subnet_v4_id = uuidutils.generate_uuid()
pool_name = lib_utils.get_neutron_subnetpool_name(pool_cidr)
prefixlen = ipaddress.ip_network(str(pool_cidr)).prefixlen
new_subnetpool = {
'name': pool_name,
'default_prefixlen': prefixlen,
'prefixes': [pool_cidr],
'shared': True}
fake_kuryr_subnetpool_id = uuidutils.generate_uuid()
fake_name = pool_name
if pool_cidr == FAKE_IP4_CIDR:
kuryr_subnetpools = self._get_fake_v4_subnetpools(
fake_kuryr_subnetpool_id, prefixes=[pool_cidr],
name=fake_name)
else:
kuryr_subnetpools = self._get_fake_v6_subnetpools(
fake_kuryr_subnetpool_id, prefixes=[pool_cidr],
name=fake_name)
mock_list_subnetpools.return_value = {'subnetpools': []}
fake_subnetpool_response = {
'subnetpool': kuryr_subnetpools['subnetpools'][0]
}
mock_create_subnetpool.return_value = fake_subnetpool_response
fake_request = {
'AddressSpace': '',
'Pool': pool_cidr,
'SubPool': '', # In the case --ip-range is not given
'Options': {
'neutron.subnet.uuid': neutron_subnet_v4_id,
'neutron.net.shared': True
},
'V6': False
}
response = self.app.post('/IpamDriver.RequestPool',
content_type='application/json',
data=jsonutils.dumps(fake_request))
self.assertEqual(200, response.status_code)
mock_list_subnetpools.assert_called_with(
name=pool_name, tags=[str(neutron_subnet_v4_id)])
mock_create_subnetpool.assert_called_with(
{'subnetpool': new_subnetpool})
mock_add_tag.assert_called_once_with(
'subnetpools', fake_kuryr_subnetpool_id, neutron_subnet_v4_id)
decoded_json = jsonutils.loads(response.data)
self.assertEqual(fake_kuryr_subnetpool_id, decoded_json['PoolID'])
@mock.patch('kuryr_libnetwork.controllers.app.neutron.add_tag')
@mock.patch('kuryr_libnetwork.controllers.app.neutron.create_subnetpool')
@mock.patch('kuryr_libnetwork.controllers.app.neutron.list_subnetpools')
@mock.patch('kuryr_libnetwork.controllers.app.neutron.list_subnets')
@ddt.data((FAKE_IP4_CIDR), (FAKE_IP6_CIDR))
def test_ipam_driver_request_pool_with_existing_subnet_name(self,
pool_cidr, mock_list_subnets, mock_list_subnetpools,
mock_create_subnetpool, mock_add_tag):
# faking list_subnets
docker_endpoint_id = lib_utils.get_hash()
neutron_network_id = uuidutils.generate_uuid()
neutron_subnet_v4_id = uuidutils.generate_uuid()
neutron_subnet_v4_name = utils.make_subnet_name(FAKE_IP4_CIDR)
# Faking existing Neutron subnets
fake_v4_subnet = self._get_fake_v4_subnet(
neutron_network_id, docker_endpoint_id,
subnet_v4_id=neutron_subnet_v4_id,
cidr=FAKE_IP4_CIDR, name=neutron_subnet_v4_name)
fake_subnets = {
'subnets': [
fake_v4_subnet['subnet'],
]
}
mock_list_subnets.return_value = fake_subnets
pool_name = lib_utils.get_neutron_subnetpool_name(pool_cidr)
prefixlen = ipaddress.ip_network(str(pool_cidr)).prefixlen
new_subnetpool = {
'name': pool_name,
'default_prefixlen': prefixlen,
'prefixes': [pool_cidr],
'shared': False}
fake_kuryr_subnetpool_id = uuidutils.generate_uuid()
fake_name = pool_name
if pool_cidr == FAKE_IP4_CIDR:
kuryr_subnetpools = self._get_fake_v4_subnetpools(
fake_kuryr_subnetpool_id, prefixes=[pool_cidr],
name=fake_name)
else:
kuryr_subnetpools = self._get_fake_v6_subnetpools(
fake_kuryr_subnetpool_id, prefixes=[pool_cidr],
name=fake_name)
mock_list_subnetpools.return_value = {'subnetpools': []}
fake_subnetpool_response = {
'subnetpool': kuryr_subnetpools['subnetpools'][0]
}
mock_create_subnetpool.return_value = fake_subnetpool_response
fake_request = {
'AddressSpace': '',
'Pool': pool_cidr,
'SubPool': '', # In the case --ip-range is not given
'Options': {
'neutron.subnet.name': neutron_subnet_v4_name
},
'V6': False
}
response = self.app.post('/IpamDriver.RequestPool',
content_type='application/json',
data=jsonutils.dumps(fake_request))
self.assertEqual(200, response.status_code)
mock_list_subnets.assert_called_with(name=neutron_subnet_v4_name)
mock_list_subnetpools.assert_called_with(
name=pool_name, tags=[str(neutron_subnet_v4_id)])
mock_create_subnetpool.assert_called_with(
{'subnetpool': new_subnetpool})
mock_add_tag.assert_called_once_with(
'subnetpools', fake_kuryr_subnetpool_id, neutron_subnet_v4_id)
decoded_json = jsonutils.loads(response.data)
self.assertEqual(fake_kuryr_subnetpool_id, decoded_json['PoolID'])
@mock.patch('kuryr_libnetwork.controllers.app.neutron.create_subnetpool')
@mock.patch('kuryr_libnetwork.controllers.app.neutron.list_subnetpools')
@mock.patch('kuryr_libnetwork.controllers.app.neutron.list_subnets')
@ddt.data((FAKE_IP4_CIDR), (FAKE_IP6_CIDR))
def test_ipam_driver_request_pool_with_user_pool(self, pool_cidr,
mock_list_subnets, mock_list_subnetpools, mock_create_subnetpool):
fake_subnet = {"subnets": []}
mock_list_subnets.return_value = fake_subnet
pool_name = lib_utils.get_neutron_subnetpool_name(pool_cidr)
prefixlen = ipaddress.ip_network(str(pool_cidr)).prefixlen
new_subnetpool = {
'name': pool_name,
'default_prefixlen': prefixlen,
'prefixes': [pool_cidr],
'shared': False}
fake_kuryr_subnetpool_id = uuidutils.generate_uuid()
fake_name = pool_name
if pool_cidr == FAKE_IP4_CIDR:
kuryr_subnetpools = self._get_fake_v4_subnetpools(
fake_kuryr_subnetpool_id, prefixes=[pool_cidr],
name=fake_name)
else:
kuryr_subnetpools = self._get_fake_v6_subnetpools(
fake_kuryr_subnetpool_id, prefixes=[pool_cidr],
name=fake_name)
mock_list_subnetpools.return_value = {'subnetpools': []}
fake_subnetpool_response = {
'subnetpool': kuryr_subnetpools['subnetpools'][0]
}
mock_create_subnetpool.return_value = fake_subnetpool_response
fake_request = {
'AddressSpace': '',
'Pool': pool_cidr,
'SubPool': '', # In the case --ip-range is not given
'Options': {},
'V6': False
}
response = self.app.post('/IpamDriver.RequestPool',
content_type='application/json',
data=jsonutils.dumps(fake_request))
self.assertEqual(200, response.status_code)
mock_list_subnets.assert_called_with(cidr=pool_cidr)
mock_list_subnetpools.assert_called_with(name=fake_name)
mock_create_subnetpool.assert_called_with(
{'subnetpool': new_subnetpool})
decoded_json = jsonutils.loads(response.data)
self.assertEqual(fake_kuryr_subnetpool_id, decoded_json['PoolID'])
@mock.patch('kuryr_libnetwork.controllers.app.neutron.add_tag')
@mock.patch('kuryr_libnetwork.controllers.app.neutron.list_subnetpools')
@mock.patch('kuryr_libnetwork.controllers.app.neutron.list_subnets')
@mock.patch('kuryr_libnetwork.controllers.app')
@ddt.data((True, FAKE_IP4_CIDR), (True, FAKE_IP6_CIDR),
(False, FAKE_IP4_CIDR), (False, FAKE_IP6_CIDR))
@ddt.unpack
def test_ipam_driver_request_pool_with_pool_name_option(self,
use_tag_ext, pool_cidr, mock_app, mock_list_subnets,
mock_list_subnetpools, mock_add_tag):
mock_app.tag_ext = use_tag_ext
fake_subnet = {"subnets": []}
mock_list_subnets.return_value = fake_subnet
fake_kuryr_subnetpool_id = uuidutils.generate_uuid()
fake_name = 'fake_pool_name'
if pool_cidr == FAKE_IP4_CIDR:
kuryr_subnetpools = self._get_fake_v4_subnetpools(
fake_kuryr_subnetpool_id, prefixes=[pool_cidr],
name=fake_name)
options = {
const.NEUTRON_POOL_NAME_OPTION: fake_name}
else:
kuryr_subnetpools = self._get_fake_v6_subnetpools(
fake_kuryr_subnetpool_id, prefixes=[pool_cidr],
name=fake_name)
options = {
const.NEUTRON_V6_POOL_NAME_OPTION: fake_name}
mock_list_subnetpools.return_value = kuryr_subnetpools
fake_request = {
'AddressSpace': '',
'Pool': pool_cidr,
'SubPool': pool_cidr,
'Options': options,
'V6': False if pool_cidr == FAKE_IP4_CIDR else True
}
response = self.app.post('/IpamDriver.RequestPool',
content_type='application/json',
data=jsonutils.dumps(fake_request))
self.assertEqual(200, response.status_code)
mock_list_subnets.assert_called_with(cidr=pool_cidr)
if mock_app.tag_ext:
mock_add_tag.assert_called_once_with(
'subnetpools', fake_kuryr_subnetpool_id,
const.KURYR_EXISTING_NEUTRON_SUBNETPOOL)
else:
mock_add_tag.assert_not_called()
decoded_json = jsonutils.loads(response.data)
self.assertEqual(fake_kuryr_subnetpool_id, decoded_json['PoolID'])
@mock.patch('kuryr_libnetwork.controllers.app.neutron.add_tag')
@mock.patch('kuryr_libnetwork.controllers.app.neutron.list_subnetpools')
@mock.patch('kuryr_libnetwork.controllers.app.neutron.list_subnets')
@mock.patch('kuryr_libnetwork.controllers.app')
@ddt.data((True, FAKE_IP4_CIDR), (True, FAKE_IP6_CIDR),
(False, FAKE_IP4_CIDR), (False, FAKE_IP6_CIDR))
@ddt.unpack
def test_ipam_driver_request_pool_with_pool_id_option(self,
use_tag_ext, pool_cidr, mock_app, mock_list_subnets,
mock_list_subnetpools, mock_add_tag):
mock_app.tag_ext = use_tag_ext
fake_subnet = {"subnets": []}
mock_list_subnets.return_value = fake_subnet
fake_kuryr_subnetpool_id = uuidutils.generate_uuid()
if pool_cidr == FAKE_IP4_CIDR:
kuryr_subnetpools = self._get_fake_v4_subnetpools(
fake_kuryr_subnetpool_id, prefixes=[pool_cidr])
options = {
const.NEUTRON_POOL_UUID_OPTION: fake_kuryr_subnetpool_id}
else:
kuryr_subnetpools = self._get_fake_v6_subnetpools(
fake_kuryr_subnetpool_id, prefixes=[pool_cidr])
options = {
const.NEUTRON_V6_POOL_UUID_OPTION: fake_kuryr_subnetpool_id}
mock_list_subnetpools.return_value = kuryr_subnetpools
fake_request = {
'AddressSpace': '',
'Pool': pool_cidr,
'SubPool': pool_cidr,
'Options': options,
'V6': False if pool_cidr == FAKE_IP4_CIDR else True
}
response = self.app.post('/IpamDriver.RequestPool',
content_type='application/json',
data=jsonutils.dumps(fake_request))
self.assertEqual(200, response.status_code)
mock_list_subnets.assert_called_with(cidr=pool_cidr)
if mock_app.tag_ext:
mock_add_tag.assert_called_once_with(
'subnetpools', fake_kuryr_subnetpool_id,
const.KURYR_EXISTING_NEUTRON_SUBNETPOOL)
else:
mock_add_tag.assert_not_called()
decoded_json = jsonutils.loads(response.data)
self.assertEqual(fake_kuryr_subnetpool_id, decoded_json['PoolID'])
@mock.patch('kuryr_libnetwork.controllers.app.neutron.add_tag')
@mock.patch('kuryr_libnetwork.controllers.app.neutron.create_subnetpool')
@mock.patch('kuryr_libnetwork.controllers.app.neutron.list_subnetpools')
@mock.patch('kuryr_libnetwork.controllers.app.neutron.list_subnets')
@mock.patch('kuryr_libnetwork.controllers.app')
@ddt.data((True, FAKE_IP4_CIDR), (True, FAKE_IP6_CIDR),
(False, FAKE_IP4_CIDR), (False, FAKE_IP6_CIDR))
@ddt.unpack
def test_ipam_driver_request_pool_with_unmatched_cidr(self,
use_tag_ext, pool_cidr, mock_app, mock_list_subnets,
mock_list_subnetpools, mock_create_subnetpool, mock_add_tag):
mock_app.tag_ext = use_tag_ext
fake_subnet = {"subnets": []}
mock_list_subnets.return_value = fake_subnet
subnet_ip4_cidr = '10.0.0.0/24'
subnet_ip6_cidr = 'fe80::/68'
if pool_cidr == FAKE_IP4_CIDR:
subnet_cidr = subnet_ip4_cidr
else:
subnet_cidr = subnet_ip6_cidr
pool_name = lib_utils.get_neutron_subnetpool_name(subnet_cidr)
prefixlen = ipaddress.ip_network(str(subnet_cidr)).prefixlen
new_subnetpool = {
'name': pool_name,
'default_prefixlen': prefixlen,
'prefixes': [subnet_cidr],
'shared': False}
fake_kuryr_subnetpool_id = uuidutils.generate_uuid()
fake_existing_subnetpool_id = uuidutils.generate_uuid()
if pool_cidr == FAKE_IP4_CIDR:
kuryr_subnetpools = self._get_fake_v4_subnetpools(
fake_kuryr_subnetpool_id, prefixes=[subnet_ip4_cidr])
existing_subnetpools = self._get_fake_v4_subnetpools(
fake_existing_subnetpool_id, prefixes=[pool_cidr])
options = {
const.NEUTRON_POOL_UUID_OPTION: fake_existing_subnetpool_id}
else:
kuryr_subnetpools = self._get_fake_v6_subnetpools(
fake_kuryr_subnetpool_id, prefixes=[subnet_ip6_cidr])
existing_subnetpools = self._get_fake_v6_subnetpools(
fake_existing_subnetpool_id, prefixes=[pool_cidr])
options = {
const.NEUTRON_V6_POOL_UUID_OPTION: fake_existing_subnetpool_id}
mock_list_subnetpools.side_effect = [
existing_subnetpools,
{'subnetpools': []}
]
fake_subnetpool_response = {
'subnetpool': kuryr_subnetpools['subnetpools'][0]
}
mock_create_subnetpool.return_value = fake_subnetpool_response
if pool_cidr == FAKE_IP4_CIDR:
subnet_cidr = subnet_ip4_cidr
fake_request = {
'AddressSpace': '',
'Pool': subnet_cidr,
'SubPool': subnet_cidr,
'Options': options,
'V6': False,
}
else:
subnet_cidr = subnet_ip6_cidr
fake_request = {
'AddressSpace': '',
'Pool': subnet_cidr,
'SubPool': subnet_cidr,
'Options': options,
'V6': True,
}
response = self.app.post('/IpamDriver.RequestPool',
content_type='application/json',
data=jsonutils.dumps(fake_request))
self.assertEqual(200, response.status_code)
mock_list_subnets.assert_called_with(cidr=subnet_cidr)
mock_create_subnetpool.assert_called_with(
{'subnetpool': new_subnetpool})
if mock_app.tag_ext:
mock_add_tag.assert_called_once_with(
'subnetpools', fake_existing_subnetpool_id,
const.KURYR_EXISTING_NEUTRON_SUBNETPOOL)
else:
mock_add_tag.assert_not_called()
decoded_json = jsonutils.loads(response.data)
self.assertEqual(fake_kuryr_subnetpool_id, decoded_json['PoolID'])
@mock.patch('kuryr_libnetwork.controllers.app.neutron.list_subnetpools')
def test_ipam_driver_request_pool_with_default_v6pool(self,
mock_list_subnetpools):
fake_kuryr_subnetpool_id = uuidutils.generate_uuid()
fake_name = 'kuryr6'
kuryr_subnetpools = self._get_fake_v6_subnetpools(
fake_kuryr_subnetpool_id, prefixes=['fe80::/64'])
mock_list_subnetpools.return_value = {
'subnetpools': kuryr_subnetpools['subnetpools']}
fake_request = {
'AddressSpace': '',
'Pool': '',
'SubPool': '', # In the case --ip-range is not given
'Options': {},
'V6': True
}
response = self.app.post('/IpamDriver.RequestPool',
content_type='application/json',
data=jsonutils.dumps(fake_request))
self.assertEqual(200, response.status_code)
mock_list_subnetpools.assert_called_with(name=fake_name)
decoded_json = jsonutils.loads(response.data)
self.assertEqual(fake_kuryr_subnetpool_id, decoded_json['PoolID'])
@mock.patch('kuryr_libnetwork.controllers.app.neutron.delete_subnetpool')
@mock.patch('kuryr_libnetwork.controllers.app.neutron.remove_tag')
@mock.patch('kuryr_libnetwork.controllers.app.neutron.list_subnets')
@mock.patch('kuryr_libnetwork.controllers.app.neutron.list_subnetpools')
@mock.patch('kuryr_libnetwork.controllers.app')
@ddt.data((True), (False))
def test_ipam_driver_release_pool(self,
use_tag_ext,
mock_app,
mock_list_subnetpools,
mock_list_subnets,
mock_remove_tag,
mock_delete_subnetpool):
mock_app.tag_ext = use_tag_ext
fake_kuryr_subnetpool_id = uuidutils.generate_uuid()
fake_subnetpool_name = lib_utils.get_neutron_subnetpool_name(
FAKE_IP4_CIDR)
kuryr_subnetpools = self._get_fake_v4_subnetpools(
fake_kuryr_subnetpool_id, prefixes=[FAKE_IP4_CIDR],
name=fake_subnetpool_name)
mock_list_subnetpools.return_value = kuryr_subnetpools
docker_endpoint_id = lib_utils.get_hash()
neutron_network_id = uuidutils.generate_uuid()
subnet_v4_id = uuidutils.generate_uuid()
fake_v4_subnet = self._get_fake_v4_subnet(
neutron_network_id, docker_endpoint_id, subnet_v4_id,
subnetpool_id=fake_kuryr_subnetpool_id,
cidr=FAKE_IP4_CIDR)
fake_subnet_response = {
'subnets': [
fake_v4_subnet['subnet']
]
}
mock_list_subnets.return_value = fake_subnet_response
mock_delete_subnetpool.return_value = {}
fake_request = {
'PoolID': fake_kuryr_subnetpool_id
}
response = self.app.post('/IpamDriver.ReleasePool',
content_type='application/json',
data=jsonutils.dumps(fake_request))
self.assertEqual(200, response.status_code)
if mock_app.tag_ext:
mock_list_subnetpools.assert_called_with(
id=fake_kuryr_subnetpool_id)
mock_list_subnets.assert_called_with(
cidr=FAKE_IP4_CIDR)
mock_remove_tag.assert_called_with('subnets',
subnet_v4_id,
fake_kuryr_subnetpool_id)
mock_delete_subnetpool.assert_called_with(fake_kuryr_subnetpool_id)
@mock.patch('kuryr_libnetwork.controllers.app.neutron.remove_tag')
@mock.patch('kuryr_libnetwork.controllers.app.neutron.list_subnets')
@mock.patch('kuryr_libnetwork.controllers.app.neutron.list_subnetpools')
@mock.patch('kuryr_libnetwork.controllers.app')
@ddt.data((True), (False))
def test_ipam_driver_release_pool_with_pool_name_option(
self, use_tag_ext, mock_app, mock_list_subnetpools,
mock_list_subnets, mock_remove_tag):
mock_app.tag_ext = use_tag_ext
fake_kuryr_subnetpool_id = uuidutils.generate_uuid()
fake_subnetpool_name = 'fake_pool_name'
fake_tags = []
if mock_app.tag_ext:
fake_tags.append(const.KURYR_EXISTING_NEUTRON_SUBNETPOOL)
kuryr_subnetpools = self._get_fake_v4_subnetpools(
fake_kuryr_subnetpool_id, prefixes=[FAKE_IP4_CIDR],
name=fake_subnetpool_name, tags=fake_tags)
mock_list_subnetpools.return_value = kuryr_subnetpools
docker_endpoint_id = lib_utils.get_hash()
neutron_network_id = uuidutils.generate_uuid()
subnet_v4_id = uuidutils.generate_uuid()
fake_v4_subnet = self._get_fake_v4_subnet(
neutron_network_id, docker_endpoint_id, subnet_v4_id,
subnetpool_id=fake_kuryr_subnetpool_id,
cidr=FAKE_IP4_CIDR)
fake_subnet_response = {
'subnets': [
fake_v4_subnet['subnet']
]
}
mock_list_subnets.return_value = fake_subnet_response
fake_request = {
'PoolID': fake_kuryr_subnetpool_id
}
response = self.app.post('/IpamDriver.ReleasePool',
content_type='application/json',
data=jsonutils.dumps(fake_request))
self.assertEqual(200, response.status_code)
if mock_app.tag_ext:
mock_list_subnetpools.assert_called_with(
id=fake_kuryr_subnetpool_id)
mock_list_subnets.assert_called_with(
cidr=FAKE_IP4_CIDR)
mock_remove_tag.assert_any_call('subnets',
subnet_v4_id,
fake_kuryr_subnetpool_id)
mock_remove_tag.assert_any_call(
'subnetpools', fake_kuryr_subnetpool_id,
const.KURYR_EXISTING_NEUTRON_SUBNETPOOL)
@mock.patch('kuryr_libnetwork.controllers._neutron_port_add_tag')
@mock.patch('kuryr_libnetwork.controllers.app.neutron.create_port')
@mock.patch('kuryr_libnetwork.controllers.app.neutron.list_subnets')
@mock.patch('kuryr_libnetwork.controllers.app')
@ddt.data((False), (True))
def test_ipam_driver_request_address(self, use_tag_ext, mock_app,
mock_list_subnets, mock_create_port, mock_port_add_tag):
mock_app.tag_ext = use_tag_ext
fake_kuryr_subnetpool_id = uuidutils.generate_uuid()
# faking list_subnets
docker_endpoint_id = lib_utils.get_hash()
neutron_network_id = uuidutils.generate_uuid()
subnet_v4_id = uuidutils.generate_uuid()
fake_v4_subnet = self._get_fake_v4_subnet(
neutron_network_id, docker_endpoint_id, subnet_v4_id,
subnetpool_id=fake_kuryr_subnetpool_id,
cidr=FAKE_IP4_CIDR)
fake_subnet_response = {
'subnets': [
fake_v4_subnet['subnet']
]
}
mock_list_subnets.return_value = fake_subnet_response
# faking create_port
fake_neutron_port_id = uuidutils.generate_uuid()
fake_mac_address = 'fa:16:3e:ca:59:88'
fake_port = base.TestKuryrBase._get_fake_port(
docker_endpoint_id, neutron_network_id,
fake_neutron_port_id, lib_const.PORT_STATUS_ACTIVE,
subnet_v4_id,
neutron_subnet_v4_address="10.0.0.5",
neutron_mac_address=fake_mac_address)
port_request = {
'name': const.KURYR_UNBOUND_PORT,
'admin_state_up': True,
'network_id': neutron_network_id,
}
fixed_ips = port_request['fixed_ips'] = []
fixed_ip = {'subnet_id': subnet_v4_id}
fixed_ips.append(fixed_ip)
mock_create_port.return_value = fake_port
# Testing container ip allocation
fake_request = {
'PoolID': fake_kuryr_subnetpool_id,
'Address': '', # Querying for container address
'Options': {const.DOCKER_MAC_ADDRESS_OPTION: fake_mac_address}
}
mock_port_add_tag.return_value = None
response = self.app.post('/IpamDriver.RequestAddress',
content_type='application/json',
data=jsonutils.dumps(fake_request))
self.assertEqual(200, response.status_code)
mock_list_subnets.assert_called_with(
subnetpool_id=fake_kuryr_subnetpool_id)
mock_create_port.assert_called_with({'port': port_request})
decoded_json = jsonutils.loads(response.data)
self.assertEqual('10.0.0.5/16', decoded_json['Address'])
if mock_app.tag_ext:
mock_port_add_tag.assert_called()
else:
mock_port_add_tag.assert_not_called()
@mock.patch('kuryr_libnetwork.controllers.app.neutron.list_subnets')
@mock.patch('kuryr_libnetwork.controllers.app.neutron.list_subnetpools')
def test_ipam_driver_request_address_when_subnet_not_exist(self,
mock_list_subnetpools, mock_list_subnets):
requested_address = '10.0.0.5'
fake_mac_address = 'fa:16:3e:ca:59:88'
fake_kuryr_subnetpool_id = uuidutils.generate_uuid()
fake_name = lib_utils.get_neutron_subnetpool_name(FAKE_IP4_CIDR)
kuryr_subnetpools = self._get_fake_v4_subnetpools(
fake_kuryr_subnetpool_id, prefixes=[FAKE_IP4_CIDR],
name=fake_name)
mock_list_subnetpools.return_value = kuryr_subnetpools
# faking list_subnets
fake_subnet_response = {'subnets': []}
mock_list_subnets.return_value = fake_subnet_response
# Testing container ip allocation
fake_request = {
'PoolID': fake_kuryr_subnetpool_id,
'Address': requested_address,
'Options': {const.DOCKER_MAC_ADDRESS_OPTION: fake_mac_address}
}
response = self.app.post('/IpamDriver.RequestAddress',
content_type='application/json',
data=jsonutils.dumps(fake_request))
self.assertEqual(500, response.status_code)
mock_list_subnetpools.assert_called_with(
id=fake_kuryr_subnetpool_id)
mock_list_subnets.assert_called_with(cidr=FAKE_IP4_CIDR)
@mock.patch('kuryr_libnetwork.controllers._neutron_port_add_tag')
@mock.patch('kuryr_libnetwork.controllers.app.neutron.create_port')
@mock.patch('kuryr_libnetwork.controllers.app.neutron.update_port')
@mock.patch('kuryr_libnetwork.controllers.app.neutron.list_ports')
@mock.patch('kuryr_libnetwork.controllers.app.neutron.list_subnets')
@mock.patch('kuryr_libnetwork.controllers.app')
@ddt.data((False), (True))
def test_ipam_driver_request_specific_address(self,
use_tag_ext, mock_app, mock_list_subnets, mock_list_ports,
mock_update_port, mock_create_port, mock_port_add_tag):
mock_app.tag_ext = use_tag_ext
# faking list_subnets
neutron_network_id = uuidutils.generate_uuid()
docker_endpoint_id = lib_utils.get_hash()
subnet_v4_id = uuidutils.generate_uuid()
fake_kuryr_subnetpool_id = uuidutils.generate_uuid()
fake_v4_subnet = self._get_fake_v4_subnet(
neutron_network_id, docker_endpoint_id, subnet_v4_id,
subnetpool_id=fake_kuryr_subnetpool_id,
cidr=FAKE_IP4_CIDR)
fake_subnet_response = {
'subnets': [
fake_v4_subnet['subnet']
]
}
mock_list_subnets.return_value = fake_subnet_response
# faking update_port or create_port
requested_address = '10.0.0.5'
fake_mac_address = 'fa:16:3e:ca:59:88'
fake_neutron_port_id = uuidutils.generate_uuid()
fake_port = base.TestKuryrBase._get_fake_port(
docker_endpoint_id, neutron_network_id,
fake_neutron_port_id, lib_const.PORT_STATUS_ACTIVE,
subnet_v4_id,
neutron_subnet_v4_address=requested_address,
neutron_mac_address=fake_mac_address)
fixed_ip_existing = [('subnet_id=%s' % subnet_v4_id)]
fixed_ip_existing.append('ip_address=%s' % requested_address)
fake_ports_response = {'ports': []}
mock_list_ports.return_value = fake_ports_response
port_request = {
'name': const.KURYR_UNBOUND_PORT,
'admin_state_up': True,
'network_id': neutron_network_id,
}
fixed_ips = port_request['fixed_ips'] = []
fixed_ip = {'subnet_id': subnet_v4_id,
'ip_address': requested_address}
fixed_ips.append(fixed_ip)
mock_create_port.return_value = fake_port
# Testing container ip allocation
fake_request = {
'PoolID': fake_kuryr_subnetpool_id,
'Address': requested_address,
'Options': {const.DOCKER_MAC_ADDRESS_OPTION: fake_mac_address}
}
mock_port_add_tag.return_value = None
response = self.app.post('/IpamDriver.RequestAddress',
content_type='application/json',
data=jsonutils.dumps(fake_request))
self.assertEqual(200, response.status_code)
mock_list_subnets.assert_called_with(
subnetpool_id=fake_kuryr_subnetpool_id)
mock_list_ports.assert_has_calls([
mock.call(fixed_ips=fixed_ip_existing),
mock.call(
mac_address=fake_mac_address,
fixed_ips='subnet_id=%s' % fake_v4_subnet['subnet']['id'])])
mock_create_port.assert_called_with({'port': port_request})
if mock_app.tag_ext:
mock_port_add_tag.assert_called()
else:
mock_port_add_tag.assert_not_called()
decoded_json = jsonutils.loads(response.data)
self.assertEqual(requested_address + '/16', decoded_json['Address'])
@mock.patch('kuryr_libnetwork.controllers._neutron_port_add_tag')
@mock.patch('kuryr_libnetwork.controllers.app.neutron.create_port')
@mock.patch('kuryr_libnetwork.controllers.app.neutron.list_ports')
@mock.patch('kuryr_libnetwork.controllers.app.neutron.list_subnets')
@mock.patch('kuryr_libnetwork.controllers.app')
@ddt.data((False), (True))
def test_ipam_driver_request_specific_address_existing_port(self,
use_tag_ext, mock_app, mock_list_subnets, mock_list_ports,
mock_create_port, mock_port_add_tag):
mock_app.tag_ext = use_tag_ext
# faking list_subnets
neutron_network_id = uuidutils.generate_uuid()
docker_endpoint_id = lib_utils.get_hash()
subnet_v4_id = uuidutils.generate_uuid()
subnet_v6_id = uuidutils.generate_uuid()
fake_kuryr_subnetpool_id = uuidutils.generate_uuid()
fake_kuryr_subnetpool_v6_id = uuidutils.generate_uuid()
fake_v4_subnet = self._get_fake_v4_subnet(
neutron_network_id, docker_endpoint_id, subnet_v4_id,
subnetpool_id=fake_kuryr_subnetpool_id,
cidr=FAKE_IP4_CIDR)
fake_v6_subnet = self._get_fake_v6_subnet(
neutron_network_id, docker_endpoint_id, subnet_v6_id,
subnetpool_id=fake_kuryr_subnetpool_v6_id,
cidr=FAKE_IP6_CIDR)
fake_subnet_response = {
'subnets': [
fake_v4_subnet['subnet']
]
}
fake_subnet_response_v6 = {
'subnets': [
fake_v6_subnet['subnet']
]
}
mock_list_subnets.side_effect = [
fake_subnet_response, fake_subnet_response_v6]
# faking update_port or create_port
requested_address = '10.0.0.5'
requested_address_v6 = 'fe80::6'
requested_mac_address = 'fa:16:3e:86:a0:fe'
fake_neutron_port_id = uuidutils.generate_uuid()
fake_port = base.TestKuryrBase._get_fake_port(
docker_endpoint_id, neutron_network_id,
fake_neutron_port_id, lib_const.PORT_STATUS_ACTIVE,
subnet_v4_id, subnet_v6_id,
neutron_subnet_v4_address=requested_address,
neutron_subnet_v6_address=requested_address_v6,
neutron_mac_address=requested_mac_address)
fixed_ip_existing = [('subnet_id=%s' % subnet_v4_id)]
fixed_ipv6_existing = [('subnet_id=%s' % subnet_v6_id)]
fixed_ip_existing.append('ip_address=%s' % requested_address)
fixed_ipv6_existing.append('ip_address=%s' % requested_address_v6)
fake_existing_port = dict(fake_port['port'])
fake_existing_port['binding:host_id'] = ''
fake_existing_port['binding:vif_type'] = 'unbound'
fake_ports_response = {'ports': [fake_existing_port]}
fake_existing_port_2 = dict(fake_port['port'])
fake_existing_port_2['name'] = const.NEUTRON_UNBOUND_PORT
fake_existing_port_2['binding:host_id'] = lib_utils.get_hostname()
fake_ports_response_2 = {'ports': [fake_existing_port_2]}
mock_list_ports.side_effect = [
fake_ports_response, fake_ports_response_2]
# Testing container ip allocation
fake_request = {
'PoolID': fake_kuryr_subnetpool_id,
'Address': requested_address,
'Options': {const.DOCKER_MAC_ADDRESS_OPTION: requested_mac_address}
}
mock_port_add_tag.return_value = None
response = self.app.post('/IpamDriver.RequestAddress',
content_type='application/json',
data=jsonutils.dumps(fake_request))
self.assertEqual(200, response.status_code)
decoded_json = jsonutils.loads(response.data)
self.assertEqual(requested_address + '/16', decoded_json['Address'])
fake_request_2 = {
'PoolID': fake_kuryr_subnetpool_v6_id,
'Address': requested_address_v6,
'Options': {const.DOCKER_MAC_ADDRESS_OPTION: requested_mac_address}
}
response = self.app.post('/IpamDriver.RequestAddress',
content_type='application/json',
data=jsonutils.dumps(fake_request_2))
self.assertEqual(200, response.status_code)
decoded_json = jsonutils.loads(response.data)
self.assertEqual(requested_address_v6 + '/64', decoded_json['Address'])
mock_list_subnets.assert_has_calls([
mock.call(subnetpool_id=fake_kuryr_subnetpool_id),
mock.call(subnetpool_id=fake_kuryr_subnetpool_v6_id)])
mock_list_ports.assert_has_calls([
mock.call(fixed_ips=fixed_ip_existing),
mock.call(fixed_ips=fixed_ipv6_existing)])
if mock_app.tag_ext:
self.assertEqual(2, mock_port_add_tag.call_count)
else:
self.assertEqual(0, mock_port_add_tag.call_count)
@mock.patch('kuryr_libnetwork.controllers._neutron_port_add_tag')
@mock.patch('kuryr_libnetwork.controllers.app.neutron.create_port')
@mock.patch('kuryr_libnetwork.controllers.app.neutron.list_ports')
@mock.patch('kuryr_libnetwork.controllers.app.neutron.list_subnets')
@mock.patch('kuryr_libnetwork.controllers.app')
@ddt.data((False), (True))
def test_ipam_driver_request_specific_mac_address_existing_port(self,
use_tag_ext, mock_app, mock_list_subnets, mock_list_ports,
mock_create_port, mock_port_add_tag):
mock_app.tag_ext = use_tag_ext
# faking list_subnets
neutron_network_id = uuidutils.generate_uuid()
docker_endpoint_id = lib_utils.get_hash()
subnet_v4_id = uuidutils.generate_uuid()
subnet_v6_id = uuidutils.generate_uuid()
fake_kuryr_subnetpool_id = uuidutils.generate_uuid()
fake_kuryr_subnetpool_v6_id = uuidutils.generate_uuid()
fake_v4_subnet = self._get_fake_v4_subnet(
neutron_network_id, docker_endpoint_id, subnet_v4_id,
subnetpool_id=fake_kuryr_subnetpool_id,
cidr=FAKE_IP4_CIDR)
fake_v6_subnet = self._get_fake_v6_subnet(
neutron_network_id, docker_endpoint_id, subnet_v6_id,
subnetpool_id=fake_kuryr_subnetpool_v6_id,
cidr=FAKE_IP6_CIDR)
fake_subnet_response = {
'subnets': [
fake_v4_subnet['subnet']
]
}
fake_subnet_response_v6 = {
'subnets': [
fake_v6_subnet['subnet']
]
}
mock_list_subnets.side_effect = [
fake_subnet_response, fake_subnet_response_v6]
# faking update_port or create_port
fake_address = '10.0.0.5'
fake_address_v6 = 'fe80::6'
requested_mac_address = 'fa:16:3e:86:a0:fe'
fake_neutron_port_id = uuidutils.generate_uuid()
fake_port = base.TestKuryrBase._get_fake_port(
docker_endpoint_id, neutron_network_id,
fake_neutron_port_id, lib_const.PORT_STATUS_ACTIVE,
subnet_v4_id, subnet_v6_id,
neutron_subnet_v4_address=fake_address,
neutron_subnet_v6_address=fake_address_v6,
neutron_mac_address=requested_mac_address)
fixed_ip_existing = [('subnet_id=%s' % subnet_v4_id)]
fixed_ipv6_existing = [('subnet_id=%s' % subnet_v6_id)]
fixed_ip_existing.append('ip_address=%s' % fake_address)
fixed_ipv6_existing.append('ip_address=%s' % fake_address_v6)
fake_existing_port = dict(fake_port['port'])
fake_existing_port['binding:host_id'] = ''
fake_existing_port['binding:vif_type'] = 'unbound'
fake_ports_response = {'ports': [fake_existing_port]}
fake_existing_port_2 = dict(fake_port['port'])
fake_existing_port_2['name'] = const.NEUTRON_UNBOUND_PORT
fake_existing_port_2['binding:host_id'] = lib_utils.get_hostname()
fake_ports_response_2 = {'ports': [fake_existing_port_2]}
mock_list_ports.side_effect = [
fake_ports_response, fake_ports_response_2]
# Testing container ip allocation
fake_request = {
'PoolID': fake_kuryr_subnetpool_id,
'Address': '',
'Options': {const.DOCKER_MAC_ADDRESS_OPTION: requested_mac_address}
}
mock_port_add_tag.return_value = None
response = self.app.post('/IpamDriver.RequestAddress',
content_type='application/json',
data=jsonutils.dumps(fake_request))
self.assertEqual(200, response.status_code)
decoded_json = jsonutils.loads(response.data)
self.assertEqual(fake_address + '/16', decoded_json['Address'])
fake_request_2 = {
'PoolID': fake_kuryr_subnetpool_v6_id,
'Address': '',
'Options': {const.DOCKER_MAC_ADDRESS_OPTION: requested_mac_address}
}
response = self.app.post('/IpamDriver.RequestAddress',
content_type='application/json',
data=jsonutils.dumps(fake_request_2))
self.assertEqual(200, response.status_code)
decoded_json = jsonutils.loads(response.data)
self.assertEqual(fake_address_v6 + '/64', decoded_json['Address'])
mock_list_subnets.assert_has_calls([
mock.call(subnetpool_id=fake_kuryr_subnetpool_id),
mock.call(subnetpool_id=fake_kuryr_subnetpool_v6_id)])
mock_list_ports.assert_has_calls([
mock.call(mac_address=requested_mac_address,
fixed_ips='subnet_id=%s' % subnet_v4_id),
mock.call(mac_address=requested_mac_address,
fixed_ips='subnet_id=%s' % subnet_v6_id)])
if mock_app.tag_ext:
self.assertEqual(2, mock_port_add_tag.call_count)
else:
self.assertEqual(0, mock_port_add_tag.call_count)
@mock.patch('kuryr_libnetwork.controllers._neutron_port_add_tag')
@mock.patch('kuryr_libnetwork.controllers.app.neutron.create_port')
@mock.patch('kuryr_libnetwork.controllers.app.neutron.list_subnets')
@mock.patch('kuryr_libnetwork.controllers.app.neutron.list_subnetpools')
@mock.patch('kuryr_libnetwork.controllers.app')
@ddt.data((False), (True))
def test_ipam_driver_request_address_overlapping_cidr_in_neutron(self,
use_tag_ext, mock_app, mock_list_subnetpools, mock_list_subnets,
mock_create_port, mock_port_add_tag):
mock_app.tag_ext = use_tag_ext
# faking list_subnetpools
fake_kuryr_subnetpool_id = uuidutils.generate_uuid()
fake_kuryr_subnetpool_id2 = uuidutils.generate_uuid()
fake_name = lib_utils.get_neutron_subnetpool_name(FAKE_IP4_CIDR)
kuryr_subnetpools = self._get_fake_v4_subnetpools(
fake_kuryr_subnetpool_id, prefixes=[FAKE_IP4_CIDR],
name=fake_name)
mock_list_subnetpools.return_value = kuryr_subnetpools
# faking list_subnets
docker_endpoint_id = lib_utils.get_hash()
neutron_network_id = uuidutils.generate_uuid()
neutron_network_id2 = uuidutils.generate_uuid()
neutron_subnet_v4_id = uuidutils.generate_uuid()
neutron_subnet_v4_id2 = uuidutils.generate_uuid()
# Fake existing Neutron subnets
fake_v4_subnet = self._get_fake_v4_subnet(
neutron_network_id, docker_endpoint_id, neutron_subnet_v4_id,
cidr=FAKE_IP4_CIDR, name=utils.make_subnet_name(FAKE_IP4_CIDR),
tags=[fake_kuryr_subnetpool_id])
fake_v4_subnet2 = self._get_fake_v4_subnet(
neutron_network_id2, docker_endpoint_id, neutron_subnet_v4_id2,
cidr=FAKE_IP4_CIDR, name=utils.make_subnet_name(FAKE_IP4_CIDR),
tags=[fake_kuryr_subnetpool_id2])
fake_subnet_response = {
'subnets': []
}
fake_subnet_response2 = {
'subnets': [
fake_v4_subnet2['subnet'],
fake_v4_subnet['subnet']
]
}
mock_list_subnets.side_effect = [
fake_subnet_response, fake_subnet_response2]
# faking create_port
fake_neutron_port_id = uuidutils.generate_uuid()
fake_mac_address = 'fa:16:3e:86:a0:fe'
fake_port = self._get_fake_port(
docker_endpoint_id, neutron_network_id,
fake_neutron_port_id,
neutron_subnet_v4_id=neutron_subnet_v4_id,
neutron_subnet_v4_address="10.0.0.5",
neutron_mac_address=fake_mac_address)
mock_create_port.return_value = fake_port
port_request = {
'name': const.KURYR_UNBOUND_PORT,
'admin_state_up': True,
'network_id': neutron_network_id,
}
port_request['fixed_ips'] = []
fixed_ip = {'subnet_id': neutron_subnet_v4_id}
port_request['fixed_ips'].append(fixed_ip)
# Testing container ip allocation
fake_request = {
'PoolID': fake_kuryr_subnetpool_id,
'Address': '', # Querying for container address
'Options': {const.DOCKER_MAC_ADDRESS_OPTION: fake_mac_address}
}
mock_port_add_tag.return_value = None
response = self.app.post('/IpamDriver.RequestAddress',
content_type='application/json',
data=jsonutils.dumps(fake_request))
self.assertEqual(200, response.status_code)
mock_list_subnetpools.assert_called_with(
id=fake_kuryr_subnetpool_id)
mock_list_subnets.assert_called_with(
cidr=FAKE_IP4_CIDR)
mock_create_port.assert_called_with(
{'port': port_request})
if mock_app.tag_ext:
mock_port_add_tag.assert_called()
else:
mock_port_add_tag.assert_not_called()
decoded_json = jsonutils.loads(response.data)
self.assertEqual('10.0.0.5/16', decoded_json['Address'])
@mock.patch('kuryr_libnetwork.controllers.app.neutron.create_port')
@mock.patch('kuryr_libnetwork.controllers.app.neutron.list_subnets')
@mock.patch('kuryr_libnetwork.controllers.app.neutron.list_subnetpools')
def test_ipam_driver_request_address_overlapping_cidr_no_subnet_tags(self,
mock_list_subnetpools, mock_list_subnets, mock_create_port):
# faking list_subnetpools
fake_kuryr_subnetpool_id = uuidutils.generate_uuid()
fake_kuryr_subnetpool_id2 = uuidutils.generate_uuid()
fake_name = lib_utils.get_neutron_subnetpool_name(FAKE_IP4_CIDR)
kuryr_subnetpools = self._get_fake_v4_subnetpools(
fake_kuryr_subnetpool_id, prefixes=[FAKE_IP4_CIDR],
name=fake_name)
mock_list_subnetpools.return_value = kuryr_subnetpools
# faking list_subnets
docker_endpoint_id = lib_utils.get_hash()
neutron_network_id = uuidutils.generate_uuid()
neutron_network_id2 = uuidutils.generate_uuid()
neutron_subnet_v4_id = uuidutils.generate_uuid()
neutron_subnet_v4_id2 = uuidutils.generate_uuid()
# Fake existing Neutron subnets
fake_v4_subnet = self._get_fake_v4_subnet(
neutron_network_id, docker_endpoint_id, neutron_subnet_v4_id,
subnetpool_id=fake_kuryr_subnetpool_id,
cidr=FAKE_IP4_CIDR)
# Making existing Neutron subnet has no tag attribute
del fake_v4_subnet['subnet']['tags']
fake_v4_subnet2 = self._get_fake_v4_subnet(
neutron_network_id2, docker_endpoint_id, neutron_subnet_v4_id2,
subnetpool_id=fake_kuryr_subnetpool_id2,
cidr=FAKE_IP4_CIDR)
# Making existing Neutron subnet has no tag attribute
del fake_v4_subnet2['subnet']['tags']
fake_subnet_response = {
'subnets': []
}
fake_subnet_response2 = {
'subnets': [
fake_v4_subnet2['subnet'],
fake_v4_subnet['subnet']
]
}
mock_list_subnets.side_effect = [
fake_subnet_response, fake_subnet_response2]
# Testing container ip allocation
fake_mac_address = 'fa:16:3e:86:a0:fe'
fake_request = {
'PoolID': fake_kuryr_subnetpool_id,
'Address': '', # Querying for container address
'Options': {const.DOCKER_MAC_ADDRESS_OPTION: fake_mac_address}
}
response = self.app.post('/IpamDriver.RequestAddress',
content_type='application/json',
data=jsonutils.dumps(fake_request))
self.assertEqual(w_exceptions.InternalServerError.code,
response.status_code)
mock_list_subnetpools.assert_called_with(
id=fake_kuryr_subnetpool_id)
mock_list_subnets.assert_called_with(
cidr=FAKE_IP4_CIDR)
decoded_json = jsonutils.loads(response.data)
self.assertIn('Err', decoded_json)
self.assertIn(fake_kuryr_subnetpool_id, decoded_json['Err'])
@mock.patch('kuryr_libnetwork.controllers._neutron_port_add_tag')
@mock.patch('kuryr_libnetwork.controllers.app.neutron.create_port')
@mock.patch('kuryr_libnetwork.controllers.app.neutron.list_subnets')
@mock.patch('kuryr_libnetwork.controllers.app.neutron.list_subnetpools')
@mock.patch('kuryr_libnetwork.controllers.app')
@ddt.data((False), (True))
def test_ipam_driver_request_address_overlapping_cidr_in_kuryr(
self, use_tag_ext, mock_app, mock_list_subnetpools,
mock_list_subnets, mock_create_port, mock_port_add_tag):
mock_app.tag_ext = use_tag_ext
# faking list_subnetpools
fake_kuryr_subnetpool_id = uuidutils.generate_uuid()
fake_name = lib_utils.get_neutron_subnetpool_name(FAKE_IP4_CIDR)
kuryr_subnetpools = self._get_fake_v4_subnetpools(
fake_kuryr_subnetpool_id, prefixes=[FAKE_IP4_CIDR],
name=fake_name)
mock_list_subnetpools.return_value = kuryr_subnetpools
# faking list_subnets
docker_endpoint_id = lib_utils.get_hash()
neutron_network_id = uuidutils.generate_uuid()
neutron_network_id2 = uuidutils.generate_uuid()
neutron_subnet_v4_id = uuidutils.generate_uuid()
neutron_subnet_v4_id2 = uuidutils.generate_uuid()
fake_v4_subnet = self._get_fake_v4_subnet(
neutron_network_id, docker_endpoint_id, neutron_subnet_v4_id,
cidr=FAKE_IP4_CIDR,
name=utils.make_subnet_name(FAKE_IP4_CIDR))
fake_v4_subnet2 = self._get_fake_v4_subnet(
neutron_network_id2, docker_endpoint_id, neutron_subnet_v4_id2,
cidr=FAKE_IP4_CIDR,
name=utils.make_subnet_name(FAKE_IP4_CIDR))
fake_subnet_response = {
'subnets': [
fake_v4_subnet['subnet'],
fake_v4_subnet2['subnet'],
]
}
mock_list_subnets.return_value = fake_subnet_response
# faking create_port
fake_neutron_port_id = uuidutils.generate_uuid()
fake_mac_address = 'fa:16:3e:86:a0:fe'
fake_port = self._get_fake_port(
docker_endpoint_id, neutron_network_id,
fake_neutron_port_id,
neutron_subnet_v4_id=neutron_subnet_v4_id,
neutron_subnet_v4_address="10.0.0.5",
neutron_mac_address=fake_mac_address)
mock_create_port.return_value = fake_port
port_request = {
'name': const.KURYR_UNBOUND_PORT,
'admin_state_up': True,
'network_id': neutron_network_id,
}
port_request['fixed_ips'] = []
fixed_ip = {'subnet_id': neutron_subnet_v4_id}
port_request['fixed_ips'].append(fixed_ip)
# Testing container ip allocation
fake_request = {
'PoolID': fake_kuryr_subnetpool_id,
'Address': '', # Querying for container address
'Options': {const.DOCKER_MAC_ADDRESS_OPTION: fake_mac_address}
}
mock_port_add_tag.return_value = None
response = self.app.post('/IpamDriver.RequestAddress',
content_type='application/json',
data=jsonutils.dumps(fake_request))
self.assertEqual(200, response.status_code)
mock_list_subnetpools.assert_called_with(
id=fake_kuryr_subnetpool_id)
mock_list_subnets.assert_called_with(
subnetpool_id=fake_kuryr_subnetpool_id)
mock_create_port.assert_called_with(
{'port': port_request})
if mock_app.tag_ext:
mock_port_add_tag.assert_called()
else:
mock_port_add_tag.assert_not_called()
decoded_json = jsonutils.loads(response.data)
self.assertEqual('10.0.0.5/16', decoded_json['Address'])
@mock.patch('kuryr_libnetwork.controllers.app.neutron.list_subnets')
def test_ipam_driver_request_address_for_same_gateway(self,
mock_list_subnets):
fake_kuryr_subnetpool_id = uuidutils.generate_uuid()
# faking list_subnets
docker_endpoint_id = lib_utils.get_hash()
neutron_network_id = uuidutils.generate_uuid()
subnet_v4_id = uuidutils.generate_uuid()
fake_v4_subnet = self._get_fake_v4_subnet(
neutron_network_id, docker_endpoint_id, subnet_v4_id,
subnetpool_id=fake_kuryr_subnetpool_id,
cidr=FAKE_IP4_CIDR)
fake_v4_subnet['subnet'].update(gateway_ip='10.0.0.1')
fake_subnet_response = {
'subnets': [
fake_v4_subnet['subnet']
]
}
mock_list_subnets.return_value = fake_subnet_response
# Testing container ip allocation
fake_request = {
'PoolID': fake_kuryr_subnetpool_id,
'Address': '10.0.0.1',
'Options': {
const.REQUEST_ADDRESS_TYPE: const.NETWORK_GATEWAY_OPTIONS
}
}
response = self.app.post('/IpamDriver.RequestAddress',
content_type='application/json',
data=jsonutils.dumps(fake_request))
self.assertEqual(200, response.status_code)
mock_list_subnets.assert_called_with(
subnetpool_id=fake_kuryr_subnetpool_id)
decoded_json = jsonutils.loads(response.data)
self.assertEqual('10.0.0.1/16', decoded_json['Address'])
@mock.patch('kuryr_libnetwork.controllers.app.neutron.list_subnets')
def test_ipam_driver_request_address_for_different_gateway(self,
mock_list_subnets):
fake_kuryr_subnetpool_id = uuidutils.generate_uuid()
# faking list_subnets
docker_endpoint_id = lib_utils.get_hash()
neutron_network_id = uuidutils.generate_uuid()
subnet_v4_id = uuidutils.generate_uuid()
fake_v4_subnet = self._get_fake_v4_subnet(
neutron_network_id, docker_endpoint_id, subnet_v4_id,
subnetpool_id=fake_kuryr_subnetpool_id,
cidr=FAKE_IP4_CIDR)
fake_v4_subnet['subnet'].update(gateway_ip='10.0.0.1')
fake_subnet_response = {
'subnets': [
fake_v4_subnet['subnet']
]
}
mock_list_subnets.return_value = fake_subnet_response
# Testing container ip allocation
fake_request = {
'PoolID': fake_kuryr_subnetpool_id,
'Address': '10.0.0.5', # Different with existed gw ip
'Options': {
const.REQUEST_ADDRESS_TYPE: const.NETWORK_GATEWAY_OPTIONS
}
}
response = self.app.post('/IpamDriver.RequestAddress',
content_type='application/json',
data=jsonutils.dumps(fake_request))
self.assertEqual(500, response.status_code)
mock_list_subnets.assert_called_with(
subnetpool_id=fake_kuryr_subnetpool_id)
decoded_json = jsonutils.loads(response.data)
self.assertIn('Err', decoded_json)
err_message = ("Requested gateway {0} does not match with "
"gateway {1} in existed network.").format(
'10.0.0.5', '10.0.0.1')
self.assertEqual({'Err': err_message}, decoded_json)
@mock.patch('kuryr_libnetwork.controllers.app.neutron.delete_port')
@mock.patch('kuryr_libnetwork.controllers.app.neutron.list_ports')
@mock.patch('kuryr_libnetwork.controllers.app.neutron.list_subnets')
def test_ipam_driver_release_address(self,
mock_list_subnets, mock_list_ports, mock_delete_port):
# faking list_subnets
fake_kuryr_subnetpool_id = uuidutils.generate_uuid()
docker_network_id = lib_utils.get_hash()
docker_endpoint_id = lib_utils.get_hash()
subnet_v4_id = uuidutils.generate_uuid()
fake_v4_subnet = self._get_fake_v4_subnet(
docker_network_id, docker_endpoint_id, subnet_v4_id,
subnetpool_id=fake_kuryr_subnetpool_id,
cidr=FAKE_IP4_CIDR,
name=utils.make_subnet_name(FAKE_IP4_CIDR))
fake_subnet_response = {
'subnets': [
fake_v4_subnet['subnet']
]
}
mock_list_subnets.return_value = fake_subnet_response
fake_ip4 = '10.0.0.5'
# faking list_ports and delete_port
neutron_network_id = uuidutils.generate_uuid()
fake_neutron_port_id = uuidutils.generate_uuid()
fake_port = base.TestKuryrBase._get_fake_port(
docker_endpoint_id, neutron_network_id,
fake_neutron_port_id, lib_const.PORT_STATUS_ACTIVE,
subnet_v4_id,
neutron_subnet_v4_address=fake_ip4,
device_owner=lib_const.DEVICE_OWNER,
tags=lib_const.DEVICE_OWNER)
fake_port['port']['fixed_ips'] = [
{'subnet_id': subnet_v4_id, 'ip_address': fake_ip4}
]
list_port_response = {'ports': [fake_port['port']]}
mock_list_ports.return_value = list_port_response
mock_delete_port.return_value = {}
fake_request = {
'PoolID': fake_kuryr_subnetpool_id,
'Address': fake_ip4
}
response = self.app.post('/IpamDriver.ReleaseAddress',
content_type='application/json',
data=jsonutils.dumps(fake_request))
self.assertEqual(200, response.status_code)
mock_list_subnets.assert_called_with(
subnetpool_id=fake_kuryr_subnetpool_id)
mock_list_ports.assert_called()
mock_delete_port.assert_called_with(
fake_port['port']['id'])
@mock.patch('kuryr_libnetwork.controllers.app.neutron.delete_port')
@mock.patch('kuryr_libnetwork.controllers.app.neutron.list_ports')
@mock.patch('kuryr_libnetwork.controllers.app.neutron.list_subnets')
@mock.patch('kuryr_libnetwork.controllers.app.neutron.list_subnetpools')
def test_ipam_driver_release_address_w_existing_subnet(self,
mock_list_subnetpools, mock_list_subnets, mock_list_ports,
mock_delete_port):
# faking list_subnetpools
fake_kuryr_subnetpool_id = uuidutils.generate_uuid()
fake_name = str('-'.join(['kuryrPool', FAKE_IP4_CIDR]))
kuryr_subnetpools = self._get_fake_v4_subnetpools(
fake_kuryr_subnetpool_id, prefixes=[FAKE_IP4_CIDR], name=fake_name)
mock_list_subnetpools.return_value = kuryr_subnetpools
# faking list_subnets
docker_network_id = lib_utils.get_hash()
docker_endpoint_id = lib_utils.get_hash()
subnet_v4_id = uuidutils.generate_uuid()
fake_v4_subnet = self._get_fake_v4_subnet(
docker_network_id, docker_endpoint_id, subnet_v4_id,
subnetpool_id=fake_kuryr_subnetpool_id,
cidr=FAKE_IP4_CIDR)
fake_subnet_response = {
'subnets': []
}
fake_subnet_response2 = {
'subnets': [
fake_v4_subnet['subnet']
]
}
mock_list_subnets.side_effect = [
fake_subnet_response, fake_subnet_response2]
fake_ip4 = '10.0.0.5'
# faking list_ports and delete_port
neutron_network_id = uuidutils.generate_uuid()
fake_neutron_port_id = uuidutils.generate_uuid()
fake_port = base.TestKuryrBase._get_fake_port(
docker_endpoint_id, neutron_network_id,
fake_neutron_port_id, lib_const.PORT_STATUS_ACTIVE,
subnet_v4_id,
neutron_subnet_v4_address=fake_ip4,
device_owner=lib_const.DEVICE_OWNER,
tags=lib_const.DEVICE_OWNER)
fake_port['port']['fixed_ips'] = [
{'subnet_id': subnet_v4_id, 'ip_address': fake_ip4}
]
list_port_response = {'ports': [fake_port['port']]}
mock_list_ports.return_value = list_port_response
mock_delete_port.return_value = {}
fake_request = {
'PoolID': fake_kuryr_subnetpool_id,
'Address': fake_ip4
}
response = self.app.post('/IpamDriver.ReleaseAddress',
content_type='application/json',
data=jsonutils.dumps(fake_request))
self.assertEqual(200, response.status_code)
mock_list_subnetpools.assert_called_with(
id=fake_kuryr_subnetpool_id)
mock_list_subnets.assert_called_with(
cidr=FAKE_IP4_CIDR)
mock_list_ports.assert_called()
mock_delete_port.assert_called_with(
fake_port['port']['id'])
@mock.patch('kuryr_libnetwork.controllers.app.neutron.remove_tag')
@mock.patch('kuryr_libnetwork.controllers.app.neutron.update_port')
@mock.patch('kuryr_libnetwork.controllers.app.neutron.delete_port')
@mock.patch('kuryr_libnetwork.controllers.app.neutron.list_ports')
@mock.patch('kuryr_libnetwork.controllers.app.neutron.list_subnets')
@mock.patch('kuryr_libnetwork.controllers.app')
def test_ipam_driver_release_address_w_existing_port(self,
mock_app, mock_list_subnets, mock_list_ports, mock_delete_port,
mock_update_port, mock_remove_tag):
# TODO(hongbin): Current implementation still delete existing ports
# if tag extension is not enabled. This needs to be fixed and test
# case needs to be added after.
mock_app.tag_ext = True
# faking list_subnets
fake_kuryr_subnetpool_id = uuidutils.generate_uuid()
docker_network_id = lib_utils.get_hash()
docker_endpoint_id = lib_utils.get_hash()
subnet_v4_id = uuidutils.generate_uuid()
fake_v4_subnet = self._get_fake_v4_subnet(
docker_network_id, docker_endpoint_id, subnet_v4_id,
subnetpool_id=fake_kuryr_subnetpool_id,
cidr=FAKE_IP4_CIDR)
fake_subnet_response = {
'subnets': [
fake_v4_subnet['subnet']
]
}
mock_list_subnets.return_value = fake_subnet_response
fake_ip4 = '10.0.0.5'
# faking list_ports and delete_port
neutron_network_id = uuidutils.generate_uuid()
fake_neutron_port_id = uuidutils.generate_uuid()
fake_port = base.TestKuryrBase._get_fake_port(
docker_endpoint_id, neutron_network_id,
fake_neutron_port_id, lib_const.PORT_STATUS_ACTIVE,
subnet_v4_id,
neutron_subnet_v4_address=fake_ip4,
device_owner=lib_const.DEVICE_OWNER,
tags=const.KURYR_EXISTING_NEUTRON_PORT)
fake_port['port']['fixed_ips'] = [
{'subnet_id': subnet_v4_id, 'ip_address': fake_ip4}
]
list_port_response = {'ports': [fake_port['port']]}
mock_list_ports.return_value = list_port_response
fake_request = {
'PoolID': fake_kuryr_subnetpool_id,
'Address': fake_ip4
}
response = self.app.post('/IpamDriver.ReleaseAddress',
content_type='application/json',
data=jsonutils.dumps(fake_request))
self.assertEqual(200, response.status_code)
mock_list_subnets.assert_called_with(
subnetpool_id=fake_kuryr_subnetpool_id)
mock_list_ports.assert_called()
expect_updated_port = {'device_owner': '',
'device_id': '', 'binding:host_id': ''}
mock_update_port.assert_called_with(fake_port['port']['id'],
{'port': expect_updated_port})
mock_delete_port.assert_not_called()
mock_remove_tag.assert_called_with('ports',
fake_port['port']['id'],
const.KURYR_EXISTING_NEUTRON_PORT)
@mock.patch('kuryr_libnetwork.controllers.app.neutron.delete_port')
@mock.patch('kuryr_libnetwork.controllers.app.neutron.list_ports')
@mock.patch('kuryr_libnetwork.controllers.app.neutron.list_subnets')
@mock.patch('kuryr_libnetwork.controllers.app')
@ddt.data((False), (True))
def test_ipam_driver_release_address_w_same_ip_and_cidr(
self, use_tag_ext, mock_app, mock_list_subnets, mock_list_ports,
mock_delete_port):
# It checks only the kuryr port is removed even if the other port
# has the same IP and belongs to a subnet with the same subnetpool_id
# faking list_subnets
mock_app.tag_ext = use_tag_ext
fake_kuryr_subnetpool_id = uuidutils.generate_uuid()
# faking list_subnets
docker_network_id = lib_utils.get_hash()
docker_endpoint_id = lib_utils.get_hash()
no_kuryr_endpoint_id = lib_utils.get_hash()
subnet_v4_id = uuidutils.generate_uuid()
fake_v4_subnet = self._get_fake_v4_subnet(
docker_network_id, docker_endpoint_id, subnet_v4_id,
subnetpool_id=fake_kuryr_subnetpool_id,
cidr=FAKE_IP4_CIDR)
fake_subnet_response = {
'subnets': [
fake_v4_subnet['subnet']
]
}
mock_list_subnets.return_value = fake_subnet_response
fake_ip4 = '10.0.0.5'
# faking list_ports and delete_port
neutron_network_id = uuidutils.generate_uuid()
fake_neutron_port_id = uuidutils.generate_uuid()
fake_port = base.TestKuryrBase._get_fake_port(
docker_endpoint_id, neutron_network_id,
fake_neutron_port_id, lib_const.PORT_STATUS_ACTIVE,
subnet_v4_id,
neutron_subnet_v4_address=fake_ip4,
device_owner=lib_const.DEVICE_OWNER)
fake_port_no_kuryr = base.TestKuryrBase._get_fake_port(
no_kuryr_endpoint_id, neutron_network_id,
fake_neutron_port_id, lib_const.PORT_STATUS_ACTIVE,
subnet_v4_id,
neutron_subnet_v4_address=fake_ip4)
fake_port_no_kuryr['port']['name'] = 'port0'
if use_tag_ext:
fake_port['port']['tags'] = [lib_const.DEVICE_OWNER]
fake_port['port']['fixed_ips'] = [
{'subnet_id': subnet_v4_id, 'ip_address': fake_ip4}
]
fake_port_no_kuryr['port']['fixed_ips'] = [
{'subnet_id': subnet_v4_id, 'ip_address': fake_ip4}
]
list_port_response = {'ports': [fake_port['port'],
fake_port_no_kuryr['port']]}
mock_list_ports.return_value = list_port_response
mock_delete_port.return_value = {}
fake_request = {
'PoolID': fake_kuryr_subnetpool_id,
'Address': fake_ip4
}
response = self.app.post('/IpamDriver.ReleaseAddress',
content_type='application/json',
data=jsonutils.dumps(fake_request))
self.assertEqual(200, response.status_code)
mock_list_subnets.assert_called_with(
subnetpool_id=fake_kuryr_subnetpool_id)
mock_list_ports.assert_called()
mock_delete_port.assert_called_once_with(
fake_port['port']['id'])
@mock.patch('kuryr_libnetwork.controllers.app.neutron.remove_tag')
@mock.patch('kuryr_libnetwork.controllers.app.neutron.update_port')
@mock.patch('kuryr_libnetwork.controllers.app.neutron.delete_port')
@mock.patch('kuryr_libnetwork.controllers.app.neutron.list_ports')
@mock.patch('kuryr_libnetwork.controllers.app.neutron.list_subnets')
@mock.patch('kuryr_libnetwork.controllers.app')
def test_ipam_driver_release_address_w_existing_port_w_same_ip_and_cidr(
self, mock_app, mock_list_subnets, mock_list_ports,
mock_delete_port, mock_update_port, mock_remove_tag):
# TODO(hongbin): Current implementation still delete existing ports
# if tag extension is not enabled. This needs to be fixed and test
# case needs to be added after.
mock_app.tag_ext = True
fake_kuryr_subnetpool_id = uuidutils.generate_uuid()
fake_kuryr_subnetpool_id2 = uuidutils.generate_uuid()
# faking list_subnets
docker_network_id = lib_utils.get_hash()
docker_network_id2 = lib_utils.get_hash()
docker_endpoint_id = lib_utils.get_hash()
docker_endpoint_id2 = lib_utils.get_hash()
subnet_v4_id = uuidutils.generate_uuid()
subnet_v4_id2 = uuidutils.generate_uuid()
fake_v4_subnet = self._get_fake_v4_subnet(
docker_network_id, docker_endpoint_id, subnet_v4_id,
subnetpool_id=fake_kuryr_subnetpool_id,
cidr=FAKE_IP4_CIDR)
fake_v4_subnet2 = self._get_fake_v4_subnet(
docker_network_id2, docker_endpoint_id2, subnet_v4_id2,
subnetpool_id=fake_kuryr_subnetpool_id2,
cidr=FAKE_IP4_CIDR)
fake_subnet_response = {
'subnets': [
fake_v4_subnet['subnet'], fake_v4_subnet2['subnet']
]
}
mock_list_subnets.return_value = fake_subnet_response
fake_ip4 = '10.0.0.5'
# faking list_ports and delete_port
neutron_network_id = uuidutils.generate_uuid()
fake_neutron_port_id = uuidutils.generate_uuid()
fake_port = base.TestKuryrBase._get_fake_port(
docker_endpoint_id, neutron_network_id,
fake_neutron_port_id, lib_const.PORT_STATUS_ACTIVE,
subnet_v4_id,
neutron_subnet_v4_address=fake_ip4,
device_owner=lib_const.DEVICE_OWNER,
tags=[const.KURYR_EXISTING_NEUTRON_PORT])
fake_port2 = base.TestKuryrBase._get_fake_port(
docker_endpoint_id2, neutron_network_id,
fake_neutron_port_id, lib_const.PORT_STATUS_ACTIVE,
subnet_v4_id2,
neutron_subnet_v4_address=fake_ip4,
device_owner=lib_const.DEVICE_OWNER,
tags=[const.KURYR_EXISTING_NEUTRON_PORT])
fake_port['port']['fixed_ips'] = [
{'subnet_id': subnet_v4_id, 'ip_address': fake_ip4}
]
fake_port2['port']['fixed_ips'] = [
{'subnet_id': subnet_v4_id2, 'ip_address': fake_ip4}
]
list_port_response = {'ports': [fake_port['port'],
fake_port2['port']]}
mock_list_ports.return_value = list_port_response
mock_delete_port.return_value = {}
fake_request = {
'PoolID': fake_kuryr_subnetpool_id,
'Address': fake_ip4
}
response = self.app.post('/IpamDriver.ReleaseAddress',
content_type='application/json',
data=jsonutils.dumps(fake_request))
self.assertEqual(200, response.status_code)
mock_list_subnets.assert_called_with(
subnetpool_id=fake_kuryr_subnetpool_id)
mock_list_ports.assert_called()
expect_updated_port = {'device_owner': '',
'device_id': '', 'binding:host_id': ''}
mock_update_port.assert_called_once_with(fake_port['port']['id'],
{'port': expect_updated_port})
mock_delete_port.assert_not_called()
mock_remove_tag.assert_called_once_with('ports',
fake_port['port']['id'],
const.KURYR_EXISTING_NEUTRON_PORT)
| 44.39371 | 109 | 0.650569 | 8,809 | 76,224 | 5.170053 | 0.034397 | 0.040841 | 0.054652 | 0.053949 | 0.944119 | 0.935072 | 0.927409 | 0.920361 | 0.913532 | 0.905101 | 0 | 0.013556 | 0.263513 | 76,224 | 1,716 | 110 | 44.41958 | 0.797713 | 0.040158 | 0 | 0.804993 | 0 | 0 | 0.125827 | 0.076391 | 0 | 0 | 0 | 0.000583 | 0.091093 | 1 | 0.017544 | false | 0 | 0.008097 | 0 | 0.026316 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
3ecf49d5edc335a66eb39fadb5d9ec7bb6cd9c87 | 15,197 | py | Python | t2t_bert/pretrain_finetuning/token_generator_igr.py | yyht/bert | 480c909e0835a455606e829310ff949c9dd23549 | [
"Apache-2.0"
] | 34 | 2018-12-19T01:00:57.000Z | 2021-03-26T09:36:37.000Z | t2t_bert/pretrain_finetuning/token_generator_igr.py | yyht/bert | 480c909e0835a455606e829310ff949c9dd23549 | [
"Apache-2.0"
] | 11 | 2018-12-25T03:37:59.000Z | 2021-08-25T14:43:58.000Z | t2t_bert/pretrain_finetuning/token_generator_igr.py | yyht/bert | 480c909e0835a455606e829310ff949c9dd23549 | [
"Apache-2.0"
] | 9 | 2018-12-27T08:00:44.000Z | 2020-06-08T03:05:14.000Z | import tensorflow as tf
import numpy as np
from utils.bert import bert_utils
from utils.bert import bert_modules, albert_modules
import tensorflow as tf
from tensorflow.python.framework import ops
class FlipGradientBuilder(object):
def __init__(self):
self.num_calls = 0
def __call__(self, x, l=1.0):
grad_name = "FlipGradient%d" % self.num_calls
@ops.RegisterGradient(grad_name)
def _flip_gradients(op, grad):
return [tf.negative(grad) * l]
g = tf.get_default_graph()
with g.gradient_override_map({"Identity": grad_name}):
y = tf.identity(x)
self.num_calls += 1
return y
flip_gradient = FlipGradientBuilder()
def sample_gumbel(shape, samples=1, eps=1e-20):
"""Sample from Gumbel(0, 1)"""
if samples > 1:
sample_shape = shape + [samples]
else:
sample_shape = shape
U = tf.random_uniform(sample_shape, minval=0, maxval=1)
return -tf.log(-tf.log(U + eps) + eps)
def gumbel_softmax(logits, temperature, samples=1):
""" Draw a sample from the Gumbel-Softmax distribution"""
input_shape_list = bert_utils.get_shape_list(logits, expected_rank=2)
if samples > 1:
logits = tf.expand_dims(logits, -1)
y = logits + sample_gumbel(input_shape_list, samples)
return [tf.exp(tf.nn.log_softmax(y / temperature, axis=1)),
y]
def sample_normal(shape, samples=1, eps=1e-20):
if samples > 1:
sample_shape = shape + [samples]
else:
sample_shape = shape
epsilon = tf.random.normal(shape=sample_shape)
return epsilon
def iso_gaussian_sample(logits, temperature, samples=1):
input_shape_list = bert_utils.get_shape_list(logits, expected_rank=2)
if samples > 1:
logits = tf.expand_dims(logits, -1)
y = logits + sample_normal(input_shape_list, samples)
return [tf.exp(tf.nn.log_softmax(y / temperature)), logits]
def token_generator_igr(config, input_tensor,
output_weights,
input_ids,
input_ori_ids,
input_mask,
**kargs):
input_shape_list = bert_utils.get_shape_list(input_tensor, expected_rank=3)
batch_size = input_shape_list[0]
seq_length = input_shape_list[1]
hidden_dims = input_shape_list[2]
embedding_projection = kargs.get('embedding_projection', None)
scope = kargs.get('scope', None)
if scope:
scope = scope + '/' + 'cls/predictions'
else:
scope = 'cls/predictions'
tf.logging.info("**** mlm generator scope **** %s", str(scope))
# with tf.variable_scope("cls/predictions", reuse=tf.AUTO_REUSE):
with tf.variable_scope(scope, reuse=tf.AUTO_REUSE):
if config.get('ln_type', 'postln') == 'preln':
input_tensor = albert_modules.layer_norm(input_tensor)
elif config.get('ln_type', 'postln') == 'postln':
input_tensor = input_tensor
else:
input_tensor = input_tensor
# if config.get("embedding", "factorized") == "factorized":
# projection_width = config.hidden_size
# else:
# projection_width = config.embedding_size
if config.get("embedding", "none_factorized") == "none_factorized":
projection_width = config.hidden_size
tf.logging.info("==not using embedding factorized==")
else:
projection_width = config.get('embedding_size', config.hidden_size)
tf.logging.info("==using embedding factorized: embedding size: %s==", str(projection_width))
with tf.variable_scope("transform"):
input_tensor = tf.layers.dense(
input_tensor,
units=projection_width,
activation=albert_modules.get_activation(config.hidden_act),
kernel_initializer=albert_modules.create_initializer(
config.initializer_range))
if config.get('ln_type', 'postln') == 'preln':
input_tensor = input_tensor
elif config.get('ln_type', 'postln') == 'postln':
input_tensor = albert_modules.layer_norm(input_tensor)
else:
input_tensor = albert_modules.layer_norm(input_tensor)
if embedding_projection is not None:
# batch x seq x hidden, embedding x hidden
print(input_tensor.get_shape(), embedding_projection.get_shape())
input_tensor = tf.einsum("abc,dc->abd", input_tensor, embedding_projection)
else:
print("==no need for embedding projection==")
input_tensor = input_tensor
output_bias = tf.get_variable(
"output_bias",
shape=[config.vocab_size],
initializer=tf.zeros_initializer())
# batch x seq x embedding
logits = tf.einsum("abc,dc->abd", input_tensor, output_weights)
logits = tf.nn.bias_add(logits, output_bias)
input_shape_list = bert_utils.get_shape_list(logits, expected_rank=3)
width = input_shape_list[2]
# logits_tempered = tf.nn.log_softmax(logits, axis=-1)
# width=config.vocab_size
flat_logits_tempered = tf.reshape(logits,
[batch_size * seq_length, width])
num_train_steps = kargs.get('num_train_steps', None)
if num_train_steps and kargs.get('gumbel_anneal', "anneal") == 'anneal':
tf.logging.info("****** apply annealed temperature ******* %s", str(num_train_steps))
annealed_temp = tf.train.polynomial_decay(config.get('gumbel_temperature', 1.0),
tf.train.get_or_create_global_step(),
kargs.get("num_train_steps", 10000),
end_learning_rate=0.1,
power=1.0,
cycle=False)
elif kargs.get('gumbel_anneal', "anneal") == 'softplus':
tf.logging.info("****** apply auto-scale temperature *******")
# batch x seq x dim
with tf.variable_scope("gumbel_auto_scaling_temperature"):
annealed_temp = tf.layers.dense(input_tensor,
1,
activation=tf.nn.softplus,
) + 1.0
annealed_temp = 1./ annealed_temp
annealed_temp = tf.reshape(annealed_temp, [batch_size * seq_length, 1])
if config.get('gen_sample', 1) > 1:
tf.logging.info("****** apply auto-scale temperature for multi-sampling *******")
annealed_temp = tf.expand_dims(annealed_temp, -1)
else:
annealed_temp = 0.01
tf.logging.info("****** not apply annealed tenperature with fixed temp ******* %s", str(annealed_temp))
# [batch x seq] x config.vocab_size x config.get('gen_sample', 1)
sampled_logprob_temp, sampled_logprob = iso_gaussian_sample(flat_logits_tempered,
temperature=annealed_temp,
samples=config.get('gen_sample', 1))
# argmax on config.vocab_size which is always axis=1
# [batch x seq] x config.vocab_size x config.get('gen_sample', 1)
# armax(logits+gumbel_samples) to sample a categoritical distribution
if kargs.get('sampled_prob_id', True):
tf.logging.info("****** apply categorical sampled id of original logits *******")
sampled_id = tf.one_hot(tf.argmax(sampled_logprob, axis=1),
config.vocab_size,
axis=1) # sampled multiminal id
else:
tf.logging.info("****** apply gumbel-softmax logprob for logits *******")
sampled_id = tf.one_hot(tf.argmax(sampled_logprob_temp, axis=1),
config.vocab_size,
axis=1) # sampled multiminal id
# straight-through gumbel softmax estimator
if kargs.get("straight_through", True):
tf.logging.info("****** apply straight_through_estimator *******")
sampled_id = tf.stop_gradient(sampled_id-sampled_logprob_temp) + flip_gradient(sampled_logprob_temp)
else:
tf.logging.info("****** apply gumbel-softmax probs *******")
sampled_id = flip_gradient(sampled_logprob_temp)
sampled_binary_mask = kargs.get('sampled_binary_mask', None)
if sampled_binary_mask is not None:
label_diff_ids = tf.identity(sampled_binary_mask) # 0 for original and 1 for replace
else:
label_diff_ids = tf.not_equal(
tf.cast(input_ids, tf.int32),
tf.cast(input_ori_ids, tf.int32) # 0 for original and 1 for replace
)
label_diff_ids = tf.cast(label_diff_ids, tf.float32)
label_diff_ids = tf.expand_dims(label_diff_ids, axis=[-1]) # batch x seq x 1
input_ori_ids = tf.one_hot(input_ori_ids, config.vocab_size) # batch x seq x vocab
input_ori_ids = tf.cast(input_ori_ids, tf.float32)
if config.get('gen_sample', 1) == 1:
sampled_input_id = tf.reshape(sampled_id, [batch_size, seq_length, config.vocab_size])
if kargs.get('mask_method', 'only_mask') == 'only_mask':
tf.logging.info("****** only mask sample *******")
label_diff_ids = tf.cast(label_diff_ids, tf.float32)
sampled_input_id = (label_diff_ids) * tf.cast(sampled_input_id, tf.float32) + (1 - label_diff_ids) * tf.cast(input_ori_ids, tf.float32)
else:
sampled_input_id = tf.reshape(samples, [batch_size, seq_length, config.vocab_size, config.get('gen_sample', 1)])
label_diff_ids = tf.expand_dims(label_diff_ids, axis=-1) # batch x seq x 1
input_ori_ids = tf.expand_dims(input_ori_ids, axis=-1) # batch x seq x vocab x 1
if kargs.get('mask_method', 'only_mask') == 'only_mask':
tf.logging.info("****** only mask sample *******")
sampled_input_id = (label_diff_ids) * tf.cast(sampled_input_id, tf.float32) + (1 - input_ori_ids) * label_diff_ids
return sampled_input_id
def token_generator_gumbel_normal(config, input_tensor,
output_weights,
input_ids,
input_ori_ids,
input_mask,
**kargs):
input_shape_list = bert_utils.get_shape_list(input_tensor, expected_rank=3)
batch_size = input_shape_list[0]
seq_length = input_shape_list[1]
hidden_dims = input_shape_list[2]
embedding_projection = kargs.get('embedding_projection', None)
scope = kargs.get('scope', None)
if scope:
scope = scope + '/' + 'cls/predictions'
else:
scope = 'cls/predictions'
tf.logging.info("**** mlm generator scope **** %s", str(scope))
# with tf.variable_scope("cls/predictions", reuse=tf.AUTO_REUSE):
with tf.variable_scope(scope, reuse=tf.AUTO_REUSE):
if config.get('ln_type', 'postln') == 'preln':
input_tensor = albert_modules.layer_norm(input_tensor)
elif config.get('ln_type', 'postln') == 'postln':
input_tensor = input_tensor
else:
input_tensor = input_tensor
# if config.get("embedding", "factorized") == "factorized":
# projection_width = config.hidden_size
# else:
# projection_width = config.embedding_size
if config.get("embedding", "none_factorized") == "none_factorized":
projection_width = config.hidden_size
tf.logging.info("==not using embedding factorized==")
else:
projection_width = config.get('embedding_size', config.hidden_size)
tf.logging.info("==using embedding factorized: embedding size: %s==", str(projection_width))
with tf.variable_scope("transform"):
input_tensor = tf.layers.dense(
input_tensor,
units=projection_width,
activation=albert_modules.get_activation(config.hidden_act),
kernel_initializer=albert_modules.create_initializer(
config.initializer_range))
if config.get('ln_type', 'postln') == 'preln':
input_tensor = input_tensor
elif config.get('ln_type', 'postln') == 'postln':
input_tensor = albert_modules.layer_norm(input_tensor)
else:
input_tensor = albert_modules.layer_norm(input_tensor)
if embedding_projection is not None:
# batch x seq x hidden, embedding x hidden
print(input_tensor.get_shape(), embedding_projection.get_shape())
input_tensor = tf.einsum("abc,dc->abd", input_tensor, embedding_projection)
else:
print("==no need for embedding projection==")
input_tensor = input_tensor
output_bias = tf.get_variable(
"output_bias",
shape=[config.vocab_size],
initializer=tf.zeros_initializer())
# batch x seq x embedding
logits = tf.einsum("abc,dc->abd", input_tensor, output_weights)
logits = tf.nn.bias_add(logits, output_bias)
input_shape_list = bert_utils.get_shape_list(logits, expected_rank=3)
width = input_shape_list[2]
logits_tempered = tf.nn.log_softmax(logits, axis=-1)
# width=config.vocab_size
flat_logits_tempered = tf.reshape(logits_tempered,
[batch_size * seq_length, width])
num_train_steps = kargs.get('num_train_steps', None)
if num_train_steps and kargs.get('gumbel_anneal', True):
tf.logging.info("****** apply annealed tenperature ******* %s", str(num_train_steps))
annealed_temp = tf.train.polynomial_decay(config.get('gumbel_temperature', 1.0),
tf.train.get_or_create_global_step(),
kargs.get("num_train_steps", 10000),
end_learning_rate=0.1,
power=1.0,
cycle=False)
else:
annealed_temp = 1.0
tf.logging.info("****** not apply annealed tenperature with fixed temp ******* %s", str(annealed_temp))
# [batch x seq] x config.vocab_size x config.get('gen_sample', 1)
sampled_logprob_temp, sampled_logprob = gumbel_softmax(flat_logits_tempered,
temperature=annealed_temp,
samples=config.get('gen_sample', 1))
# argmax on config.vocab_size which is always axis=1
# [batch x seq] x config.vocab_size x config.get('gen_sample', 1)
# armax(logits+gumbel_samples) to sample a categoritical distribution
if kargs.get('sampled_prob_id', True):
tf.logging.info("****** apply categorical sampled id of original logits *******")
sampled_id = tf.one_hot(tf.argmax(sampled_logprob, axis=1),
config.vocab_size,
axis=1) # sampled multiminal id
else:
tf.logging.info("****** apply gumbel-softmax logprob for logits *******")
sampled_id = tf.one_hot(tf.argmax(sampled_logprob_temp, axis=1),
config.vocab_size,
axis=1) # sampled multiminal id
# straight-through gumbel softmax estimator
if kargs.get("straight_through", True):
tf.logging.info("****** apply straight_through_estimator without grl *******")
sampled_id = tf.stop_gradient(sampled_id-sampled_logprob_temp) + (sampled_logprob_temp)
else:
tf.logging.info("****** apply gumbel-softmax probs without grl *******")
sampled_id = flip_gradient(sampled_logprob_temp)
sampled_binary_mask = kargs.get('sampled_binary_mask', None)
if sampled_binary_mask is not None:
label_diff_ids = tf.identity(sampled_binary_mask) # 0 for original and 1 for replace
else:
label_diff_ids = tf.not_equal(
tf.cast(input_ids, tf.int32),
tf.cast(input_ori_ids, tf.int32) # 0 for original and 1 for replace
)
label_diff_ids = tf.cast(label_diff_ids, tf.float32)
label_diff_ids = tf.expand_dims(label_diff_ids, axis=[-1]) # batch x seq x 1
input_ori_ids = tf.one_hot(input_ori_ids, config.vocab_size) # batch x seq x vocab
input_ori_ids = tf.cast(input_ori_ids, tf.float32)
if config.get('gen_sample', 1) == 1:
sampled_input_id = tf.reshape(sampled_id, [batch_size, seq_length, config.vocab_size])
if kargs.get('mask_method', 'only_mask') == 'only_mask':
tf.logging.info("****** only mask sample *******")
label_diff_ids = tf.cast(label_diff_ids, tf.float32)
sampled_input_id = (label_diff_ids) * tf.cast(sampled_input_id, tf.float32) + (1 - label_diff_ids) * tf.cast(input_ori_ids, tf.float32)
else:
sampled_input_id = tf.reshape(samples, [batch_size, seq_length, config.vocab_size, config.get('gen_sample', 1)])
label_diff_ids = tf.expand_dims(label_diff_ids, axis=-1) # batch x seq x 1
input_ori_ids = tf.expand_dims(input_ori_ids, axis=-1) # batch x seq x vocab x 1
if kargs.get('mask_method', 'only_mask') == 'only_mask':
tf.logging.info("****** only mask sample *******")
sampled_input_id = (label_diff_ids) * tf.cast(sampled_input_id, tf.float32) + (1 - input_ori_ids) * label_diff_ids
return sampled_input_id
| 38.669211 | 139 | 0.708429 | 2,187 | 15,197 | 4.648834 | 0.098308 | 0.048687 | 0.033048 | 0.030294 | 0.888168 | 0.871447 | 0.867513 | 0.857874 | 0.857874 | 0.857874 | 0 | 0.012069 | 0.160361 | 15,197 | 392 | 140 | 38.767857 | 0.784718 | 0.111404 | 0 | 0.780822 | 0 | 0 | 0.158545 | 0.006175 | 0 | 0 | 0 | 0 | 0 | 1 | 0.030822 | false | 0 | 0.020548 | 0.003425 | 0.082192 | 0.013699 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
41160f950e37d2be9194612a8e460e86058f1e27 | 11,591 | py | Python | fileSystem/school-projects/development/softwaredesignandcomputerlogiccis122/cis122lab4/python/credit_card_functions.py | nomad-mystic/nomadmystic | 7814c1f7c1a45464df5896d03dd3c3bed0f763d0 | [
"MIT"
] | 1 | 2016-06-15T08:36:56.000Z | 2016-06-15T08:36:56.000Z | fileSystem/school-projects/development/softwaredesignandcomputerlogiccis122/cis122lab4/python/credit_card_functions.py | nomad-mystic/nomadmystic | 7814c1f7c1a45464df5896d03dd3c3bed0f763d0 | [
"MIT"
] | 1 | 2016-06-08T13:05:41.000Z | 2016-06-08T13:06:07.000Z | fileSystem/school-projects/development/softwaredesignandcomputerlogiccis122/cis122lab4/python/credit_card_functions.py | nomad-mystic/nomadmystic | 7814c1f7c1a45464df5896d03dd3c3bed0f763d0 | [
"MIT"
] | null | null | null | # programmer = Keith Murphy
# File = credit_card_functions.py
# Date Created = 2-26-2015
# Last Mod = 2-26-2015
# Import regular expressions external library
import re
# Declare str credit_card_number
# Declare str credit_card_type
# Declare str valid_card_number
# Declare str credit_card_number
# Declare str matched_valid_card_number
# Declare Boolean invalid_card_number
# Declare Boolean tested_card_number_validation
# Function credit_card_input()
# Declare str credit_card_type
# Declare str credit_card_number
#
# Display Please enter your credit card type (Examples: Mastercard, Visa, Discover):
# Input credit_card_type
# Display Please enter your credit card number(No Spaces):
# Input credit_card_number
#
# Return credit_card_number, credit_card_type
# End Function
def credit_card_input():
credit_card_type = input('Please enter your credit card type (Examples: Mastercard, Visa, Discover): ')
credit_card_number = input('Please enter your credit card number(No Spaces): ')
return credit_card_number, credit_card_type
# Function str str check_credit_card_type(str credit_card_number, str credit_card_type)
# Declare str credit_card_type
# Declare str credit_card_number
# Declare Boolean tested_card_number_validation
#
# If credit_card_type == 'Mastercard' or credit_card_type == 'mastercard' Then
# Set tested_card_number_validation, credit_card_number, credit_card_type = \
# run_master_card_validation(credit_card_number, credit_card_type)
#
# Set credit_card_number, credit_card_type = credit_card_validation_loop(tested_card_number_validation,
# credit_card_number,
# credit_card_type)
# Return credit_card_number, credit_card_type
#
# Else If credit_card_type == 'Visa' or credit_card_type == 'visa' Then
# Set tested_card_number_validation, credit_card_number, credit_card_type = \
# run_visa_card_validation(credit_card_number, credit_card_type)
#
# Set credit_card_number, credit_card_type = credit_card_validation_loop(tested_card_number_validation,
# credit_card_number,
# credit_card_type)
# Return credit_card_number, credit_card_type
#
# Else If credit_card_type == 'Discover' or credit_card_type == 'discover' Then
# Set tested_card_number_validation, credit_card_number, credit_card_type = \
# run_discover_card_validation(credit_card_number, credit_card_type)
#
# Set credit_card_number, credit_card_type = credit_card_validation_loop(tested_card_number_validation,
# credit_card_number,
# credit_card_type)
# Return credit_card_number, credit_card_type
# Else
# Display Sorry I didn't understand what you entered, Please try Again
# Call credit_card_input()
# End If
# End Function
def check_credit_card_type(credit_card_number, credit_card_type):
if credit_card_type == 'Mastercard' or credit_card_type == 'mastercard':
tested_card_number_validation, credit_card_number, credit_card_type = \
run_master_card_validation(credit_card_number, credit_card_type)
credit_card_number, credit_card_type = credit_card_validation_loop(tested_card_number_validation,
credit_card_number,
credit_card_type)
return credit_card_number, credit_card_type
elif credit_card_type == 'Visa' or credit_card_type == 'visa':
tested_card_number_validation, credit_card_number, credit_card_type = \
run_visa_card_validation(credit_card_number, credit_card_type)
credit_card_number, credit_card_type = credit_card_validation_loop(tested_card_number_validation,
credit_card_number,
credit_card_type)
return credit_card_number, credit_card_type
elif credit_card_type == 'Discover' or credit_card_type == 'discover':
tested_card_number_validation, credit_card_number, credit_card_type = \
run_discover_card_validation(credit_card_number, credit_card_type)
credit_card_number, credit_card_type = credit_card_validation_loop(tested_card_number_validation,
credit_card_number,
credit_card_type)
return credit_card_number, credit_card_type
else:
print("Sorry I didn't understand what you entered, Please try Again")
credit_card_input()
# Function Boolean str str credit_card_validation_loop(Boolean tested_card_number_validation,
# str credit_card_number, str credit_card_type)
# Declare str credit_card_number
# Declare Boolean tested_card_number_validation
# Declare str credit_card_type
# While not tested_card_number_validation
#
# Set credit_card_number, credit_card_type = credit_card_input()
#
# If credit_card_type == 'Mastercard' or credit_card_type == 'mastercard' Then
# Set tested_card_number_validation, credit_card_number, credit_card_type = \
# run_master_card_validation(credit_card_number, credit_card_type)
#
# Else If credit_card_type == 'Visa' or credit_card_type == 'visa' Then
# Set tested_card_number_validation, credit_card_number, credit_card_type = \
# run_visa_card_validation(credit_card_number, credit_card_type)
#
# Else If credit_card_type == 'Discover' or credit_card_type == 'discover' Then
# Set tested_card_number_validation, credit_card_number, credit_card_type = \
# run_discover_card_validation(credit_card_number, credit_card_type)
# End If
# End While
# Set credit_card_number = 'Your card was approved. Thank You for shopping with us here at NomadMystics.com '
# Return credit_card_number, credit_card_type
# End Function
def credit_card_validation_loop(tested_card_number_validation, credit_card_number, credit_card_type):
while not tested_card_number_validation:
credit_card_number, credit_card_type = credit_card_input()
if credit_card_type == 'Mastercard' or credit_card_type == 'mastercard':
tested_card_number_validation, credit_card_number, credit_card_type = \
run_master_card_validation(credit_card_number, credit_card_type)
elif credit_card_type == 'Visa' or credit_card_type == 'visa':
tested_card_number_validation, credit_card_number, credit_card_type = \
run_visa_card_validation(credit_card_number, credit_card_type)
elif credit_card_type == 'Discover' or credit_card_type == 'discover':
tested_card_number_validation, credit_card_number, credit_card_type = \
run_discover_card_validation(credit_card_number, credit_card_type)
credit_card_number = 'Your card was approved. Thank You for shopping with us here at NomadMystics.com '
return credit_card_number, credit_card_type
# Function str str run_master_card_validation(str credit_card_number, str credit_card_type)
# Declare str valid_card_number
# Declare str credit_card_number
# Declare str matched_valid_card_number
# Declare Boolean invalid_card_number
# Declare Boolean tested_card_number_validation
#
# valid_card_number = re.compile('^5[1-5][0-9]{14}$')
# matched_valid_card_number = valid_card_number.match(credit_card_number)
#
# If matched_valid_card_number Then
#
# Return True, matched_valid_card_number.group(), credit_card_type
# Else
# Set invalid_card_number = False
# Set tested_card_number_validation = False
#
# Return tested_card_number_validation, invalid_card_number, credit_card_type
# End If
# End Function
def run_master_card_validation(credit_card_number, credit_card_type):
valid_card_number = re.compile('^5[1-5][0-9]{14}$')
matched_valid_card_number = valid_card_number.match(credit_card_number)
if matched_valid_card_number:
return True, matched_valid_card_number.group(), credit_card_type
else:
invalid_card_number = False
tested_card_number_validation = False
return tested_card_number_validation, invalid_card_number, credit_card_type
# Function str str run_visa_card_validation(str credit_card_number, str credit_card_type)
# Declare str valid_card_number
# Declare str credit_card_number
# Declare str matched_valid_card_number
# Declare Boolean invalid_card_number
# Declare Boolean tested_card_number_validation
#
# valid_card_number = re.compile('^4[0-9]{12}(?:[0-9]{3})?$')
# matched_valid_card_number = valid_card_number.match(credit_card_number)
#
# If matched_valid_card_number Then
#
# Return True, matched_valid_card_number.group(), credit_card_type
# Else
# Set invalid_card_number = False
# Set tested_card_number_validation = False
#
# Return tested_card_number_validation, invalid_card_number, credit_card_type
# End If
# End Function
def run_visa_card_validation(credit_card_number, credit_card_type):
valid_card_number = re.compile('^4[0-9]{12}(?:[0-9]{3})?$')
matched_valid_card_number = valid_card_number.match(credit_card_number)
if matched_valid_card_number:
return True, matched_valid_card_number.group(), credit_card_type
else:
invalid_card_number = False
tested_card_number_validation = False
return tested_card_number_validation, invalid_card_number, credit_card_type
# Function str str run_discover_card_validation(str credit_card_number, str credit_card_type)
# Declare str valid_card_number
# Declare str credit_card_number
# Declare str matched_valid_card_number
# Declare Boolean invalid_card_number
# Declare Boolean tested_card_number_validation
#
# valid_card_number = re.compile('^6(?:011|5[0-9]{2})[0-9]{12}$')
# matched_valid_card_number = valid_card_number.match(credit_card_number)
#
# If matched_valid_card_number Then
#
# Return True, matched_valid_card_number.group(), credit_card_type
# Else
# Set invalid_card_number = False
# Set tested_card_number_validation = False
#
# Return tested_card_number_validation, invalid_card_number, credit_card_type
# End If
# End Function
def run_discover_card_validation(credit_card_number, credit_card_type):
valid_card_number = re.compile('^6(?:011|5[0-9]{2})[0-9]{12}$')
matched_valid_card_number = valid_card_number.match(credit_card_number)
if matched_valid_card_number:
return True, matched_valid_card_number.group(), credit_card_type
else:
invalid_card_number = False
tested_card_number_validation = False
return tested_card_number_validation, invalid_card_number, credit_card_type
| 44.926357 | 113 | 0.6939 | 1,444 | 11,591 | 5.08518 | 0.066482 | 0.268283 | 0.198284 | 0.160697 | 0.950429 | 0.943347 | 0.940079 | 0.935585 | 0.923873 | 0.909438 | 0 | 0.007782 | 0.246139 | 11,591 | 257 | 114 | 45.101167 | 0.83257 | 0.561125 | 0 | 0.760563 | 0 | 0 | 0.085645 | 0.010933 | 0 | 0 | 0 | 0 | 0 | 1 | 0.084507 | false | 0 | 0.014085 | 0 | 0.253521 | 0.014085 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
de4179b0e3191fc105dc7e02647e324d820fa6bd | 171,824 | py | Python | Thrift/gen-py/SpotifakeServices/TrackService.py | BrunoLujan/Spotifake-DESER | a811444af0a1326659dd27949c6a1c66c7cd66a1 | [
"Apache-2.0"
] | null | null | null | Thrift/gen-py/SpotifakeServices/TrackService.py | BrunoLujan/Spotifake-DESER | a811444af0a1326659dd27949c6a1c66c7cd66a1 | [
"Apache-2.0"
] | null | null | null | Thrift/gen-py/SpotifakeServices/TrackService.py | BrunoLujan/Spotifake-DESER | a811444af0a1326659dd27949c6a1c66c7cd66a1 | [
"Apache-2.0"
] | null | null | null | #
# Autogenerated by Thrift Compiler (0.13.0)
#
# DO NOT EDIT UNLESS YOU ARE SURE THAT YOU KNOW WHAT YOU ARE DOING
#
# options string: py
#
from thrift.Thrift import TType, TMessageType, TFrozenDict, TException, TApplicationException
from thrift.protocol.TProtocol import TProtocolException
from thrift.TRecursive import fix_spec
import sys
import logging
from .ttypes import *
from thrift.Thrift import TProcessor
from thrift.transport import TTransport
all_structs = []
class Iface(object):
def GetTrackByTitle(self, title):
"""
Get Track by Title
@param title
The Track Title to be obtained
@return Track
Track object
Parameters:
- title
"""
pass
def GetTrackByAlbumId(self, idAlbum):
"""
Get Track by idAlbum
@param idAlbum
The Track Title to be obtained
@return Track
list<Track>
Parameters:
- idAlbum
"""
pass
def GetTrackByPlaylistId(self, idPlaylist):
"""
Get Track by idAlbum
@param idAlbum
The Track Title to be obtained
@return Track
list<Track>
Parameters:
- idPlaylist
"""
pass
def GetTrackByLibraryId(self, idLibrary):
"""
Get Track by idLibrary
@param idLibrary
The Library Id to be obtained
@return Track
list<Track>
Parameters:
- idLibrary
"""
pass
def AddTrackToAlbum(self, idAlbum, newTrack, idContentCreator):
"""
Add a Track to an Album.
@param idAlbum
The Album Id which a track will be added
@param newTrack
@return Track
Track object added
Parameters:
- idAlbum
- newTrack
- idContentCreator
"""
pass
def AddFeaturingTrack(self, idNewTrack, idContenCreator):
"""
Register a featuring Track
@param newTrack
@return idNewTrack
Featuring added
Parameters:
- idNewTrack
- idContenCreator
"""
pass
def DeleteAlbumTrack(self, idAlbum, trackNumber):
"""
Delete a Track from an Album
@param idAlbum
The Album Id which a track will be deleted.
@param trackNumber
The Track number which will be deleted
@return Id
The Track Id of the Track deleted.
Parameters:
- idAlbum
- trackNumber
"""
pass
def GetTrackByQuery(self, query):
"""
Get Track by Query
@param query
The query to be obtained
@return Track
list<Track>
Parameters:
- query
"""
pass
def UpdateAlbumTrackTitle(self, idAlbum, trackNumber, newAlbumTrackTitle):
"""
Update previously registered Album track title.
@param idAlbum
The Album Id of the Album which require an update track title.
@param trackNumber
The Track number of the Track which require an update title
@return Album
Modified Album obejct.
Parameters:
- idAlbum
- trackNumber
- newAlbumTrackTitle
"""
pass
def UpdateAlbumTrackFeaturing(self, idAlbum, trackNumber, newFeaturing):
"""
Update previously registered Album track featuring.
@param idAlbum
The Album Id of the Album which require an update track featuring.
@param trackNumber
The Track number of the Track which require an update featuring
@return Album
Modified Album obejct.
Parameters:
- idAlbum
- trackNumber
- newFeaturing
"""
pass
def AddTrackToLibrary(self, idLibrary, idTrack):
"""
Add a Track to Library.
@param idLibrary
The Library Id to which a track will be added
@param newTrack
@return Track
Track object added
Parameters:
- idLibrary
- idTrack
"""
pass
def DeleteLibraryTrack(self, idLibrary, trackNumber):
"""
Delete a Track from a Library
@param idLibrary
The Library Id which a track will be deleted.
@param trackNumber
The Track number which will be deleted
@return Id
The Track Id of the Track deleted.
Parameters:
- idLibrary
- trackNumber
"""
pass
def AddTrackToPlaylist(self, idPlaylist, idTrack):
"""
Add a Track to Playlist.
@param idPlaylist
The Playlist Id to which a track will be added
@param newTrack
@return Track
Track object added
Parameters:
- idPlaylist
- idTrack
"""
pass
def DeletePlaylistTrack(self, idPlaylist, trackNumber):
"""
Delete a Track from a Playlist
@param idPlaylist
The Playlist Id which a track will be deleted.
@param trackNumber
The Track number which will be deleted
@return Id
The Track Id of the Track deleted.
Parameters:
- idPlaylist
- trackNumber
"""
pass
def AddTrackToPlayQueue(self, idPlayQueu, newTrack):
"""
Add a Track to PlayQueue.
@param idPlayQueue
The PlayQueue Id to which a track will be added
@param newTrack
@return Track
Track object added
Parameters:
- idPlayQueu
- newTrack
"""
pass
def DeletePlayQueueTrack(self, idPlayQueu, trackNumber):
"""
Delete a Track from a PlayQueue
@param idPlayQueue
The PlayQueue Id which a track will be deleted.
@param trackNumber
The Track number which will be deleted
@return Id
The Track Id of the Track deleted.
Parameters:
- idPlayQueu
- trackNumber
"""
pass
def GenerateRadioStation(self, idGender):
"""
Generate a Radio Station
@param gender
The gender which the radio station will be generated.
@return tracks
List of tracks which belong to the gender entered.
Parameters:
- idGender
"""
pass
def GetLocalTracksByIdConsumer(self, idConsumer):
"""
Get Local Tracks By Id Consumer.
@param idConsumer
The Consumer Id which is required to get Tracks
@return LocalTracks
List of tracks which belong to idConsumer
Parameters:
- idConsumer
"""
pass
def AddLocalTrack(self, LocalTrack):
"""
Add Local Track.
@param LocalTrack
The Local Track which will be added
@return LocalTracks
List of tracks which belong to idConsumer
Parameters:
- LocalTrack
"""
pass
class Client(Iface):
def __init__(self, iprot, oprot=None):
self._iprot = self._oprot = iprot
if oprot is not None:
self._oprot = oprot
self._seqid = 0
def GetTrackByTitle(self, title):
"""
Get Track by Title
@param title
The Track Title to be obtained
@return Track
Track object
Parameters:
- title
"""
self.send_GetTrackByTitle(title)
return self.recv_GetTrackByTitle()
def send_GetTrackByTitle(self, title):
self._oprot.writeMessageBegin('GetTrackByTitle', TMessageType.CALL, self._seqid)
args = GetTrackByTitle_args()
args.title = title
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_GetTrackByTitle(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = GetTrackByTitle_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.sErrorNotFoundE is not None:
raise result.sErrorNotFoundE
if result.sErrorSystemE is not None:
raise result.sErrorSystemE
raise TApplicationException(TApplicationException.MISSING_RESULT, "GetTrackByTitle failed: unknown result")
def GetTrackByAlbumId(self, idAlbum):
"""
Get Track by idAlbum
@param idAlbum
The Track Title to be obtained
@return Track
list<Track>
Parameters:
- idAlbum
"""
self.send_GetTrackByAlbumId(idAlbum)
return self.recv_GetTrackByAlbumId()
def send_GetTrackByAlbumId(self, idAlbum):
self._oprot.writeMessageBegin('GetTrackByAlbumId', TMessageType.CALL, self._seqid)
args = GetTrackByAlbumId_args()
args.idAlbum = idAlbum
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_GetTrackByAlbumId(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = GetTrackByAlbumId_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.sErrorNotFoundE is not None:
raise result.sErrorNotFoundE
if result.sErrorSystemE is not None:
raise result.sErrorSystemE
raise TApplicationException(TApplicationException.MISSING_RESULT, "GetTrackByAlbumId failed: unknown result")
def GetTrackByPlaylistId(self, idPlaylist):
"""
Get Track by idAlbum
@param idAlbum
The Track Title to be obtained
@return Track
list<Track>
Parameters:
- idPlaylist
"""
self.send_GetTrackByPlaylistId(idPlaylist)
return self.recv_GetTrackByPlaylistId()
def send_GetTrackByPlaylistId(self, idPlaylist):
self._oprot.writeMessageBegin('GetTrackByPlaylistId', TMessageType.CALL, self._seqid)
args = GetTrackByPlaylistId_args()
args.idPlaylist = idPlaylist
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_GetTrackByPlaylistId(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = GetTrackByPlaylistId_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.sErrorNotFoundE is not None:
raise result.sErrorNotFoundE
if result.sErrorSystemE is not None:
raise result.sErrorSystemE
raise TApplicationException(TApplicationException.MISSING_RESULT, "GetTrackByPlaylistId failed: unknown result")
def GetTrackByLibraryId(self, idLibrary):
"""
Get Track by idLibrary
@param idLibrary
The Library Id to be obtained
@return Track
list<Track>
Parameters:
- idLibrary
"""
self.send_GetTrackByLibraryId(idLibrary)
return self.recv_GetTrackByLibraryId()
def send_GetTrackByLibraryId(self, idLibrary):
self._oprot.writeMessageBegin('GetTrackByLibraryId', TMessageType.CALL, self._seqid)
args = GetTrackByLibraryId_args()
args.idLibrary = idLibrary
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_GetTrackByLibraryId(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = GetTrackByLibraryId_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.sErrorNotFoundE is not None:
raise result.sErrorNotFoundE
if result.sErrorSystemE is not None:
raise result.sErrorSystemE
raise TApplicationException(TApplicationException.MISSING_RESULT, "GetTrackByLibraryId failed: unknown result")
def AddTrackToAlbum(self, idAlbum, newTrack, idContentCreator):
"""
Add a Track to an Album.
@param idAlbum
The Album Id which a track will be added
@param newTrack
@return Track
Track object added
Parameters:
- idAlbum
- newTrack
- idContentCreator
"""
self.send_AddTrackToAlbum(idAlbum, newTrack, idContentCreator)
return self.recv_AddTrackToAlbum()
def send_AddTrackToAlbum(self, idAlbum, newTrack, idContentCreator):
self._oprot.writeMessageBegin('AddTrackToAlbum', TMessageType.CALL, self._seqid)
args = AddTrackToAlbum_args()
args.idAlbum = idAlbum
args.newTrack = newTrack
args.idContentCreator = idContentCreator
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_AddTrackToAlbum(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = AddTrackToAlbum_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.sErrorSystemE is not None:
raise result.sErrorSystemE
raise TApplicationException(TApplicationException.MISSING_RESULT, "AddTrackToAlbum failed: unknown result")
def AddFeaturingTrack(self, idNewTrack, idContenCreator):
"""
Register a featuring Track
@param newTrack
@return idNewTrack
Featuring added
Parameters:
- idNewTrack
- idContenCreator
"""
self.send_AddFeaturingTrack(idNewTrack, idContenCreator)
return self.recv_AddFeaturingTrack()
def send_AddFeaturingTrack(self, idNewTrack, idContenCreator):
self._oprot.writeMessageBegin('AddFeaturingTrack', TMessageType.CALL, self._seqid)
args = AddFeaturingTrack_args()
args.idNewTrack = idNewTrack
args.idContenCreator = idContenCreator
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_AddFeaturingTrack(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = AddFeaturingTrack_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.sErrorSystemE is not None:
raise result.sErrorSystemE
raise TApplicationException(TApplicationException.MISSING_RESULT, "AddFeaturingTrack failed: unknown result")
def DeleteAlbumTrack(self, idAlbum, trackNumber):
"""
Delete a Track from an Album
@param idAlbum
The Album Id which a track will be deleted.
@param trackNumber
The Track number which will be deleted
@return Id
The Track Id of the Track deleted.
Parameters:
- idAlbum
- trackNumber
"""
self.send_DeleteAlbumTrack(idAlbum, trackNumber)
return self.recv_DeleteAlbumTrack()
def send_DeleteAlbumTrack(self, idAlbum, trackNumber):
self._oprot.writeMessageBegin('DeleteAlbumTrack', TMessageType.CALL, self._seqid)
args = DeleteAlbumTrack_args()
args.idAlbum = idAlbum
args.trackNumber = trackNumber
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_DeleteAlbumTrack(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = DeleteAlbumTrack_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.sErrorNotFoundE is not None:
raise result.sErrorNotFoundE
if result.sErrorSystemE is not None:
raise result.sErrorSystemE
if result.sErrorInvalidRequestE is not None:
raise result.sErrorInvalidRequestE
raise TApplicationException(TApplicationException.MISSING_RESULT, "DeleteAlbumTrack failed: unknown result")
def GetTrackByQuery(self, query):
"""
Get Track by Query
@param query
The query to be obtained
@return Track
list<Track>
Parameters:
- query
"""
self.send_GetTrackByQuery(query)
return self.recv_GetTrackByQuery()
def send_GetTrackByQuery(self, query):
self._oprot.writeMessageBegin('GetTrackByQuery', TMessageType.CALL, self._seqid)
args = GetTrackByQuery_args()
args.query = query
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_GetTrackByQuery(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = GetTrackByQuery_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.sErrorNotFoundE is not None:
raise result.sErrorNotFoundE
if result.sErrorSystemE is not None:
raise result.sErrorSystemE
raise TApplicationException(TApplicationException.MISSING_RESULT, "GetTrackByQuery failed: unknown result")
def UpdateAlbumTrackTitle(self, idAlbum, trackNumber, newAlbumTrackTitle):
"""
Update previously registered Album track title.
@param idAlbum
The Album Id of the Album which require an update track title.
@param trackNumber
The Track number of the Track which require an update title
@return Album
Modified Album obejct.
Parameters:
- idAlbum
- trackNumber
- newAlbumTrackTitle
"""
self.send_UpdateAlbumTrackTitle(idAlbum, trackNumber, newAlbumTrackTitle)
return self.recv_UpdateAlbumTrackTitle()
def send_UpdateAlbumTrackTitle(self, idAlbum, trackNumber, newAlbumTrackTitle):
self._oprot.writeMessageBegin('UpdateAlbumTrackTitle', TMessageType.CALL, self._seqid)
args = UpdateAlbumTrackTitle_args()
args.idAlbum = idAlbum
args.trackNumber = trackNumber
args.newAlbumTrackTitle = newAlbumTrackTitle
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_UpdateAlbumTrackTitle(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = UpdateAlbumTrackTitle_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.sErrorNotFoundE is not None:
raise result.sErrorNotFoundE
if result.sErrorSystemE is not None:
raise result.sErrorSystemE
if result.sErrorInvalidRequestE is not None:
raise result.sErrorInvalidRequestE
raise TApplicationException(TApplicationException.MISSING_RESULT, "UpdateAlbumTrackTitle failed: unknown result")
def UpdateAlbumTrackFeaturing(self, idAlbum, trackNumber, newFeaturing):
"""
Update previously registered Album track featuring.
@param idAlbum
The Album Id of the Album which require an update track featuring.
@param trackNumber
The Track number of the Track which require an update featuring
@return Album
Modified Album obejct.
Parameters:
- idAlbum
- trackNumber
- newFeaturing
"""
self.send_UpdateAlbumTrackFeaturing(idAlbum, trackNumber, newFeaturing)
return self.recv_UpdateAlbumTrackFeaturing()
def send_UpdateAlbumTrackFeaturing(self, idAlbum, trackNumber, newFeaturing):
self._oprot.writeMessageBegin('UpdateAlbumTrackFeaturing', TMessageType.CALL, self._seqid)
args = UpdateAlbumTrackFeaturing_args()
args.idAlbum = idAlbum
args.trackNumber = trackNumber
args.newFeaturing = newFeaturing
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_UpdateAlbumTrackFeaturing(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = UpdateAlbumTrackFeaturing_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.sErrorNotFoundE is not None:
raise result.sErrorNotFoundE
if result.sErrorSystemE is not None:
raise result.sErrorSystemE
if result.sErrorInvalidRequestE is not None:
raise result.sErrorInvalidRequestE
raise TApplicationException(TApplicationException.MISSING_RESULT, "UpdateAlbumTrackFeaturing failed: unknown result")
def AddTrackToLibrary(self, idLibrary, idTrack):
"""
Add a Track to Library.
@param idLibrary
The Library Id to which a track will be added
@param newTrack
@return Track
Track object added
Parameters:
- idLibrary
- idTrack
"""
self.send_AddTrackToLibrary(idLibrary, idTrack)
return self.recv_AddTrackToLibrary()
def send_AddTrackToLibrary(self, idLibrary, idTrack):
self._oprot.writeMessageBegin('AddTrackToLibrary', TMessageType.CALL, self._seqid)
args = AddTrackToLibrary_args()
args.idLibrary = idLibrary
args.idTrack = idTrack
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_AddTrackToLibrary(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = AddTrackToLibrary_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.sErrorSystemE is not None:
raise result.sErrorSystemE
raise TApplicationException(TApplicationException.MISSING_RESULT, "AddTrackToLibrary failed: unknown result")
def DeleteLibraryTrack(self, idLibrary, trackNumber):
"""
Delete a Track from a Library
@param idLibrary
The Library Id which a track will be deleted.
@param trackNumber
The Track number which will be deleted
@return Id
The Track Id of the Track deleted.
Parameters:
- idLibrary
- trackNumber
"""
self.send_DeleteLibraryTrack(idLibrary, trackNumber)
return self.recv_DeleteLibraryTrack()
def send_DeleteLibraryTrack(self, idLibrary, trackNumber):
self._oprot.writeMessageBegin('DeleteLibraryTrack', TMessageType.CALL, self._seqid)
args = DeleteLibraryTrack_args()
args.idLibrary = idLibrary
args.trackNumber = trackNumber
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_DeleteLibraryTrack(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = DeleteLibraryTrack_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.sErrorNotFoundE is not None:
raise result.sErrorNotFoundE
if result.sErrorSystemE is not None:
raise result.sErrorSystemE
raise TApplicationException(TApplicationException.MISSING_RESULT, "DeleteLibraryTrack failed: unknown result")
def AddTrackToPlaylist(self, idPlaylist, idTrack):
"""
Add a Track to Playlist.
@param idPlaylist
The Playlist Id to which a track will be added
@param newTrack
@return Track
Track object added
Parameters:
- idPlaylist
- idTrack
"""
self.send_AddTrackToPlaylist(idPlaylist, idTrack)
return self.recv_AddTrackToPlaylist()
def send_AddTrackToPlaylist(self, idPlaylist, idTrack):
self._oprot.writeMessageBegin('AddTrackToPlaylist', TMessageType.CALL, self._seqid)
args = AddTrackToPlaylist_args()
args.idPlaylist = idPlaylist
args.idTrack = idTrack
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_AddTrackToPlaylist(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = AddTrackToPlaylist_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.sErrorSystemE is not None:
raise result.sErrorSystemE
raise TApplicationException(TApplicationException.MISSING_RESULT, "AddTrackToPlaylist failed: unknown result")
def DeletePlaylistTrack(self, idPlaylist, trackNumber):
"""
Delete a Track from a Playlist
@param idPlaylist
The Playlist Id which a track will be deleted.
@param trackNumber
The Track number which will be deleted
@return Id
The Track Id of the Track deleted.
Parameters:
- idPlaylist
- trackNumber
"""
self.send_DeletePlaylistTrack(idPlaylist, trackNumber)
return self.recv_DeletePlaylistTrack()
def send_DeletePlaylistTrack(self, idPlaylist, trackNumber):
self._oprot.writeMessageBegin('DeletePlaylistTrack', TMessageType.CALL, self._seqid)
args = DeletePlaylistTrack_args()
args.idPlaylist = idPlaylist
args.trackNumber = trackNumber
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_DeletePlaylistTrack(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = DeletePlaylistTrack_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.sErrorNotFoundE is not None:
raise result.sErrorNotFoundE
if result.sErrorSystemE is not None:
raise result.sErrorSystemE
raise TApplicationException(TApplicationException.MISSING_RESULT, "DeletePlaylistTrack failed: unknown result")
def AddTrackToPlayQueue(self, idPlayQueu, newTrack):
"""
Add a Track to PlayQueue.
@param idPlayQueue
The PlayQueue Id to which a track will be added
@param newTrack
@return Track
Track object added
Parameters:
- idPlayQueu
- newTrack
"""
self.send_AddTrackToPlayQueue(idPlayQueu, newTrack)
return self.recv_AddTrackToPlayQueue()
def send_AddTrackToPlayQueue(self, idPlayQueu, newTrack):
self._oprot.writeMessageBegin('AddTrackToPlayQueue', TMessageType.CALL, self._seqid)
args = AddTrackToPlayQueue_args()
args.idPlayQueu = idPlayQueu
args.newTrack = newTrack
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_AddTrackToPlayQueue(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = AddTrackToPlayQueue_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.sErrorSystemE is not None:
raise result.sErrorSystemE
raise TApplicationException(TApplicationException.MISSING_RESULT, "AddTrackToPlayQueue failed: unknown result")
def DeletePlayQueueTrack(self, idPlayQueu, trackNumber):
"""
Delete a Track from a PlayQueue
@param idPlayQueue
The PlayQueue Id which a track will be deleted.
@param trackNumber
The Track number which will be deleted
@return Id
The Track Id of the Track deleted.
Parameters:
- idPlayQueu
- trackNumber
"""
self.send_DeletePlayQueueTrack(idPlayQueu, trackNumber)
return self.recv_DeletePlayQueueTrack()
def send_DeletePlayQueueTrack(self, idPlayQueu, trackNumber):
self._oprot.writeMessageBegin('DeletePlayQueueTrack', TMessageType.CALL, self._seqid)
args = DeletePlayQueueTrack_args()
args.idPlayQueu = idPlayQueu
args.trackNumber = trackNumber
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_DeletePlayQueueTrack(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = DeletePlayQueueTrack_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.sErrorNotFoundE is not None:
raise result.sErrorNotFoundE
if result.sErrorSystemE is not None:
raise result.sErrorSystemE
raise TApplicationException(TApplicationException.MISSING_RESULT, "DeletePlayQueueTrack failed: unknown result")
def GenerateRadioStation(self, idGender):
"""
Generate a Radio Station
@param gender
The gender which the radio station will be generated.
@return tracks
List of tracks which belong to the gender entered.
Parameters:
- idGender
"""
self.send_GenerateRadioStation(idGender)
return self.recv_GenerateRadioStation()
def send_GenerateRadioStation(self, idGender):
self._oprot.writeMessageBegin('GenerateRadioStation', TMessageType.CALL, self._seqid)
args = GenerateRadioStation_args()
args.idGender = idGender
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_GenerateRadioStation(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = GenerateRadioStation_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.sErrorSystemE is not None:
raise result.sErrorSystemE
raise TApplicationException(TApplicationException.MISSING_RESULT, "GenerateRadioStation failed: unknown result")
def GetLocalTracksByIdConsumer(self, idConsumer):
"""
Get Local Tracks By Id Consumer.
@param idConsumer
The Consumer Id which is required to get Tracks
@return LocalTracks
List of tracks which belong to idConsumer
Parameters:
- idConsumer
"""
self.send_GetLocalTracksByIdConsumer(idConsumer)
return self.recv_GetLocalTracksByIdConsumer()
def send_GetLocalTracksByIdConsumer(self, idConsumer):
self._oprot.writeMessageBegin('GetLocalTracksByIdConsumer', TMessageType.CALL, self._seqid)
args = GetLocalTracksByIdConsumer_args()
args.idConsumer = idConsumer
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_GetLocalTracksByIdConsumer(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = GetLocalTracksByIdConsumer_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.sErrorSystemE is not None:
raise result.sErrorSystemE
raise TApplicationException(TApplicationException.MISSING_RESULT, "GetLocalTracksByIdConsumer failed: unknown result")
def AddLocalTrack(self, LocalTrack):
"""
Add Local Track.
@param LocalTrack
The Local Track which will be added
@return LocalTracks
List of tracks which belong to idConsumer
Parameters:
- LocalTrack
"""
self.send_AddLocalTrack(LocalTrack)
return self.recv_AddLocalTrack()
def send_AddLocalTrack(self, LocalTrack):
self._oprot.writeMessageBegin('AddLocalTrack', TMessageType.CALL, self._seqid)
args = AddLocalTrack_args()
args.LocalTrack = LocalTrack
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_AddLocalTrack(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = AddLocalTrack_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.sErrorSystemE is not None:
raise result.sErrorSystemE
raise TApplicationException(TApplicationException.MISSING_RESULT, "AddLocalTrack failed: unknown result")
class Processor(Iface, TProcessor):
def __init__(self, handler):
self._handler = handler
self._processMap = {}
self._processMap["GetTrackByTitle"] = Processor.process_GetTrackByTitle
self._processMap["GetTrackByAlbumId"] = Processor.process_GetTrackByAlbumId
self._processMap["GetTrackByPlaylistId"] = Processor.process_GetTrackByPlaylistId
self._processMap["GetTrackByLibraryId"] = Processor.process_GetTrackByLibraryId
self._processMap["AddTrackToAlbum"] = Processor.process_AddTrackToAlbum
self._processMap["AddFeaturingTrack"] = Processor.process_AddFeaturingTrack
self._processMap["DeleteAlbumTrack"] = Processor.process_DeleteAlbumTrack
self._processMap["GetTrackByQuery"] = Processor.process_GetTrackByQuery
self._processMap["UpdateAlbumTrackTitle"] = Processor.process_UpdateAlbumTrackTitle
self._processMap["UpdateAlbumTrackFeaturing"] = Processor.process_UpdateAlbumTrackFeaturing
self._processMap["AddTrackToLibrary"] = Processor.process_AddTrackToLibrary
self._processMap["DeleteLibraryTrack"] = Processor.process_DeleteLibraryTrack
self._processMap["AddTrackToPlaylist"] = Processor.process_AddTrackToPlaylist
self._processMap["DeletePlaylistTrack"] = Processor.process_DeletePlaylistTrack
self._processMap["AddTrackToPlayQueue"] = Processor.process_AddTrackToPlayQueue
self._processMap["DeletePlayQueueTrack"] = Processor.process_DeletePlayQueueTrack
self._processMap["GenerateRadioStation"] = Processor.process_GenerateRadioStation
self._processMap["GetLocalTracksByIdConsumer"] = Processor.process_GetLocalTracksByIdConsumer
self._processMap["AddLocalTrack"] = Processor.process_AddLocalTrack
self._on_message_begin = None
def on_message_begin(self, func):
self._on_message_begin = func
def process(self, iprot, oprot):
(name, type, seqid) = iprot.readMessageBegin()
if self._on_message_begin:
self._on_message_begin(name, type, seqid)
if name not in self._processMap:
iprot.skip(TType.STRUCT)
iprot.readMessageEnd()
x = TApplicationException(TApplicationException.UNKNOWN_METHOD, 'Unknown function %s' % (name))
oprot.writeMessageBegin(name, TMessageType.EXCEPTION, seqid)
x.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
return
else:
self._processMap[name](self, seqid, iprot, oprot)
return True
def process_GetTrackByTitle(self, seqid, iprot, oprot):
args = GetTrackByTitle_args()
args.read(iprot)
iprot.readMessageEnd()
result = GetTrackByTitle_result()
try:
result.success = self._handler.GetTrackByTitle(args.title)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except SpotifakeManagement.ttypes.SErrorNotFoundException as sErrorNotFoundE:
msg_type = TMessageType.REPLY
result.sErrorNotFoundE = sErrorNotFoundE
except SpotifakeManagement.ttypes.SErrorSystemException as sErrorSystemE:
msg_type = TMessageType.REPLY
result.sErrorSystemE = sErrorSystemE
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("GetTrackByTitle", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_GetTrackByAlbumId(self, seqid, iprot, oprot):
args = GetTrackByAlbumId_args()
args.read(iprot)
iprot.readMessageEnd()
result = GetTrackByAlbumId_result()
try:
result.success = self._handler.GetTrackByAlbumId(args.idAlbum)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except SpotifakeManagement.ttypes.SErrorNotFoundException as sErrorNotFoundE:
msg_type = TMessageType.REPLY
result.sErrorNotFoundE = sErrorNotFoundE
except SpotifakeManagement.ttypes.SErrorSystemException as sErrorSystemE:
msg_type = TMessageType.REPLY
result.sErrorSystemE = sErrorSystemE
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("GetTrackByAlbumId", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_GetTrackByPlaylistId(self, seqid, iprot, oprot):
args = GetTrackByPlaylistId_args()
args.read(iprot)
iprot.readMessageEnd()
result = GetTrackByPlaylistId_result()
try:
result.success = self._handler.GetTrackByPlaylistId(args.idPlaylist)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except SpotifakeManagement.ttypes.SErrorNotFoundException as sErrorNotFoundE:
msg_type = TMessageType.REPLY
result.sErrorNotFoundE = sErrorNotFoundE
except SpotifakeManagement.ttypes.SErrorSystemException as sErrorSystemE:
msg_type = TMessageType.REPLY
result.sErrorSystemE = sErrorSystemE
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("GetTrackByPlaylistId", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_GetTrackByLibraryId(self, seqid, iprot, oprot):
args = GetTrackByLibraryId_args()
args.read(iprot)
iprot.readMessageEnd()
result = GetTrackByLibraryId_result()
try:
result.success = self._handler.GetTrackByLibraryId(args.idLibrary)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except SpotifakeManagement.ttypes.SErrorNotFoundException as sErrorNotFoundE:
msg_type = TMessageType.REPLY
result.sErrorNotFoundE = sErrorNotFoundE
except SpotifakeManagement.ttypes.SErrorSystemException as sErrorSystemE:
msg_type = TMessageType.REPLY
result.sErrorSystemE = sErrorSystemE
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("GetTrackByLibraryId", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_AddTrackToAlbum(self, seqid, iprot, oprot):
args = AddTrackToAlbum_args()
args.read(iprot)
iprot.readMessageEnd()
result = AddTrackToAlbum_result()
try:
result.success = self._handler.AddTrackToAlbum(args.idAlbum, args.newTrack, args.idContentCreator)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except SpotifakeManagement.ttypes.SErrorSystemException as sErrorSystemE:
msg_type = TMessageType.REPLY
result.sErrorSystemE = sErrorSystemE
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("AddTrackToAlbum", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_AddFeaturingTrack(self, seqid, iprot, oprot):
args = AddFeaturingTrack_args()
args.read(iprot)
iprot.readMessageEnd()
result = AddFeaturingTrack_result()
try:
result.success = self._handler.AddFeaturingTrack(args.idNewTrack, args.idContenCreator)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except SpotifakeManagement.ttypes.SErrorSystemException as sErrorSystemE:
msg_type = TMessageType.REPLY
result.sErrorSystemE = sErrorSystemE
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("AddFeaturingTrack", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_DeleteAlbumTrack(self, seqid, iprot, oprot):
args = DeleteAlbumTrack_args()
args.read(iprot)
iprot.readMessageEnd()
result = DeleteAlbumTrack_result()
try:
result.success = self._handler.DeleteAlbumTrack(args.idAlbum, args.trackNumber)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except SpotifakeManagement.ttypes.SErrorNotFoundException as sErrorNotFoundE:
msg_type = TMessageType.REPLY
result.sErrorNotFoundE = sErrorNotFoundE
except SpotifakeManagement.ttypes.SErrorSystemException as sErrorSystemE:
msg_type = TMessageType.REPLY
result.sErrorSystemE = sErrorSystemE
except SpotifakeManagement.ttypes.SErrorInvalidRequestException as sErrorInvalidRequestE:
msg_type = TMessageType.REPLY
result.sErrorInvalidRequestE = sErrorInvalidRequestE
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("DeleteAlbumTrack", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_GetTrackByQuery(self, seqid, iprot, oprot):
args = GetTrackByQuery_args()
args.read(iprot)
iprot.readMessageEnd()
result = GetTrackByQuery_result()
try:
result.success = self._handler.GetTrackByQuery(args.query)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except SpotifakeManagement.ttypes.SErrorNotFoundException as sErrorNotFoundE:
msg_type = TMessageType.REPLY
result.sErrorNotFoundE = sErrorNotFoundE
except SpotifakeManagement.ttypes.SErrorSystemException as sErrorSystemE:
msg_type = TMessageType.REPLY
result.sErrorSystemE = sErrorSystemE
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("GetTrackByQuery", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_UpdateAlbumTrackTitle(self, seqid, iprot, oprot):
args = UpdateAlbumTrackTitle_args()
args.read(iprot)
iprot.readMessageEnd()
result = UpdateAlbumTrackTitle_result()
try:
result.success = self._handler.UpdateAlbumTrackTitle(args.idAlbum, args.trackNumber, args.newAlbumTrackTitle)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except SpotifakeManagement.ttypes.SErrorNotFoundException as sErrorNotFoundE:
msg_type = TMessageType.REPLY
result.sErrorNotFoundE = sErrorNotFoundE
except SpotifakeManagement.ttypes.SErrorSystemException as sErrorSystemE:
msg_type = TMessageType.REPLY
result.sErrorSystemE = sErrorSystemE
except SpotifakeManagement.ttypes.SErrorInvalidRequestException as sErrorInvalidRequestE:
msg_type = TMessageType.REPLY
result.sErrorInvalidRequestE = sErrorInvalidRequestE
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("UpdateAlbumTrackTitle", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_UpdateAlbumTrackFeaturing(self, seqid, iprot, oprot):
args = UpdateAlbumTrackFeaturing_args()
args.read(iprot)
iprot.readMessageEnd()
result = UpdateAlbumTrackFeaturing_result()
try:
result.success = self._handler.UpdateAlbumTrackFeaturing(args.idAlbum, args.trackNumber, args.newFeaturing)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except SpotifakeManagement.ttypes.SErrorNotFoundException as sErrorNotFoundE:
msg_type = TMessageType.REPLY
result.sErrorNotFoundE = sErrorNotFoundE
except SpotifakeManagement.ttypes.SErrorSystemException as sErrorSystemE:
msg_type = TMessageType.REPLY
result.sErrorSystemE = sErrorSystemE
except SpotifakeManagement.ttypes.SErrorInvalidRequestException as sErrorInvalidRequestE:
msg_type = TMessageType.REPLY
result.sErrorInvalidRequestE = sErrorInvalidRequestE
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("UpdateAlbumTrackFeaturing", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_AddTrackToLibrary(self, seqid, iprot, oprot):
args = AddTrackToLibrary_args()
args.read(iprot)
iprot.readMessageEnd()
result = AddTrackToLibrary_result()
try:
result.success = self._handler.AddTrackToLibrary(args.idLibrary, args.idTrack)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except SpotifakeManagement.ttypes.SErrorSystemException as sErrorSystemE:
msg_type = TMessageType.REPLY
result.sErrorSystemE = sErrorSystemE
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("AddTrackToLibrary", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_DeleteLibraryTrack(self, seqid, iprot, oprot):
args = DeleteLibraryTrack_args()
args.read(iprot)
iprot.readMessageEnd()
result = DeleteLibraryTrack_result()
try:
result.success = self._handler.DeleteLibraryTrack(args.idLibrary, args.trackNumber)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except SpotifakeManagement.ttypes.SErrorNotFoundException as sErrorNotFoundE:
msg_type = TMessageType.REPLY
result.sErrorNotFoundE = sErrorNotFoundE
except SpotifakeManagement.ttypes.SErrorSystemException as sErrorSystemE:
msg_type = TMessageType.REPLY
result.sErrorSystemE = sErrorSystemE
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("DeleteLibraryTrack", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_AddTrackToPlaylist(self, seqid, iprot, oprot):
args = AddTrackToPlaylist_args()
args.read(iprot)
iprot.readMessageEnd()
result = AddTrackToPlaylist_result()
try:
result.success = self._handler.AddTrackToPlaylist(args.idPlaylist, args.idTrack)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except SpotifakeManagement.ttypes.SErrorSystemException as sErrorSystemE:
msg_type = TMessageType.REPLY
result.sErrorSystemE = sErrorSystemE
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("AddTrackToPlaylist", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_DeletePlaylistTrack(self, seqid, iprot, oprot):
args = DeletePlaylistTrack_args()
args.read(iprot)
iprot.readMessageEnd()
result = DeletePlaylistTrack_result()
try:
result.success = self._handler.DeletePlaylistTrack(args.idPlaylist, args.trackNumber)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except SpotifakeManagement.ttypes.SErrorNotFoundException as sErrorNotFoundE:
msg_type = TMessageType.REPLY
result.sErrorNotFoundE = sErrorNotFoundE
except SpotifakeManagement.ttypes.SErrorSystemException as sErrorSystemE:
msg_type = TMessageType.REPLY
result.sErrorSystemE = sErrorSystemE
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("DeletePlaylistTrack", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_AddTrackToPlayQueue(self, seqid, iprot, oprot):
args = AddTrackToPlayQueue_args()
args.read(iprot)
iprot.readMessageEnd()
result = AddTrackToPlayQueue_result()
try:
result.success = self._handler.AddTrackToPlayQueue(args.idPlayQueu, args.newTrack)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except SpotifakeManagement.ttypes.SErrorSystemException as sErrorSystemE:
msg_type = TMessageType.REPLY
result.sErrorSystemE = sErrorSystemE
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("AddTrackToPlayQueue", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_DeletePlayQueueTrack(self, seqid, iprot, oprot):
args = DeletePlayQueueTrack_args()
args.read(iprot)
iprot.readMessageEnd()
result = DeletePlayQueueTrack_result()
try:
result.success = self._handler.DeletePlayQueueTrack(args.idPlayQueu, args.trackNumber)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except SpotifakeManagement.ttypes.SErrorNotFoundException as sErrorNotFoundE:
msg_type = TMessageType.REPLY
result.sErrorNotFoundE = sErrorNotFoundE
except SpotifakeManagement.ttypes.SErrorSystemException as sErrorSystemE:
msg_type = TMessageType.REPLY
result.sErrorSystemE = sErrorSystemE
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("DeletePlayQueueTrack", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_GenerateRadioStation(self, seqid, iprot, oprot):
args = GenerateRadioStation_args()
args.read(iprot)
iprot.readMessageEnd()
result = GenerateRadioStation_result()
try:
result.success = self._handler.GenerateRadioStation(args.idGender)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except SpotifakeManagement.ttypes.SErrorSystemException as sErrorSystemE:
msg_type = TMessageType.REPLY
result.sErrorSystemE = sErrorSystemE
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("GenerateRadioStation", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_GetLocalTracksByIdConsumer(self, seqid, iprot, oprot):
args = GetLocalTracksByIdConsumer_args()
args.read(iprot)
iprot.readMessageEnd()
result = GetLocalTracksByIdConsumer_result()
try:
result.success = self._handler.GetLocalTracksByIdConsumer(args.idConsumer)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except SpotifakeManagement.ttypes.SErrorSystemException as sErrorSystemE:
msg_type = TMessageType.REPLY
result.sErrorSystemE = sErrorSystemE
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("GetLocalTracksByIdConsumer", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_AddLocalTrack(self, seqid, iprot, oprot):
args = AddLocalTrack_args()
args.read(iprot)
iprot.readMessageEnd()
result = AddLocalTrack_result()
try:
result.success = self._handler.AddLocalTrack(args.LocalTrack)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except SpotifakeManagement.ttypes.SErrorSystemException as sErrorSystemE:
msg_type = TMessageType.REPLY
result.sErrorSystemE = sErrorSystemE
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("AddLocalTrack", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
# HELPER FUNCTIONS AND STRUCTURES
class GetTrackByTitle_args(object):
"""
Attributes:
- title
"""
def __init__(self, title=None,):
self.title = title
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRING:
self.title = iprot.readString().decode('utf-8') if sys.version_info[0] == 2 else iprot.readString()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('GetTrackByTitle_args')
if self.title is not None:
oprot.writeFieldBegin('title', TType.STRING, 1)
oprot.writeString(self.title.encode('utf-8') if sys.version_info[0] == 2 else self.title)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(GetTrackByTitle_args)
GetTrackByTitle_args.thrift_spec = (
None, # 0
(1, TType.STRING, 'title', 'UTF8', None, ), # 1
)
class GetTrackByTitle_result(object):
"""
Attributes:
- success
- sErrorNotFoundE
- sErrorSystemE
"""
def __init__(self, success=None, sErrorNotFoundE=None, sErrorSystemE=None,):
self.success = success
self.sErrorNotFoundE = sErrorNotFoundE
self.sErrorSystemE = sErrorSystemE
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.STRUCT:
self.success = SpotifakeManagement.ttypes.Track()
self.success.read(iprot)
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.sErrorNotFoundE = SpotifakeManagement.ttypes.SErrorNotFoundException()
self.sErrorNotFoundE.read(iprot)
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.STRUCT:
self.sErrorSystemE = SpotifakeManagement.ttypes.SErrorSystemException()
self.sErrorSystemE.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('GetTrackByTitle_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.STRUCT, 0)
self.success.write(oprot)
oprot.writeFieldEnd()
if self.sErrorNotFoundE is not None:
oprot.writeFieldBegin('sErrorNotFoundE', TType.STRUCT, 1)
self.sErrorNotFoundE.write(oprot)
oprot.writeFieldEnd()
if self.sErrorSystemE is not None:
oprot.writeFieldBegin('sErrorSystemE', TType.STRUCT, 2)
self.sErrorSystemE.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(GetTrackByTitle_result)
GetTrackByTitle_result.thrift_spec = (
(0, TType.STRUCT, 'success', [SpotifakeManagement.ttypes.Track, None], None, ), # 0
(1, TType.STRUCT, 'sErrorNotFoundE', [SpotifakeManagement.ttypes.SErrorNotFoundException, None], None, ), # 1
(2, TType.STRUCT, 'sErrorSystemE', [SpotifakeManagement.ttypes.SErrorSystemException, None], None, ), # 2
)
class GetTrackByAlbumId_args(object):
"""
Attributes:
- idAlbum
"""
def __init__(self, idAlbum=None,):
self.idAlbum = idAlbum
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.I16:
self.idAlbum = iprot.readI16()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('GetTrackByAlbumId_args')
if self.idAlbum is not None:
oprot.writeFieldBegin('idAlbum', TType.I16, 1)
oprot.writeI16(self.idAlbum)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(GetTrackByAlbumId_args)
GetTrackByAlbumId_args.thrift_spec = (
None, # 0
(1, TType.I16, 'idAlbum', None, None, ), # 1
)
class GetTrackByAlbumId_result(object):
"""
Attributes:
- success
- sErrorNotFoundE
- sErrorSystemE
"""
def __init__(self, success=None, sErrorNotFoundE=None, sErrorSystemE=None,):
self.success = success
self.sErrorNotFoundE = sErrorNotFoundE
self.sErrorSystemE = sErrorSystemE
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.LIST:
self.success = []
(_etype24, _size21) = iprot.readListBegin()
for _i25 in range(_size21):
_elem26 = SpotifakeManagement.ttypes.Track()
_elem26.read(iprot)
self.success.append(_elem26)
iprot.readListEnd()
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.sErrorNotFoundE = SpotifakeManagement.ttypes.SErrorNotFoundException()
self.sErrorNotFoundE.read(iprot)
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.STRUCT:
self.sErrorSystemE = SpotifakeManagement.ttypes.SErrorSystemException()
self.sErrorSystemE.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('GetTrackByAlbumId_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.LIST, 0)
oprot.writeListBegin(TType.STRUCT, len(self.success))
for iter27 in self.success:
iter27.write(oprot)
oprot.writeListEnd()
oprot.writeFieldEnd()
if self.sErrorNotFoundE is not None:
oprot.writeFieldBegin('sErrorNotFoundE', TType.STRUCT, 1)
self.sErrorNotFoundE.write(oprot)
oprot.writeFieldEnd()
if self.sErrorSystemE is not None:
oprot.writeFieldBegin('sErrorSystemE', TType.STRUCT, 2)
self.sErrorSystemE.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(GetTrackByAlbumId_result)
GetTrackByAlbumId_result.thrift_spec = (
(0, TType.LIST, 'success', (TType.STRUCT, [SpotifakeManagement.ttypes.Track, None], False), None, ), # 0
(1, TType.STRUCT, 'sErrorNotFoundE', [SpotifakeManagement.ttypes.SErrorNotFoundException, None], None, ), # 1
(2, TType.STRUCT, 'sErrorSystemE', [SpotifakeManagement.ttypes.SErrorSystemException, None], None, ), # 2
)
class GetTrackByPlaylistId_args(object):
"""
Attributes:
- idPlaylist
"""
def __init__(self, idPlaylist=None,):
self.idPlaylist = idPlaylist
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.I16:
self.idPlaylist = iprot.readI16()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('GetTrackByPlaylistId_args')
if self.idPlaylist is not None:
oprot.writeFieldBegin('idPlaylist', TType.I16, 1)
oprot.writeI16(self.idPlaylist)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(GetTrackByPlaylistId_args)
GetTrackByPlaylistId_args.thrift_spec = (
None, # 0
(1, TType.I16, 'idPlaylist', None, None, ), # 1
)
class GetTrackByPlaylistId_result(object):
"""
Attributes:
- success
- sErrorNotFoundE
- sErrorSystemE
"""
def __init__(self, success=None, sErrorNotFoundE=None, sErrorSystemE=None,):
self.success = success
self.sErrorNotFoundE = sErrorNotFoundE
self.sErrorSystemE = sErrorSystemE
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.LIST:
self.success = []
(_etype31, _size28) = iprot.readListBegin()
for _i32 in range(_size28):
_elem33 = SpotifakeManagement.ttypes.Track()
_elem33.read(iprot)
self.success.append(_elem33)
iprot.readListEnd()
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.sErrorNotFoundE = SpotifakeManagement.ttypes.SErrorNotFoundException()
self.sErrorNotFoundE.read(iprot)
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.STRUCT:
self.sErrorSystemE = SpotifakeManagement.ttypes.SErrorSystemException()
self.sErrorSystemE.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('GetTrackByPlaylistId_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.LIST, 0)
oprot.writeListBegin(TType.STRUCT, len(self.success))
for iter34 in self.success:
iter34.write(oprot)
oprot.writeListEnd()
oprot.writeFieldEnd()
if self.sErrorNotFoundE is not None:
oprot.writeFieldBegin('sErrorNotFoundE', TType.STRUCT, 1)
self.sErrorNotFoundE.write(oprot)
oprot.writeFieldEnd()
if self.sErrorSystemE is not None:
oprot.writeFieldBegin('sErrorSystemE', TType.STRUCT, 2)
self.sErrorSystemE.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(GetTrackByPlaylistId_result)
GetTrackByPlaylistId_result.thrift_spec = (
(0, TType.LIST, 'success', (TType.STRUCT, [SpotifakeManagement.ttypes.Track, None], False), None, ), # 0
(1, TType.STRUCT, 'sErrorNotFoundE', [SpotifakeManagement.ttypes.SErrorNotFoundException, None], None, ), # 1
(2, TType.STRUCT, 'sErrorSystemE', [SpotifakeManagement.ttypes.SErrorSystemException, None], None, ), # 2
)
class GetTrackByLibraryId_args(object):
"""
Attributes:
- idLibrary
"""
def __init__(self, idLibrary=None,):
self.idLibrary = idLibrary
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.I16:
self.idLibrary = iprot.readI16()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('GetTrackByLibraryId_args')
if self.idLibrary is not None:
oprot.writeFieldBegin('idLibrary', TType.I16, 1)
oprot.writeI16(self.idLibrary)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(GetTrackByLibraryId_args)
GetTrackByLibraryId_args.thrift_spec = (
None, # 0
(1, TType.I16, 'idLibrary', None, None, ), # 1
)
class GetTrackByLibraryId_result(object):
"""
Attributes:
- success
- sErrorNotFoundE
- sErrorSystemE
"""
def __init__(self, success=None, sErrorNotFoundE=None, sErrorSystemE=None,):
self.success = success
self.sErrorNotFoundE = sErrorNotFoundE
self.sErrorSystemE = sErrorSystemE
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.LIST:
self.success = []
(_etype38, _size35) = iprot.readListBegin()
for _i39 in range(_size35):
_elem40 = SpotifakeManagement.ttypes.Track()
_elem40.read(iprot)
self.success.append(_elem40)
iprot.readListEnd()
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.sErrorNotFoundE = SpotifakeManagement.ttypes.SErrorNotFoundException()
self.sErrorNotFoundE.read(iprot)
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.STRUCT:
self.sErrorSystemE = SpotifakeManagement.ttypes.SErrorSystemException()
self.sErrorSystemE.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('GetTrackByLibraryId_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.LIST, 0)
oprot.writeListBegin(TType.STRUCT, len(self.success))
for iter41 in self.success:
iter41.write(oprot)
oprot.writeListEnd()
oprot.writeFieldEnd()
if self.sErrorNotFoundE is not None:
oprot.writeFieldBegin('sErrorNotFoundE', TType.STRUCT, 1)
self.sErrorNotFoundE.write(oprot)
oprot.writeFieldEnd()
if self.sErrorSystemE is not None:
oprot.writeFieldBegin('sErrorSystemE', TType.STRUCT, 2)
self.sErrorSystemE.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(GetTrackByLibraryId_result)
GetTrackByLibraryId_result.thrift_spec = (
(0, TType.LIST, 'success', (TType.STRUCT, [SpotifakeManagement.ttypes.Track, None], False), None, ), # 0
(1, TType.STRUCT, 'sErrorNotFoundE', [SpotifakeManagement.ttypes.SErrorNotFoundException, None], None, ), # 1
(2, TType.STRUCT, 'sErrorSystemE', [SpotifakeManagement.ttypes.SErrorSystemException, None], None, ), # 2
)
class AddTrackToAlbum_args(object):
"""
Attributes:
- idAlbum
- newTrack
- idContentCreator
"""
def __init__(self, idAlbum=None, newTrack=None, idContentCreator=None,):
self.idAlbum = idAlbum
self.newTrack = newTrack
self.idContentCreator = idContentCreator
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.I16:
self.idAlbum = iprot.readI16()
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.STRUCT:
self.newTrack = SpotifakeManagement.ttypes.Track()
self.newTrack.read(iprot)
else:
iprot.skip(ftype)
elif fid == 3:
if ftype == TType.I16:
self.idContentCreator = iprot.readI16()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('AddTrackToAlbum_args')
if self.idAlbum is not None:
oprot.writeFieldBegin('idAlbum', TType.I16, 1)
oprot.writeI16(self.idAlbum)
oprot.writeFieldEnd()
if self.newTrack is not None:
oprot.writeFieldBegin('newTrack', TType.STRUCT, 2)
self.newTrack.write(oprot)
oprot.writeFieldEnd()
if self.idContentCreator is not None:
oprot.writeFieldBegin('idContentCreator', TType.I16, 3)
oprot.writeI16(self.idContentCreator)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(AddTrackToAlbum_args)
AddTrackToAlbum_args.thrift_spec = (
None, # 0
(1, TType.I16, 'idAlbum', None, None, ), # 1
(2, TType.STRUCT, 'newTrack', [SpotifakeManagement.ttypes.Track, None], None, ), # 2
(3, TType.I16, 'idContentCreator', None, None, ), # 3
)
class AddTrackToAlbum_result(object):
"""
Attributes:
- success
- sErrorSystemE
"""
def __init__(self, success=None, sErrorSystemE=None,):
self.success = success
self.sErrorSystemE = sErrorSystemE
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.I16:
self.success = iprot.readI16()
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.sErrorSystemE = SpotifakeManagement.ttypes.SErrorSystemException()
self.sErrorSystemE.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('AddTrackToAlbum_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.I16, 0)
oprot.writeI16(self.success)
oprot.writeFieldEnd()
if self.sErrorSystemE is not None:
oprot.writeFieldBegin('sErrorSystemE', TType.STRUCT, 1)
self.sErrorSystemE.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(AddTrackToAlbum_result)
AddTrackToAlbum_result.thrift_spec = (
(0, TType.I16, 'success', None, None, ), # 0
(1, TType.STRUCT, 'sErrorSystemE', [SpotifakeManagement.ttypes.SErrorSystemException, None], None, ), # 1
)
class AddFeaturingTrack_args(object):
"""
Attributes:
- idNewTrack
- idContenCreator
"""
def __init__(self, idNewTrack=None, idContenCreator=None,):
self.idNewTrack = idNewTrack
self.idContenCreator = idContenCreator
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.I16:
self.idNewTrack = iprot.readI16()
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.I16:
self.idContenCreator = iprot.readI16()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('AddFeaturingTrack_args')
if self.idNewTrack is not None:
oprot.writeFieldBegin('idNewTrack', TType.I16, 1)
oprot.writeI16(self.idNewTrack)
oprot.writeFieldEnd()
if self.idContenCreator is not None:
oprot.writeFieldBegin('idContenCreator', TType.I16, 2)
oprot.writeI16(self.idContenCreator)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(AddFeaturingTrack_args)
AddFeaturingTrack_args.thrift_spec = (
None, # 0
(1, TType.I16, 'idNewTrack', None, None, ), # 1
(2, TType.I16, 'idContenCreator', None, None, ), # 2
)
class AddFeaturingTrack_result(object):
"""
Attributes:
- success
- sErrorSystemE
"""
def __init__(self, success=None, sErrorSystemE=None,):
self.success = success
self.sErrorSystemE = sErrorSystemE
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.I16:
self.success = iprot.readI16()
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.sErrorSystemE = SpotifakeManagement.ttypes.SErrorSystemException()
self.sErrorSystemE.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('AddFeaturingTrack_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.I16, 0)
oprot.writeI16(self.success)
oprot.writeFieldEnd()
if self.sErrorSystemE is not None:
oprot.writeFieldBegin('sErrorSystemE', TType.STRUCT, 1)
self.sErrorSystemE.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(AddFeaturingTrack_result)
AddFeaturingTrack_result.thrift_spec = (
(0, TType.I16, 'success', None, None, ), # 0
(1, TType.STRUCT, 'sErrorSystemE', [SpotifakeManagement.ttypes.SErrorSystemException, None], None, ), # 1
)
class DeleteAlbumTrack_args(object):
"""
Attributes:
- idAlbum
- trackNumber
"""
def __init__(self, idAlbum=None, trackNumber=None,):
self.idAlbum = idAlbum
self.trackNumber = trackNumber
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.I16:
self.idAlbum = iprot.readI16()
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.I16:
self.trackNumber = iprot.readI16()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('DeleteAlbumTrack_args')
if self.idAlbum is not None:
oprot.writeFieldBegin('idAlbum', TType.I16, 1)
oprot.writeI16(self.idAlbum)
oprot.writeFieldEnd()
if self.trackNumber is not None:
oprot.writeFieldBegin('trackNumber', TType.I16, 2)
oprot.writeI16(self.trackNumber)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(DeleteAlbumTrack_args)
DeleteAlbumTrack_args.thrift_spec = (
None, # 0
(1, TType.I16, 'idAlbum', None, None, ), # 1
(2, TType.I16, 'trackNumber', None, None, ), # 2
)
class DeleteAlbumTrack_result(object):
"""
Attributes:
- success
- sErrorNotFoundE
- sErrorSystemE
- sErrorInvalidRequestE
"""
def __init__(self, success=None, sErrorNotFoundE=None, sErrorSystemE=None, sErrorInvalidRequestE=None,):
self.success = success
self.sErrorNotFoundE = sErrorNotFoundE
self.sErrorSystemE = sErrorSystemE
self.sErrorInvalidRequestE = sErrorInvalidRequestE
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.I16:
self.success = iprot.readI16()
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.sErrorNotFoundE = SpotifakeManagement.ttypes.SErrorNotFoundException()
self.sErrorNotFoundE.read(iprot)
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.STRUCT:
self.sErrorSystemE = SpotifakeManagement.ttypes.SErrorSystemException()
self.sErrorSystemE.read(iprot)
else:
iprot.skip(ftype)
elif fid == 3:
if ftype == TType.STRUCT:
self.sErrorInvalidRequestE = SpotifakeManagement.ttypes.SErrorInvalidRequestException()
self.sErrorInvalidRequestE.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('DeleteAlbumTrack_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.I16, 0)
oprot.writeI16(self.success)
oprot.writeFieldEnd()
if self.sErrorNotFoundE is not None:
oprot.writeFieldBegin('sErrorNotFoundE', TType.STRUCT, 1)
self.sErrorNotFoundE.write(oprot)
oprot.writeFieldEnd()
if self.sErrorSystemE is not None:
oprot.writeFieldBegin('sErrorSystemE', TType.STRUCT, 2)
self.sErrorSystemE.write(oprot)
oprot.writeFieldEnd()
if self.sErrorInvalidRequestE is not None:
oprot.writeFieldBegin('sErrorInvalidRequestE', TType.STRUCT, 3)
self.sErrorInvalidRequestE.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(DeleteAlbumTrack_result)
DeleteAlbumTrack_result.thrift_spec = (
(0, TType.I16, 'success', None, None, ), # 0
(1, TType.STRUCT, 'sErrorNotFoundE', [SpotifakeManagement.ttypes.SErrorNotFoundException, None], None, ), # 1
(2, TType.STRUCT, 'sErrorSystemE', [SpotifakeManagement.ttypes.SErrorSystemException, None], None, ), # 2
(3, TType.STRUCT, 'sErrorInvalidRequestE', [SpotifakeManagement.ttypes.SErrorInvalidRequestException, None], None, ), # 3
)
class GetTrackByQuery_args(object):
"""
Attributes:
- query
"""
def __init__(self, query=None,):
self.query = query
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRING:
self.query = iprot.readString().decode('utf-8') if sys.version_info[0] == 2 else iprot.readString()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('GetTrackByQuery_args')
if self.query is not None:
oprot.writeFieldBegin('query', TType.STRING, 1)
oprot.writeString(self.query.encode('utf-8') if sys.version_info[0] == 2 else self.query)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(GetTrackByQuery_args)
GetTrackByQuery_args.thrift_spec = (
None, # 0
(1, TType.STRING, 'query', 'UTF8', None, ), # 1
)
class GetTrackByQuery_result(object):
"""
Attributes:
- success
- sErrorNotFoundE
- sErrorSystemE
"""
def __init__(self, success=None, sErrorNotFoundE=None, sErrorSystemE=None,):
self.success = success
self.sErrorNotFoundE = sErrorNotFoundE
self.sErrorSystemE = sErrorSystemE
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.LIST:
self.success = []
(_etype45, _size42) = iprot.readListBegin()
for _i46 in range(_size42):
_elem47 = SpotifakeManagement.ttypes.Track()
_elem47.read(iprot)
self.success.append(_elem47)
iprot.readListEnd()
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.sErrorNotFoundE = SpotifakeManagement.ttypes.SErrorNotFoundException()
self.sErrorNotFoundE.read(iprot)
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.STRUCT:
self.sErrorSystemE = SpotifakeManagement.ttypes.SErrorSystemException()
self.sErrorSystemE.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('GetTrackByQuery_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.LIST, 0)
oprot.writeListBegin(TType.STRUCT, len(self.success))
for iter48 in self.success:
iter48.write(oprot)
oprot.writeListEnd()
oprot.writeFieldEnd()
if self.sErrorNotFoundE is not None:
oprot.writeFieldBegin('sErrorNotFoundE', TType.STRUCT, 1)
self.sErrorNotFoundE.write(oprot)
oprot.writeFieldEnd()
if self.sErrorSystemE is not None:
oprot.writeFieldBegin('sErrorSystemE', TType.STRUCT, 2)
self.sErrorSystemE.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(GetTrackByQuery_result)
GetTrackByQuery_result.thrift_spec = (
(0, TType.LIST, 'success', (TType.STRUCT, [SpotifakeManagement.ttypes.Track, None], False), None, ), # 0
(1, TType.STRUCT, 'sErrorNotFoundE', [SpotifakeManagement.ttypes.SErrorNotFoundException, None], None, ), # 1
(2, TType.STRUCT, 'sErrorSystemE', [SpotifakeManagement.ttypes.SErrorSystemException, None], None, ), # 2
)
class UpdateAlbumTrackTitle_args(object):
"""
Attributes:
- idAlbum
- trackNumber
- newAlbumTrackTitle
"""
def __init__(self, idAlbum=None, trackNumber=None, newAlbumTrackTitle=None,):
self.idAlbum = idAlbum
self.trackNumber = trackNumber
self.newAlbumTrackTitle = newAlbumTrackTitle
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.I16:
self.idAlbum = iprot.readI16()
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.I16:
self.trackNumber = iprot.readI16()
else:
iprot.skip(ftype)
elif fid == 3:
if ftype == TType.STRING:
self.newAlbumTrackTitle = iprot.readString().decode('utf-8') if sys.version_info[0] == 2 else iprot.readString()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('UpdateAlbumTrackTitle_args')
if self.idAlbum is not None:
oprot.writeFieldBegin('idAlbum', TType.I16, 1)
oprot.writeI16(self.idAlbum)
oprot.writeFieldEnd()
if self.trackNumber is not None:
oprot.writeFieldBegin('trackNumber', TType.I16, 2)
oprot.writeI16(self.trackNumber)
oprot.writeFieldEnd()
if self.newAlbumTrackTitle is not None:
oprot.writeFieldBegin('newAlbumTrackTitle', TType.STRING, 3)
oprot.writeString(self.newAlbumTrackTitle.encode('utf-8') if sys.version_info[0] == 2 else self.newAlbumTrackTitle)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(UpdateAlbumTrackTitle_args)
UpdateAlbumTrackTitle_args.thrift_spec = (
None, # 0
(1, TType.I16, 'idAlbum', None, None, ), # 1
(2, TType.I16, 'trackNumber', None, None, ), # 2
(3, TType.STRING, 'newAlbumTrackTitle', 'UTF8', None, ), # 3
)
class UpdateAlbumTrackTitle_result(object):
"""
Attributes:
- success
- sErrorNotFoundE
- sErrorSystemE
- sErrorInvalidRequestE
"""
def __init__(self, success=None, sErrorNotFoundE=None, sErrorSystemE=None, sErrorInvalidRequestE=None,):
self.success = success
self.sErrorNotFoundE = sErrorNotFoundE
self.sErrorSystemE = sErrorSystemE
self.sErrorInvalidRequestE = sErrorInvalidRequestE
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.STRUCT:
self.success = SpotifakeManagement.ttypes.Track()
self.success.read(iprot)
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.sErrorNotFoundE = SpotifakeManagement.ttypes.SErrorNotFoundException()
self.sErrorNotFoundE.read(iprot)
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.STRUCT:
self.sErrorSystemE = SpotifakeManagement.ttypes.SErrorSystemException()
self.sErrorSystemE.read(iprot)
else:
iprot.skip(ftype)
elif fid == 3:
if ftype == TType.STRUCT:
self.sErrorInvalidRequestE = SpotifakeManagement.ttypes.SErrorInvalidRequestException()
self.sErrorInvalidRequestE.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('UpdateAlbumTrackTitle_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.STRUCT, 0)
self.success.write(oprot)
oprot.writeFieldEnd()
if self.sErrorNotFoundE is not None:
oprot.writeFieldBegin('sErrorNotFoundE', TType.STRUCT, 1)
self.sErrorNotFoundE.write(oprot)
oprot.writeFieldEnd()
if self.sErrorSystemE is not None:
oprot.writeFieldBegin('sErrorSystemE', TType.STRUCT, 2)
self.sErrorSystemE.write(oprot)
oprot.writeFieldEnd()
if self.sErrorInvalidRequestE is not None:
oprot.writeFieldBegin('sErrorInvalidRequestE', TType.STRUCT, 3)
self.sErrorInvalidRequestE.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(UpdateAlbumTrackTitle_result)
UpdateAlbumTrackTitle_result.thrift_spec = (
(0, TType.STRUCT, 'success', [SpotifakeManagement.ttypes.Track, None], None, ), # 0
(1, TType.STRUCT, 'sErrorNotFoundE', [SpotifakeManagement.ttypes.SErrorNotFoundException, None], None, ), # 1
(2, TType.STRUCT, 'sErrorSystemE', [SpotifakeManagement.ttypes.SErrorSystemException, None], None, ), # 2
(3, TType.STRUCT, 'sErrorInvalidRequestE', [SpotifakeManagement.ttypes.SErrorInvalidRequestException, None], None, ), # 3
)
class UpdateAlbumTrackFeaturing_args(object):
"""
Attributes:
- idAlbum
- trackNumber
- newFeaturing
"""
def __init__(self, idAlbum=None, trackNumber=None, newFeaturing=None,):
self.idAlbum = idAlbum
self.trackNumber = trackNumber
self.newFeaturing = newFeaturing
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.I16:
self.idAlbum = iprot.readI16()
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.I16:
self.trackNumber = iprot.readI16()
else:
iprot.skip(ftype)
elif fid == 3:
if ftype == TType.STRUCT:
self.newFeaturing = SpotifakeManagement.ttypes.ContentCreator()
self.newFeaturing.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('UpdateAlbumTrackFeaturing_args')
if self.idAlbum is not None:
oprot.writeFieldBegin('idAlbum', TType.I16, 1)
oprot.writeI16(self.idAlbum)
oprot.writeFieldEnd()
if self.trackNumber is not None:
oprot.writeFieldBegin('trackNumber', TType.I16, 2)
oprot.writeI16(self.trackNumber)
oprot.writeFieldEnd()
if self.newFeaturing is not None:
oprot.writeFieldBegin('newFeaturing', TType.STRUCT, 3)
self.newFeaturing.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(UpdateAlbumTrackFeaturing_args)
UpdateAlbumTrackFeaturing_args.thrift_spec = (
None, # 0
(1, TType.I16, 'idAlbum', None, None, ), # 1
(2, TType.I16, 'trackNumber', None, None, ), # 2
(3, TType.STRUCT, 'newFeaturing', [SpotifakeManagement.ttypes.ContentCreator, None], None, ), # 3
)
class UpdateAlbumTrackFeaturing_result(object):
"""
Attributes:
- success
- sErrorNotFoundE
- sErrorSystemE
- sErrorInvalidRequestE
"""
def __init__(self, success=None, sErrorNotFoundE=None, sErrorSystemE=None, sErrorInvalidRequestE=None,):
self.success = success
self.sErrorNotFoundE = sErrorNotFoundE
self.sErrorSystemE = sErrorSystemE
self.sErrorInvalidRequestE = sErrorInvalidRequestE
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.STRUCT:
self.success = SpotifakeManagement.ttypes.Track()
self.success.read(iprot)
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.sErrorNotFoundE = SpotifakeManagement.ttypes.SErrorNotFoundException()
self.sErrorNotFoundE.read(iprot)
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.STRUCT:
self.sErrorSystemE = SpotifakeManagement.ttypes.SErrorSystemException()
self.sErrorSystemE.read(iprot)
else:
iprot.skip(ftype)
elif fid == 3:
if ftype == TType.STRUCT:
self.sErrorInvalidRequestE = SpotifakeManagement.ttypes.SErrorInvalidRequestException()
self.sErrorInvalidRequestE.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('UpdateAlbumTrackFeaturing_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.STRUCT, 0)
self.success.write(oprot)
oprot.writeFieldEnd()
if self.sErrorNotFoundE is not None:
oprot.writeFieldBegin('sErrorNotFoundE', TType.STRUCT, 1)
self.sErrorNotFoundE.write(oprot)
oprot.writeFieldEnd()
if self.sErrorSystemE is not None:
oprot.writeFieldBegin('sErrorSystemE', TType.STRUCT, 2)
self.sErrorSystemE.write(oprot)
oprot.writeFieldEnd()
if self.sErrorInvalidRequestE is not None:
oprot.writeFieldBegin('sErrorInvalidRequestE', TType.STRUCT, 3)
self.sErrorInvalidRequestE.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(UpdateAlbumTrackFeaturing_result)
UpdateAlbumTrackFeaturing_result.thrift_spec = (
(0, TType.STRUCT, 'success', [SpotifakeManagement.ttypes.Track, None], None, ), # 0
(1, TType.STRUCT, 'sErrorNotFoundE', [SpotifakeManagement.ttypes.SErrorNotFoundException, None], None, ), # 1
(2, TType.STRUCT, 'sErrorSystemE', [SpotifakeManagement.ttypes.SErrorSystemException, None], None, ), # 2
(3, TType.STRUCT, 'sErrorInvalidRequestE', [SpotifakeManagement.ttypes.SErrorInvalidRequestException, None], None, ), # 3
)
class AddTrackToLibrary_args(object):
"""
Attributes:
- idLibrary
- idTrack
"""
def __init__(self, idLibrary=None, idTrack=None,):
self.idLibrary = idLibrary
self.idTrack = idTrack
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.I16:
self.idLibrary = iprot.readI16()
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.I16:
self.idTrack = iprot.readI16()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('AddTrackToLibrary_args')
if self.idLibrary is not None:
oprot.writeFieldBegin('idLibrary', TType.I16, 1)
oprot.writeI16(self.idLibrary)
oprot.writeFieldEnd()
if self.idTrack is not None:
oprot.writeFieldBegin('idTrack', TType.I16, 2)
oprot.writeI16(self.idTrack)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(AddTrackToLibrary_args)
AddTrackToLibrary_args.thrift_spec = (
None, # 0
(1, TType.I16, 'idLibrary', None, None, ), # 1
(2, TType.I16, 'idTrack', None, None, ), # 2
)
class AddTrackToLibrary_result(object):
"""
Attributes:
- success
- sErrorSystemE
"""
def __init__(self, success=None, sErrorSystemE=None,):
self.success = success
self.sErrorSystemE = sErrorSystemE
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.BOOL:
self.success = iprot.readBool()
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.sErrorSystemE = SpotifakeManagement.ttypes.SErrorSystemException()
self.sErrorSystemE.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('AddTrackToLibrary_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.BOOL, 0)
oprot.writeBool(self.success)
oprot.writeFieldEnd()
if self.sErrorSystemE is not None:
oprot.writeFieldBegin('sErrorSystemE', TType.STRUCT, 1)
self.sErrorSystemE.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(AddTrackToLibrary_result)
AddTrackToLibrary_result.thrift_spec = (
(0, TType.BOOL, 'success', None, None, ), # 0
(1, TType.STRUCT, 'sErrorSystemE', [SpotifakeManagement.ttypes.SErrorSystemException, None], None, ), # 1
)
class DeleteLibraryTrack_args(object):
"""
Attributes:
- idLibrary
- trackNumber
"""
def __init__(self, idLibrary=None, trackNumber=None,):
self.idLibrary = idLibrary
self.trackNumber = trackNumber
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.I16:
self.idLibrary = iprot.readI16()
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.I16:
self.trackNumber = iprot.readI16()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('DeleteLibraryTrack_args')
if self.idLibrary is not None:
oprot.writeFieldBegin('idLibrary', TType.I16, 1)
oprot.writeI16(self.idLibrary)
oprot.writeFieldEnd()
if self.trackNumber is not None:
oprot.writeFieldBegin('trackNumber', TType.I16, 2)
oprot.writeI16(self.trackNumber)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(DeleteLibraryTrack_args)
DeleteLibraryTrack_args.thrift_spec = (
None, # 0
(1, TType.I16, 'idLibrary', None, None, ), # 1
(2, TType.I16, 'trackNumber', None, None, ), # 2
)
class DeleteLibraryTrack_result(object):
"""
Attributes:
- success
- sErrorNotFoundE
- sErrorSystemE
"""
def __init__(self, success=None, sErrorNotFoundE=None, sErrorSystemE=None,):
self.success = success
self.sErrorNotFoundE = sErrorNotFoundE
self.sErrorSystemE = sErrorSystemE
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.I16:
self.success = iprot.readI16()
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.sErrorNotFoundE = SpotifakeManagement.ttypes.SErrorNotFoundException()
self.sErrorNotFoundE.read(iprot)
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.STRUCT:
self.sErrorSystemE = SpotifakeManagement.ttypes.SErrorSystemException()
self.sErrorSystemE.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('DeleteLibraryTrack_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.I16, 0)
oprot.writeI16(self.success)
oprot.writeFieldEnd()
if self.sErrorNotFoundE is not None:
oprot.writeFieldBegin('sErrorNotFoundE', TType.STRUCT, 1)
self.sErrorNotFoundE.write(oprot)
oprot.writeFieldEnd()
if self.sErrorSystemE is not None:
oprot.writeFieldBegin('sErrorSystemE', TType.STRUCT, 2)
self.sErrorSystemE.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(DeleteLibraryTrack_result)
DeleteLibraryTrack_result.thrift_spec = (
(0, TType.I16, 'success', None, None, ), # 0
(1, TType.STRUCT, 'sErrorNotFoundE', [SpotifakeManagement.ttypes.SErrorNotFoundException, None], None, ), # 1
(2, TType.STRUCT, 'sErrorSystemE', [SpotifakeManagement.ttypes.SErrorSystemException, None], None, ), # 2
)
class AddTrackToPlaylist_args(object):
"""
Attributes:
- idPlaylist
- idTrack
"""
def __init__(self, idPlaylist=None, idTrack=None,):
self.idPlaylist = idPlaylist
self.idTrack = idTrack
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.I16:
self.idPlaylist = iprot.readI16()
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.I16:
self.idTrack = iprot.readI16()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('AddTrackToPlaylist_args')
if self.idPlaylist is not None:
oprot.writeFieldBegin('idPlaylist', TType.I16, 1)
oprot.writeI16(self.idPlaylist)
oprot.writeFieldEnd()
if self.idTrack is not None:
oprot.writeFieldBegin('idTrack', TType.I16, 2)
oprot.writeI16(self.idTrack)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(AddTrackToPlaylist_args)
AddTrackToPlaylist_args.thrift_spec = (
None, # 0
(1, TType.I16, 'idPlaylist', None, None, ), # 1
(2, TType.I16, 'idTrack', None, None, ), # 2
)
class AddTrackToPlaylist_result(object):
"""
Attributes:
- success
- sErrorSystemE
"""
def __init__(self, success=None, sErrorSystemE=None,):
self.success = success
self.sErrorSystemE = sErrorSystemE
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.BOOL:
self.success = iprot.readBool()
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.sErrorSystemE = SpotifakeManagement.ttypes.SErrorSystemException()
self.sErrorSystemE.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('AddTrackToPlaylist_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.BOOL, 0)
oprot.writeBool(self.success)
oprot.writeFieldEnd()
if self.sErrorSystemE is not None:
oprot.writeFieldBegin('sErrorSystemE', TType.STRUCT, 1)
self.sErrorSystemE.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(AddTrackToPlaylist_result)
AddTrackToPlaylist_result.thrift_spec = (
(0, TType.BOOL, 'success', None, None, ), # 0
(1, TType.STRUCT, 'sErrorSystemE', [SpotifakeManagement.ttypes.SErrorSystemException, None], None, ), # 1
)
class DeletePlaylistTrack_args(object):
"""
Attributes:
- idPlaylist
- trackNumber
"""
def __init__(self, idPlaylist=None, trackNumber=None,):
self.idPlaylist = idPlaylist
self.trackNumber = trackNumber
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.I16:
self.idPlaylist = iprot.readI16()
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.I16:
self.trackNumber = iprot.readI16()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('DeletePlaylistTrack_args')
if self.idPlaylist is not None:
oprot.writeFieldBegin('idPlaylist', TType.I16, 1)
oprot.writeI16(self.idPlaylist)
oprot.writeFieldEnd()
if self.trackNumber is not None:
oprot.writeFieldBegin('trackNumber', TType.I16, 2)
oprot.writeI16(self.trackNumber)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(DeletePlaylistTrack_args)
DeletePlaylistTrack_args.thrift_spec = (
None, # 0
(1, TType.I16, 'idPlaylist', None, None, ), # 1
(2, TType.I16, 'trackNumber', None, None, ), # 2
)
class DeletePlaylistTrack_result(object):
"""
Attributes:
- success
- sErrorNotFoundE
- sErrorSystemE
"""
def __init__(self, success=None, sErrorNotFoundE=None, sErrorSystemE=None,):
self.success = success
self.sErrorNotFoundE = sErrorNotFoundE
self.sErrorSystemE = sErrorSystemE
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.I16:
self.success = iprot.readI16()
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.sErrorNotFoundE = SpotifakeManagement.ttypes.SErrorNotFoundException()
self.sErrorNotFoundE.read(iprot)
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.STRUCT:
self.sErrorSystemE = SpotifakeManagement.ttypes.SErrorSystemException()
self.sErrorSystemE.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('DeletePlaylistTrack_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.I16, 0)
oprot.writeI16(self.success)
oprot.writeFieldEnd()
if self.sErrorNotFoundE is not None:
oprot.writeFieldBegin('sErrorNotFoundE', TType.STRUCT, 1)
self.sErrorNotFoundE.write(oprot)
oprot.writeFieldEnd()
if self.sErrorSystemE is not None:
oprot.writeFieldBegin('sErrorSystemE', TType.STRUCT, 2)
self.sErrorSystemE.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(DeletePlaylistTrack_result)
DeletePlaylistTrack_result.thrift_spec = (
(0, TType.I16, 'success', None, None, ), # 0
(1, TType.STRUCT, 'sErrorNotFoundE', [SpotifakeManagement.ttypes.SErrorNotFoundException, None], None, ), # 1
(2, TType.STRUCT, 'sErrorSystemE', [SpotifakeManagement.ttypes.SErrorSystemException, None], None, ), # 2
)
class AddTrackToPlayQueue_args(object):
"""
Attributes:
- idPlayQueu
- newTrack
"""
def __init__(self, idPlayQueu=None, newTrack=None,):
self.idPlayQueu = idPlayQueu
self.newTrack = newTrack
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.I16:
self.idPlayQueu = iprot.readI16()
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.STRUCT:
self.newTrack = SpotifakeManagement.ttypes.Track()
self.newTrack.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('AddTrackToPlayQueue_args')
if self.idPlayQueu is not None:
oprot.writeFieldBegin('idPlayQueu', TType.I16, 1)
oprot.writeI16(self.idPlayQueu)
oprot.writeFieldEnd()
if self.newTrack is not None:
oprot.writeFieldBegin('newTrack', TType.STRUCT, 2)
self.newTrack.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(AddTrackToPlayQueue_args)
AddTrackToPlayQueue_args.thrift_spec = (
None, # 0
(1, TType.I16, 'idPlayQueu', None, None, ), # 1
(2, TType.STRUCT, 'newTrack', [SpotifakeManagement.ttypes.Track, None], None, ), # 2
)
class AddTrackToPlayQueue_result(object):
"""
Attributes:
- success
- sErrorSystemE
"""
def __init__(self, success=None, sErrorSystemE=None,):
self.success = success
self.sErrorSystemE = sErrorSystemE
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.STRUCT:
self.success = SpotifakeManagement.ttypes.Track()
self.success.read(iprot)
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.sErrorSystemE = SpotifakeManagement.ttypes.SErrorSystemException()
self.sErrorSystemE.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('AddTrackToPlayQueue_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.STRUCT, 0)
self.success.write(oprot)
oprot.writeFieldEnd()
if self.sErrorSystemE is not None:
oprot.writeFieldBegin('sErrorSystemE', TType.STRUCT, 1)
self.sErrorSystemE.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(AddTrackToPlayQueue_result)
AddTrackToPlayQueue_result.thrift_spec = (
(0, TType.STRUCT, 'success', [SpotifakeManagement.ttypes.Track, None], None, ), # 0
(1, TType.STRUCT, 'sErrorSystemE', [SpotifakeManagement.ttypes.SErrorSystemException, None], None, ), # 1
)
class DeletePlayQueueTrack_args(object):
"""
Attributes:
- idPlayQueu
- trackNumber
"""
def __init__(self, idPlayQueu=None, trackNumber=None,):
self.idPlayQueu = idPlayQueu
self.trackNumber = trackNumber
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.I16:
self.idPlayQueu = iprot.readI16()
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.I16:
self.trackNumber = iprot.readI16()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('DeletePlayQueueTrack_args')
if self.idPlayQueu is not None:
oprot.writeFieldBegin('idPlayQueu', TType.I16, 1)
oprot.writeI16(self.idPlayQueu)
oprot.writeFieldEnd()
if self.trackNumber is not None:
oprot.writeFieldBegin('trackNumber', TType.I16, 2)
oprot.writeI16(self.trackNumber)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(DeletePlayQueueTrack_args)
DeletePlayQueueTrack_args.thrift_spec = (
None, # 0
(1, TType.I16, 'idPlayQueu', None, None, ), # 1
(2, TType.I16, 'trackNumber', None, None, ), # 2
)
class DeletePlayQueueTrack_result(object):
"""
Attributes:
- success
- sErrorNotFoundE
- sErrorSystemE
"""
def __init__(self, success=None, sErrorNotFoundE=None, sErrorSystemE=None,):
self.success = success
self.sErrorNotFoundE = sErrorNotFoundE
self.sErrorSystemE = sErrorSystemE
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.I16:
self.success = iprot.readI16()
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.sErrorNotFoundE = SpotifakeManagement.ttypes.SErrorNotFoundException()
self.sErrorNotFoundE.read(iprot)
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.STRUCT:
self.sErrorSystemE = SpotifakeManagement.ttypes.SErrorSystemException()
self.sErrorSystemE.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('DeletePlayQueueTrack_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.I16, 0)
oprot.writeI16(self.success)
oprot.writeFieldEnd()
if self.sErrorNotFoundE is not None:
oprot.writeFieldBegin('sErrorNotFoundE', TType.STRUCT, 1)
self.sErrorNotFoundE.write(oprot)
oprot.writeFieldEnd()
if self.sErrorSystemE is not None:
oprot.writeFieldBegin('sErrorSystemE', TType.STRUCT, 2)
self.sErrorSystemE.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(DeletePlayQueueTrack_result)
DeletePlayQueueTrack_result.thrift_spec = (
(0, TType.I16, 'success', None, None, ), # 0
(1, TType.STRUCT, 'sErrorNotFoundE', [SpotifakeManagement.ttypes.SErrorNotFoundException, None], None, ), # 1
(2, TType.STRUCT, 'sErrorSystemE', [SpotifakeManagement.ttypes.SErrorSystemException, None], None, ), # 2
)
class GenerateRadioStation_args(object):
"""
Attributes:
- idGender
"""
def __init__(self, idGender=None,):
self.idGender = idGender
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.I16:
self.idGender = iprot.readI16()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('GenerateRadioStation_args')
if self.idGender is not None:
oprot.writeFieldBegin('idGender', TType.I16, 1)
oprot.writeI16(self.idGender)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(GenerateRadioStation_args)
GenerateRadioStation_args.thrift_spec = (
None, # 0
(1, TType.I16, 'idGender', None, None, ), # 1
)
class GenerateRadioStation_result(object):
"""
Attributes:
- success
- sErrorSystemE
"""
def __init__(self, success=None, sErrorSystemE=None,):
self.success = success
self.sErrorSystemE = sErrorSystemE
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.LIST:
self.success = []
(_etype52, _size49) = iprot.readListBegin()
for _i53 in range(_size49):
_elem54 = SpotifakeManagement.ttypes.Track()
_elem54.read(iprot)
self.success.append(_elem54)
iprot.readListEnd()
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.sErrorSystemE = SpotifakeManagement.ttypes.SErrorSystemException()
self.sErrorSystemE.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('GenerateRadioStation_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.LIST, 0)
oprot.writeListBegin(TType.STRUCT, len(self.success))
for iter55 in self.success:
iter55.write(oprot)
oprot.writeListEnd()
oprot.writeFieldEnd()
if self.sErrorSystemE is not None:
oprot.writeFieldBegin('sErrorSystemE', TType.STRUCT, 1)
self.sErrorSystemE.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(GenerateRadioStation_result)
GenerateRadioStation_result.thrift_spec = (
(0, TType.LIST, 'success', (TType.STRUCT, [SpotifakeManagement.ttypes.Track, None], False), None, ), # 0
(1, TType.STRUCT, 'sErrorSystemE', [SpotifakeManagement.ttypes.SErrorSystemException, None], None, ), # 1
)
class GetLocalTracksByIdConsumer_args(object):
"""
Attributes:
- idConsumer
"""
def __init__(self, idConsumer=None,):
self.idConsumer = idConsumer
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.I16:
self.idConsumer = iprot.readI16()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('GetLocalTracksByIdConsumer_args')
if self.idConsumer is not None:
oprot.writeFieldBegin('idConsumer', TType.I16, 1)
oprot.writeI16(self.idConsumer)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(GetLocalTracksByIdConsumer_args)
GetLocalTracksByIdConsumer_args.thrift_spec = (
None, # 0
(1, TType.I16, 'idConsumer', None, None, ), # 1
)
class GetLocalTracksByIdConsumer_result(object):
"""
Attributes:
- success
- sErrorSystemE
"""
def __init__(self, success=None, sErrorSystemE=None,):
self.success = success
self.sErrorSystemE = sErrorSystemE
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.LIST:
self.success = []
(_etype59, _size56) = iprot.readListBegin()
for _i60 in range(_size56):
_elem61 = SpotifakeManagement.ttypes.LocalTrack()
_elem61.read(iprot)
self.success.append(_elem61)
iprot.readListEnd()
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.sErrorSystemE = SpotifakeManagement.ttypes.SErrorSystemException()
self.sErrorSystemE.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('GetLocalTracksByIdConsumer_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.LIST, 0)
oprot.writeListBegin(TType.STRUCT, len(self.success))
for iter62 in self.success:
iter62.write(oprot)
oprot.writeListEnd()
oprot.writeFieldEnd()
if self.sErrorSystemE is not None:
oprot.writeFieldBegin('sErrorSystemE', TType.STRUCT, 1)
self.sErrorSystemE.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(GetLocalTracksByIdConsumer_result)
GetLocalTracksByIdConsumer_result.thrift_spec = (
(0, TType.LIST, 'success', (TType.STRUCT, [SpotifakeManagement.ttypes.LocalTrack, None], False), None, ), # 0
(1, TType.STRUCT, 'sErrorSystemE', [SpotifakeManagement.ttypes.SErrorSystemException, None], None, ), # 1
)
class AddLocalTrack_args(object):
"""
Attributes:
- LocalTrack
"""
def __init__(self, LocalTrack=None,):
self.LocalTrack = LocalTrack
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.LocalTrack = SpotifakeManagement.ttypes.LocalTrack()
self.LocalTrack.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('AddLocalTrack_args')
if self.LocalTrack is not None:
oprot.writeFieldBegin('LocalTrack', TType.STRUCT, 1)
self.LocalTrack.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(AddLocalTrack_args)
AddLocalTrack_args.thrift_spec = (
None, # 0
(1, TType.STRUCT, 'LocalTrack', [SpotifakeManagement.ttypes.LocalTrack, None], None, ), # 1
)
class AddLocalTrack_result(object):
"""
Attributes:
- success
- sErrorSystemE
"""
def __init__(self, success=None, sErrorSystemE=None,):
self.success = success
self.sErrorSystemE = sErrorSystemE
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.BOOL:
self.success = iprot.readBool()
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.sErrorSystemE = SpotifakeManagement.ttypes.SErrorSystemException()
self.sErrorSystemE.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('AddLocalTrack_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.BOOL, 0)
oprot.writeBool(self.success)
oprot.writeFieldEnd()
if self.sErrorSystemE is not None:
oprot.writeFieldBegin('sErrorSystemE', TType.STRUCT, 1)
self.sErrorSystemE.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(AddLocalTrack_result)
AddLocalTrack_result.thrift_spec = (
(0, TType.BOOL, 'success', None, None, ), # 0
(1, TType.STRUCT, 'sErrorSystemE', [SpotifakeManagement.ttypes.SErrorSystemException, None], None, ), # 1
)
fix_spec(all_structs)
del all_structs
| 35.23872 | 134 | 0.608623 | 16,395 | 171,824 | 6.190607 | 0.017688 | 0.014286 | 0.025716 | 0.021814 | 0.862978 | 0.835775 | 0.827647 | 0.817725 | 0.81402 | 0.81402 | 0 | 0.006773 | 0.301431 | 171,824 | 4,875 | 135 | 35.245949 | 0.8388 | 0.055906 | 0 | 0.844193 | 1 | 0 | 0.042121 | 0.007877 | 0 | 0 | 0 | 0 | 0 | 1 | 0.103399 | false | 0.005382 | 0.002266 | 0.032295 | 0.193201 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
de8ea94d6fad15f38a66895b29351c7311d8cfb8 | 53 | py | Python | randNum.py | william-letton/ASSIST | b1b998de257a38f02206bccc8a1465bbd908b4f3 | [
"Unlicense"
] | null | null | null | randNum.py | william-letton/ASSIST | b1b998de257a38f02206bccc8a1465bbd908b4f3 | [
"Unlicense"
] | null | null | null | randNum.py | william-letton/ASSIST | b1b998de257a38f02206bccc8a1465bbd908b4f3 | [
"Unlicense"
] | null | null | null | from random import seed
from random import randint
| 17.666667 | 27 | 0.811321 | 8 | 53 | 5.375 | 0.625 | 0.465116 | 0.744186 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.188679 | 53 | 2 | 28 | 26.5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
de91abe1a9f34229f498cd7e7d243c4ba9c16cea | 10,492 | py | Python | rubiks_color_resolver/permutations.py | lidacity/rubiks-color-resolver | 88b181150b6f3db8d117715f30323cd3ece093c0 | [
"MIT"
] | null | null | null | rubiks_color_resolver/permutations.py | lidacity/rubiks-color-resolver | 88b181150b6f3db8d117715f30323cd3ece093c0 | [
"MIT"
] | null | null | null | rubiks_color_resolver/permutations.py | lidacity/rubiks-color-resolver | 88b181150b6f3db8d117715f30323cd3ece093c0 | [
"MIT"
] | null | null | null | def permutations(iterable, r=None):
"""
From https://github.com/python/cpython/blob/master/Modules/itertoolsmodule.c
"""
pool = tuple(iterable)
n = len(pool)
r = n if r is None else r
indices = list(range(n))
cycles = list(range(n - r + 1, n + 1))[::-1]
yield tuple(pool[i] for i in indices[:r])
while n:
for i in reversed(list(range(r))):
cycles[i] -= 1
if cycles[i] == 0:
indices[i:] = indices[i + 1 :] + indices[i : i + 1]
cycles[i] = n - i
else:
j = cycles[i]
indices[i], indices[-j] = indices[-j], indices[i]
yield tuple(pool[i] for i in indices[:r])
break
else:
return
odd_cube_center_color_permutations = (
("W", "O", "G", "R", "B", "Y"),
("W", "G", "R", "B", "O", "Y"),
("W", "B", "O", "G", "R", "Y"),
("W", "R", "B", "O", "G", "Y"),
("Y", "B", "R", "G", "O", "W"),
("Y", "G", "O", "B", "R", "W"),
("Y", "R", "G", "O", "B", "W"),
("Y", "O", "B", "R", "G", "W"),
("O", "Y", "G", "W", "B", "R"),
("O", "W", "B", "Y", "G", "R"),
("O", "G", "W", "B", "Y", "R"),
("O", "B", "Y", "G", "W", "R"),
("G", "Y", "R", "W", "O", "B"),
("G", "W", "O", "Y", "R", "B"),
("G", "R", "W", "O", "Y", "B"),
("G", "O", "Y", "R", "W", "B"),
("R", "Y", "B", "W", "G", "O"),
("R", "W", "G", "Y", "B", "O"),
("R", "B", "W", "G", "Y", "O"),
("R", "G", "Y", "B", "W", "O"),
("B", "W", "R", "Y", "O", "G"),
("B", "Y", "O", "W", "R", "G"),
("B", "R", "Y", "O", "W", "G"),
("B", "O", "W", "R", "Y", "G"),
)
# even_cube_center_color_permutations = list(sorted(permutations(ALL_COLORS)))
even_cube_center_color_permutations = """B G O R W Y
B G O R Y W
B G O W R Y
B G O W Y R
B G O Y R W
B G O Y W R
B G R O W Y
B G R O Y W
B G R W O Y
B G R W Y O
B G R Y O W
B G R Y W O
B G W O R Y
B G W O Y R
B G W R O Y
B G W R Y O
B G W Y O R
B G W Y R O
B G Y O R W
B G Y O W R
B G Y R O W
B G Y R W O
B G Y W O R
B G Y W R O
B O G R W Y
B O G R Y W
B O G W R Y
B O G W Y R
B O G Y R W
B O G Y W R
B O R G W Y
B O R G Y W
B O R W G Y
B O R W Y G
B O R Y G W
B O R Y W G
B O W G R Y
B O W G Y R
B O W R G Y
B O W R Y G
B O W Y G R
B O W Y R G
B O Y G R W
B O Y G W R
B O Y R G W
B O Y R W G
B O Y W G R
B O Y W R G
B R G O W Y
B R G O Y W
B R G W O Y
B R G W Y O
B R G Y O W
B R G Y W O
B R O G W Y
B R O G Y W
B R O W G Y
B R O W Y G
B R O Y G W
B R O Y W G
B R W G O Y
B R W G Y O
B R W O G Y
B R W O Y G
B R W Y G O
B R W Y O G
B R Y G O W
B R Y G W O
B R Y O G W
B R Y O W G
B R Y W G O
B R Y W O G
B W G O R Y
B W G O Y R
B W G R O Y
B W G R Y O
B W G Y O R
B W G Y R O
B W O G R Y
B W O G Y R
B W O R G Y
B W O R Y G
B W O Y G R
B W O Y R G
B W R G O Y
B W R G Y O
B W R O G Y
B W R O Y G
B W R Y G O
B W R Y O G
B W Y G O R
B W Y G R O
B W Y O G R
B W Y O R G
B W Y R G O
B W Y R O G
B Y G O R W
B Y G O W R
B Y G R O W
B Y G R W O
B Y G W O R
B Y G W R O
B Y O G R W
B Y O G W R
B Y O R G W
B Y O R W G
B Y O W G R
B Y O W R G
B Y R G O W
B Y R G W O
B Y R O G W
B Y R O W G
B Y R W G O
B Y R W O G
B Y W G O R
B Y W G R O
B Y W O G R
B Y W O R G
B Y W R G O
B Y W R O G
G B O R W Y
G B O R Y W
G B O W R Y
G B O W Y R
G B O Y R W
G B O Y W R
G B R O W Y
G B R O Y W
G B R W O Y
G B R W Y O
G B R Y O W
G B R Y W O
G B W O R Y
G B W O Y R
G B W R O Y
G B W R Y O
G B W Y O R
G B W Y R O
G B Y O R W
G B Y O W R
G B Y R O W
G B Y R W O
G B Y W O R
G B Y W R O
G O B R W Y
G O B R Y W
G O B W R Y
G O B W Y R
G O B Y R W
G O B Y W R
G O R B W Y
G O R B Y W
G O R W B Y
G O R W Y B
G O R Y B W
G O R Y W B
G O W B R Y
G O W B Y R
G O W R B Y
G O W R Y B
G O W Y B R
G O W Y R B
G O Y B R W
G O Y B W R
G O Y R B W
G O Y R W B
G O Y W B R
G O Y W R B
G R B O W Y
G R B O Y W
G R B W O Y
G R B W Y O
G R B Y O W
G R B Y W O
G R O B W Y
G R O B Y W
G R O W B Y
G R O W Y B
G R O Y B W
G R O Y W B
G R W B O Y
G R W B Y O
G R W O B Y
G R W O Y B
G R W Y B O
G R W Y O B
G R Y B O W
G R Y B W O
G R Y O B W
G R Y O W B
G R Y W B O
G R Y W O B
G W B O R Y
G W B O Y R
G W B R O Y
G W B R Y O
G W B Y O R
G W B Y R O
G W O B R Y
G W O B Y R
G W O R B Y
G W O R Y B
G W O Y B R
G W O Y R B
G W R B O Y
G W R B Y O
G W R O B Y
G W R O Y B
G W R Y B O
G W R Y O B
G W Y B O R
G W Y B R O
G W Y O B R
G W Y O R B
G W Y R B O
G W Y R O B
G Y B O R W
G Y B O W R
G Y B R O W
G Y B R W O
G Y B W O R
G Y B W R O
G Y O B R W
G Y O B W R
G Y O R B W
G Y O R W B
G Y O W B R
G Y O W R B
G Y R B O W
G Y R B W O
G Y R O B W
G Y R O W B
G Y R W B O
G Y R W O B
G Y W B O R
G Y W B R O
G Y W O B R
G Y W O R B
G Y W R B O
G Y W R O B
O B G R W Y
O B G R Y W
O B G W R Y
O B G W Y R
O B G Y R W
O B G Y W R
O B R G W Y
O B R G Y W
O B R W G Y
O B R W Y G
O B R Y G W
O B R Y W G
O B W G R Y
O B W G Y R
O B W R G Y
O B W R Y G
O B W Y G R
O B W Y R G
O B Y G R W
O B Y G W R
O B Y R G W
O B Y R W G
O B Y W G R
O B Y W R G
O G B R W Y
O G B R Y W
O G B W R Y
O G B W Y R
O G B Y R W
O G B Y W R
O G R B W Y
O G R B Y W
O G R W B Y
O G R W Y B
O G R Y B W
O G R Y W B
O G W B R Y
O G W B Y R
O G W R B Y
O G W R Y B
O G W Y B R
O G W Y R B
O G Y B R W
O G Y B W R
O G Y R B W
O G Y R W B
O G Y W B R
O G Y W R B
O R B G W Y
O R B G Y W
O R B W G Y
O R B W Y G
O R B Y G W
O R B Y W G
O R G B W Y
O R G B Y W
O R G W B Y
O R G W Y B
O R G Y B W
O R G Y W B
O R W B G Y
O R W B Y G
O R W G B Y
O R W G Y B
O R W Y B G
O R W Y G B
O R Y B G W
O R Y B W G
O R Y G B W
O R Y G W B
O R Y W B G
O R Y W G B
O W B G R Y
O W B G Y R
O W B R G Y
O W B R Y G
O W B Y G R
O W B Y R G
O W G B R Y
O W G B Y R
O W G R B Y
O W G R Y B
O W G Y B R
O W G Y R B
O W R B G Y
O W R B Y G
O W R G B Y
O W R G Y B
O W R Y B G
O W R Y G B
O W Y B G R
O W Y B R G
O W Y G B R
O W Y G R B
O W Y R B G
O W Y R G B
O Y B G R W
O Y B G W R
O Y B R G W
O Y B R W G
O Y B W G R
O Y B W R G
O Y G B R W
O Y G B W R
O Y G R B W
O Y G R W B
O Y G W B R
O Y G W R B
O Y R B G W
O Y R B W G
O Y R G B W
O Y R G W B
O Y R W B G
O Y R W G B
O Y W B G R
O Y W B R G
O Y W G B R
O Y W G R B
O Y W R B G
O Y W R G B
R B G O W Y
R B G O Y W
R B G W O Y
R B G W Y O
R B G Y O W
R B G Y W O
R B O G W Y
R B O G Y W
R B O W G Y
R B O W Y G
R B O Y G W
R B O Y W G
R B W G O Y
R B W G Y O
R B W O G Y
R B W O Y G
R B W Y G O
R B W Y O G
R B Y G O W
R B Y G W O
R B Y O G W
R B Y O W G
R B Y W G O
R B Y W O G
R G B O W Y
R G B O Y W
R G B W O Y
R G B W Y O
R G B Y O W
R G B Y W O
R G O B W Y
R G O B Y W
R G O W B Y
R G O W Y B
R G O Y B W
R G O Y W B
R G W B O Y
R G W B Y O
R G W O B Y
R G W O Y B
R G W Y B O
R G W Y O B
R G Y B O W
R G Y B W O
R G Y O B W
R G Y O W B
R G Y W B O
R G Y W O B
R O B G W Y
R O B G Y W
R O B W G Y
R O B W Y G
R O B Y G W
R O B Y W G
R O G B W Y
R O G B Y W
R O G W B Y
R O G W Y B
R O G Y B W
R O G Y W B
R O W B G Y
R O W B Y G
R O W G B Y
R O W G Y B
R O W Y B G
R O W Y G B
R O Y B G W
R O Y B W G
R O Y G B W
R O Y G W B
R O Y W B G
R O Y W G B
R W B G O Y
R W B G Y O
R W B O G Y
R W B O Y G
R W B Y G O
R W B Y O G
R W G B O Y
R W G B Y O
R W G O B Y
R W G O Y B
R W G Y B O
R W G Y O B
R W O B G Y
R W O B Y G
R W O G B Y
R W O G Y B
R W O Y B G
R W O Y G B
R W Y B G O
R W Y B O G
R W Y G B O
R W Y G O B
R W Y O B G
R W Y O G B
R Y B G O W
R Y B G W O
R Y B O G W
R Y B O W G
R Y B W G O
R Y B W O G
R Y G B O W
R Y G B W O
R Y G O B W
R Y G O W B
R Y G W B O
R Y G W O B
R Y O B G W
R Y O B W G
R Y O G B W
R Y O G W B
R Y O W B G
R Y O W G B
R Y W B G O
R Y W B O G
R Y W G B O
R Y W G O B
R Y W O B G
R Y W O G B
W B G O R Y
W B G O Y R
W B G R O Y
W B G R Y O
W B G Y O R
W B G Y R O
W B O G R Y
W B O G Y R
W B O R G Y
W B O R Y G
W B O Y G R
W B O Y R G
W B R G O Y
W B R G Y O
W B R O G Y
W B R O Y G
W B R Y G O
W B R Y O G
W B Y G O R
W B Y G R O
W B Y O G R
W B Y O R G
W B Y R G O
W B Y R O G
W G B O R Y
W G B O Y R
W G B R O Y
W G B R Y O
W G B Y O R
W G B Y R O
W G O B R Y
W G O B Y R
W G O R B Y
W G O R Y B
W G O Y B R
W G O Y R B
W G R B O Y
W G R B Y O
W G R O B Y
W G R O Y B
W G R Y B O
W G R Y O B
W G Y B O R
W G Y B R O
W G Y O B R
W G Y O R B
W G Y R B O
W G Y R O B
W O B G R Y
W O B G Y R
W O B R G Y
W O B R Y G
W O B Y G R
W O B Y R G
W O G B R Y
W O G B Y R
W O G R B Y
W O G R Y B
W O G Y B R
W O G Y R B
W O R B G Y
W O R B Y G
W O R G B Y
W O R G Y B
W O R Y B G
W O R Y G B
W O Y B G R
W O Y B R G
W O Y G B R
W O Y G R B
W O Y R B G
W O Y R G B
W R B G O Y
W R B G Y O
W R B O G Y
W R B O Y G
W R B Y G O
W R B Y O G
W R G B O Y
W R G B Y O
W R G O B Y
W R G O Y B
W R G Y B O
W R G Y O B
W R O B G Y
W R O B Y G
W R O G B Y
W R O G Y B
W R O Y B G
W R O Y G B
W R Y B G O
W R Y B O G
W R Y G B O
W R Y G O B
W R Y O B G
W R Y O G B
W Y B G O R
W Y B G R O
W Y B O G R
W Y B O R G
W Y B R G O
W Y B R O G
W Y G B O R
W Y G B R O
W Y G O B R
W Y G O R B
W Y G R B O
W Y G R O B
W Y O B G R
W Y O B R G
W Y O G B R
W Y O G R B
W Y O R B G
W Y O R G B
W Y R B G O
W Y R B O G
W Y R G B O
W Y R G O B
W Y R O B G
W Y R O G B
Y B G O R W
Y B G O W R
Y B G R O W
Y B G R W O
Y B G W O R
Y B G W R O
Y B O G R W
Y B O G W R
Y B O R G W
Y B O R W G
Y B O W G R
Y B O W R G
Y B R G O W
Y B R G W O
Y B R O G W
Y B R O W G
Y B R W G O
Y B R W O G
Y B W G O R
Y B W G R O
Y B W O G R
Y B W O R G
Y B W R G O
Y B W R O G
Y G B O R W
Y G B O W R
Y G B R O W
Y G B R W O
Y G B W O R
Y G B W R O
Y G O B R W
Y G O B W R
Y G O R B W
Y G O R W B
Y G O W B R
Y G O W R B
Y G R B O W
Y G R B W O
Y G R O B W
Y G R O W B
Y G R W B O
Y G R W O B
Y G W B O R
Y G W B R O
Y G W O B R
Y G W O R B
Y G W R B O
Y G W R O B
Y O B G R W
Y O B G W R
Y O B R G W
Y O B R W G
Y O B W G R
Y O B W R G
Y O G B R W
Y O G B W R
Y O G R B W
Y O G R W B
Y O G W B R
Y O G W R B
Y O R B G W
Y O R B W G
Y O R G B W
Y O R G W B
Y O R W B G
Y O R W G B
Y O W B G R
Y O W B R G
Y O W G B R
Y O W G R B
Y O W R B G
Y O W R G B
Y R B G O W
Y R B G W O
Y R B O G W
Y R B O W G
Y R B W G O
Y R B W O G
Y R G B O W
Y R G B W O
Y R G O B W
Y R G O W B
Y R G W B O
Y R G W O B
Y R O B G W
Y R O B W G
Y R O G B W
Y R O G W B
Y R O W B G
Y R O W G B
Y R W B G O
Y R W B O G
Y R W G B O
Y R W G O B
Y R W O B G
Y R W O G B
Y W B G O R
Y W B G R O
Y W B O G R
Y W B O R G
Y W B R G O
Y W B R O G
Y W G B O R
Y W G B R O
Y W G O B R
Y W G O R B
Y W G R B O
Y W G R O B
Y W O B G R
Y W O B R G
Y W O G B R
Y W O G R B
Y W O R B G
Y W O R G B
Y W R B G O
Y W R B O G
Y W R G B O
Y W R G O B
Y W R O B G
Y W R O G B"""
len_even_cube_center_color_permutations = 720
| 13.538065 | 80 | 0.477888 | 4,596 | 10,492 | 1.087032 | 0.011314 | 0.060048 | 0.023419 | 0.011209 | 0.923739 | 0.904724 | 0.904524 | 0.892714 | 0.86269 | 0.435148 | 0 | 0.001777 | 0.463687 | 10,492 | 774 | 81 | 13.555556 | 0.886085 | 0.014678 | 0 | 0.005215 | 0 | 0 | 0.850901 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.001304 | false | 0 | 0 | 0 | 0.002608 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
deaf4b434843998068b8042ea475c2dbf5b8a851 | 101,712 | py | Python | moments/LD/Matrices.py | grahamgower/moments | 54d2c58d91a231303fb361258e24b41b23f50661 | [
"BSD-3-Clause"
] | null | null | null | moments/LD/Matrices.py | grahamgower/moments | 54d2c58d91a231303fb361258e24b41b23f50661 | [
"BSD-3-Clause"
] | null | null | null | moments/LD/Matrices.py | grahamgower/moments | 54d2c58d91a231303fb361258e24b41b23f50661 | [
"BSD-3-Clause"
] | null | null | null | import numpy as np
from . import Util
from scipy.sparse import csc_matrix
## matrices for new LDstats2 models (h and compressed ys)
### drift
def drift_h(num_pops, nus, frozen=None):
if num_pops != len(nus):
raise ValueError("number of pops must match length of nus.")
# if any population is frozen, we set that population's size to something
# extremely large, so that drift is effectively zero
if frozen is not None:
if num_pops != len(frozen):
raise ValueError("length of 'frozen' must match number of pops.")
for pid in range(num_pops):
if frozen[pid] == True:
nus[pid] = 1e30
D = np.zeros( ( int(num_pops*(num_pops+1)/2), int(num_pops*(num_pops+1)/2) ) )
c = 0
for ii in range(num_pops):
D[c,c] = -1./nus[ii]
c += (num_pops-ii)
return D
def drift_ld(num_pops, nus, frozen=None):
if num_pops != len(nus):
raise ValueError("number of pops must match length of nus.")
# if any population is frozen, we set that population's size to something
# extremely large, so that drift is effectively zero
if frozen is not None:
if num_pops != len(frozen):
raise ValueError("length of 'frozen' must match number of pops.")
for pid in range(num_pops):
if frozen[pid] == True:
nus[pid] = 1e30
names = Util.ld_names(num_pops)
row = []
col = []
data = []
for ii, name in enumerate(names):
mom = name.split('_')[0]
pops = name.split('_')[1:]
if mom == 'DD':
pop1, pop2 = [int(p) for p in pops]
if pop1 == pop2:
new_rows = [ii, ii, ii]
new_cols = [names.index('DD_{0}_{0}'.format(pop1)),
names.index('Dz_{0}_{0}_{0}'.format(pop1)),
names.index('pi2_{0}_{0}_{0}_{0}'.format(pop1))]
new_data = [-3./nus[pop1-1], 1./nus[pop1-1], 1./nus[pop1-1]]
for r,c,d in zip(new_rows, new_cols, new_data):
row.append(r)
col.append(c)
data.append(d)
else:
row.append( ii )
col.append( names.index('DD_{0}_{1}'.format(pop1, pop2)) )
data.append( - 1./nus[pop1-1] - 1./nus[pop2-1] )
elif mom == 'Dz':
pop1, pop2, pop3 = [int(p) for p in pops]
if pop1 == pop2 == pop3:
new_rows = [ii, ii]
new_cols = [names.index('DD_{0}_{0}'.format(pop1)),
names.index('Dz_{0}_{0}_{0}'.format(pop1))]
new_data = [4./nus[pop1-1], -5./nus[pop1-1]]
for r,c,d in zip(new_rows, new_cols, new_data):
row.append(r)
col.append(c)
data.append(d)
elif pop1 == pop2:
row.append( ii )
col.append( names.index('Dz_{0}_{1}_{2}'.format(pop1, pop2, pop3)) )
data.append( -3./nus[pop1-1] )
elif pop1 == pop3:
row.append( ii )
col.append( names.index('Dz_{0}_{1}_{2}'.format(pop1, pop2, pop3)) )
data.append( -3./nus[pop1-1] )
elif pop2 == pop3:
new_rows = [ii, ii]
new_cols = [names.index(Util.map_moment('DD_{0}_{1}'.format(pop1,pop2))),
names.index('Dz_{0}_{1}_{1}'.format(pop1,pop2))]
new_data = [4./nus[pop2-1], -1./nus[pop1-1]]
for r,c,d in zip(new_rows, new_cols, new_data):
row.append(r)
col.append(c)
data.append(d)
else: # all different
row.append(ii)
col.append(names.index('Dz_{0}_{1}_{2}'.format(pop1,pop2,pop3)))
data.append(-1./nus[pop1-1])
elif mom == 'pi2':
pop1, pop2, pop3, pop4 = [int(p) for p in pops]
if pop1 == pop2 == pop3 == pop4:
new_rows = [ii, ii]
new_cols = [names.index('Dz_{0}_{0}_{0}'.format(pop1)),
names.index('pi2_{0}_{0}_{0}_{0}'.format(pop1))]
new_data = [1./nus[pop1-1], -2./nus[pop1-1]]
for r,c,d in zip(new_rows, new_cols, new_data):
row.append(r)
col.append(c)
data.append(d)
elif pop1 == pop2 == pop3:
new_rows = [ii, ii]
new_cols = [names.index('Dz_{0}_{0}_{1}'.format(pop1,pop4)),
names.index('pi2_{0}_{0}_{0}_{1}'.format(pop1,pop4))]
new_data = [1./2/nus[pop1-1], -1./nus[pop1-1]]
for r,c,d in zip(new_rows, new_cols, new_data):
row.append(r)
col.append(c)
data.append(d)
elif pop1 == pop2 == pop4:
new_rows = [ii, ii]
new_cols = [names.index('Dz_{0}_{1}_{0}'.format(pop1,pop3)),
names.index('pi2_{0}_{0}_{1}_{0}'.format(pop1,pop3))]
new_data = [1./2/nus[pop1-1], -1./nus[pop1-1]]
for r,c,d in zip(new_rows, new_cols, new_data):
row.append(r)
col.append(c)
data.append(d)
elif pop1 == pop2 and pop3 == pop4:
row.append( ii )
col.append( names.index('pi2_{0}_{0}_{1}_{1}'.format(pop1,pop3)) )
data.append( - 1./nus[pop1-1] - 1./nus[pop3-1] )
elif pop1 == pop2:
row.append( ii )
col.append( names.index('pi2_{0}_{0}_{1}_{2}'.format(pop1,pop3,pop4)) )
data.append( -1./nus[pop1-1] )
elif pop1 == pop3 == pop4:
new_rows = [ii, ii]
new_cols = [names.index(Util.map_moment('Dz_{0}_{1}_{0}'.format(pop1,pop2))),
names.index('pi2_{0}_{1}_{0}_{0}'.format(pop1,pop2))]
new_data = [1./2/nus[pop1-1], -1./nus[pop1-1]]
for r,c,d in zip(new_rows, new_cols, new_data):
row.append(r)
col.append(c)
data.append(d)
elif pop2 == pop3 == pop4:
new_rows = [ii, ii]
new_cols = [names.index(Util.map_moment('Dz_{1}_{0}_{1}'.format(pop1,pop2))),
names.index('pi2_{0}_{1}_{1}_{1}'.format(pop1,pop2))]
new_data = [1./2/nus[pop2-1], -1./nus[pop2-1]]
for r,c,d in zip(new_rows, new_cols, new_data):
row.append(r)
col.append(c)
data.append(d)
elif pop1 == pop3 and pop2 == pop4:
new_rows = [ii, ii]
new_cols = [names.index('Dz_{0}_{1}_{1}'.format(pop1,pop2)),
names.index('Dz_{1}_{0}_{0}'.format(pop1,pop2))]
#new_data = [-1./4/nus[pop1-1], 1./4/nus[pop2-1]] ### first value changed from - to +
new_data = [1./4/nus[pop1-1], 1./4/nus[pop2-1]]
for r,c,d in zip(new_rows, new_cols, new_data):
row.append(r)
col.append(c)
data.append(d)
elif pop1 == pop3:
row.append( ii )
col.append( names.index(Util.map_moment('Dz_{0}_{1}_{2}'.format(pop1,pop2,pop4))) )
data.append( 1./4/nus[pop1-1] )
elif pop1 == pop4:
row.append( ii )
col.append( names.index(Util.map_moment('Dz_{0}_{1}_{2}'.format(pop1,pop2,pop3))) )
data.append( 1./4/nus[pop1-1] )
elif pop2 == pop3:
row.append( ii )
col.append( names.index(Util.map_moment('Dz_{1}_{0}_{2}'.format(pop1,pop2,pop4))) )
data.append( 1./4/nus[pop2-1] )
elif pop2 == pop4:
row.append( ii )
col.append( names.index(Util.map_moment('Dz_{1}_{0}_{2}'.format(pop1,pop2,pop3))) )
data.append( 1./4/nus[pop2-1] )
elif pop3 == pop4:
row.append( ii )
col.append( names.index('pi2_{0}_{1}_{2}_{2}'.format(pop1,pop2,pop3)) )
data.append( -1./nus[pop3-1] )
else:
if len(set([pop1,pop2,pop3,pop4])) < 4:
print("oh no")
print(pop1,pop2,pop3,pop4)
return csc_matrix((data,(row,col)),shape=(len(names), len(names)))
### mutation
def mutation_h(num_pops, theta, frozen=None, selfing=None):
if frozen is None and selfing is None:
return theta*np.ones(int(num_pops*(num_pops+1)/2))
else:
U = np.zeros( int(num_pops*(num_pops+1)/2) )
if selfing is None:
selfing = [0] * num_pops
if frozen is None:
frozen = [False] * num_pops
c = 0
for ii in range(num_pops):
for jj in range(ii,num_pops):
if frozen[ii] is not True:
U[c] += theta/2.*(1-selfing[ii]/2.)
if frozen[jj] is not True:
U[c] += theta/2.*(1-selfing[jj]/2.)
c += 1
return U
def mutation_ld(num_pops, theta, frozen=None, selfing=None):
names_ld, names_h = Util.moment_names(num_pops)
row = []
col = []
data = []
if frozen is None and selfing is None:
for ii, mom in enumerate(names_ld):
name = mom.split('_')[0]
if name == 'pi2':
hmomp = 'H_' + mom.split('_')[1] +'_' + mom.split('_')[2]
hmomq = 'H_' + mom.split('_')[3] +'_' + mom.split('_')[4]
if hmomp == hmomq:
row.append(ii)
col.append(names_h.index(hmomp))
data.append(theta/2.)
else:
row.append(ii)
row.append(ii)
col.append(names_h.index(hmomp))
col.append(names_h.index(hmomq))
data.append(theta/4.)
data.append(theta/4.)
else:
thetas = [theta]*num_pops
if selfing is not None:
for i,s in enumerate(selfing):
thetas[i] = thetas[i]*(1-s/2.)
if frozen is not None:
for pid in range(num_pops):
if frozen[pid] == True:
thetas[pid] = 0.0
for ii, mom in enumerate(names_ld):
name = mom.split('_')[0]
if name == 'pi2':
i,j,k,l = [int(x) for x in mom.split('_')[1:]]
hmomp = 'H_' + str(i) +'_' + str(j)
hmomq = 'H_' + str(k) +'_' + str(l)
if hmomp == hmomq:
# i=k and j=l
row.append(ii)
col.append(names_h.index(hmomp))
data.append((thetas[i-1]/2. + thetas[j-1]/2.)/2.)
else:
row.append(ii)
col.append(names_h.index(hmomp)) # check if k and l are frozen
data.append( (thetas[k-1]/2.+thetas[l-1]/2.) / 4. )
row.append(ii)
col.append(names_h.index(hmomq)) # check if i and j are frozen
data.append( (thetas[i-1]/2.+thetas[j-1]/2.) / 4. )
return csc_matrix((data,(row,col)),shape=(len(names_ld), len(names_h)))
### recombination
def recombination(num_pops, r, frozen=None, selfing=None):
names = Util.ld_names(num_pops)
row = list(range(int(num_pops*(num_pops+1)/2 + num_pops**2*(num_pops+1)/2)))
col = list(range(int(num_pops*(num_pops+1)/2 + num_pops**2*(num_pops+1)/2)))
if frozen is None and selfing is None:
data = [-1.*r]*int(num_pops*(num_pops+1)/2) + [-r/2.]*int(num_pops**2*(num_pops+1)/2)
else:
rs = [r]*num_pops
if selfing is not None:
for i,s in enumerate(selfing):
rs[i] = rs[i] * (1-s)
if frozen is not None:
for pid in range(num_pops):
if frozen[pid] == True:
rs[pid] = 0
data = []
for name in names:
if name.split('_')[0] == 'DD':
pop1 = int(name.split('_')[1])
pop2 = int(name.split('_')[2])
data.append(-1./2*rs[pop1-1] -1./2*rs[pop2-1])
elif name.split('_')[0] == 'Dz':
pop1 = int(name.split('_')[1])
data.append(-1./2*rs[pop1-1])
else:
continue
return csc_matrix((data,(row,col)),shape=(len(names), len(names)))
### migration
def migration_h(num_pops, mig_mat, frozen=None):
"""
mig_mat has the form [[0, m12, m13, ..., m1n], ..., [mn1, mn2, ..., 0]]
Note that m12 is the probability that a lineage in deme 1 had its parent
in deme 2, to be consisten with moments (fs).
"""
if frozen is not None:
for pid in range(num_pops):
if frozen[pid] == True:
for pid2 in range(num_pops):
mig_mat[pid][pid2] = 0
mig_mat[pid2][pid] = 0
Hs = Util.het_names(num_pops)
M = np.zeros( ( len(Hs), len(Hs) ) )
for ii,H in enumerate(Hs):
pop1,pop2 = [int(f) for f in H.split('_')[1:]]
if pop1 == pop2:
for jj in range(1,num_pops+1):
if jj == pop1:
continue
else:
M[ii,ii] -= 2*mig_mat[pop1-1][jj-1]
M[ii,Hs.index(Util.map_moment('H_{0}_{1}'.format(pop1,jj)))] += 2*mig_mat[pop1-1][jj-1]
else:
for jj in range(1,num_pops+1):
if jj == pop1:
continue
else:
M[ii,ii] -= mig_mat[pop1-1][jj-1]
M[ii,Hs.index(Util.map_moment('H_{0}_{1}'.format(pop2,jj)))] += mig_mat[pop1-1][jj-1]
for jj in range(1,num_pops+1):
if jj == pop2:
continue
else:
M[ii,ii] -= mig_mat[pop2-1][jj-1]
M[ii,Hs.index(Util.map_moment('H_{0}_{1}'.format(pop1,jj)))] += mig_mat[pop2-1][jj-1]
return M
def migration_ld(num_pops, mig_mat, frozen=None):
if frozen is not None:
for pid in range(num_pops):
if frozen[pid] == True:
for pid2 in range(num_pops):
mig_mat[pid][pid2] = 0
mig_mat[pid2][pid] = 0
Ys = Util.ld_names(num_pops)
M = np.zeros( ( len(Ys), len(Ys) ) )
for ii, mom in enumerate(Ys):
name = mom.split('_')[0]
pops = [ int(p) for p in mom.split('_')[1:] ]
if name == 'DD':
pop1, pop2 = pops
if pop1 == pop2:
for jj in range(1,num_pops+1):
if jj != pop1:
M[ii, Ys.index(Util.map_moment('DD_{0}_{0}'.format(pop1)))] -= 2 * mig_mat[pop1-1][jj-1]
M[ii, Ys.index(Util.map_moment('DD_{0}_{1}'.format(pop1,jj)))] += 2 * mig_mat[pop1-1][jj-1]
M[ii, Ys.index(Util.map_moment('Dz_{0}_{0}_{0}'.format(pop1)))] += 1./2 * mig_mat[pop1-1][jj-1]
M[ii, Ys.index(Util.map_moment('Dz_{0}_{0}_{1}'.format(pop1,jj)))] -= 1./2 * mig_mat[pop1-1][jj-1]
M[ii, Ys.index(Util.map_moment('Dz_{0}_{1}_{0}'.format(pop1,jj)))] -= 1./2 * mig_mat[pop1-1][jj-1]
M[ii, Ys.index(Util.map_moment('Dz_{0}_{1}_{1}'.format(pop1,jj)))] += 1./2 * mig_mat[pop1-1][jj-1]
else:
for kk in range(1,num_pops+1):
if kk != pop1:
M[ii, Ys.index(Util.map_moment('DD_{0}_{1}'.format(pop1,pop2)))] -= mig_mat[pop1-1][kk-1]
M[ii, Ys.index(Util.map_moment('DD_{0}_{1}'.format(kk,pop2)))] += mig_mat[pop1-1][kk-1]
M[ii, Ys.index(Util.map_moment('Dz_{0}_{1}_{1}'.format(pop2,pop1)))] += 1./4 * mig_mat[pop1-1][kk-1]
M[ii, Ys.index(Util.map_moment('Dz_{0}_{1}_{2}'.format(pop2,pop1,kk)))] -= 1./4 * mig_mat[pop1-1][kk-1]
M[ii, Ys.index(Util.map_moment('Dz_{0}_{1}_{2}'.format(pop2,kk,pop1)))] -= 1./4 * mig_mat[pop1-1][kk-1]
M[ii, Ys.index(Util.map_moment('Dz_{0}_{1}_{1}'.format(pop2,kk)))] += 1./4 * mig_mat[pop1-1][kk-1]
if kk != pop2:
M[ii, Ys.index(Util.map_moment('DD_{0}_{1}'.format(pop1,pop2)))] -= mig_mat[pop2-1][kk-1]
M[ii, Ys.index(Util.map_moment('DD_{0}_{1}'.format(pop1,kk)))] += mig_mat[pop2-1][kk-1]
M[ii, Ys.index(Util.map_moment('Dz_{0}_{1}_{1}'.format(pop1,pop2)))] += 1./4 * mig_mat[pop2-1][kk-1]
M[ii, Ys.index(Util.map_moment('Dz_{0}_{1}_{2}'.format(pop1,pop2,kk)))] -= 1./4 * mig_mat[pop2-1][kk-1]
M[ii, Ys.index(Util.map_moment('Dz_{0}_{1}_{2}'.format(pop1,kk,pop2)))] -= 1./4 * mig_mat[pop2-1][kk-1]
M[ii, Ys.index(Util.map_moment('Dz_{0}_{1}_{1}'.format(pop1,kk)))] += 1./4 * mig_mat[pop2-1][kk-1]
elif name == 'Dz':
pop1, pop2, pop3 = pops
if pop1 == pop2 == pop3:
for jj in range(1,num_pops+1):
if jj != pop1:
M[ii, Ys.index(Util.map_moment('Dz_{0}_{0}_{0}'.format(pop1)))] -= 3 * mig_mat[pop1-1][jj-1]
M[ii, Ys.index(Util.map_moment('Dz_{0}_{0}_{1}'.format(pop1,jj)))] += mig_mat[pop1-1][jj-1]
M[ii, Ys.index(Util.map_moment('Dz_{0}_{1}_{0}'.format(pop1,jj)))] += mig_mat[pop1-1][jj-1]
M[ii, Ys.index(Util.map_moment('Dz_{1}_{0}_{0}'.format(pop1,jj)))] += mig_mat[pop1-1][jj-1]
M[ii, Ys.index(Util.map_moment('pi2_{0}_{0}_{0}_{0}'.format(pop1)))] += 4 * mig_mat[pop1-1][jj-1]
M[ii, Ys.index(Util.map_moment('pi2_{0}_{0}_{0}_{1}'.format(pop1,jj)))] -= 4 * mig_mat[pop1-1][jj-1]
M[ii, Ys.index(Util.map_moment('pi2_{0}_{1}_{0}_{0}'.format(pop1,jj)))] -= 4 * mig_mat[pop1-1][jj-1]
M[ii, Ys.index(Util.map_moment('pi2_{0}_{1}_{0}_{1}'.format(pop1,jj)))] += 4 * mig_mat[pop1-1][jj-1]
elif pop1 == pop2:
for kk in range(1,num_pops+1):
if kk != pop1:
M[ii, Ys.index(Util.map_moment('Dz_{0}_{0}_{1}'.format(pop1,pop3)))] -= 2 * mig_mat[pop1-1][kk-1]
M[ii, Ys.index(Util.map_moment('Dz_{0}_{1}_{2}'.format(kk,pop1,pop3)))] += mig_mat[pop1-1][kk-1]
M[ii, Ys.index(Util.map_moment('Dz_{0}_{1}_{2}'.format(pop1,kk,pop3)))] += mig_mat[pop1-1][kk-1]
M[ii, Ys.index(Util.map_moment('pi2_{0}_{0}_{0}_{1}'.format(pop1,pop3)))] += 4 * mig_mat[pop1-1][kk-1]
M[ii, Ys.index(Util.map_moment('pi2_{0}_{0}_{1}_{2}'.format(pop1,pop3,kk)))] -= 4 * mig_mat[pop1-1][kk-1]
M[ii, Ys.index(Util.map_moment('pi2_{0}_{2}_{0}_{1}'.format(pop1,pop3,kk)))] -= 4 * mig_mat[pop1-1][kk-1]
M[ii, Ys.index(Util.map_moment('pi2_{0}_{2}_{1}_{2}'.format(pop1,pop3,kk)))] += 4 * mig_mat[pop1-1][kk-1]
if kk != pop3:
M[ii, Ys.index(Util.map_moment('Dz_{0}_{0}_{1}'.format(pop1,pop3)))] -= mig_mat[pop3-1][kk-1]
M[ii, Ys.index(Util.map_moment('Dz_{0}_{0}_{1}'.format(pop1,kk)))] += mig_mat[pop3-1][kk-1]
elif pop1 == pop3:
for kk in range(1,num_pops+1):
if kk != pop1:
M[ii, Ys.index(Util.map_moment('Dz_{0}_{1}_{0}'.format(pop1,pop2)))] -= 2 * mig_mat[pop1-1][kk-1]
M[ii, Ys.index(Util.map_moment('Dz_{0}_{1}_{2}'.format(kk,pop2,pop1)))] += mig_mat[pop1-1][kk-1]
M[ii, Ys.index(Util.map_moment('Dz_{0}_{1}_{2}'.format(pop1,pop2,kk)))] += mig_mat[pop1-1][kk-1]
M[ii, Ys.index(Util.map_moment('pi2_{0}_{1}_{0}_{0}'.format(pop1,pop2)))] += 4 * mig_mat[pop1-1][kk-1]
M[ii, Ys.index(Util.map_moment('pi2_{0}_{1}_{0}_{2}'.format(pop1,pop2,kk)))] -= 4 * mig_mat[pop1-1][kk-1]
M[ii, Ys.index(Util.map_moment('pi2_{1}_{2}_{0}_{0}'.format(pop1,pop2,kk)))] -= 4 * mig_mat[pop1-1][kk-1]
M[ii, Ys.index(Util.map_moment('pi2_{1}_{2}_{0}_{2}'.format(pop1,pop2,kk)))] += 4 * mig_mat[pop1-1][kk-1]
if kk != pop2:
M[ii, Ys.index(Util.map_moment('Dz_{0}_{1}_{0}'.format(pop1,pop2)))] -= mig_mat[pop2-1][kk-1]
M[ii, Ys.index(Util.map_moment('Dz_{0}_{1}_{0}'.format(pop1,kk)))] += mig_mat[pop2-1][kk-1]
elif pop2 == pop3:
for kk in range(1,num_pops+1):
if kk != pop1:
M[ii, Ys.index(Util.map_moment('Dz_{0}_{1}_{1}'.format(pop1,pop2)))] -= mig_mat[pop1-1][kk-1]
M[ii, Ys.index(Util.map_moment('Dz_{2}_{1}_{1}'.format(pop1,pop2,kk)))] += mig_mat[pop1-1][kk-1]
M[ii, Ys.index(Util.map_moment('pi2_{0}_{1}_{0}_{1}'.format(pop1,pop2)))] += 4 * mig_mat[pop1-1][kk-1]
M[ii, Ys.index(Util.map_moment('pi2_{0}_{1}_{1}_{2}'.format(pop1,pop2,kk)))] -= 4 * mig_mat[pop1-1][kk-1]
M[ii, Ys.index(Util.map_moment('pi2_{1}_{2}_{0}_{1}'.format(pop1,pop2,kk)))] -= 4 * mig_mat[pop1-1][kk-1]
M[ii, Ys.index(Util.map_moment('pi2_{1}_{2}_{1}_{2}'.format(pop1,pop2,kk)))] += 4 * mig_mat[pop1-1][kk-1]
if kk != pop2:
M[ii, Ys.index(Util.map_moment('Dz_{0}_{1}_{1}'.format(pop1,pop2)))] -= 2 * mig_mat[pop2-1][kk-1]
M[ii, Ys.index(Util.map_moment('Dz_{0}_{1}_{2}'.format(pop1,pop2,kk)))] += mig_mat[pop2-1][kk-1]
M[ii, Ys.index(Util.map_moment('Dz_{0}_{2}_{1}'.format(pop1,pop2,kk)))] += mig_mat[pop2-1][kk-1]
else:
for ll in range(1,num_pops+1):
if ll != pop1:
M[ii, Ys.index(Util.map_moment('Dz_{0}_{1}_{2}'.format(pop1,pop2,pop3)))] -= mig_mat[pop1-1][ll-1]
M[ii, Ys.index(Util.map_moment('Dz_{0}_{1}_{2}'.format(ll,pop2,pop3)))] += mig_mat[pop1-1][ll-1]
M[ii, Ys.index(Util.map_moment('pi2_{0}_{1}_{0}_{2}'.format(pop1,pop2,pop3)))] += 4 * mig_mat[pop1-1][ll-1]
M[ii, Ys.index(Util.map_moment('pi2_{0}_{1}_{2}_{3}'.format(pop1,pop2,pop3,ll)))] -= 4 * mig_mat[pop1-1][ll-1]
M[ii, Ys.index(Util.map_moment('pi2_{1}_{3}_{0}_{2}'.format(pop1,pop2,pop3,ll)))] -= 4 * mig_mat[pop1-1][ll-1]
M[ii, Ys.index(Util.map_moment('pi2_{1}_{3}_{2}_{3}'.format(pop1,pop2,pop3,ll)))] += 4 * mig_mat[pop1-1][ll-1]
if ll != pop2:
M[ii, Ys.index(Util.map_moment('Dz_{0}_{1}_{2}'.format(pop1,pop2,pop3)))] -= mig_mat[pop2-1][ll-1]
M[ii, Ys.index(Util.map_moment('Dz_{0}_{1}_{2}'.format(pop1,ll,pop3)))] += mig_mat[pop2-1][ll-1]
if ll != pop3:
M[ii, Ys.index(Util.map_moment('Dz_{0}_{1}_{2}'.format(pop1,pop2,pop3)))] -= mig_mat[pop3-1][ll-1]
M[ii, Ys.index(Util.map_moment('Dz_{0}_{1}_{2}'.format(pop1,pop2,ll)))] += mig_mat[pop3-1][ll-1]
elif name == 'pi2':
pop1, pop2, pop3, pop4 = pops
if pop1 == pop2 == pop3 == pop4:
for jj in range(1,num_pops+1):
if jj != pop1:
M[ii, Ys.index(Util.map_moment('pi2_{0}_{0}_{0}_{0}'.format(pop1)))] -= 4 * mig_mat[pop1-1][jj-1]
M[ii, Ys.index(Util.map_moment('pi2_{0}_{0}_{0}_{1}'.format(pop1,jj)))] += 2 * mig_mat[pop1-1][jj-1]
M[ii, Ys.index(Util.map_moment('pi2_{0}_{1}_{0}_{0}'.format(pop1,jj)))] += 2 * mig_mat[pop1-1][jj-1]
elif pop1 == pop2 == pop3:
for kk in range(1,num_pops+1):
if kk != pop1:
M[ii, Ys.index(Util.map_moment('pi2_{0}_{0}_{0}_{1}'.format(pop1,pop4)))] -= 3 * mig_mat[pop1-1][kk-1]
M[ii, Ys.index(Util.map_moment('pi2_{0}_{2}_{0}_{1}'.format(pop1,pop4,kk)))] += 2 * mig_mat[pop1-1][kk-1]
M[ii, Ys.index(Util.map_moment('pi2_{0}_{0}_{1}_{2}'.format(pop1,pop4,kk)))] += mig_mat[pop1-1][kk-1]
if kk != pop4:
M[ii, Ys.index(Util.map_moment('pi2_{0}_{0}_{0}_{1}'.format(pop1,pop4)))] -= mig_mat[pop4-1][kk-1]
M[ii, Ys.index(Util.map_moment('pi2_{0}_{0}_{0}_{2}'.format(pop1,pop4,kk)))] += mig_mat[pop4-1][kk-1]
elif pop1 == pop2 == pop4:
for kk in range(1,num_pops+1):
if kk != pop1:
M[ii, Ys.index(Util.map_moment('pi2_{0}_{0}_{1}_{0}'.format(pop1,pop3)))] -= 3 * mig_mat[pop1-1][kk-1]
M[ii, Ys.index(Util.map_moment('pi2_{0}_{2}_{1}_{0}'.format(pop1,pop3,kk)))] += 2 * mig_mat[pop1-1][kk-1]
M[ii, Ys.index(Util.map_moment('pi2_{0}_{0}_{1}_{2}'.format(pop1,pop3,kk)))] += mig_mat[pop1-1][kk-1]
if kk != pop3:
M[ii, Ys.index(Util.map_moment('pi2_{0}_{0}_{1}_{0}'.format(pop1,pop3)))] -= mig_mat[pop3-1][kk-1]
M[ii, Ys.index(Util.map_moment('pi2_{0}_{0}_{2}_{0}'.format(pop1,pop3,kk)))] += mig_mat[pop3-1][kk-1]
elif pop1 == pop2 and pop3 == pop4:
for kk in range(1,num_pops+1):
if kk != pop1:
M[ii, Ys.index(Util.map_moment('pi2_{0}_{0}_{1}_{1}'.format(pop1,pop3)))] -= 2 * mig_mat[pop1-1][kk-1]
M[ii, Ys.index(Util.map_moment('pi2_{0}_{2}_{1}_{1}'.format(pop1,pop3,kk)))] += 2 * mig_mat[pop1-1][kk-1]
if kk != pop3:
M[ii, Ys.index(Util.map_moment('pi2_{0}_{0}_{1}_{1}'.format(pop1,pop3)))] -= 2 * mig_mat[pop3-1][kk-1]
M[ii, Ys.index(Util.map_moment('pi2_{0}_{0}_{1}_{2}'.format(pop1,pop3,kk)))] += 2 * mig_mat[pop3-1][kk-1]
elif pop1 == pop2:
for ll in range(1,num_pops+1):
if ll != pop1:
M[ii, Ys.index(Util.map_moment('pi2_{0}_{0}_{1}_{2}'.format(pop1,pop3,pop4)))] -= 2 * mig_mat[pop1-1][ll-1]
M[ii, Ys.index(Util.map_moment('pi2_{0}_{3}_{1}_{2}'.format(pop1,pop3,pop4,ll)))] += 2 * mig_mat[pop1-1][ll-1]
if ll != pop3:
M[ii, Ys.index(Util.map_moment('pi2_{0}_{0}_{1}_{2}'.format(pop1,pop3,pop4)))] -= mig_mat[pop3-1][ll-1]
M[ii, Ys.index(Util.map_moment('pi2_{0}_{0}_{2}_{3}'.format(pop1,pop3,pop4,ll)))] += mig_mat[pop3-1][ll-1]
if ll != pop4:
M[ii, Ys.index(Util.map_moment('pi2_{0}_{0}_{1}_{2}'.format(pop1,pop3,pop4)))] -= mig_mat[pop4-1][ll-1]
M[ii, Ys.index(Util.map_moment('pi2_{0}_{0}_{1}_{3}'.format(pop1,pop3,pop4,ll)))] += mig_mat[pop4-1][ll-1]
elif pop1 == pop3 == pop4:
for kk in range(1,num_pops+1):
if kk != pop1:
M[ii, Ys.index(Util.map_moment('pi2_{0}_{1}_{0}_{0}'.format(pop1,pop2)))] -= 3 * mig_mat[pop1-1][kk-1]
M[ii, Ys.index(Util.map_moment('pi2_{0}_{1}_{0}_{2}'.format(pop1,pop2,kk)))] += 2 * mig_mat[pop1-1][kk-1]
M[ii, Ys.index(Util.map_moment('pi2_{1}_{2}_{0}_{0}'.format(pop1,pop2,kk)))] += mig_mat[pop1-1][kk-1]
if kk != pop2:
M[ii, Ys.index(Util.map_moment('pi2_{0}_{1}_{0}_{0}'.format(pop1,pop2)))] -= mig_mat[pop2-1][kk-1]
M[ii, Ys.index(Util.map_moment('pi2_{0}_{2}_{0}_{0}'.format(pop1,pop2,kk)))] += mig_mat[pop2-1][kk-1]
elif pop2 == pop3 == pop4:
for kk in range(1,num_pops+1):
if kk != pop1:
M[ii, Ys.index(Util.map_moment('pi2_{0}_{1}_{1}_{1}'.format(pop1,pop2)))] -= mig_mat[pop1-1][kk-1]
M[ii, Ys.index(Util.map_moment('pi2_{1}_{2}_{1}_{1}'.format(pop1,pop2,kk)))] += mig_mat[pop1-1][kk-1]
if kk != pop2:
M[ii, Ys.index(Util.map_moment('pi2_{0}_{1}_{1}_{1}'.format(pop1,pop2)))] -= 3 * mig_mat[pop2-1][kk-1]
M[ii, Ys.index(Util.map_moment('pi2_{0}_{1}_{1}_{2}'.format(pop1,pop2,kk)))] += 2 * mig_mat[pop2-1][kk-1]
M[ii, Ys.index(Util.map_moment('pi2_{0}_{2}_{1}_{1}'.format(pop1,pop2,kk)))] += mig_mat[pop2-1][kk-1]
elif pop1 == pop3 and pop2 == pop4:
for kk in range(1,num_pops+1):
if kk != pop1:
M[ii, Ys.index(Util.map_moment('pi2_{0}_{1}_{0}_{1}'.format(pop1,pop2)))] -= 2 * mig_mat[pop1-1][kk-1]
M[ii, Ys.index(Util.map_moment('pi2_{0}_{1}_{1}_{2}'.format(pop1,pop2,kk)))] += mig_mat[pop1-1][kk-1]
M[ii, Ys.index(Util.map_moment('pi2_{1}_{2}_{0}_{1}'.format(pop1,pop2,kk)))] += mig_mat[pop1-1][kk-1]
if kk != pop2:
M[ii, Ys.index(Util.map_moment('pi2_{0}_{1}_{0}_{1}'.format(pop1,pop2)))] -= 2 * mig_mat[pop2-1][kk-1]
M[ii, Ys.index(Util.map_moment('pi2_{0}_{1}_{0}_{2}'.format(pop1,pop2,kk)))] += mig_mat[pop2-1][kk-1]
M[ii, Ys.index(Util.map_moment('pi2_{0}_{2}_{0}_{1}'.format(pop1,pop2,kk)))] += mig_mat[pop2-1][kk-1]
elif pop1 == pop3:
for ll in range(1,num_pops+1):
if ll != pop1:
M[ii, Ys.index(Util.map_moment('pi2_{0}_{1}_{0}_{2}'.format(pop1,pop2,pop4)))] -= 2 * mig_mat[pop1-1][ll-1]
M[ii, Ys.index(Util.map_moment('pi2_{0}_{1}_{2}_{3}'.format(pop1,pop2,pop4,ll)))] += mig_mat[pop1-1][ll-1]
M[ii, Ys.index(Util.map_moment('pi2_{1}_{3}_{0}_{2}'.format(pop1,pop2,pop4,ll)))] += mig_mat[pop1-1][ll-1]
if ll != pop2:
M[ii, Ys.index(Util.map_moment('pi2_{0}_{1}_{0}_{2}'.format(pop1,pop2,pop4)))] -= mig_mat[pop2-1][ll-1]
M[ii, Ys.index(Util.map_moment('pi2_{0}_{3}_{0}_{2}'.format(pop1,pop2,pop4,ll)))] += mig_mat[pop2-1][ll-1]
if ll != pop4:
M[ii, Ys.index(Util.map_moment('pi2_{0}_{1}_{0}_{2}'.format(pop1,pop2,pop4)))] -= mig_mat[pop4-1][ll-1]
M[ii, Ys.index(Util.map_moment('pi2_{0}_{1}_{0}_{3}'.format(pop1,pop2,pop4,ll)))] += mig_mat[pop4-1][ll-1]
elif pop1 == pop4:
for ll in range(1,num_pops+1):
if ll != pop1:
M[ii, Ys.index(Util.map_moment('pi2_{0}_{1}_{2}_{0}'.format(pop1,pop2,pop3)))] -= 2 * mig_mat[pop1-1][ll-1]
M[ii, Ys.index(Util.map_moment('pi2_{1}_{3}_{2}_{0}'.format(pop1,pop2,pop3,ll)))] += mig_mat[pop1-1][ll-1]
M[ii, Ys.index(Util.map_moment('pi2_{0}_{1}_{2}_{3}'.format(pop1,pop2,pop3,ll)))] += mig_mat[pop1-1][ll-1]
if ll != pop2:
M[ii, Ys.index(Util.map_moment('pi2_{0}_{1}_{2}_{0}'.format(pop1,pop2,pop3)))] -= mig_mat[pop2-1][ll-1]
M[ii, Ys.index(Util.map_moment('pi2_{0}_{3}_{2}_{0}'.format(pop1,pop2,pop3,ll)))] += mig_mat[pop2-1][ll-1]
if ll != pop3:
M[ii, Ys.index(Util.map_moment('pi2_{0}_{1}_{2}_{0}'.format(pop1,pop2,pop3)))] -= mig_mat[pop3-1][ll-1]
M[ii, Ys.index(Util.map_moment('pi2_{0}_{1}_{3}_{0}'.format(pop1,pop2,pop3,ll)))] += mig_mat[pop3-1][ll-1]
elif pop2 == pop3:
for ll in range(1,num_pops+1):
if ll != pop1:
M[ii, Ys.index(Util.map_moment('pi2_{0}_{1}_{1}_{2}'.format(pop1,pop2,pop4)))] -= mig_mat[pop1-1][ll-1]
M[ii, Ys.index(Util.map_moment('pi2_{1}_{3}_{1}_{2}'.format(pop1,pop2,pop4,ll)))] += mig_mat[pop1-1][ll-1]
if ll != pop2:
M[ii, Ys.index(Util.map_moment('pi2_{0}_{1}_{1}_{2}'.format(pop1,pop2,pop4)))] -= 2 * mig_mat[pop2-1][ll-1]
M[ii, Ys.index(Util.map_moment('pi2_{0}_{3}_{1}_{2}'.format(pop1,pop2,pop4,ll)))] += mig_mat[pop2-1][ll-1]
M[ii, Ys.index(Util.map_moment('pi2_{0}_{1}_{2}_{3}'.format(pop1,pop2,pop4,ll)))] += mig_mat[pop2-1][ll-1]
if ll != pop4:
M[ii, Ys.index(Util.map_moment('pi2_{0}_{1}_{1}_{2}'.format(pop1,pop2,pop4)))] -= mig_mat[pop4-1][ll-1]
M[ii, Ys.index(Util.map_moment('pi2_{0}_{1}_{1}_{3}'.format(pop1,pop2,pop4,ll)))] += mig_mat[pop4-1][ll-1]
elif pop2 == pop4:
for ll in range(1,num_pops+1):
if ll != pop1:
M[ii, Ys.index(Util.map_moment('pi2_{0}_{1}_{2}_{1}'.format(pop1,pop2,pop3)))] -= mig_mat[pop1-1][ll-1]
M[ii, Ys.index(Util.map_moment('pi2_{1}_{3}_{2}_{1}'.format(pop1,pop2,pop3,ll)))] += mig_mat[pop1-1][ll-1]
if ll != pop2:
M[ii, Ys.index(Util.map_moment('pi2_{0}_{1}_{2}_{1}'.format(pop1,pop2,pop3)))] -= 2 * mig_mat[pop2-1][ll-1]
M[ii, Ys.index(Util.map_moment('pi2_{0}_{3}_{2}_{1}'.format(pop1,pop2,pop3,ll)))] += mig_mat[pop2-1][ll-1]
M[ii, Ys.index(Util.map_moment('pi2_{0}_{1}_{2}_{3}'.format(pop1,pop2,pop3,ll)))] += mig_mat[pop2-1][ll-1]
if ll != pop3:
M[ii, Ys.index(Util.map_moment('pi2_{0}_{1}_{2}_{1}'.format(pop1,pop2,pop3)))] -= mig_mat[pop3-1][ll-1]
M[ii, Ys.index(Util.map_moment('pi2_{0}_{1}_{3}_{1}'.format(pop1,pop2,pop3,ll)))] += mig_mat[pop3-1][ll-1]
elif pop3 == pop4:
for ll in range(1,num_pops+1):
if ll != pop1:
M[ii, Ys.index(Util.map_moment('pi2_{0}_{1}_{2}_{2}'.format(pop1,pop2,pop3)))] -= mig_mat[pop1-1][ll-1]
M[ii, Ys.index(Util.map_moment('pi2_{1}_{3}_{2}_{2}'.format(pop1,pop2,pop3,ll)))] += mig_mat[pop1-1][ll-1]
if ll != pop2:
M[ii, Ys.index(Util.map_moment('pi2_{0}_{1}_{2}_{2}'.format(pop1,pop2,pop3)))] -= mig_mat[pop2-1][ll-1]
M[ii, Ys.index(Util.map_moment('pi2_{0}_{3}_{2}_{2}'.format(pop1,pop2,pop3,ll)))] += mig_mat[pop2-1][ll-1]
if ll != pop3:
M[ii, Ys.index(Util.map_moment('pi2_{0}_{1}_{2}_{2}'.format(pop1,pop2,pop3)))] -= 2 * mig_mat[pop3-1][ll-1]
M[ii, Ys.index(Util.map_moment('pi2_{0}_{1}_{2}_{3}'.format(pop1,pop2,pop3,ll)))] += 2 * mig_mat[pop3-1][ll-1]
else:
if len(set([pop1,pop2,pop3,pop4])) != 4:
print("fucked up again")
for ss in range(1,num_pops+1):
if ss != pop1:
M[ii, Ys.index(Util.map_moment('pi2_{0}_{1}_{2}_{3}'.format(pop1,pop2,pop3,pop4)))] -= mig_mat[pop1-1][ss-1]
M[ii, Ys.index(Util.map_moment('pi2_{4}_{1}_{2}_{3}'.format(pop1,pop2,pop3,pop4,ss)))] += mig_mat[pop1-1][ss-1]
if ss != pop2:
M[ii, Ys.index(Util.map_moment('pi2_{0}_{1}_{2}_{3}'.format(pop1,pop2,pop3,pop4)))] -= mig_mat[pop2-1][ss-1]
M[ii, Ys.index(Util.map_moment('pi2_{0}_{4}_{2}_{3}'.format(pop1,pop2,pop3,pop4,ss)))] += mig_mat[pop2-1][ss-1]
if ss != pop3:
M[ii, Ys.index(Util.map_moment('pi2_{0}_{1}_{2}_{3}'.format(pop1,pop2,pop3,pop4)))] -= mig_mat[pop3-1][ss-1]
M[ii, Ys.index(Util.map_moment('pi2_{0}_{1}_{4}_{3}'.format(pop1,pop2,pop3,pop4,ss)))] += mig_mat[pop3-1][ss-1]
if ss != pop4:
M[ii, Ys.index(Util.map_moment('pi2_{0}_{1}_{2}_{3}'.format(pop1,pop2,pop3,pop4)))] -= mig_mat[pop4-1][ss-1]
M[ii, Ys.index(Util.map_moment('pi2_{0}_{1}_{2}_{4}'.format(pop1,pop2,pop3,pop4,ss)))] += mig_mat[pop4-1][ss-1]
return csc_matrix(M)
def admix_h(num_pops, pop1, pop2, f):
moms_from = Util.moment_names(num_pops)[1]
moms_to = Util.moment_names(num_pops+1)[1]
A = np.zeros((len(moms_to), len(moms_from)))
for ii, mom_to in enumerate(moms_to):
if mom_to in moms_from: # doesn't involve new pop (unchanged)
A[ii, moms_from.index(mom_to)] = 1
else: # all moments are of the form H_k_new, k in [1,...,new] (new = num_pops+1)
i1 = int(mom_to.split('_')[1])
i2 = int(mom_to.split('_')[2])
if i2 != num_pops+1:
raise ValueError("This is unexpected... i2 should have been num_pops+1.")
if i1 == i2 == num_pops+1: # H_new_new
A[ii, moms_from.index(Util.map_moment('H_{0}_{0}'.format(pop1)))] = f**2
A[ii, moms_from.index(Util.map_moment('H_{0}_{1}'.format(pop1, pop2)))] = 2*f*(1-f)
A[ii, moms_from.index(Util.map_moment('H_{0}_{0}'.format(pop2)))] = (1-f)**2
elif i1 == pop1: # H_pop1_new
A[ii, moms_from.index(Util.map_moment('H_{0}_{0}'.format(pop1)))] = f
A[ii, moms_from.index(Util.map_moment('H_{0}_{1}'.format(pop1, pop2)))] = (1-f)
elif i1 == pop2: # H_pop2_new
A[ii, moms_from.index(Util.map_moment('H_{0}_{1}'.format(pop1, pop2)))] = f
A[ii, moms_from.index(Util.map_moment('H_{0}_{0}'.format(pop2)))] = (1-f)
else: # H_non-source_new
A[ii, moms_from.index(Util.map_moment('H_{0}_{1}'.format(pop1, i1)))] = f
A[ii, moms_from.index(Util.map_moment('H_{0}_{1}'.format(pop2, i1)))] = (1-f)
return A
def admix_ld(num_pops, pop1, pop2, f):
moms_from = Util.moment_names(num_pops)[0]
moms_to = Util.moment_names(num_pops+1)[0]
A = np.zeros((len(moms_to), len(moms_from)))
for ii, mom_to in enumerate(moms_to):
if mom_to in moms_from: # doesn't involve new pop (unchanged)
A[ii, moms_from.index(mom_to)] = 1
else: # moments are either DD, Dz, or pi2. we handle each in turn
mom_name = mom_to.split('_')[0]
if mom_name == 'DD':
i1 = int(mom_to.split('_')[1])
i2 = int(mom_to.split('_')[2])
if i1 == i2 == num_pops+1: # DD_new_new
A[ii, moms_from.index(Util.map_moment('DD_{0}_{0}'.format(pop1,pop2)))] += f**2
A[ii, moms_from.index(Util.map_moment('DD_{0}_{1}'.format(pop1,pop2)))] += 2*f*(1-f)
A[ii, moms_from.index(Util.map_moment('DD_{1}_{1}'.format(pop1,pop2)))] += (1-f)**2
A[ii, moms_from.index(Util.map_moment('Dz_{0}_{0}_{0}'.format(pop1,pop2)))] += 1./2 * f**2 * (1-f)
A[ii, moms_from.index(Util.map_moment('Dz_{0}_{0}_{1}'.format(pop1,pop2)))] += -1./2 * f**2 * (1-f)
A[ii, moms_from.index(Util.map_moment('Dz_{0}_{1}_{0}'.format(pop1,pop2)))] += -1./2 * f**2 * (1-f)
A[ii, moms_from.index(Util.map_moment('Dz_{0}_{1}_{1}'.format(pop1,pop2)))] += 1./2 * f**2 * (1-f)
A[ii, moms_from.index(Util.map_moment('Dz_{1}_{0}_{0}'.format(pop1,pop2)))] += 1./2 * f * (1-f)**2
A[ii, moms_from.index(Util.map_moment('Dz_{1}_{0}_{1}'.format(pop1,pop2)))] += -1./2 * f * (1-f)**2
A[ii, moms_from.index(Util.map_moment('Dz_{1}_{1}_{0}'.format(pop1,pop2)))] += -1./2 * f * (1-f)**2
A[ii, moms_from.index(Util.map_moment('Dz_{1}_{1}_{1}'.format(pop1,pop2)))] += 1./2 * f * (1-f)**2
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{0}_{0}_{0}'.format(pop1,pop2)))] += f**2 * (1-f)**2
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{0}_{0}_{1}'.format(pop1,pop2)))] += -2 * f**2 * (1-f)**2
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{0}_{1}_{1}'.format(pop1,pop2)))] += f**2 * (1-f)**2
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{1}_{0}_{0}'.format(pop1,pop2)))] += -2 * f**2 * (1-f)**2
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{1}_{0}_{1}'.format(pop1,pop2)))] += 4 * f**2 * (1-f)**2
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{1}_{1}_{1}'.format(pop1,pop2)))] += -2 * f**2 * (1-f)**2
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{1}_{0}_{0}'.format(pop1,pop2)))] += f**2 * (1-f)**2
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{1}_{0}_{1}'.format(pop1,pop2)))] += -2 * f**2 * (1-f)**2
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{1}_{1}_{1}'.format(pop1,pop2)))] += f**2 * (1-f)**2
elif i1 == pop1: # DD_pop1_new
A[ii, moms_from.index(Util.map_moment('DD_{0}_{0}'.format(pop1,pop2)))] += f
A[ii, moms_from.index(Util.map_moment('DD_{0}_{1}'.format(pop1,pop2)))] += 1-f
A[ii, moms_from.index(Util.map_moment('Dz_{0}_{0}_{0}'.format(pop1,pop2)))] += 1./4 * f * (1-f)
A[ii, moms_from.index(Util.map_moment('Dz_{0}_{0}_{1}'.format(pop1,pop2)))] += -1./4 * f * (1-f)
A[ii, moms_from.index(Util.map_moment('Dz_{0}_{1}_{0}'.format(pop1,pop2)))] += -1./4 * f * (1-f)
A[ii, moms_from.index(Util.map_moment('Dz_{0}_{1}_{1}'.format(pop1,pop2)))] += 1./4 * f * (1-f)
elif i1 == pop2: # DD_pop2_new
A[ii, moms_from.index(Util.map_moment('DD_{0}_{1}'.format(pop1,pop2)))] += f
A[ii, moms_from.index(Util.map_moment('DD_{1}_{1}'.format(pop1,pop2)))] += 1-f
A[ii, moms_from.index(Util.map_moment('Dz_{1}_{0}_{0}'.format(pop1,pop2)))] += 1./4 * f * (1-f)
A[ii, moms_from.index(Util.map_moment('Dz_{1}_{0}_{1}'.format(pop1,pop2)))] += -1./4 * f * (1-f)
A[ii, moms_from.index(Util.map_moment('Dz_{1}_{1}_{0}'.format(pop1,pop2)))] += -1./4 * f * (1-f)
A[ii, moms_from.index(Util.map_moment('Dz_{1}_{1}_{1}'.format(pop1,pop2)))] += 1./4 * f * (1-f)
else: # DD_non-source_new
A[ii, moms_from.index(Util.map_moment('DD_{0}_{2}'.format(pop1,pop2,i1)))] += f
A[ii, moms_from.index(Util.map_moment('DD_{1}_{2}'.format(pop1,pop2,i1)))] += 1-f
A[ii, moms_from.index(Util.map_moment('Dz_{2}_{0}_{0}'.format(pop1,pop2,i1)))] += 1./4 * f * (1-f)
A[ii, moms_from.index(Util.map_moment('Dz_{2}_{0}_{1}'.format(pop1,pop2,i1)))] += -1./4 * f * (1-f)
A[ii, moms_from.index(Util.map_moment('Dz_{2}_{1}_{0}'.format(pop1,pop2,i1)))] += -1./4 * f * (1-f)
A[ii, moms_from.index(Util.map_moment('Dz_{2}_{1}_{1}'.format(pop1,pop2,i1)))] += 1./4 * f * (1-f)
elif mom_name == 'Dz':
i1 = int(mom_to.split('_')[1])
i2 = int(mom_to.split('_')[2])
i3 = int(mom_to.split('_')[3])
if i1 == i2 == i3 == num_pops+1: # Dz_new_new_new
A[ii, moms_from.index(Util.map_moment('Dz_{0}_{0}_{0}'.format(pop1,pop2)))] += f**3
A[ii, moms_from.index(Util.map_moment('Dz_{0}_{0}_{1}'.format(pop1,pop2)))] += f**2 * (1-f)
A[ii, moms_from.index(Util.map_moment('Dz_{0}_{1}_{0}'.format(pop1,pop2)))] += f**2 * (1-f)
A[ii, moms_from.index(Util.map_moment('Dz_{0}_{1}_{1}'.format(pop1,pop2)))] += f * (1-f)**2
A[ii, moms_from.index(Util.map_moment('Dz_{1}_{0}_{0}'.format(pop1,pop2)))] += f**2 * (1-f)
A[ii, moms_from.index(Util.map_moment('Dz_{1}_{0}_{1}'.format(pop1,pop2)))] += f * (1-f)**2
A[ii, moms_from.index(Util.map_moment('Dz_{1}_{1}_{0}'.format(pop1,pop2)))] += f * (1-f)**2
A[ii, moms_from.index(Util.map_moment('Dz_{1}_{1}_{1}'.format(pop1,pop2)))] += (1-f)**3
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{0}_{0}_{0}'.format(pop1,pop2)))] += 4 * f**3 * (1-f)
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{0}_{0}_{1}'.format(pop1,pop2)))] += 4 * f**2 * (1-f) * (1-2*f)
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{0}_{1}_{1}'.format(pop1,pop2)))] += -4 * f**2 * (1-f)**2
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{1}_{0}_{0}'.format(pop1,pop2)))] += 4 * f**2 * (1-f) * (1-2*f)
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{1}_{0}_{1}'.format(pop1,pop2)))] += 4 * f * (1-f) * (1-2*f)**2
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{1}_{1}_{1}'.format(pop1,pop2)))] += -4 * f * (1-f)**2 * (1-2*f)
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{1}_{0}_{0}'.format(pop1,pop2)))] += -4 * f**2 * (1-f)**2
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{1}_{0}_{1}'.format(pop1,pop2)))] += -4 * f * (1-f)**2 * (1-2*f)
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{1}_{1}_{1}'.format(pop1,pop2)))] += 4 * f * (1-f)**3
elif i1 == pop1 and i2 == i3 == num_pops+1: # Dz_pop1_new_new
A[ii, moms_from.index(Util.map_moment('Dz_{0}_{0}_{0}'.format(pop1,pop2)))] += f**2
A[ii, moms_from.index(Util.map_moment('Dz_{0}_{0}_{1}'.format(pop1,pop2)))] += f * (1-f)
A[ii, moms_from.index(Util.map_moment('Dz_{0}_{1}_{0}'.format(pop1,pop2)))] += f * (1-f)
A[ii, moms_from.index(Util.map_moment('Dz_{0}_{1}_{1}'.format(pop1,pop2)))] += (1-f)**2
elif i1 == pop2 and i2 == i3 == num_pops+1: # Dz_pop2_new_new
A[ii, moms_from.index(Util.map_moment('Dz_{1}_{0}_{0}'.format(pop1,pop2)))] += f**2
A[ii, moms_from.index(Util.map_moment('Dz_{1}_{0}_{1}'.format(pop1,pop2)))] += f * (1-f)
A[ii, moms_from.index(Util.map_moment('Dz_{1}_{1}_{0}'.format(pop1,pop2)))] += f * (1-f)
A[ii, moms_from.index(Util.map_moment('Dz_{1}_{1}_{1}'.format(pop1,pop2)))] += (1-f)**2
elif i2 == i3 == num_pops+1: # Dz_non-source_new_new
A[ii, moms_from.index(Util.map_moment('Dz_{2}_{0}_{0}'.format(pop1,pop2,i1)))] += f**2
A[ii, moms_from.index(Util.map_moment('Dz_{2}_{0}_{1}'.format(pop1,pop2,i1)))] += f * (1-f)
A[ii, moms_from.index(Util.map_moment('Dz_{2}_{1}_{0}'.format(pop1,pop2,i1)))] += f * (1-f)
A[ii, moms_from.index(Util.map_moment('Dz_{2}_{1}_{1}'.format(pop1,pop2,i1)))] += (1-f)**2
elif i1 == i3 == num_pops+1 and i2 == pop1: # Dz_new_pop1_new
A[ii, moms_from.index(Util.map_moment('Dz_{0}_{0}_{0}'.format(pop1,pop2)))] += f**2
A[ii, moms_from.index(Util.map_moment('Dz_{0}_{0}_{1}'.format(pop1,pop2)))] += f * (1-f)
A[ii, moms_from.index(Util.map_moment('Dz_{1}_{0}_{0}'.format(pop1,pop2)))] += f * (1-f)
A[ii, moms_from.index(Util.map_moment('Dz_{1}_{0}_{1}'.format(pop1,pop2)))] += (1-f)**2
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{0}_{0}_{0}'.format(pop1,pop2)))] += 4 * f**2 * (1-f)
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{0}_{0}_{1}'.format(pop1,pop2)))] += 4 * f * (1-f) * (1-2*f)
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{0}_{1}_{1}'.format(pop1,pop2)))] += -4 * f * (1-f)**2
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{1}_{0}_{0}'.format(pop1,pop2)))] += -4 * f**2 * (1-f)
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{1}_{0}_{1}'.format(pop1,pop2)))] += -4 * f * (1-f) * (1-2*f)
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{1}_{1}_{1}'.format(pop1,pop2)))] += 4 * f * (1-f)**2
elif i1 == i3 == num_pops+1 and i2 == pop2: # Dz_ne2_pop2_new
A[ii, moms_from.index(Util.map_moment('Dz_{0}_{1}_{0}'.format(pop1,pop2)))] += f**2
A[ii, moms_from.index(Util.map_moment('Dz_{0}_{1}_{1}'.format(pop1,pop2)))] += f * (1-f)
A[ii, moms_from.index(Util.map_moment('Dz_{1}_{1}_{0}'.format(pop1,pop2)))] += f * (1-f)
A[ii, moms_from.index(Util.map_moment('Dz_{1}_{1}_{1}'.format(pop1,pop2)))] += (1-f)**2
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{1}_{0}_{0}'.format(pop1,pop2)))] += 4 * f**2 * (1-f)
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{1}_{0}_{1}'.format(pop1,pop2)))] += 4 * f * (1-f) * (1-2*f)
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{1}_{1}_{1}'.format(pop1,pop2)))] += -4 * f * (1-f)**2
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{1}_{0}_{0}'.format(pop1,pop2)))] += -4 * f**2 * (1-f)
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{1}_{0}_{1}'.format(pop1,pop2)))] += -4 * f * (1-f) * (1-2*f)
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{1}_{1}_{1}'.format(pop1,pop2)))] += 4 * f * (1-f)**2
elif i1 == i3 == num_pops+1: # Dz_new_non-source_new
A[ii, moms_from.index(Util.map_moment('Dz_{0}_{2}_{0}'.format(pop1,pop2,i2)))] += f**2
A[ii, moms_from.index(Util.map_moment('Dz_{0}_{2}_{1}'.format(pop1,pop2,i2)))] += f * (1-f)
A[ii, moms_from.index(Util.map_moment('Dz_{1}_{2}_{0}'.format(pop1,pop2,i2)))] += f * (1-f)
A[ii, moms_from.index(Util.map_moment('Dz_{1}_{2}_{1}'.format(pop1,pop2,i2)))] += (1-f)**2
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{2}_{0}_{0}'.format(pop1,pop2,i2)))] += 4 * f**2 * (1-f)
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{2}_{0}_{1}'.format(pop1,pop2,i2)))] += 4 * f * (1-f) * (1-2*f)
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{2}_{1}_{1}'.format(pop1,pop2,i2)))] += -4 * f * (1-f)**2
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{2}_{0}_{0}'.format(pop1,pop2,i2)))] += -4 * f**2 * (1-f)
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{2}_{0}_{1}'.format(pop1,pop2,i2)))] += -4 * f * (1-f) * (1-2*f)
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{2}_{1}_{1}'.format(pop1,pop2,i2)))] += 4 * f * (1-f)**2
elif i1 == num_pops+1 and i2 == pop1 and i3 == pop1: # Dz_new_pop1_pop1
A[ii, moms_from.index(Util.map_moment('Dz_{0}_{0}_{0}'.format(pop1,pop2)))] += f
A[ii, moms_from.index(Util.map_moment('Dz_{1}_{0}_{0}'.format(pop1,pop2)))] += (1-f)
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{0}_{0}_{0}'.format(pop1,pop2)))] += 4 * f * (1-f)
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{0}_{0}_{1}'.format(pop1,pop2)))] += -4 * f * (1-f)
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{1}_{0}_{0}'.format(pop1,pop2)))] += -4 * f * (1-f)
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{1}_{0}_{1}'.format(pop1,pop2)))] += 4 * f * (1-f)
elif i1 == num_pops+1 and i2 == pop1 and i3 == pop2: # Dz_new_pop1_pop2
A[ii, moms_from.index(Util.map_moment('Dz_{0}_{0}_{1}'.format(pop1,pop2)))] += f
A[ii, moms_from.index(Util.map_moment('Dz_{1}_{0}_{1}'.format(pop1,pop2)))] += (1-f)
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{0}_{0}_{1}'.format(pop1,pop2)))] += 4 * f * (1-f)
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{0}_{1}_{1}'.format(pop1,pop2)))] += -4 * f * (1-f)
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{1}_{0}_{1}'.format(pop1,pop2)))] += -4 * f * (1-f)
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{1}_{1}_{1}'.format(pop1,pop2)))] += 4 * f * (1-f)
elif i1 == num_pops+1 and i2 == pop2 and i3 == pop1: # Dz_new_pop2_pop1
A[ii, moms_from.index(Util.map_moment('Dz_{0}_{1}_{0}'.format(pop1,pop2)))] += f
A[ii, moms_from.index(Util.map_moment('Dz_{1}_{1}_{0}'.format(pop1,pop2)))] += (1-f)
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{1}_{0}_{0}'.format(pop1,pop2)))] += 4 * f * (1-f)
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{1}_{0}_{1}'.format(pop1,pop2)))] += -4 * f * (1-f)
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{1}_{0}_{0}'.format(pop1,pop2)))] += -4 * f * (1-f)
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{1}_{0}_{1}'.format(pop1,pop2)))] += 4 * f * (1-f)
elif i1 == num_pops+1 and i2 == pop2 and i3 == pop2: # Dz_new_pop2_pop2
A[ii, moms_from.index(Util.map_moment('Dz_{0}_{1}_{1}'.format(pop1,pop2)))] += f
A[ii, moms_from.index(Util.map_moment('Dz_{1}_{1}_{1}'.format(pop1,pop2)))] += (1-f)
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{1}_{0}_{1}'.format(pop1,pop2)))] += 4 * f * (1-f)
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{1}_{1}_{1}'.format(pop1,pop2)))] += -4 * f * (1-f)
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{1}_{0}_{1}'.format(pop1,pop2)))] += -4 * f * (1-f)
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{1}_{1}_{1}'.format(pop1,pop2)))] += 4 * f * (1-f)
elif i1 == num_pops+1 and i2 == pop1: # Dz_new_pop1_non
A[ii, moms_from.index(Util.map_moment('Dz_{0}_{0}_{2}'.format(pop1,pop2,i3)))] += f
A[ii, moms_from.index(Util.map_moment('Dz_{1}_{0}_{2}'.format(pop1,pop2,i3)))] += (1-f)
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{0}_{0}_{2}'.format(pop1,pop2,i3)))] += 4 * f * (1-f)
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{0}_{1}_{2}'.format(pop1,pop2,i3)))] += -4 * f * (1-f)
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{1}_{0}_{2}'.format(pop1,pop2,i3)))] += -4 * f * (1-f)
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{1}_{1}_{2}'.format(pop1,pop2,i3)))] += 4 * f * (1-f)
elif i1 == num_pops+1 and i2 == pop2: # Dz_new_pop2_non
A[ii, moms_from.index(Util.map_moment('Dz_{0}_{1}_{2}'.format(pop1,pop2,i3)))] += f
A[ii, moms_from.index(Util.map_moment('Dz_{1}_{1}_{2}'.format(pop1,pop2,i3)))] += (1-f)
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{1}_{0}_{2}'.format(pop1,pop2,i3)))] += 4 * f * (1-f)
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{1}_{1}_{2}'.format(pop1,pop2,i3)))] += -4 * f * (1-f)
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{1}_{0}_{2}'.format(pop1,pop2,i3)))] += -4 * f * (1-f)
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{1}_{1}_{2}'.format(pop1,pop2,i3)))] += 4 * f * (1-f)
elif i1 == num_pops+1 and i3 == pop1: # Dz_new_non_pop1
A[ii, moms_from.index(Util.map_moment('Dz_{0}_{2}_{0}'.format(pop1,pop2,i2)))] += f
A[ii, moms_from.index(Util.map_moment('Dz_{1}_{2}_{0}'.format(pop1,pop2,i2)))] += (1-f)
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{2}_{0}_{0}'.format(pop1,pop2,i2)))] += 4 * f * (1-f)
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{2}_{0}_{1}'.format(pop1,pop2,i2)))] += -4 * f * (1-f)
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{2}_{0}_{0}'.format(pop1,pop2,i2)))] += -4 * f * (1-f)
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{2}_{0}_{1}'.format(pop1,pop2,i2)))] += 4 * f * (1-f)
elif i1 == num_pops+1 and i3 == pop2: # Dz_new_non_pop2
A[ii, moms_from.index(Util.map_moment('Dz_{0}_{2}_{1}'.format(pop1,pop2,i2)))] += f
A[ii, moms_from.index(Util.map_moment('Dz_{1}_{2}_{1}'.format(pop1,pop2,i2)))] += (1-f)
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{2}_{0}_{1}'.format(pop1,pop2,i2)))] += 4 * f * (1-f)
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{2}_{1}_{1}'.format(pop1,pop2,i2)))] += -4 * f * (1-f)
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{2}_{0}_{1}'.format(pop1,pop2,i2)))] += -4 * f * (1-f)
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{2}_{1}_{1}'.format(pop1,pop2,i2)))] += 4 * f * (1-f)
elif i1 == num_pops+1 and i2 == i3: # Dz_new_non_non (same non-source pop)
A[ii, moms_from.index(Util.map_moment('Dz_{0}_{2}_{2}'.format(pop1,pop2,i2)))] += f
A[ii, moms_from.index(Util.map_moment('Dz_{1}_{2}_{2}'.format(pop1,pop2,i2)))] += (1-f)
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{2}_{0}_{2}'.format(pop1,pop2,i2)))] += 4 * f * (1-f)
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{2}_{1}_{2}'.format(pop1,pop2,i2)))] += -4 * f * (1-f)
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{2}_{0}_{2}'.format(pop1,pop2,i2)))] += -4 * f * (1-f)
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{2}_{1}_{2}'.format(pop1,pop2,i2)))] += 4 * f * (1-f)
elif i1 == num_pops+1: # Dz_new_non1_non2 (different non-source pops)
A[ii, moms_from.index(Util.map_moment('Dz_{0}_{2}_{3}'.format(pop1,pop2,i2,i3)))] += f
A[ii, moms_from.index(Util.map_moment('Dz_{1}_{2}_{3}'.format(pop1,pop2,i2,i3)))] += (1-f)
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{2}_{0}_{3}'.format(pop1,pop2,i2,i3)))] += 4 * f * (1-f)
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{2}_{1}_{3}'.format(pop1,pop2,i2,i3)))] += -4 * f * (1-f)
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{2}_{0}_{3}'.format(pop1,pop2,i2,i3)))] += -4 * f * (1-f)
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{2}_{1}_{3}'.format(pop1,pop2,i2,i3)))] += 4 * f * (1-f)
elif i1 == pop1 and i2 == pop1 and i3 == num_pops+1: # Dz_pop1_pop1_new
A[ii, moms_from.index(Util.map_moment('Dz_{0}_{0}_{0}'.format(pop1,pop2)))] += f
A[ii, moms_from.index(Util.map_moment('Dz_{0}_{0}_{1}'.format(pop1,pop2)))] += (1-f)
elif i1 == pop1 and i2 == pop2 and i3 == num_pops+1: # Dz_pop1_pop2_new
A[ii, moms_from.index(Util.map_moment('Dz_{0}_{1}_{0}'.format(pop1,pop2)))] += f
A[ii, moms_from.index(Util.map_moment('Dz_{0}_{1}_{1}'.format(pop1,pop2)))] += (1-f)
elif i1 == pop1 and i3 == num_pops+1: # Dz_pop1_non_new
A[ii, moms_from.index(Util.map_moment('Dz_{0}_{2}_{0}'.format(pop1,pop2,i2)))] += f
A[ii, moms_from.index(Util.map_moment('Dz_{0}_{2}_{1}'.format(pop1,pop2,i2)))] += (1-f)
elif i1 == pop2 and i2 == pop1 and i3 == num_pops+1: # Dz_pop2_pop1_new
A[ii, moms_from.index(Util.map_moment('Dz_{1}_{0}_{0}'.format(pop1,pop2)))] += f
A[ii, moms_from.index(Util.map_moment('Dz_{1}_{0}_{1}'.format(pop1,pop2)))] += (1-f)
elif i1 == pop2 and i2 == pop2 and i3 == num_pops+1: # Dz_pop2_pop2_new
A[ii, moms_from.index(Util.map_moment('Dz_{1}_{1}_{0}'.format(pop1,pop2)))] += f
A[ii, moms_from.index(Util.map_moment('Dz_{1}_{1}_{1}'.format(pop1,pop2)))] += (1-f)
elif i1 == pop2 and i3 == num_pops+1: # Dz_pop2_non_new
A[ii, moms_from.index(Util.map_moment('Dz_{1}_{2}_{0}'.format(pop1,pop2,i2)))] += f
A[ii, moms_from.index(Util.map_moment('Dz_{1}_{2}_{1}'.format(pop1,pop2,i2)))] += (1-f)
elif i2 == pop1 and i3 == num_pops+1: # Dz_non_pop1_new
A[ii, moms_from.index(Util.map_moment('Dz_{2}_{0}_{0}'.format(pop1,pop2,i1)))] += f
A[ii, moms_from.index(Util.map_moment('Dz_{2}_{0}_{1}'.format(pop1,pop2,i1)))] += (1-f)
elif i2 == pop2 and i3 == num_pops+1: # Dz_non_pop2_new
A[ii, moms_from.index(Util.map_moment('Dz_{2}_{1}_{0}'.format(pop1,pop2,i1)))] += f
A[ii, moms_from.index(Util.map_moment('Dz_{2}_{1}_{1}'.format(pop1,pop2,i1)))] += (1-f)
elif i1 == i2 and i3 == num_pops+1: # Dz_non_non_new
A[ii, moms_from.index(Util.map_moment('Dz_{2}_{2}_{0}'.format(pop1,pop2,i1)))] += f
A[ii, moms_from.index(Util.map_moment('Dz_{2}_{2}_{1}'.format(pop1,pop2,i1)))] += (1-f)
elif i3 == num_pops+1: # Dz_non1_non2_new
A[ii, moms_from.index(Util.map_moment('Dz_{2}_{3}_{0}'.format(pop1,pop2,i1,i2)))] += f
A[ii, moms_from.index(Util.map_moment('Dz_{2}_{3}_{1}'.format(pop1,pop2,i1,i2)))] += (1-f)
else:
print("missed a Dz: ", mom_to)
elif mom_name == 'pi2':
i1 = int(mom_to.split('_')[1])
i2 = int(mom_to.split('_')[2])
i3 = int(mom_to.split('_')[3])
i4 = int(mom_to.split('_')[4])
if i1 == i2 == i3 == i4 == num_pops+1: # pi2_new_new_new_new
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{0}_{0}_{0}'.format(pop1,pop2)))] += f**4
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{0}_{0}_{1}'.format(pop1,pop2)))] += 2 * f**3 * (1-f)
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{0}_{1}_{1}'.format(pop1,pop2)))] += f**2 * (1-f)**2
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{1}_{0}_{0}'.format(pop1,pop2)))] += 2 * f**3 * (1-f)
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{1}_{0}_{1}'.format(pop1,pop2)))] += 4 * f**2 * (1-f)**2
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{1}_{1}_{1}'.format(pop1,pop2)))] += 2 * f * (1-f)**3
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{1}_{0}_{0}'.format(pop1,pop2)))] += f**2 * (1-f)**2
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{1}_{0}_{1}'.format(pop1,pop2)))] += 2 * f * (1-f)**3
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{1}_{1}_{1}'.format(pop1,pop2)))] += (1-f)**4
elif i2 == i3 == i4 == num_pops+1:
if i1 == pop1: # pi2_pop1_new_new_new
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{0}_{0}_{0}'.format(pop1,pop2)))] += f**3
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{0}_{0}_{1}'.format(pop1,pop2)))] += 2 * f**2 * (1-f)
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{0}_{1}_{1}'.format(pop1,pop2)))] += f * (1-f)**2
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{1}_{0}_{0}'.format(pop1,pop2)))] += f**2 * (1-f)
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{1}_{0}_{1}'.format(pop1,pop2)))] += 2 * f * (1-f)**2
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{1}_{1}_{1}'.format(pop1,pop2)))] += (1-f)**3
elif i1 == pop2: # pi2_pop2_new_new_new
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{1}_{0}_{0}'.format(pop1,pop2)))] += f**3
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{1}_{0}_{1}'.format(pop1,pop2)))] += 2 * f**2 * (1-f)
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{1}_{1}_{1}'.format(pop1,pop2)))] += f * (1-f)**2
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{1}_{0}_{0}'.format(pop1,pop2)))] += f**2 * (1-f)
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{1}_{0}_{1}'.format(pop1,pop2)))] += 2 * f * (1-f)**2
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{1}_{1}_{1}'.format(pop1,pop2)))] += (1-f)**3
else: # pi2_non-source_new_new_new
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{2}_{0}_{0}'.format(pop1,pop2,i1)))] += f**3
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{2}_{0}_{1}'.format(pop1,pop2,i1)))] += 2 * f**2 * (1-f)
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{2}_{1}_{1}'.format(pop1,pop2,i1)))] += f * (1-f)**2
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{2}_{0}_{0}'.format(pop1,pop2,i1)))] += f**2 * (1-f)
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{2}_{0}_{1}'.format(pop1,pop2,i1)))] += 2 * f * (1-f)**2
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{2}_{1}_{1}'.format(pop1,pop2,i1)))] += (1-f)**3
elif i3 == i4 == num_pops+1:
if i1 == i2 == pop1: # pi2_pop1_pop1_new_new
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{0}_{0}_{0}'.format(pop1,pop2)))] += f**2
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{0}_{0}_{1}'.format(pop1,pop2)))] += 2 * f * (1-f)
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{0}_{1}_{1}'.format(pop1,pop2)))] += (1-f)**2
elif i1 == pop1 and i2 == pop2: # pi2_pop1_pop2_new_new
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{1}_{0}_{0}'.format(pop1,pop2)))] += f**2
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{1}_{0}_{1}'.format(pop1,pop2)))] += 2 * f * (1-f)
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{1}_{1}_{1}'.format(pop1,pop2)))] += (1-f)**2
elif i1 == pop1: # pi2_pop1_non-source_new_new
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{2}_{0}_{0}'.format(pop1,pop2,i2)))] += f**2
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{2}_{0}_{1}'.format(pop1,pop2,i2)))] += 2 * f * (1-f)
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{2}_{1}_{1}'.format(pop1,pop2,i2)))] += (1-f)**2
elif i1 == pop2 and i2 == pop1: # pi2_pop2_pop1_new_new
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{1}_{0}_{0}'.format(pop1,pop2)))] += f**2
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{1}_{0}_{1}'.format(pop1,pop2)))] += 2 * f * (1-f)
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{1}_{1}_{1}'.format(pop1,pop2)))] += (1-f)**2
elif i1 == pop2 and i2 == pop2: # pi2_pop2_pop2_new_new
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{1}_{0}_{0}'.format(pop1,pop2)))] += f**2
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{1}_{0}_{1}'.format(pop1,pop2)))] += 2 * f * (1-f)
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{1}_{1}_{1}'.format(pop1,pop2)))] += (1-f)**2
elif i1 == pop2: # pi2_pop2_non-source_new_new
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{2}_{0}_{0}'.format(pop1,pop2,i2)))] += f**2
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{2}_{0}_{1}'.format(pop1,pop2,i2)))] += 2 * f * (1-f)
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{2}_{1}_{1}'.format(pop1,pop2,i2)))] += (1-f)**2
elif i2 == pop1: # pi2_non-source_pop1_new_new
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{2}_{0}_{0}'.format(pop1,pop2,i1)))] += f**2
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{2}_{0}_{1}'.format(pop1,pop2,i1)))] += 2 * f * (1-f)
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{2}_{1}_{1}'.format(pop1,pop2,i1)))] += (1-f)**2
elif i2 == pop2: # pi2_non-source_pop2_new_new
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{2}_{0}_{0}'.format(pop1,pop2,i1)))] += f**2
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{2}_{0}_{1}'.format(pop1,pop2,i1)))] += 2 * f * (1-f)
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{2}_{1}_{1}'.format(pop1,pop2,i1)))] += (1-f)**2
elif i1 == i2: # pi2_non_non_new_new (non-source pops are the same)
A[ii, moms_from.index(Util.map_moment('pi2_{2}_{2}_{0}_{0}'.format(pop1,pop2,i1)))] += f**2
A[ii, moms_from.index(Util.map_moment('pi2_{2}_{2}_{0}_{1}'.format(pop1,pop2,i1)))] += 2 * f * (1-f)
A[ii, moms_from.index(Util.map_moment('pi2_{2}_{2}_{1}_{1}'.format(pop1,pop2,i1)))] += (1-f)**2
else: # pi2_non1_non2_new_new (differenc=t non-source pops)
A[ii, moms_from.index(Util.map_moment('pi2_{2}_{3}_{0}_{0}'.format(pop1,pop2,i1,i2)))] += f**2
A[ii, moms_from.index(Util.map_moment('pi2_{2}_{3}_{0}_{1}'.format(pop1,pop2,i1,i2)))] += 2 * f * (1-f)
A[ii, moms_from.index(Util.map_moment('pi2_{2}_{3}_{1}_{1}'.format(pop1,pop2,i1,i2)))] += (1-f)**2
elif i2 == num_pops+1 and i4 == num_pops+1:
if i1 == pop1 and i3 == pop1: # pi2_pop1_new_pop1_new
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{0}_{0}_{0}'.format(pop1,pop2)))] += f**2
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{0}_{0}_{1}'.format(pop1,pop2)))] += f * (1-f)
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{1}_{0}_{0}'.format(pop1,pop2)))] += f * (1-f)
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{1}_{0}_{1}'.format(pop1,pop2)))] += (1-f)**2
elif i1 == pop1 and i3 == pop2: # pi2_pop1_new_pop2_new
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{0}_{0}_{1}'.format(pop1,pop2)))] += f**2
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{0}_{1}_{1}'.format(pop1,pop2)))] += f * (1-f)
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{1}_{0}_{1}'.format(pop1,pop2)))] += f * (1-f)
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{1}_{1}_{1}'.format(pop1,pop2)))] += (1-f)**2
elif i1 == pop1: # pi2_pop1_new_non-source_new
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{0}_{0}_{2}'.format(pop1,pop2,i3)))] += f**2
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{0}_{1}_{2}'.format(pop1,pop2,i3)))] += f * (1-f)
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{1}_{0}_{2}'.format(pop1,pop2,i3)))] += f * (1-f)
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{1}_{1}_{2}'.format(pop1,pop2,i3)))] += (1-f)**2
elif i1 == pop2 and i3 == pop1: # pi2_pop2_new_pop1_new
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{1}_{0}_{0}'.format(pop1,pop2)))] += f**2
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{1}_{0}_{1}'.format(pop1,pop2)))] += f * (1-f)
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{1}_{0}_{0}'.format(pop1,pop2)))] += f * (1-f)
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{1}_{0}_{1}'.format(pop1,pop2)))] += (1-f)**2
elif i1 == pop2 and i3 == pop2: # pi2_pop2_new_pop2_new
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{1}_{0}_{1}'.format(pop1,pop2)))] += f**2
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{1}_{1}_{1}'.format(pop1,pop2)))] += f * (1-f)
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{1}_{0}_{1}'.format(pop1,pop2)))] += f * (1-f)
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{1}_{1}_{1}'.format(pop1,pop2)))] += (1-f)**2
elif i1 == pop2: # pi2_pop2_new_non-source_new
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{1}_{0}_{2}'.format(pop1,pop2,i3)))] += f**2
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{1}_{1}_{2}'.format(pop1,pop2,i3)))] += f * (1-f)
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{1}_{0}_{2}'.format(pop1,pop2,i3)))] += f * (1-f)
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{1}_{1}_{2}'.format(pop1,pop2,i3)))] += (1-f)**2
elif i3 == pop1: # pi2_non_new_pop1_new
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{2}_{0}_{0}'.format(pop1,pop2,i1)))] += f**2
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{2}_{0}_{1}'.format(pop1,pop2,i1)))] += f * (1-f)
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{2}_{0}_{0}'.format(pop1,pop2,i1)))] += f * (1-f)
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{2}_{0}_{1}'.format(pop1,pop2,i1)))] += (1-f)**2
elif i3 == pop2: # pi2_non_new_pop2_new
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{2}_{0}_{1}'.format(pop1,pop2,i1)))] += f**2
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{2}_{1}_{1}'.format(pop1,pop2,i1)))] += f * (1-f)
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{2}_{0}_{1}'.format(pop1,pop2,i1)))] += f * (1-f)
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{2}_{1}_{1}'.format(pop1,pop2,i1)))] += (1-f)**2
elif i1 == i3: # pi2_non_new_non_new
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{2}_{0}_{2}'.format(pop1,pop2,i1)))] += f**2
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{2}_{1}_{2}'.format(pop1,pop2,i1)))] += f * (1-f)
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{2}_{0}_{2}'.format(pop1,pop2,i1)))] += f * (1-f)
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{2}_{1}_{2}'.format(pop1,pop2,i1)))] += (1-f)**2
else: # pi2_non1_new_non2_new
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{2}_{0}_{3}'.format(pop1,pop2,i1,i3)))] += f**2
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{2}_{1}_{3}'.format(pop1,pop2,i1,i3)))] += f * (1-f)
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{2}_{0}_{3}'.format(pop1,pop2,i1,i3)))] += f * (1-f)
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{2}_{1}_{3}'.format(pop1,pop2,i1,i3)))] += (1-f)**2
elif i4 == num_pops+1:
if i1 == pop1 and i2 == pop1 and i3 == pop1: # pi2_pop1_pop1_pop1_new
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{0}_{0}_{0}'.format(pop1,pop2)))] += f
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{0}_{0}_{1}'.format(pop1,pop2)))] += 1-f
elif i1 == pop1 and i2 == pop1 and i3 == pop2: # pi2_pop1_pop1_pop2_new
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{0}_{0}_{1}'.format(pop1,pop2)))] += f
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{0}_{1}_{1}'.format(pop1,pop2)))] += 1-f
elif i1 == pop1 and i2 == pop1: # pi2_pop1_pop1_non_new
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{0}_{0}_{2}'.format(pop1,pop2,i3)))] += f
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{0}_{1}_{2}'.format(pop1,pop2,i3)))] += 1-f
elif i1 == pop1 and i2 == pop2 and i3 == pop1: # pi2_pop1_pop2_pop1_new
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{1}_{0}_{0}'.format(pop1,pop2)))] += f
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{1}_{0}_{1}'.format(pop1,pop2)))] += 1-f
elif i1 == pop1 and i2 == pop2 and i3 == pop2: # pi2_pop1_pop2_pop2_new
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{1}_{0}_{1}'.format(pop1,pop2)))] += f
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{1}_{1}_{1}'.format(pop1,pop2)))] += 1-f
elif i1 == pop1 and i2 == pop2: # pi2_pop1_pop2_non_new
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{1}_{0}_{2}'.format(pop1,pop2,i3)))] += f
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{1}_{1}_{2}'.format(pop1,pop2,i3)))] += 1-f
elif i1 == pop1 and i3 == pop1: # pi2_pop1_non_pop1_new
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{2}_{0}_{0}'.format(pop1,pop2,i2)))] += f
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{2}_{0}_{1}'.format(pop1,pop2,i2)))] += 1-f
elif i1 == pop1 and i3 == pop2: # pi2_pop1_non_pop2_new
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{2}_{0}_{1}'.format(pop1,pop2,i2)))] += f
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{2}_{1}_{1}'.format(pop1,pop2,i2)))] += 1-f
elif i1 == pop1 and i2 == i3: # pi2_pop1_non_non_new
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{2}_{0}_{2}'.format(pop1,pop2,i2)))] += f
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{2}_{1}_{2}'.format(pop1,pop2,i2)))] += 1-f
elif i1 == pop1: # pi2_pop1_non1_non2_new
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{2}_{0}_{3}'.format(pop1,pop2,i2,i3)))] += f
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{2}_{1}_{3}'.format(pop1,pop2,i2,i3)))] += 1-f
elif i1 == pop2 and i2 == pop1 and i3 == pop1: # pi2_pop2_pop1_pop1_new
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{1}_{0}_{0}'.format(pop1,pop2)))] += f
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{1}_{0}_{1}'.format(pop1,pop2)))] += 1-f
elif i1 == pop2 and i2 == pop1 and i3 == pop2: # pi2_pop2_pop1_pop2_new
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{1}_{0}_{1}'.format(pop1,pop2)))] += f
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{1}_{1}_{1}'.format(pop1,pop2)))] += 1-f
elif i1 == pop2 and i2 == pop1: # pi2_pop2_pop1_non_new
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{1}_{0}_{2}'.format(pop1,pop2,i3)))] += f
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{1}_{1}_{2}'.format(pop1,pop2,i3)))] += 1-f
elif i1 == pop2 and i2 == pop2 and i3 == pop1: # pi2_pop2_pop2_pop1_new
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{1}_{0}_{0}'.format(pop1,pop2)))] += f
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{1}_{0}_{1}'.format(pop1,pop2)))] += 1-f
elif i1 == pop2 and i2 == pop2 and i3 == pop2: # pi2_pop2_pop2_pop2_new
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{1}_{0}_{1}'.format(pop1,pop2)))] += f
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{1}_{1}_{1}'.format(pop1,pop2)))] += 1-f
elif i1 == pop2 and i2 == pop2: # pi2_pop2_pop2_non_new
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{1}_{0}_{2}'.format(pop1,pop2,i3)))] += f
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{1}_{1}_{2}'.format(pop1,pop2,i3)))] += 1-f
elif i1 == pop2 and i3 == pop1: # pi2_pop2_non_pop1_new
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{2}_{0}_{0}'.format(pop1,pop2,i2)))] += f
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{2}_{0}_{1}'.format(pop1,pop2,i2)))] += 1-f
elif i1 == pop2 and i3 == pop2: # pi2_pop2_non_pop2_new
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{2}_{0}_{1}'.format(pop1,pop2,i2)))] += f
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{2}_{1}_{1}'.format(pop1,pop2,i2)))] += 1-f
elif i1 == pop2 and i2 == i3: # pi2_pop2_non_non_new
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{2}_{0}_{2}'.format(pop1,pop2,i2)))] += f
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{2}_{1}_{2}'.format(pop1,pop2,i2)))] += 1-f
elif i1 == pop2: # pi2_pop2_non1_non2_new
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{2}_{0}_{3}'.format(pop1,pop2,i2,i3)))] += f
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{2}_{1}_{3}'.format(pop1,pop2,i2,i3)))] += 1-f
elif i2 == pop1 and i3 == pop1: # pi2_non_pop1_pop1_new
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{2}_{0}_{0}'.format(pop1,pop2,i1)))] += f
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{2}_{0}_{1}'.format(pop1,pop2,i1)))] += 1-f
elif i2 == pop1 and i3 == pop2: # pi2_non_pop1_pop2_new
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{2}_{0}_{1}'.format(pop1,pop2,i1)))] += f
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{2}_{1}_{1}'.format(pop1,pop2,i1)))] += 1-f
elif i2 == pop1 and i1 == i3: # pi2_non_pop1_non_new
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{2}_{0}_{2}'.format(pop1,pop2,i1)))] += f
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{2}_{1}_{2}'.format(pop1,pop2,i1)))] += 1-f
elif i2 == pop1: # pi2_non1_pop1_non2_new
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{2}_{0}_{3}'.format(pop1,pop2,i1,i3)))] += f
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{2}_{1}_{3}'.format(pop1,pop2,i1,i3)))] += 1-f
elif i2 == pop2 and i3 == pop1: # pi2_non_pop2_pop1_new
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{2}_{0}_{0}'.format(pop1,pop2,i1)))] += f
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{2}_{0}_{1}'.format(pop1,pop2,i1)))] += 1-f
elif i2 == pop2 and i3 == pop2: # pi2_non_pop2_pop2_new
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{2}_{0}_{1}'.format(pop1,pop2,i1)))] += f
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{2}_{1}_{1}'.format(pop1,pop2,i1)))] += 1-f
elif i2 == pop2 and i1 == i3: # pi2_non_pop2_non_new
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{2}_{0}_{2}'.format(pop1,pop2,i1)))] += f
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{2}_{1}_{2}'.format(pop1,pop2,i1)))] += 1-f
elif i2 == pop2: # pi2_non1_pop2_non2_new
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{2}_{0}_{3}'.format(pop1,pop2,i1,i3)))] += f
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{2}_{1}_{3}'.format(pop1,pop2,i1,i3)))] += 1-f
elif i3 == pop1 and i1 == i2: # pi2_non1_non1_pop1_new
A[ii, moms_from.index(Util.map_moment('pi2_{2}_{2}_{0}_{0}'.format(pop1,pop2,i1)))] += f
A[ii, moms_from.index(Util.map_moment('pi2_{2}_{2}_{0}_{1}'.format(pop1,pop2,i1)))] += 1-f
elif i3 == pop2 and i1 == i2: # pi2_non1_non1_pop2_new
A[ii, moms_from.index(Util.map_moment('pi2_{2}_{2}_{0}_{1}'.format(pop1,pop2,i1)))] += f
A[ii, moms_from.index(Util.map_moment('pi2_{2}_{2}_{1}_{1}'.format(pop1,pop2,i1)))] += 1-f
elif i1 == i2 == i3: # pi2_non1_non1_non1_new
A[ii, moms_from.index(Util.map_moment('pi2_{2}_{2}_{0}_{2}'.format(pop1,pop2,i1)))] += f
A[ii, moms_from.index(Util.map_moment('pi2_{2}_{2}_{1}_{2}'.format(pop1,pop2,i1)))] += 1-f
elif i1 == i2: # pi2_non1_non1_non2_new
A[ii, moms_from.index(Util.map_moment('pi2_{2}_{2}_{0}_{3}'.format(pop1,pop2,i1,i3)))] += f
A[ii, moms_from.index(Util.map_moment('pi2_{2}_{2}_{1}_{3}'.format(pop1,pop2,i1,i3)))] += 1-f
elif i3 == pop1: # pi2_non1_non2_pop1_new
A[ii, moms_from.index(Util.map_moment('pi2_{2}_{3}_{0}_{0}'.format(pop1,pop2,i1,i2)))] += f
A[ii, moms_from.index(Util.map_moment('pi2_{2}_{3}_{0}_{1}'.format(pop1,pop2,i1,i2)))] += 1-f
elif i3 == pop2: # pi2_non1_non2_pop2_new
A[ii, moms_from.index(Util.map_moment('pi2_{2}_{3}_{0}_{1}'.format(pop1,pop2,i1,i2)))] += f
A[ii, moms_from.index(Util.map_moment('pi2_{2}_{3}_{1}_{1}'.format(pop1,pop2,i1,i2)))] += 1-f
elif i1 == i3: # pi2_non1_non2_non1_new
A[ii, moms_from.index(Util.map_moment('pi2_{2}_{3}_{0}_{2}'.format(pop1,pop2,i1,i2)))] += f
A[ii, moms_from.index(Util.map_moment('pi2_{2}_{3}_{1}_{2}'.format(pop1,pop2,i1,i2)))] += 1-f
elif i2 == i3: # pi2_non1_non2_non2_new
A[ii, moms_from.index(Util.map_moment('pi2_{2}_{3}_{0}_{3}'.format(pop1,pop2,i1,i2)))] += f
A[ii, moms_from.index(Util.map_moment('pi2_{2}_{3}_{1}_{3}'.format(pop1,pop2,i1,i2)))] += 1-f
else: # pi2_non1_non2_non3_new
A[ii, moms_from.index(Util.map_moment('pi2_{2}_{3}_{0}_{4}'.format(pop1,pop2,i1,i2,i3)))] += f
A[ii, moms_from.index(Util.map_moment('pi2_{2}_{3}_{1}_{4}'.format(pop1,pop2,i1,i2,i3)))] += 1-f
elif i2 == num_pops+1:
if i1 == pop1 and i3 == pop1 and i4 == pop1: # pi2_pop1_new_pop1_pop1
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{0}_{0}_{0}'.format(pop1,pop2)))] += f
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{1}_{0}_{0}'.format(pop1,pop2)))] += 1-f
elif i1 == pop1 and i3 == pop1 and i4 == pop2: # pi2_pop1_new_pop1_pop2
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{0}_{0}_{1}'.format(pop1,pop2)))] += f
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{1}_{0}_{1}'.format(pop1,pop2)))] += 1-f
elif i1 == pop1 and i3 == pop1: # pi2_pop1_new_pop1_non
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{0}_{0}_{2}'.format(pop1,pop2,i4)))] += f
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{1}_{0}_{2}'.format(pop1,pop2,i4)))] += 1-f
elif i1 == pop1 and i3 == pop2 and i4 == pop1: # pi2_pop1_new_pop2_pop1
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{0}_{0}_{1}'.format(pop1,pop2)))] += f
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{1}_{0}_{1}'.format(pop1,pop2)))] += 1-f
elif i1 == pop1 and i3 == pop2 and i4 == pop2: # pi2_pop1_new_pop2_pop2
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{0}_{1}_{1}'.format(pop1,pop2)))] += f
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{1}_{1}_{1}'.format(pop1,pop2)))] += 1-f
elif i1 == pop1 and i3 == pop2: # pi2_pop1_new_pop2_non
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{0}_{1}_{2}'.format(pop1,pop2,i4)))] += f
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{1}_{1}_{2}'.format(pop1,pop2,i4)))] += 1-f
elif i1 == pop1 and i4 == pop1: # pi2_pop1_new_non_pop1
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{0}_{0}_{2}'.format(pop1,pop2,i3)))] += f
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{1}_{0}_{2}'.format(pop1,pop2,i3)))] += 1-f
elif i1 == pop1 and i4 == pop2: # pi2_pop1_new_non_pop2
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{0}_{1}_{2}'.format(pop1,pop2,i3)))] += f
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{1}_{1}_{2}'.format(pop1,pop2,i3)))] += 1-f
elif i1 == pop1 and i3 == i4: # pi2_pop1_new_non_non
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{0}_{2}_{2}'.format(pop1,pop2,i3)))] += f
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{1}_{2}_{2}'.format(pop1,pop2,i3)))] += 1-f
elif i1 == pop1: # pi2_pop1_new_non1_non2
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{0}_{2}_{3}'.format(pop1,pop2,i3,i4)))] += f
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{1}_{2}_{3}'.format(pop1,pop2,i3,i4)))] += 1-f
elif i1 == pop2 and i3 == pop1 and i4 == pop1: # pi2_pop2_new_pop1_pop1
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{1}_{0}_{0}'.format(pop1,pop2)))] += f
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{1}_{0}_{0}'.format(pop1,pop2)))] += 1-f
elif i1 == pop2 and i3 == pop1 and i4 == pop2: # pi2_pop2_new_pop1_pop2
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{1}_{0}_{1}'.format(pop1,pop2)))] += f
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{1}_{0}_{1}'.format(pop1,pop2)))] += 1-f
elif i1 == pop2 and i3 == pop1: # pi2_pop2_new_pop1_non
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{1}_{0}_{2}'.format(pop1,pop2,i4)))] += f
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{1}_{0}_{2}'.format(pop1,pop2,i4)))] += 1-f
elif i1 == pop2 and i3 == pop2 and i4 == pop1: # pi2_pop2_new_pop2_pop1
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{1}_{0}_{1}'.format(pop1,pop2)))] += f
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{1}_{0}_{1}'.format(pop1,pop2)))] += 1-f
elif i1 == pop2 and i3 == pop2 and i4 == pop2: # pi2_pop2_new_pop2_pop2
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{1}_{1}_{1}'.format(pop1,pop2)))] += f
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{1}_{1}_{1}'.format(pop1,pop2)))] += 1-f
elif i1 == pop2 and i3 == pop2: # pi2_pop2_new_pop2_non
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{1}_{1}_{2}'.format(pop1,pop2,i4)))] += f
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{1}_{1}_{2}'.format(pop1,pop2,i4)))] += 1-f
elif i1 == pop2 and i4 == pop1: # pi2_pop2_new_non_pop1
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{1}_{0}_{2}'.format(pop1,pop2,i3)))] += f
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{1}_{0}_{2}'.format(pop1,pop2,i3)))] += 1-f
elif i1 == pop2 and i4 == pop2: # pi2_pop2_new_non_pop2
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{1}_{1}_{2}'.format(pop1,pop2,i3)))] += f
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{1}_{1}_{2}'.format(pop1,pop2,i3)))] += 1-f
elif i1 == pop2 and i3 == i4: # pi2_pop2_new_non_non
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{1}_{2}_{2}'.format(pop1,pop2,i3)))] += f
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{1}_{2}_{2}'.format(pop1,pop2,i3)))] += 1-f
elif i1 == pop2: # pi2_pop2_new_non1_non2
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{1}_{2}_{3}'.format(pop1,pop2,i3,i4)))] += f
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{1}_{2}_{3}'.format(pop1,pop2,i3,i4)))] += 1-f
elif i3 == pop1 and i4 == pop1: # pi2_non_new_pop1_pop1
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{2}_{0}_{0}'.format(pop1,pop2,i1)))] += f
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{2}_{0}_{0}'.format(pop1,pop2,i1)))] += 1-f
elif i3 == pop1 and i4 == pop2: # pi2_non_new_pop1_pop2
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{2}_{0}_{1}'.format(pop1,pop2,i1)))] += f
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{2}_{0}_{1}'.format(pop1,pop2,i1)))] += 1-f
elif i3 == pop1 and i1 == i4: # pi2_non_new_pop1_non
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{2}_{0}_{2}'.format(pop1,pop2,i1)))] += f
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{2}_{0}_{2}'.format(pop1,pop2,i1)))] += 1-f
elif i3 == pop1: # pi2_non1_new_pop1_non2
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{2}_{0}_{3}'.format(pop1,pop2,i1,i4)))] += f
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{2}_{0}_{3}'.format(pop1,pop2,i1,i4)))] += 1-f
elif i3 == pop2 and i4 == pop1: # pi2_non_new_pop2_pop1
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{2}_{0}_{1}'.format(pop1,pop2,i1)))] += f
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{2}_{0}_{1}'.format(pop1,pop2,i1)))] += 1-f
elif i3 == pop2 and i4 == pop2: # pi2_non_new_pop2_pop2
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{2}_{1}_{1}'.format(pop1,pop2,i1)))] += f
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{2}_{1}_{1}'.format(pop1,pop2,i1)))] += 1-f
elif i3 == pop2 and i1 == i4: # pi2_non_new_pop2_non
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{2}_{1}_{2}'.format(pop1,pop2,i1)))] += f
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{2}_{1}_{2}'.format(pop1,pop2,i1)))] += 1-f
elif i3 == pop2: # pi2_non1_new_pop2_non2
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{2}_{1}_{3}'.format(pop1,pop2,i1,i4)))] += f
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{2}_{1}_{3}'.format(pop1,pop2,i1,i4)))] += 1-f
elif i1 == i3 and i4 == pop1: # pi2_non_new_non_pop1
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{2}_{0}_{2}'.format(pop1,pop2,i1)))] += f
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{2}_{0}_{2}'.format(pop1,pop2,i1)))] += 1-f
elif i1 == i3 and i4 == pop2: # pi2_non_new_non_pop1
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{2}_{1}_{2}'.format(pop1,pop2,i1)))] += f
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{2}_{1}_{2}'.format(pop1,pop2,i1)))] += 1-f
elif i1 == i3 == i4: # pi2_non_new_non_non
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{2}_{2}_{2}'.format(pop1,pop2,i1)))] += f
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{2}_{2}_{2}'.format(pop1,pop2,i1)))] += 1-f
elif i1 == i3: # pi2_non1_new_non1_non2
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{2}_{2}_{3}'.format(pop1,pop2,i1,i4)))] += f
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{2}_{2}_{3}'.format(pop1,pop2,i1,i4)))] += 1-f
elif i4 == pop1: # pi2_non1_new_non2_pop1
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{2}_{0}_{3}'.format(pop1,pop2,i1,i3)))] += f
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{2}_{0}_{3}'.format(pop1,pop2,i1,i3)))] += 1-f
elif i4 == pop2: # pi2_non1_new_non2_pop2
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{2}_{1}_{3}'.format(pop1,pop2,i1,i3)))] += f
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{2}_{1}_{3}'.format(pop1,pop2,i1,i3)))] += 1-f
elif i4 == i1: # pi2_non1_new_non2_non1
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{2}_{2}_{3}'.format(pop1,pop2,i1,i3)))] += f
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{2}_{2}_{3}'.format(pop1,pop2,i1,i3)))] += 1-f
elif i4 == i3: # pi2_non1_new_non2_non2
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{2}_{3}_{3}'.format(pop1,pop2,i1,i3)))] += f
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{2}_{3}_{3}'.format(pop1,pop2,i1,i3)))] += 1-f
else: # pi2_non1_new_non2_non3
A[ii, moms_from.index(Util.map_moment('pi2_{0}_{2}_{3}_{4}'.format(pop1,pop2,i1,i3,i4)))] += f
A[ii, moms_from.index(Util.map_moment('pi2_{1}_{2}_{3}_{4}'.format(pop1,pop2,i1,i3,i4)))] += 1-f
else:
print("missed a pi2 : ", mom_to)
return A
| 72.964132 | 135 | 0.484918 | 15,628 | 101,712 | 2.888981 | 0.012286 | 0.132007 | 0.155485 | 0.233228 | 0.955813 | 0.945536 | 0.930408 | 0.90868 | 0.892888 | 0.873375 | 0 | 0.095331 | 0.312313 | 101,712 | 1,393 | 136 | 73.016511 | 0.550153 | 0.037183 | 0 | 0.399138 | 0 | 0 | 0.11125 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.007759 | false | 0 | 0.002586 | 0 | 0.018966 | 0.00431 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
720c8efe1ef03c65e614cc1f21befc4990c2cd6f | 8,813 | py | Python | phasor/electronics/noise.py | mccullerlp/OpenLoop | fe86dc6dec3740d4b6be6b88d8eef8566e2aa78d | [
"Apache-2.0"
] | 5 | 2018-02-28T00:43:37.000Z | 2020-01-21T11:39:15.000Z | phasor/electronics/noise.py | mccullerlp/OpenLoop | fe86dc6dec3740d4b6be6b88d8eef8566e2aa78d | [
"Apache-2.0"
] | 1 | 2019-09-07T23:15:43.000Z | 2019-09-07T23:15:43.000Z | phasor/electronics/noise.py | mccullerlp/OpenLoop | fe86dc6dec3740d4b6be6b88d8eef8566e2aa78d | [
"Apache-2.0"
] | 1 | 2020-08-21T04:42:09.000Z | 2020-08-21T04:42:09.000Z | # -*- coding: utf-8 -*-
"""
"""
from __future__ import division, print_function, unicode_literals
import declarative as decl
from . import ports
from . import elements
sided_conversions = {
"one-sided" : 2,
"one sided" : 2,
"one" : 2,
"single-sided" : 2,
"single sided" : 2,
"single" : 2,
"two-sided" : 1,
"two sided" : 1,
"two" : 1,
"double-sided" : 1,
"double sided" : 1,
"double" : 1,
}
class VoltageFluctuation(elements.ElectricalNoiseBase, elements.ElectricalElementBase):
@decl.dproperty
def port(self, val):
self.system.own_port_virtual(self, val.i)
self.system.own_port_virtual(self, val.o)
return val
@decl.dproperty
def p_virt(self):
return ports.ElectricalPort(sname = 'virtual')
@decl.dproperty
def sided(self, val):
assert(val in sided_conversions)
return val
@decl.mproperty
def conversion(self):
return sided_conversions[self.sided]
@staticmethod
def Vsq_Hz_by_freq(F):
return 0
def system_setup_ports(self, ports_algorithm):
for kto in ports_algorithm.port_update_get(self.port.i):
ports_algorithm.port_coupling_needed(self.p_virt.o, kto)
for kto in ports_algorithm.port_update_get(self.port.o):
ports_algorithm.port_coupling_needed(self.p_virt.o, kto)
for kfrom in ports_algorithm.port_update_get(self.p_virt.o):
ports_algorithm.port_coupling_needed(self.port.o, kfrom)
ports_algorithm.port_coupling_needed(self.port.i, kfrom)
return
def system_setup_coupling(self, matrix_algorithm):
#TODO: double check that porto needs to be used
porto_use = self.port.o # self.system.ports_post_get(self.port.o)
for kfrom in matrix_algorithm.port_set_get(self.p_virt.o):
matrix_algorithm.port_coupling_insert(
self.p_virt.o,
kfrom,
self.port.i,
kfrom,
1/2,
)
matrix_algorithm.port_coupling_insert(
self.p_virt.o,
kfrom,
porto_use,
kfrom,
-1/2,
)
def system_setup_noise(self, matrix_algorithm):
for k1 in matrix_algorithm.port_set_get(self.p_virt.o):
freq = k1[ports.ClassicalFreqKey]
k2 = k1.without_keys(ports.ClassicalFreqKey) | ports.DictKey({ports.ClassicalFreqKey : -freq})
matrix_algorithm.noise_pair_insert(
self.p_virt.o, k1, self.p_virt.o, k2, self
)
return
def noise_2pt_expectation(self, pe_1, k1, pe_2, k2):
freq = k1[ports.ClassicalFreqKey].frequency()
Vsq_Hz = self.Vsq_Hz_by_freq(freq) / self.conversion
return Vsq_Hz
class CurrentFluctuation(elements.ElectricalNoiseBase, elements.ElectricalElementBase):
@decl.dproperty
def port(self, val = None):
if val is not None:
self.system.own_port_virtual(self, val.o)
self.system.own_port_virtual(self, val.i)
return val
@decl.dproperty
def portA(self, val = None):
if val is None:
val = self.port.i
self.system.own_port_virtual(self, val)
return val
@decl.dproperty
def portB(self, val = None):
if val is None:
val = self.port.o
self.system.own_port_virtual(self, val)
return val
@decl.dproperty
def p_virt(self):
return ports.ElectricalPort(sname = 'virtual')
@decl.dproperty
def sided(self, val):
assert(val in sided_conversions)
return val
@decl.mproperty
def conversion(self):
return sided_conversions[self.sided]
@staticmethod
def Isq_Hz_by_freq(F):
return 0
def system_setup_ports(self, ports_algorithm):
for kto in ports_algorithm.port_update_get(self.portA):
ports_algorithm.port_coupling_needed(self.p_virt.o, kto)
for kto in ports_algorithm.port_update_get(self.portB):
ports_algorithm.port_coupling_needed(self.p_virt.o, kto)
for kfrom in ports_algorithm.port_update_get(self.p_virt.o):
ports_algorithm.port_coupling_needed(self.portB, kfrom)
ports_algorithm.port_coupling_needed(self.portA, kfrom)
return
def system_setup_coupling(self, matrix_algorithm):
#TODO: double check that porto needs to be used
#porto_use = self.system.ports_post_get(self.portB)
porto_use = self.portB # self.system.ports_post_get(self.portB)
for kfrom in matrix_algorithm.port_set_get(self.p_virt.o):
matrix_algorithm.port_coupling_insert(
self.p_virt.o,
kfrom,
self.portA,
kfrom,
1/2,
)
matrix_algorithm.port_coupling_insert(
self.p_virt.o,
kfrom,
porto_use,
kfrom,
1/2,
)
def system_setup_noise(self, matrix_algorithm):
for k1 in matrix_algorithm.port_set_get(self.p_virt.o):
freq = k1[ports.ClassicalFreqKey]
k2 = k1.without_keys(ports.ClassicalFreqKey) | ports.DictKey({ports.ClassicalFreqKey : -freq})
matrix_algorithm.noise_pair_insert(
self.p_virt.o, k1, self.p_virt.o, k2, self
)
return
def noise_2pt_expectation(self, pe_1, k1, pe_2, k2):
freq = k1[ports.ClassicalFreqKey].frequency()
Isq_Hz = self.Isq_Hz_by_freq(freq) / self.conversion
return (self.Ze_termination)**2 * Isq_Hz
class VoltageFluctuation2(elements.ElectricalNoiseBase):
@decl.dproperty
def port(self, val):
return val
@decl.dproperty
def sided(self, val):
assert(val in sided_conversions)
return val
@decl.mproperty
def conversion(self):
return sided_conversions[self.sided]
@staticmethod
def Vsq_Hz_by_freq(F):
return 0
def system_setup_noise(self, matrix_algorithm):
porti_use = self.port.i
porto_use = self.system.ports_post_get(self.port.o)
for k1 in matrix_algorithm.port_set_get(self.port.o):
freq = k1[ports.ClassicalFreqKey]
k2 = k1.without_keys(ports.ClassicalFreqKey) | ports.DictKey({ports.ClassicalFreqKey : -freq})
matrix_algorithm.noise_pair_insert(
porto_use, k1, porto_use, k2, self
)
matrix_algorithm.noise_pair_insert(
porti_use, k1, porto_use, k2, self
)
matrix_algorithm.noise_pair_insert(
porto_use, k1, porti_use, k2, self
)
matrix_algorithm.noise_pair_insert(
porti_use, k1, porti_use, k2, self
)
pass
def noise_2pt_expectation(self, pe_1, k1, pe_2, k2):
freq = k1[ports.ClassicalFreqKey].frequency()
Vsq_Hz = self.Vsq_Hz_by_freq(freq) / self.conversion
if pe_1 == pe_2:
return Vsq_Hz / 4
else:
return -Vsq_Hz / 4
class CurrentFluctuation2(elements.ElectricalNoiseBase):
@decl.dproperty
def port(self, val):
return val
@decl.dproperty
def sided(self, val):
assert(val in sided_conversions)
return val
@decl.mproperty
def conversion(self):
return sided_conversions[self.sided]
@staticmethod
def Isq_Hz_by_freq(F):
return 0
def system_setup_noise(self, matrix_algorithm):
#print ("SETUP NOISE: ", self)
#porti_use = self.port.i
porto_use = self.system.ports_post_get(self.port.o)
for k1 in matrix_algorithm.port_set_get(self.port.o):
freq = k1[ports.ClassicalFreqKey]
k2 = k1.without_keys(ports.ClassicalFreqKey) | ports.DictKey({ports.ClassicalFreqKey : -freq})
matrix_algorithm.noise_pair_insert(
porto_use, k1, porto_use, k2, self
)
matrix_algorithm.noise_pair_insert(
self.port.i, k1, porto_use, k2, self
)
matrix_algorithm.noise_pair_insert(
porto_use, k1, self.port.i, k2, self
)
matrix_algorithm.noise_pair_insert(
self.port.i, k1, self.port.i, k2, self
)
pass
def noise_2pt_expectation(self, pe_1, k1, pe_2, k2):
freq = k1[ports.ClassicalFreqKey].frequency()
Isq_Hz = self.Isq_Hz_by_freq(freq) / self.conversion
if pe_1 == pe_2:
return (self.Ze_termination)**2 * Isq_Hz / 4
else:
return (self.Ze_termination)**2 * Isq_Hz / 4
| 32.047273 | 106 | 0.609214 | 1,103 | 8,813 | 4.62738 | 0.094288 | 0.076411 | 0.03174 | 0.035266 | 0.923785 | 0.90576 | 0.897531 | 0.865792 | 0.825431 | 0.819357 | 0 | 0.015886 | 0.300011 | 8,813 | 274 | 107 | 32.164234 | 0.811477 | 0.033473 | 0 | 0.705357 | 0 | 0 | 0.013644 | 0 | 0 | 0 | 0 | 0.00365 | 0.017857 | 1 | 0.142857 | false | 0.008929 | 0.017857 | 0.053571 | 0.3125 | 0.004464 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
7221e7c444d2cecb41e2ca27d00927aa850a47ca | 334 | py | Python | temboo/core/Library/RunKeeper/Settings/__init__.py | jordanemedlock/psychtruths | 52e09033ade9608bd5143129f8a1bfac22d634dd | [
"Apache-2.0"
] | 7 | 2016-03-07T02:07:21.000Z | 2022-01-21T02:22:41.000Z | temboo/core/Library/RunKeeper/Settings/__init__.py | jordanemedlock/psychtruths | 52e09033ade9608bd5143129f8a1bfac22d634dd | [
"Apache-2.0"
] | null | null | null | temboo/core/Library/RunKeeper/Settings/__init__.py | jordanemedlock/psychtruths | 52e09033ade9608bd5143129f8a1bfac22d634dd | [
"Apache-2.0"
] | 8 | 2016-06-14T06:01:11.000Z | 2020-04-22T09:21:44.000Z | from temboo.Library.RunKeeper.Settings.RetrieveSettings import RetrieveSettings, RetrieveSettingsInputSet, RetrieveSettingsResultSet, RetrieveSettingsChoreographyExecution
from temboo.Library.RunKeeper.Settings.UpdateSettings import UpdateSettings, UpdateSettingsInputSet, UpdateSettingsResultSet, UpdateSettingsChoreographyExecution
| 111.333333 | 171 | 0.916168 | 22 | 334 | 13.909091 | 0.636364 | 0.065359 | 0.111111 | 0.169935 | 0.222222 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.041916 | 334 | 2 | 172 | 167 | 0.95625 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 1 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
722f6ec7a7cb61273d13b45891dd65c0456ea2dc | 6,237 | py | Python | lumos/models.py | darth-dodo/project_lumos | d9a81613702bb8a38aaeef64aa4ef30b1f9dc5f0 | [
"MIT"
] | 5 | 2016-03-10T13:45:45.000Z | 2019-05-14T18:00:55.000Z | lumos/models.py | darth-dodo/project_lumos | d9a81613702bb8a38aaeef64aa4ef30b1f9dc5f0 | [
"MIT"
] | null | null | null | lumos/models.py | darth-dodo/project_lumos | d9a81613702bb8a38aaeef64aa4ef30b1f9dc5f0 | [
"MIT"
] | 4 | 2016-02-14T14:38:44.000Z | 2020-09-30T17:21:54.000Z | from django.db import models
from django.template.defaultfilters import slugify
# Create your models here.
def unicode_class(obj):
s = ''
for k,v in obj.__dict__.items():
s += ('%s: %s\n' % (k,v))
return s
def convert_to_dict(obj):
return obj.__dict__
class ProgLang(models.Model):
name = models.CharField(max_length=100,null=True,unique=True)
desc = models.TextField(null=True,blank=True)
active = models.BooleanField(default=True)
created_at = models.DateTimeField(auto_now_add=True)
modified_at = models.DateTimeField(auto_now=True)
sort_id = models.IntegerField(default=99)
slug = models.SlugField()
def save(self, *args, **kwargs):
self.slug = slugify(self.name)
super(ProgLang, self).save(*args, **kwargs)
def __unicode__(self):
return unicode_class(self)
def to_dict(self):
return convert_to_dict(self)
class Meta:
db_table = 'prog_lang'
class KnowledgeBase(models.Model):
prog_lang = models.ForeignKey(ProgLang)
title = models.CharField(max_length=200)
link = models.CharField(max_length=200)
desc = models.TextField(null=True, blank=True)
active = models.BooleanField(default=True)
diff_levels = (
(0, 'Beginner'),
(1, 'Intermediate'),
(2, 'Advanced')
)
difficulty = models.IntegerField(choices=diff_levels)
diff_sort = models.IntegerField(default=99)
data_type = (
(0, 'Video'),
(1, 'Article'),
(2, 'Interactive Site'),
(3, 'Other')
)
media_type = models.IntegerField(choices=data_type, default=3)
created_at = models.DateTimeField(auto_now_add=True)
modified_at = models.DateTimeField(auto_now=True)
def __unicode__(self):
return unicode_class(self)
def to_dict(self):
return convert_to_dict(self)
class Meta:
db_table = 'tech_knowlege_base'
class SoftSkills(models.Model):
name = models.CharField(max_length=100,null=True,unique=True)
desc = models.TextField(null=True,blank=True)
active = models.BooleanField(default=True)
created_at = models.DateTimeField(auto_now_add=True)
modified_at = models.DateTimeField(auto_now=True)
sort_id = models.IntegerField(default=99)
slug = models.SlugField()
def clean(self):
self.name = self.name.title()
def save(self, *args, **kwargs):
self.slug = slugify(self.name)
super(SoftSkills, self).save(*args, **kwargs)
def __unicode__(self):
return unicode_class(self)
def to_dict(self):
return convert_to_dict(self)
class Meta:
db_table = 'soft_skills'
class SoftSkillsData(models.Model):
soft_skill = models.ForeignKey(SoftSkills)
title = models.CharField(max_length=200)
link = models.CharField(max_length=200)
desc = models.TextField(null=True, blank=True)
active = models.BooleanField(default=True)
diff_levels = (
(0, 'Beginner'),
(1, 'Intermediate'),
(2, 'Advanced')
)
difficulty = models.IntegerField(choices=diff_levels, default=0)
diff_sort = models.IntegerField(default=99)
data_type = (
(0, 'Video'),
(1, 'Article'),
(2, 'Interactive Site'),
(3, 'Other')
)
media_type = models.IntegerField(choices=data_type, default=3)
created_at = models.DateTimeField(auto_now_add=True)
modified_at = models.DateTimeField(auto_now=True)
def __unicode__(self):
return unicode_class(self)
def to_dict(self):
return convert_to_dict(self)
class Meta:
db_table = 'soft_knowlege_base'
class ProjectBase(models.Model):
prog_lang = models.ManyToManyField(ProgLang)
title = models.CharField(max_length=200)
link = models.CharField(max_length=200)
desc = models.TextField(null=True, blank=True)
active = models.BooleanField(default=True)
sort_id = models.IntegerField(default=99)
diff_levels = (
(0, 'Beginner'),
(1, 'Intermediate'),
(2, 'Advanced')
)
difficulty = models.IntegerField(choices=diff_levels, default=0)
diff_sort = models.IntegerField(default=99)
data_type = (
(0, 'Video'),
(1, 'Article'),
(2, 'Interactive Site'),
(3, 'Other')
)
media_type = models.IntegerField(choices=data_type, default=3)
notes = models.TextField(null=True,blank=True)
created_at = models.DateTimeField(auto_now_add=True)
modified_at = models.DateTimeField(auto_now=True)
def __unicode__(self):
return unicode_class(self)
def to_dict(self):
return convert_to_dict(self)
class Meta:
db_table = 'tech_project_base'
class UserFeedback(models.Model):
username = models.CharField(max_length=200, default=None)
email = models.EmailField()
feedback_note = models.TextField()
active = models.BooleanField(default=True)
created_at = models.DateTimeField(auto_now_add=True)
modified_at = models.DateTimeField(auto_now=True)
def __unicode__(self):
return unicode_class(self)
def to_dict(self):
return convert_to_dict(self)
class Meta:
db_table = 'user_feedback'
class RandomStuff(models.Model):
title = models.CharField(max_length=200)
link = models.CharField(max_length=200)
desc = models.TextField(null=True, blank=True)
active = models.BooleanField(default=True)
sort_id = models.IntegerField(default=99)
data_type = (
(0, 'Video'),
(1, 'Article'),
(2, 'Interactive Site'),
(3, 'Other')
)
media_type = models.IntegerField(choices=data_type, default=3)
created_at = models.DateTimeField(auto_now_add=True)
modified_at = models.DateTimeField(auto_now=True)
def __unicode__(self):
return unicode_class(self)
def to_dict(self):
return convert_to_dict(self)
class Meta:
db_table = 'random_stuff'
| 31.341709 | 68 | 0.634279 | 729 | 6,237 | 5.208505 | 0.153635 | 0.023703 | 0.07743 | 0.092178 | 0.849618 | 0.829339 | 0.821965 | 0.821965 | 0.821965 | 0.821965 | 0 | 0.01672 | 0.252044 | 6,237 | 198 | 69 | 31.5 | 0.797213 | 0.003848 | 0 | 0.745455 | 0 | 0 | 0.051844 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.115152 | false | 0 | 0.012121 | 0.090909 | 0.715152 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 8 |
a0ced2083d0ac5f642af4c9f14d3125501aa4bda | 131 | py | Python | Aula_02.py | EvertonSilva22/POO-Phyton | 7093357fc72ed205a77344d52f0df87b5960358b | [
"MIT"
] | null | null | null | Aula_02.py | EvertonSilva22/POO-Phyton | 7093357fc72ed205a77344d52f0df87b5960358b | [
"MIT"
] | null | null | null | Aula_02.py | EvertonSilva22/POO-Phyton | 7093357fc72ed205a77344d52f0df87b5960358b | [
"MIT"
] | null | null | null | # int: 3 -5 80
# float: 3.0 -1.2 5. .7
# boll: True False
# str: 'olá' "olá" '''olá''' """olá"""
#Operadores ** + - * / // % | 14.555556 | 40 | 0.427481 | 20 | 131 | 2.8 | 0.75 | 0.321429 | 0.321429 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.104167 | 0.267176 | 131 | 9 | 41 | 14.555556 | 0.479167 | 0.885496 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
19ed1773680367f4942ad2e4a4c6d2174dd5eb70 | 182 | py | Python | recipes/recipes_emscripten/cryptography/test_import_cryptography.py | emscripten-forge/recipes | 62cb3e146abc8945ac210f38e4e47c080698eae5 | [
"MIT"
] | 1 | 2022-03-10T16:50:56.000Z | 2022-03-10T16:50:56.000Z | recipes/recipes_emscripten/cryptography/test_import_cryptography.py | emscripten-forge/recipes | 62cb3e146abc8945ac210f38e4e47c080698eae5 | [
"MIT"
] | 9 | 2022-03-18T09:26:38.000Z | 2022-03-29T09:21:51.000Z | recipes/recipes_emscripten/cryptography/test_import_cryptography.py | emscripten-forge/recipes | 62cb3e146abc8945ac210f38e4e47c080698eae5 | [
"MIT"
] | null | null | null |
def test_import_cryptography():
import cryptography
import cryptography.fernet
import cryptography.hazmat
import cryptography.utils
import cryptography.x509
| 22.75 | 31 | 0.758242 | 18 | 182 | 7.555556 | 0.444444 | 0.794118 | 0.352941 | 0.529412 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.02069 | 0.203297 | 182 | 8 | 32 | 22.75 | 0.917241 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | true | 0 | 1 | 0 | 1.166667 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
c20d07f00a7894854546e7c50287244e30bd1e88 | 7,587 | py | Python | tests/test_mapper.py | H10K/ukmohso | b15e93c347d1744f443cf2a08a4275bd443ee800 | [
"BSD-3-Clause"
] | null | null | null | tests/test_mapper.py | H10K/ukmohso | b15e93c347d1744f443cf2a08a4275bd443ee800 | [
"BSD-3-Clause"
] | 1 | 2019-09-21T11:07:29.000Z | 2019-09-21T11:07:29.000Z | tests/test_mapper.py | H10K/ukmohso | b15e93c347d1744f443cf2a08a4275bd443ee800 | [
"BSD-3-Clause"
] | null | null | null | #!/usr/bin/env python3
import os
import unittest
from ukmohso import Mapper
class TestMapper(unittest.TestCase):
def test_no_data_available(self):
mapreduce_map_input_file = 's3://path/to/fulchesterdata.txt.gz'
os.environ['mapreduce_map_input_file'] = mapreduce_map_input_file
os.environ['map_input_start'] = "0"
line = '1972 10 --- --- --- ---'
mapper = Mapper(line, True, False)
yields = mapper.yields()
self.assertEqual(len(yields), 0)
def test_provisional_data_can_be_ignored(self):
mapreduce_map_input_file = 's3://path/to/fulchesterdata.txt.gz'
os.environ['mapreduce_map_input_file'] = mapreduce_map_input_file
os.environ['map_input_start'] = "0"
line = '2019 1 7.2* 2.1 10 34.5 69.1 Provisional'
mapper = Mapper(line, True, False)
yields = mapper.yields()
self.assertEqual(len(yields), 0)
def test_skip_data_preamble(self):
mapreduce_map_input_file = 's3://path/to/fulchesterdata.txt.gz'
os.environ['mapreduce_map_input_file'] = mapreduce_map_input_file
os.environ['map_input_start'] = "0"
mapper = Mapper('Fulchester')
yields = mapper.yields()
self.assertEqual(len(yields), 0)
def test_skip_nonsensical_data(self):
mapreduce_map_input_file = 's3://path/to/fulchesterdata.txt.gz'
os.environ['mapreduce_map_input_file'] = mapreduce_map_input_file
os.environ['map_input_start'] = "42"
mapper = Mapper('Fulchester')
yields = mapper.yields()
key = mapper.yields()[0]['key']
value = mapper.yields()[0]['value']
self.assertEqual(key, 'mapper_skipped')
comparison = '%s:%d' % (mapreduce_map_input_file, 42)
self.assertEqual(value, comparison)
def test_station_name(self):
mapreduce_map_input_file = 's3://path/to/fulchesterdata.txt.gz'
os.environ['mapreduce_map_input_file'] = mapreduce_map_input_file
os.environ['map_input_start'] = "0"
line = 'foobar'
mapper = Mapper(line)
station_name = mapper.station_name()
self.assertTrue(station_name == 'fulchester')
def test_tmax(self):
mapreduce_map_input_file = 's3://path/to/fulchesterdata.txt.gz'
os.environ['mapreduce_map_input_file'] = mapreduce_map_input_file
os.environ['map_input_start'] = "0"
line = '2019 1 7.2 2.1 10 34.5 69.1'
mapper = Mapper(line, True, False)
yields = mapper.yields()
self.assertEqual(yields[0]['key'], 'fulchester:tmax:01')
self.assertEqual(yields[0]['value'], 7.2)
def test_tmax_esitmated(self):
mapreduce_map_input_file = 's3://path/to/fulchesterdata.txt.gz'
os.environ['mapreduce_map_input_file'] = mapreduce_map_input_file
os.environ['map_input_start'] = "0"
line = '2019 1 7.2* 2.1 10 34.5 69.1'
mapper = Mapper(line, True, False)
yields = mapper.yields()
self.assertEqual(yields[0]['key'], 'fulchester:tmax:01')
self.assertEqual(yields[0]['value'], 7.2)
def test_tmax_no_estimated(self):
mapreduce_map_input_file = 's3://path/to/fulchesterdata.txt.gz'
os.environ['mapreduce_map_input_file'] = mapreduce_map_input_file
os.environ['map_input_start'] = "0"
line = '2019 1 7.2* 2.1 10 34.5 69.1'
mapper = Mapper(line, False, False)
yields = mapper.yields()
for y in yields:
self.assertFalse(y['key'] == 'fulchester:tmax:01')
def test_tmax_not_available(self):
mapreduce_map_input_file = 's3://path/to/fulchesterdata.txt.gz'
os.environ['mapreduce_map_input_file'] = mapreduce_map_input_file
os.environ['map_input_start'] = "0"
line = '2019 1 --- 2.1 10 34.5 69.1'
mapper = Mapper(line, False, False)
yields = mapper.yields()
for y in yields:
self.assertFalse(y['key'] == 'fulchester:tmax:01')
def test_tmean(self):
mapreduce_map_input_file = 's3://path/to/fulchesterdata.txt.gz'
os.environ['mapreduce_map_input_file'] = mapreduce_map_input_file
os.environ['map_input_start'] = "0"
line = '2019 1 7.2 2.1 10 34.5 69.1'
mapper = Mapper(line, True, False)
yields = mapper.yields()
self.assertEqual(yields[2]['key'], 'fulchester:tmean:01')
self.assertEqual(yields[2]['value'], 4.65)
def test_tmean_esitmated(self):
mapreduce_map_input_file = 's3://path/to/fulchesterdata.txt.gz'
os.environ['mapreduce_map_input_file'] = mapreduce_map_input_file
os.environ['map_input_start'] = "0"
line = '2019 1 7.2* 2.1 10 34.5 69.1'
mapper = Mapper(line, True, False)
yields = mapper.yields()
self.assertEqual(yields[2]['key'], 'fulchester:tmean:01')
self.assertEqual(yields[2]['value'], 4.65)
def test_tmean_no_estimated(self):
mapreduce_map_input_file = 's3://path/to/fulchesterdata.txt.gz'
os.environ['mapreduce_map_input_file'] = mapreduce_map_input_file
os.environ['map_input_start'] = "0"
line = '2019 1 7.2* 2.1 10 34.5 69.1'
mapper = Mapper(line, False, False)
yields = mapper.yields()
for y in yields:
self.assertFalse(y['key'] == 'fulchester:tmean:01')
def test_tmean_not_available(self):
mapreduce_map_input_file = 's3://path/to/fulchesterdata.txt.gz'
os.environ['mapreduce_map_input_file'] = mapreduce_map_input_file
os.environ['map_input_start'] = "0"
line = '2019 1 --- 2.1 10 34.5 69.1'
mapper = Mapper(line, False, False)
yields = mapper.yields()
for y in yields:
self.assertFalse(y['key'] == 'fulchester:tmean:01')
def test_tmin(self):
mapreduce_map_input_file = 's3://path/to/fulchesterdata.txt.gz'
os.environ['mapreduce_map_input_file'] = mapreduce_map_input_file
os.environ['map_input_start'] = "0"
line = '2019 1 7.2 2.1 10 34.5 69.1'
mapper = Mapper(line, True, False)
yields = mapper.yields()
self.assertEqual(yields[1]['key'], 'fulchester:tmin:01')
self.assertEqual(yields[1]['value'], 2.1)
def test_tmin_esitmated(self):
mapreduce_map_input_file = 's3://path/to/fulchesterdata.txt.gz'
os.environ['mapreduce_map_input_file'] = mapreduce_map_input_file
os.environ['map_input_start'] = "0"
line = '2019 1 7.2* 2.1* 10 34.5 69.1'
mapper = Mapper(line, True, False)
yields = mapper.yields()
self.assertEqual(yields[1]['key'], 'fulchester:tmin:01')
self.assertEqual(yields[1]['value'], 2.1)
def test_tmin_no_estimated(self):
mapreduce_map_input_file = 's3://path/to/fulchesterdata.txt.gz'
os.environ['mapreduce_map_input_file'] = mapreduce_map_input_file
os.environ['map_input_start'] = "0"
line = '2019 1 7.2* 2.1* 10 34.5 69.1'
mapper = Mapper(line, False, False)
yields = mapper.yields()
for y in yields:
self.assertFalse(y['key'] == 'fulchester:tmin:01')
def test_tmin_not_available(self):
mapreduce_map_input_file = 's3://path/to/fulchesterdata.txt.gz'
os.environ['mapreduce_map_input_file'] = mapreduce_map_input_file
os.environ['map_input_start'] = "0"
line = '2019 1 7.2* --- 10 34.5 69.1'
mapper = Mapper(line, False, False)
yields = mapper.yields()
for y in yields:
self.assertFalse(y['key'] == 'fulchester:tmin:01')
| 41.686813 | 73 | 0.634375 | 1,043 | 7,587 | 4.387344 | 0.075743 | 0.120629 | 0.193182 | 0.238636 | 0.899694 | 0.899694 | 0.886145 | 0.886145 | 0.886145 | 0.886145 | 0 | 0.05362 | 0.228153 | 7,587 | 181 | 74 | 41.917127 | 0.727801 | 0.002768 | 0 | 0.779221 | 0 | 0 | 0.264904 | 0.130337 | 0 | 0 | 0 | 0 | 0.155844 | 1 | 0.11039 | false | 0 | 0.019481 | 0 | 0.136364 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
c20fe900adee4c2524ecbb104ccc9f274fcc5fe9 | 121 | py | Python | plugins/registration/api.py | bsavelev/medipy | f0da3750a6979750d5f4c96aedc89ad5ae74545f | [
"CECILL-B"
] | null | null | null | plugins/registration/api.py | bsavelev/medipy | f0da3750a6979750d5f4c96aedc89ad5ae74545f | [
"CECILL-B"
] | null | null | null | plugins/registration/api.py | bsavelev/medipy | f0da3750a6979750d5f4c96aedc89ad5ae74545f | [
"CECILL-B"
] | 1 | 2022-03-04T05:47:08.000Z | 2022-03-04T05:47:08.000Z | from registration import histogram_matching, registration_nl
from registration import registration_gui as registration
| 24.2 | 60 | 0.884298 | 14 | 121 | 7.428571 | 0.571429 | 0.307692 | 0.423077 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.107438 | 121 | 4 | 61 | 30.25 | 0.962963 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
5f19f197736d365590d71e8b4731dd13f66cc85e | 167 | py | Python | tests/context.py | PixelAssassin/img-rename | 6837bc5a6b9a42beb826bfb1fce2bd8a1a6bbb1a | [
"MIT"
] | null | null | null | tests/context.py | PixelAssassin/img-rename | 6837bc5a6b9a42beb826bfb1fce2bd8a1a6bbb1a | [
"MIT"
] | null | null | null | tests/context.py | PixelAssassin/img-rename | 6837bc5a6b9a42beb826bfb1fce2bd8a1a6bbb1a | [
"MIT"
] | null | null | null | import os
import sys
sys.path.insert(0, os.path.abspath('..'))
import main
from imgrename import core as ir
from imgrename import EXIF, exifutil, fileutil, slogging
| 18.555556 | 56 | 0.766467 | 26 | 167 | 4.923077 | 0.653846 | 0.203125 | 0.296875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.006944 | 0.137725 | 167 | 8 | 57 | 20.875 | 0.881944 | 0 | 0 | 0 | 0 | 0 | 0.011976 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.833333 | 0 | 0.833333 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
a0473e89c4fef2c3b3d1a13ee320b31c1c6bea34 | 6,843 | py | Python | a10_neutron_lbaas/tests/unit/v2/test_policy_converter.py | hthompson6/a10-neutron-lbaas | f1639758cd3abcc6c86c8e6b64dcb0397c359621 | [
"Apache-2.0"
] | 10 | 2015-09-15T05:16:15.000Z | 2020-03-18T02:34:39.000Z | a10_neutron_lbaas/tests/unit/v2/test_policy_converter.py | hthompson6/a10-neutron-lbaas | f1639758cd3abcc6c86c8e6b64dcb0397c359621 | [
"Apache-2.0"
] | 334 | 2015-02-11T23:45:00.000Z | 2020-02-28T08:58:51.000Z | a10_neutron_lbaas/tests/unit/v2/test_policy_converter.py | hthompson6/a10-neutron-lbaas | f1639758cd3abcc6c86c8e6b64dcb0397c359621 | [
"Apache-2.0"
] | 24 | 2015-01-13T21:14:45.000Z | 2021-06-02T17:22:14.000Z | # Copyright 2019, Doug Wiegley (dougwig), A10 Networks
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
from a10_neutron_lbaas.v2.policy import PolicyUtil
from a10_neutron_lbaas.tests.unit.v2 import fake_objs
from a10_neutron_lbaas.tests.unit.v2 import test_base
class TestPolicy(test_base.HandlerTestBase):
def test_create_with_no_rules_action_REDIRECT_TO_URL(self):
m = fake_objs.FakeL7Policy(None, "REDIRECT_TO_URL", None,
"http://a10networks.com", 12)
p = PolicyUtil()
resp = p.createPolicy(m)
expect_resp = """ when HTTP_REQUEST {
if { ( true ) } {
HTTP::redirect http://a10networks.com
}
}
"""
resp = resp.replace('\n', '').replace(' ', '')
expect_resp = expect_resp.replace('\n', '').replace(' ', '')
self.assertEqual(resp, expect_resp)
def test_create_with_no_rules_action_REDIRECT_TO_POOL(self):
pool = fake_objs.FakePool('HTTP', 'ROUND_ROBIN', 'SOURCE_IP')
m = fake_objs.FakeL7Policy(None, "REDIRECT_TO_POOL", pool,
None, 12)
p = PolicyUtil()
resp = p.createPolicy(m)
expect_resp = """ when HTTP_REQUEST {
if { ( true ) } {
pool fake-pool-id-001
}
} """
resp = resp.replace('\n', '').replace(' ', '')
expect_resp = expect_resp.replace('\n', '').replace(' ', '')
self.assertEqual(resp, expect_resp)
def test_create_with_no_rules_action_REJECT(self):
m = fake_objs.FakeL7Policy(None, "REJECT", None,
None, 12)
p = PolicyUtil()
resp = p.createPolicy(m)
expect_resp = """ when HTTP_REQUEST {
if { ( true ) } {
HTTP::close
}
}
"""
resp = resp.replace('\n', '').replace(' ', '')
expect_resp = expect_resp.replace('\n', '').replace(' ', '')
self.assertEqual(resp, expect_resp)
def test_create_with_with_rule_action_REDIRECT_TO_URL(self):
m = fake_objs.FakeL7Policy(None, "REDIRECT_TO_URL", None,
"http://a10networks.com", 12)
r = fake_objs.FakeL7Rule(m.id, 'HOST_NAME',
'STARTS_WITH', None, 'testvalue')
m.rules = []
m.rules.append(r)
p = PolicyUtil()
resp = p.createPolicy(m)
expect_resp = """ when HTTP_REQUEST {
if { ([HTTP::host] starts_with "testvalue") } {
HTTP::redirect http://a10networks.com
}
}
"""
resp = resp.replace('\n', '').replace(' ', '')
expect_resp = expect_resp.replace('\n', '').replace(' ', '')
self.assertEqual(resp, expect_resp)
def test_create_with_with_rule_action_REDIRECT_TO_POOL(self):
pool = fake_objs.FakePool('HTTP', 'ROUND_ROBIN', 'SOURCE_IP')
m = fake_objs.FakeL7Policy(None, "REDIRECT_TO_POOL", pool,
None, 12)
r = fake_objs.FakeL7Rule(m.id, 'FILE_TYPE',
'STARTS_WITH', 'testkey', 'testvalue')
m.rules = []
m.rules.append(r)
p = PolicyUtil()
resp = p.createPolicy(m)
expect_resp = """ when HTTP_REQUEST {
if { ([HTTP::uri] ends_with "testvalue") } {
pool fake-pool-id-001
}
}
"""
resp = resp.replace('\n', '').replace(' ', '')
expect_resp = expect_resp.replace('\n', '').replace(' ', '')
self.assertEqual(resp, expect_resp)
def test_create_with_with_rule_action_REJECT(self):
m = fake_objs.FakeL7Policy(None, "REJECT", None,
None, 12)
r = fake_objs.FakeL7Rule(m.id, 'COOKIE',
'CONTAINS', 'testkey', 'testvalue')
m.rules = []
m.rules.append(r)
p = PolicyUtil()
resp = p.createPolicy(m)
expect_resp = """when HTTP_REQUEST {
if { ([HTTP::cookie testkey] contains "testvalue") } {
HTTP::close
}
}
"""
resp = resp.replace('\n', '').replace(' ', '')
expect_resp = expect_resp.replace('\n', '').replace(' ', '')
self.assertEqual(resp, expect_resp)
def test_create_with_with_multi_rule_action_REDIRECT_TO_URL(self):
m = fake_objs.FakeL7Policy(None, "REDIRECT_TO_URL", None,
"http://a10networks.com", 12)
r1 = fake_objs.FakeL7Rule(m.id, 'HOST_NAME',
'STARTS_WITH', None, 'testvalue')
r2 = fake_objs.FakeL7Rule(m.id, 'FILE_TYPE',
'STARTS_WITH', 'testkey', 'testvalue')
r3 = fake_objs.FakeL7Rule(m.id, 'COOKIE',
'CONTAINS', 'testkey', 'testvalue')
m.rules = []
m.rules.append(r1)
m.rules.append(r2)
m.rules.append(r3)
p = PolicyUtil()
resp = p.createPolicy(m)
expect_resp = """ when HTTP_REQUEST {
if { ([HTTP::host] starts_with "testvalue") and
([HTTP::uri] ends_with "testvalue") and
([HTTP::cookie testkey] contains "testvalue") } {
HTTP::redirect http://a10networks.com
}
}
"""
resp = resp.replace('\n', '').replace(' ', '')
expect_resp = expect_resp.replace('\n', '').replace(' ', '')
self.assertEqual(resp, expect_resp)
def test_create_with_with_rule_action_REDIRECT_TO_POOL_invert(self):
pool = fake_objs.FakePool('HTTP', 'ROUND_ROBIN', 'SOURCE_IP')
m = fake_objs.FakeL7Policy(None, "REDIRECT_TO_POOL", pool,
None, 12)
r = fake_objs.FakeL7Rule(m.id, 'FILE_TYPE',
'STARTS_WITH', 'testkey', 'testvalue')
r.invert = True
m.rules = []
m.rules.append(r)
p = PolicyUtil()
resp = p.createPolicy(m)
expect_resp = """ when HTTP_REQUEST {
if { not([HTTP::uri] ends_with "testvalue") } {
pool fake-pool-id-001
}
}
"""
resp = resp.replace('\n', '').replace(' ', '')
expect_resp = expect_resp.replace('\n', '').replace(' ', '')
self.assertEqual(resp, expect_resp)
| 38.016667 | 78 | 0.547275 | 757 | 6,843 | 4.733157 | 0.179657 | 0.089311 | 0.053586 | 0.084845 | 0.826402 | 0.819704 | 0.804354 | 0.804075 | 0.783701 | 0.777003 | 0 | 0.016404 | 0.314044 | 6,843 | 179 | 79 | 38.22905 | 0.746911 | 0.087973 | 0 | 0.70068 | 0 | 0 | 0.265296 | 0 | 0 | 0 | 0 | 0 | 0.054422 | 1 | 0.054422 | false | 0 | 0.020408 | 0 | 0.081633 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
a057c52ca9211aebeea07288d42e6daa7503b165 | 7,287 | py | Python | pytorchocr/modeling/heads/rec_att_head.py | BHD233/PaddleOCR2Pytorch | f114069b3e2669c6adf0adf9596756205f184c9c | [
"Apache-2.0"
] | 364 | 2021-01-10T15:13:36.000Z | 2022-03-31T15:23:10.000Z | pytorchocr/modeling/heads/rec_att_head.py | BHD233/PaddleOCR2Pytorch | f114069b3e2669c6adf0adf9596756205f184c9c | [
"Apache-2.0"
] | 24 | 2021-01-29T23:53:01.000Z | 2022-03-25T13:01:55.000Z | pytorchocr/modeling/heads/rec_att_head.py | BHD233/PaddleOCR2Pytorch | f114069b3e2669c6adf0adf9596756205f184c9c | [
"Apache-2.0"
] | 77 | 2021-01-20T07:15:29.000Z | 2022-03-28T08:32:08.000Z | from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
import numpy as np
import torch
import torch.nn as nn
import torch.nn.functional as F
from pytorchocr.modeling.common import Activation
class AttentionHead(nn.Module):
def __init__(self, in_channels, out_channels, hidden_size, **kwargs):
super(AttentionHead, self).__init__()
self.input_size = in_channels
self.hidden_size = hidden_size
self.num_classes = out_channels
self.attention_cell = AttentionGRUCell(
in_channels, hidden_size, out_channels, use_gru=False)
self.generator = nn.Linear(hidden_size, out_channels)
def _char_to_onehot(self, input_char, onehot_dim):
input_ont_hot = F.one_hot(input_char.type(torch.int64), onehot_dim)
return input_ont_hot
def forward(self, inputs, targets=None, batch_max_length=25):
batch_size = inputs.size()[0]
num_steps = batch_max_length
hidden = torch.zeros((batch_size, self.hidden_size))
output_hiddens = []
if targets is not None:
for i in range(num_steps):
char_onehots = self._char_to_onehot(
targets[:, i], onehot_dim=self.num_classes)
(outputs, hidden), alpha = self.attention_cell(hidden, inputs,
char_onehots)
output_hiddens.append(torch.unsqueeze(outputs, dim=1))
output = torch.cat(output_hiddens, dim=1)
probs = self.generator(output)
else:
targets = torch.zeros([batch_size], dtype=torch.int32)
probs = None
char_onehots = None
outputs = None
alpha = None
for i in range(num_steps):
char_onehots = self._char_to_onehot(
targets, onehot_dim=self.num_classes)
(outputs, hidden), alpha = self.attention_cell(hidden, inputs,
char_onehots)
probs_step = self.generator(outputs)
if probs is None:
probs = torch.unsqueeze(probs_step, dim=1)
else:
probs = torch.cat(
[probs, torch.unsqueeze(
probs_step, dim=1)], dim=1)
next_input = probs_step.argmax(dim=1)
targets = next_input
return probs
class AttentionGRUCell(nn.Module):
def __init__(self, input_size, hidden_size, num_embeddings, use_gru=False):
super(AttentionGRUCell, self).__init__()
self.i2h = nn.Linear(input_size, hidden_size, bias=False)
self.h2h = nn.Linear(hidden_size, hidden_size)
self.score = nn.Linear(hidden_size, 1, bias=False)
self.rnn = nn.GRUCell(
input_size=input_size + num_embeddings, hidden_size=hidden_size, bias=True)
self.hidden_size = hidden_size
def forward(self, prev_hidden, batch_H, char_onehots):
batch_H_proj = self.i2h(batch_H)
prev_hidden_proj = torch.unsqueeze(self.h2h(prev_hidden), dim=1)
res = torch.add(batch_H_proj, prev_hidden_proj)
res = torch.tanh(res)
e = self.score(res)
alpha = F.softmax(e, dim=1)
alpha = alpha.permute(0, 2, 1)
context = torch.squeeze(torch.matmul(alpha, batch_H), dim=1)
concat_context = torch.cat([context, char_onehots.float()], 1)
cur_hidden = self.rnn(concat_context, prev_hidden)
return (cur_hidden, cur_hidden), alpha
class AttentionLSTM(nn.Module):
def __init__(self, in_channels, out_channels, hidden_size, **kwargs):
super(AttentionLSTM, self).__init__()
self.input_size = in_channels
self.hidden_size = hidden_size
self.num_classes = out_channels
self.attention_cell = AttentionLSTMCell(
in_channels, hidden_size, out_channels, use_gru=False)
self.generator = nn.Linear(hidden_size, out_channels)
def _char_to_onehot(self, input_char, onehot_dim):
input_ont_hot = F.one_hot(input_char.type(torch.int64), onehot_dim)
return input_ont_hot
def forward(self, inputs, targets=None, batch_max_length=25):
batch_size = inputs.shape[0]
num_steps = batch_max_length
hidden = (torch.zeros((batch_size, self.hidden_size)), torch.zeros(
(batch_size, self.hidden_size)))
output_hiddens = []
if targets is not None:
for i in range(num_steps):
# one-hot vectors for a i-th char
char_onehots = self._char_to_onehot(
targets[:, i], onehot_dim=self.num_classes)
hidden, alpha = self.attention_cell(hidden, inputs,
char_onehots)
hidden = (hidden[1][0], hidden[1][1])
output_hiddens.append(torch.unsqueeze(hidden[0], dim=1))
output = torch.cat(output_hiddens, dim=1)
probs = self.generator(output)
else:
targets = torch.zeros([batch_size], dtype=torch.int32)
probs = None
for i in range(num_steps):
char_onehots = self._char_to_onehot(
targets, onehot_dim=self.num_classes)
hidden, alpha = self.attention_cell(hidden, inputs,
char_onehots)
probs_step = self.generator(hidden[0])
hidden = (hidden[1][0], hidden[1][1])
if probs is None:
probs = torch.unsqueeze(probs_step, dim=1)
else:
probs = torch.cat(
[probs, torch.unsqueeze(
probs_step, dim=1)], dim=1)
next_input = probs_step.argmax(dim=1)
targets = next_input
return probs
class AttentionLSTMCell(nn.Module):
def __init__(self, input_size, hidden_size, num_embeddings, use_gru=False):
super(AttentionLSTMCell, self).__init__()
self.i2h = nn.Linear(input_size, hidden_size, bias=False)
self.h2h = nn.Linear(hidden_size, hidden_size)
self.score = nn.Linear(hidden_size, 1, bias=False)
if not use_gru:
self.rnn = nn.LSTMCell(
input_size=input_size + num_embeddings, hidden_size=hidden_size)
else:
self.rnn = nn.GRUCell(
input_size=input_size + num_embeddings, hidden_size=hidden_size)
self.hidden_size = hidden_size
def forward(self, prev_hidden, batch_H, char_onehots):
batch_H_proj = self.i2h(batch_H)
prev_hidden_proj = torch.unsqueeze(self.h2h(prev_hidden[0]), dim=1)
res = torch.add(batch_H_proj, prev_hidden_proj)
res = torch.tanh(res)
e = self.score(res)
alpha = F.softmax(e, dim=1)
alpha = alpha.permute(0, 2, 1)
context = torch.squeeze(torch.matmul(alpha, batch_H), dim=1)
concat_context = torch.cat([context, char_onehots.float()], 1)
cur_hidden = self.rnn(concat_context, prev_hidden)
return cur_hidden, alpha
| 38.151832 | 87 | 0.597365 | 889 | 7,287 | 4.608549 | 0.130484 | 0.080547 | 0.044423 | 0.043935 | 0.875763 | 0.859653 | 0.859653 | 0.848914 | 0.848914 | 0.848914 | 0 | 0.012137 | 0.310279 | 7,287 | 190 | 88 | 38.352632 | 0.803024 | 0.004254 | 0 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.067568 | false | 0 | 0.054054 | 0 | 0.189189 | 0.006757 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
a0908c7ed61e480e19d37bff69feb4a611574d1c | 1,006 | py | Python | src/encoded/tests/test_schema_donor.py | Lattice-Data/encoded | 94bb4f7cb51970523715e0598d84699a28f90861 | [
"MIT"
] | null | null | null | src/encoded/tests/test_schema_donor.py | Lattice-Data/encoded | 94bb4f7cb51970523715e0598d84699a28f90861 | [
"MIT"
] | 10 | 2020-07-22T20:16:15.000Z | 2021-06-16T19:17:44.000Z | src/encoded/tests/test_schema_donor.py | Lattice-Data/encoded | 94bb4f7cb51970523715e0598d84699a28f90861 | [
"MIT"
] | null | null | null | import pytest
def test_age_humanpostnatal(testapp, human_postnatal_donor_base):
testapp.patch_json(human_postnatal_donor_base['@id'], {'age': '9'}, status=422)
testapp.patch_json(human_postnatal_donor_base['@id'], {'age_units': 'year'}, status=422)
testapp.patch_json(human_postnatal_donor_base['@id'], {'age': '>89', 'age_units': 'year'}, status=200)
testapp.patch_json(human_postnatal_donor_base['@id'], {'age': '9', 'age_units': 'year'}, status=200)
testapp.patch_json(human_postnatal_donor_base['@id'], {'age': '9-15', 'age_units': 'year'}, status=200)
testapp.patch_json(human_postnatal_donor_base['@id'], {'age': '6-9', 'age_units': 'year'}, status=200)
testapp.patch_json(human_postnatal_donor_base['@id'], {'age': '6-', 'age_units': 'year'}, status=422)
testapp.patch_json(human_postnatal_donor_base['@id'], {'age': '-6', 'age_units': 'year'}, status=422)
testapp.patch_json(human_postnatal_donor_base['@id'], {'age': '157', 'age_units': 'month'}, status=200)
| 71.857143 | 107 | 0.695825 | 143 | 1,006 | 4.552448 | 0.174825 | 0.215054 | 0.291859 | 0.353303 | 0.854071 | 0.854071 | 0.854071 | 0.854071 | 0.854071 | 0.854071 | 0 | 0.044907 | 0.092445 | 1,006 | 13 | 108 | 77.384615 | 0.668127 | 0 | 0 | 0 | 0 | 0 | 0.173956 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.090909 | false | 0 | 0.090909 | 0 | 0.181818 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
a0bbacb393c60663cc5af507040d6617306bfd81 | 265 | py | Python | python/testData/highlighting/starArgs.py | teddywest32/intellij-community | e0268d7a1da1d318b441001448cdd3e8929b2f29 | [
"Apache-2.0"
] | null | null | null | python/testData/highlighting/starArgs.py | teddywest32/intellij-community | e0268d7a1da1d318b441001448cdd3e8929b2f29 | [
"Apache-2.0"
] | 11 | 2017-02-27T22:35:32.000Z | 2021-12-24T08:07:40.000Z | python/testData/highlighting/starArgs.py | teddywest32/intellij-community | e0268d7a1da1d318b441001448cdd3e8929b2f29 | [
"Apache-2.0"
] | 1 | 2020-11-27T10:36:50.000Z | 2020-11-27T10:36:50.000Z | def f1(*args, <error descr="Multiple * arguments are not allowed">*</error>, a):
pass
def f2(*, <error descr="Multiple * arguments are not allowed">*</error>, d):
pass
def f3(*, <error descr="Multiple * arguments are not allowed">*args</error>):
pass
| 29.444444 | 80 | 0.649057 | 37 | 265 | 4.648649 | 0.405405 | 0.174419 | 0.313953 | 0.47093 | 0.755814 | 0.755814 | 0.755814 | 0.523256 | 0 | 0 | 0 | 0.013699 | 0.173585 | 265 | 8 | 81 | 33.125 | 0.77169 | 0 | 0 | 0.5 | 0 | 0 | 0.407547 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.5 | 0 | null | null | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 7 |
cd2b964a3bf93356a5457f48e956b8e1a9e72eea | 44 | py | Python | python/__init__.py | SupinePandora43/Rails-3D | a4578b551433d1f13214a8f0433a3d7be60a6571 | [
"MIT"
] | 2 | 2019-06-23T14:33:34.000Z | 2020-11-20T19:25:38.000Z | python/__init__.py | SupinePandora43/Rails-3D | a4578b551433d1f13214a8f0433a3d7be60a6571 | [
"MIT"
] | 4 | 2019-05-08T16:08:03.000Z | 2020-09-28T17:05:05.000Z | python/__init__.py | SupinePandora43/Rails-3D | a4578b551433d1f13214a8f0433a3d7be60a6571 | [
"MIT"
] | 3 | 2019-06-23T12:01:19.000Z | 2020-09-29T18:02:49.000Z | import python.specular
import python.models
| 14.666667 | 22 | 0.863636 | 6 | 44 | 6.333333 | 0.666667 | 0.631579 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.090909 | 44 | 2 | 23 | 22 | 0.95 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
f8471b9573876c8d3cba1eea2db62b4b1118e80d | 3,522 | py | Python | tests/test_castle_rules.py | codyd51/castle | 93e7f8c18a0dacd5437b7503b7f3420d6ebc6256 | [
"MIT"
] | 2 | 2018-08-07T16:18:58.000Z | 2018-08-09T16:59:48.000Z | tests/test_castle_rules.py | codyd51/castle | 93e7f8c18a0dacd5437b7503b7f3420d6ebc6256 | [
"MIT"
] | null | null | null | tests/test_castle_rules.py | codyd51/castle | 93e7f8c18a0dacd5437b7503b7f3420d6ebc6256 | [
"MIT"
] | null | null | null | import unittest
import castle
from castle import Game, Board, Piece, PieceType, Color, InvalidChessNotationError, PlayerType, MoveParser, FenGameConstructor
class CastleTests(unittest.TestCase):
def test_can_short_castle(self):
g = Game(PlayerType.HUMAN, PlayerType.HUMAN)
g.apply_record('1. e4 e5 2. Bd3 d5 3. Nf3 f5 4. a4 g5')
self.assertTrue(g.can_castle(Color.WHITE, True))
self.assertFalse(g.can_castle(Color.WHITE, False))
self.assertFalse(g.can_castle(Color.BLACK, True))
self.assertFalse(g.can_castle(Color.BLACK, False))
def test_can_long_castle(self):
g = Game(PlayerType.HUMAN, PlayerType.HUMAN)
g.apply_record('1. d4 d5 2. e4 e5 3. Bf4 Qd7 4. Qg4 c6 5. Nc3')
self.assertTrue(g.can_castle(Color.WHITE, False))
self.assertFalse(g.can_castle(Color.WHITE, True))
self.assertFalse(g.can_castle(Color.BLACK, True))
self.assertFalse(g.can_castle(Color.BLACK, False))
def test_castle_after_king_movement(self):
g = Game(PlayerType.HUMAN, PlayerType.HUMAN)
g.apply_record('1. e4 e5 2. Bd3 d5 3. Nf3 f5 4. Kf1 g5 5. Ke1 c5')
# white has moved king
self.assertFalse(g.can_castle(Color.WHITE, True))
# test all the others just to be safe
self.assertFalse(g.can_castle(Color.WHITE, False))
self.assertFalse(g.can_castle(Color.BLACK, True))
self.assertFalse(g.can_castle(Color.BLACK, False))
def test_castle_after_rook_movement(self):
g = Game(PlayerType.HUMAN, PlayerType.HUMAN)
g.apply_record('1. e4 e5 2. Bd3 d5 3. Nf3 f5 4. Rf1 g5 5. Rh1 c5')
# white has moved rook
self.assertFalse(g.can_castle(Color.WHITE, True))
# test all the others just to be safe
self.assertFalse(g.can_castle(Color.WHITE, False))
self.assertFalse(g.can_castle(Color.BLACK, True))
self.assertFalse(g.can_castle(Color.BLACK, False))
def test_castle_obstructed(self):
g = Game(PlayerType.HUMAN, PlayerType.HUMAN)
self.assertFalse(g.can_castle(Color.WHITE, True))
self.assertFalse(g.can_castle(Color.WHITE, False))
self.assertFalse(g.can_castle(Color.BLACK, True))
self.assertFalse(g.can_castle(Color.BLACK, False))
def test_can_castle_in_check(self):
g = Game(PlayerType.HUMAN, PlayerType.HUMAN)
g.apply_record('1. e4 e5 2. d4 f5 3. Bd3 d5 4. Nf3 c5 5. c4 ')
# can castle now
self.assertTrue(g.can_castle(Color.WHITE, True))
# put white in check
g.apply_notation('Qa5')
# can't castle out of check
self.assertFalse(g.can_castle(Color.WHITE, True))
def test_can_castle_through_check(self):
g = Game(PlayerType.HUMAN, PlayerType.HUMAN)
g.apply_record('1. e4 b6 2. Nf3 e5 3. g3 d5 4. Bg2 ')
# can castle now
self.assertTrue(g.can_castle(Color.WHITE, True))
# attack f1, which the king would pass through
g.apply_notation('Ba6')
# can't castle through check
self.assertFalse(g.can_castle(Color.WHITE, True))
def test_can_castle_into_check(self):
g = Game(PlayerType.HUMAN, PlayerType.HUMAN)
g.apply_record('1. f4 e5 2. Nh3 g5 3. g4 f5 4. Bg2 Bc5')
# can castle now
self.assertTrue(g.can_castle(Color.WHITE, True))
# attack g1, which the king moves to
g.apply_notation('Bc5')
# can't castle through check
self.assertFalse(g.can_castle(Color.WHITE, True))
| 44.025 | 126 | 0.663827 | 522 | 3,522 | 4.356322 | 0.185824 | 0.126649 | 0.114336 | 0.171504 | 0.781442 | 0.781442 | 0.781442 | 0.758135 | 0.751979 | 0.751979 | 0 | 0.035897 | 0.224872 | 3,522 | 79 | 127 | 44.582278 | 0.79707 | 0.095684 | 0 | 0.589286 | 0 | 0 | 0.095869 | 0 | 0 | 0 | 0 | 0 | 0.464286 | 1 | 0.142857 | false | 0 | 0.053571 | 0 | 0.214286 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
f880fb41bc896cdc7974d0e59ff34b8cb8eb243d | 4,890 | py | Python | idw_basic.py | allixender/py_interpol_demo | 7ae52f94a4ebe2b883a23d2d9fa8d4c437d346b1 | [
"MIT"
] | 5 | 2022-03-13T16:04:38.000Z | 2022-03-16T19:25:08.000Z | idw_basic.py | allixender/py_interpol_demo | 7ae52f94a4ebe2b883a23d2d9fa8d4c437d346b1 | [
"MIT"
] | null | null | null | idw_basic.py | allixender/py_interpol_demo | 7ae52f94a4ebe2b883a23d2d9fa8d4c437d346b1 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
import numpy as np
#DISTANCE FUNCTION
def distance(x1,y1,x2,y2):
d=np.sqrt((x1-x2)**2+(y1-y2)**2)
return d
# fixed radius and power
def run_rblock(x, y, z, xz, yz, r, p):
x_block=[]
y_block=[]
z_block=[]
xr_min=xz-r
xr_max=xz+r
yr_min=yz-r
yr_max=yz+r
for i in range(len(x)):
# condition to test if a point is within the block
if ((x[i]>=xr_min and x[i]<=xr_max) and (y[i]>=yr_min and y[i]<=yr_max)):
x_block.append(x[i])
y_block.append(y[i])
z_block.append(z[i])
#calculate weight based on distance and p value
w_list=[]
for j in range(len(x_block)):
d=distance(xz,yz,x_block[j],y_block[j]) #distance function is created outside this function
if d>0:
w=1/(d**p)
w_list.append(w)
z0=0
else:
w_list.append(0) #if meet this condition, it means d<=0, weight is set to 0
#check if there is 0 in weight list
w_check=0 in w_list
if w_check==True:
idx=w_list.index(0) # find index for weight=0
z_idw=z_block[idx] # set the value to the current sample value
else:
wt=np.transpose(w_list)
z_idw=np.dot(z_block,wt)/sum(w_list) # idw calculation using dot product
return z_idw
def idw_rblock(x, y, z, grid_side_length, search_radius, p):
# n=100
n = grid_side_length
# setup frame of reference
# left,right,lower,upper coordinate boundaries
x_min=min(x)
x_max=max(x)
y_min=min(y)
y_max=max(y)
#width
w=x_max-x_min
#length
h=y_max-y_min
#x interval
wn=w/n
#y interval
hn=h/n
# target data lists to store interpolated points and values
x_idw_list=[]
y_idw_list=[]
z_head=[]
# initialisation
y_init=y_min
x_init=x_min
for i in range(n):
xz=x_init+wn*i
yz=y_init+hn*i
y_idw_list.append(yz)
x_idw_list.append(xz)
z_idw_list=[]
for j in range(n):
xz=x_init+wn*j
# search_radius=100, inv. power value p=1.5
z_idw=run_rblock(x, y, z, xz, yz, search_radius, p)
z_idw_list.append(z_idw)
z_head.append(z_idw_list)
return (x_idw_list, y_idw_list, z_head)
def run_npoint(x, y, z, xz, yz, n_point, p, rblock_iter_distance=10):
# block radius iteration distance
# r=10
r = rblock_iter_distance
nf=0
while nf<=n_point: #will stop when np reaching at least n_point
x_block=[]
y_block=[]
z_block=[]
r +=10 # add 10 unit each iteration
xr_min=xz-r
xr_max=xz+r
yr_min=yz-r
yr_max=yz+r
for i in range(len(x)):
# condition to test if a point is within the block
if ((x[i]>=xr_min and x[i]<=xr_max) and (y[i]>=yr_min and y[i]<=yr_max)):
x_block.append(x[i])
y_block.append(y[i])
z_block.append(z[i])
nf=len(x_block) #calculate number of point in the block
#calculate weight based on distance and p value
w_list=[]
for j in range(len(x_block)):
d=distance(xz,yz,x_block[j],y_block[j])
if d>0:
w=1/(d**p)
w_list.append(w)
z0=0
else:
w_list.append(0) #if meet this condition, it means d<=0, weight is set to 0
#check if there is 0 in weight list
w_check=0 in w_list
if w_check==True:
idx=w_list.index(0) # find index for weight=0
z_idw=z_block[idx] # set the value to the current sample value
else:
wt=np.transpose(w_list)
z_idw=np.dot(z_block,wt)/sum(w_list) # idw calculation using dot product
return z_idw
# min. number of search points=5, inv. power value p=1.5
def idw_npoint(x, y, z, grid_side_length, n_points, p, rblock_iter_distance=10):
# n=100
n = grid_side_length
# setup frame of reference
# left,right,lower,upper coordinate boundaries
x_min=min(x)
x_max=max(x)
y_min=min(y)
y_max=max(y)
#width
w=x_max-x_min
#length
h=y_max-y_min
#x interval
wn=w/n
#y interval
hn=h/n
# target data lists to store interpolated points and values
x_idw_list=[]
y_idw_list=[]
z_head=[]
# initialisation
y_init=y_min
x_init=x_min
for i in range(n):
xz=x_init+wn*i
yz=y_init+hn*i
y_idw_list.append(yz)
x_idw_list.append(xz)
z_idw_list=[]
for j in range(n):
xz=x_init+wn*j
# min. number of search points=5, inv. power value p=1.5
z_idw=run_npoint(x, y, z, xz, yz, n_points, p)
z_idw_list.append(z_idw)
z_head.append(z_idw_list)
return (x_idw_list, y_idw_list, z_head)
| 26.721311 | 99 | 0.57771 | 860 | 4,890 | 3.090698 | 0.156977 | 0.047404 | 0.006772 | 0.007524 | 0.847254 | 0.831452 | 0.805117 | 0.794206 | 0.782543 | 0.771257 | 0 | 0.018112 | 0.311247 | 4,890 | 182 | 100 | 26.868132 | 0.771081 | 0.274029 | 0 | 0.878049 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.04065 | false | 0 | 0.00813 | 0 | 0.089431 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
f8b7c367561a05a217dd3f21c9f6a71863564aa1 | 2,009 | py | Python | pwa/models.py | slybixi/teller-pwa | f7770029e44e79438b650438b3102a64f8c6a126 | [
"MIT"
] | null | null | null | pwa/models.py | slybixi/teller-pwa | f7770029e44e79438b650438b3102a64f8c6a126 | [
"MIT"
] | null | null | null | pwa/models.py | slybixi/teller-pwa | f7770029e44e79438b650438b3102a64f8c6a126 | [
"MIT"
] | null | null | null | from django.db import models
class Airdrop(models.Model):
title = models.CharField(max_length=250, default="", blank=True)
sub_desc1 = models.CharField(max_length=250, default="", blank=True)
terms_desc = models.TextField(default="", blank=True)
class Meta(models.Model):
title = models.CharField(max_length=100, default="", blank=True)
url = models.CharField(max_length=100, default="", blank=True)
image = models.FilePathField(path="/img")
image_alt = models.CharField(max_length=100, default="", blank=True)
desc = models.TextField(default="", blank=True)
fb_id = models.CharField(max_length=50, default="", blank=True)
twitter_id = models.CharField(max_length=50, default="", blank=True)
twitter_image = models.FilePathField(path="/img")
twitter_card = models.CharField(max_length=50, default="", blank=True)
twitter_link = models.CharField(max_length=50, default="", blank=True)
class Index(models.Model):
title = models.CharField(max_length=250, default="", blank=True)
sub_desc1 = models.CharField(max_length=250, default="", blank=True)
sub_desc2 = models.CharField(max_length=250, default="", blank=True)
desc = models.TextField(default="", blank=True)
why_desc = models.TextField(default="", blank=True)
class Airdrop_Meta(models.Model):
title = models.CharField(max_length=100, default="", blank=True)
url = models.CharField(max_length=100, default="", blank=True)
image = models.FilePathField(path="/img")
image_alt = models.CharField(max_length=100, default="", blank=True)
desc = models.TextField(default="", blank=True)
fb_id = models.CharField(max_length=50, default="", blank=True)
twitter_id = models.CharField(max_length=50, default="", blank=True)
twitter_image = models.FilePathField(path="/img")
twitter_card = models.CharField(max_length=50, default="", blank=True)
twitter_link = models.CharField(max_length=50, default="", blank=True)
| 46.72093 | 74 | 0.705824 | 260 | 2,009 | 5.311538 | 0.146154 | 0.208545 | 0.278059 | 0.330196 | 0.952209 | 0.952209 | 0.952209 | 0.89428 | 0.874728 | 0.835626 | 0 | 0.030215 | 0.143355 | 2,009 | 42 | 75 | 47.833333 | 0.772225 | 0 | 0 | 0.757576 | 0 | 0 | 0.007964 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.030303 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 11 |
3e2f24198c8077a9e836dcb71b8ba02e92b32ef6 | 81 | py | Python | myenv/lib/python3.7/site-packages/pytest_mock/__init__.py | initzion/studentcoder | fdd79eef5ff2922963f6d82292df3497aebca48d | [
"MIT"
] | 1 | 2020-03-01T04:38:24.000Z | 2020-03-01T04:38:24.000Z | myenv/lib/python3.7/site-packages/pytest_mock/__init__.py | initzion/studentcoder | fdd79eef5ff2922963f6d82292df3497aebca48d | [
"MIT"
] | null | null | null | myenv/lib/python3.7/site-packages/pytest_mock/__init__.py | initzion/studentcoder | fdd79eef5ff2922963f6d82292df3497aebca48d | [
"MIT"
] | 17 | 2019-11-21T14:11:29.000Z | 2019-11-21T15:26:23.000Z | from pytest_mock.plugin import *
from pytest_mock.plugin import _get_mock_module
| 27 | 47 | 0.864198 | 13 | 81 | 5 | 0.538462 | 0.307692 | 0.430769 | 0.615385 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.098765 | 81 | 2 | 48 | 40.5 | 0.890411 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
3e329fc7ae26eba9479dfc12a5c0f1c7d40d95ed | 46,189 | py | Python | sdk/python/pulumi_alicloud/kms/key.py | pulumi/pulumi-alicloud | 9c34d84b4588a7c885c6bec1f03b5016e5a41683 | [
"ECL-2.0",
"Apache-2.0"
] | 42 | 2019-03-18T06:34:37.000Z | 2022-03-24T07:08:57.000Z | sdk/python/pulumi_alicloud/kms/key.py | pulumi/pulumi-alicloud | 9c34d84b4588a7c885c6bec1f03b5016e5a41683 | [
"ECL-2.0",
"Apache-2.0"
] | 152 | 2019-04-15T21:03:44.000Z | 2022-03-29T18:00:57.000Z | sdk/python/pulumi_alicloud/kms/key.py | pulumi/pulumi-alicloud | 9c34d84b4588a7c885c6bec1f03b5016e5a41683 | [
"ECL-2.0",
"Apache-2.0"
] | 3 | 2020-08-26T17:30:07.000Z | 2021-07-05T01:37:45.000Z | # coding=utf-8
# *** WARNING: this file was generated by the Pulumi Terraform Bridge (tfgen) Tool. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
import warnings
import pulumi
import pulumi.runtime
from typing import Any, Mapping, Optional, Sequence, Union, overload
from .. import _utilities
__all__ = ['KeyArgs', 'Key']
@pulumi.input_type
class KeyArgs:
def __init__(__self__, *,
automatic_rotation: Optional[pulumi.Input[str]] = None,
deletion_window_in_days: Optional[pulumi.Input[int]] = None,
description: Optional[pulumi.Input[str]] = None,
is_enabled: Optional[pulumi.Input[bool]] = None,
key_spec: Optional[pulumi.Input[str]] = None,
key_state: Optional[pulumi.Input[str]] = None,
key_usage: Optional[pulumi.Input[str]] = None,
origin: Optional[pulumi.Input[str]] = None,
pending_window_in_days: Optional[pulumi.Input[int]] = None,
protection_level: Optional[pulumi.Input[str]] = None,
rotation_interval: Optional[pulumi.Input[str]] = None,
status: Optional[pulumi.Input[str]] = None):
"""
The set of arguments for constructing a Key resource.
:param pulumi.Input[str] automatic_rotation: Specifies whether to enable automatic key rotation. Default:"Disabled".
:param pulumi.Input[int] deletion_window_in_days: Field `deletion_window_in_days` has been deprecated from provider version 1.85.0. New field `pending_window_in_days` instead.
:param pulumi.Input[str] description: The description of the key as viewed in Alicloud console.
:param pulumi.Input[bool] is_enabled: Field `is_enabled` has been deprecated from provider version 1.85.0. New field `key_state` instead.
:param pulumi.Input[str] key_spec: The type of the CMK.
:param pulumi.Input[str] key_state: Field `key_state` has been deprecated from provider version 1.123.1. New field `status` instead.
:param pulumi.Input[str] key_usage: Specifies the usage of CMK. Currently, default to `ENCRYPT/DECRYPT`, indicating that CMK is used for encryption and decryption.
:param pulumi.Input[str] origin: The source of the key material for the CMK. Defaults to "Aliyun_KMS".
:param pulumi.Input[int] pending_window_in_days: Duration in days after which the key is deleted after destruction of the resource, must be between 7 and 30 days. Defaults to 30 days.
:param pulumi.Input[str] protection_level: The protection level of the CMK. Defaults to "SOFTWARE".
:param pulumi.Input[str] rotation_interval: The period of automatic key rotation. Unit: seconds.
:param pulumi.Input[str] status: The status of CMK. Defaults to Enabled. Valid Values: `Disabled`, `Enabled`, `PendingDeletion`.
"""
if automatic_rotation is not None:
pulumi.set(__self__, "automatic_rotation", automatic_rotation)
if deletion_window_in_days is not None:
warnings.warn("""Field 'deletion_window_in_days' has been deprecated from provider version 1.85.0. New field 'pending_window_in_days' instead.""", DeprecationWarning)
pulumi.log.warn("""deletion_window_in_days is deprecated: Field 'deletion_window_in_days' has been deprecated from provider version 1.85.0. New field 'pending_window_in_days' instead.""")
if deletion_window_in_days is not None:
pulumi.set(__self__, "deletion_window_in_days", deletion_window_in_days)
if description is not None:
pulumi.set(__self__, "description", description)
if is_enabled is not None:
warnings.warn("""Field 'is_enabled' has been deprecated from provider version 1.85.0. New field 'key_state' instead.""", DeprecationWarning)
pulumi.log.warn("""is_enabled is deprecated: Field 'is_enabled' has been deprecated from provider version 1.85.0. New field 'key_state' instead.""")
if is_enabled is not None:
pulumi.set(__self__, "is_enabled", is_enabled)
if key_spec is not None:
pulumi.set(__self__, "key_spec", key_spec)
if key_state is not None:
warnings.warn("""Field 'key_state' has been deprecated from provider version 1.123.1. New field 'status' instead.""", DeprecationWarning)
pulumi.log.warn("""key_state is deprecated: Field 'key_state' has been deprecated from provider version 1.123.1. New field 'status' instead.""")
if key_state is not None:
pulumi.set(__self__, "key_state", key_state)
if key_usage is not None:
pulumi.set(__self__, "key_usage", key_usage)
if origin is not None:
pulumi.set(__self__, "origin", origin)
if pending_window_in_days is not None:
pulumi.set(__self__, "pending_window_in_days", pending_window_in_days)
if protection_level is not None:
pulumi.set(__self__, "protection_level", protection_level)
if rotation_interval is not None:
pulumi.set(__self__, "rotation_interval", rotation_interval)
if status is not None:
pulumi.set(__self__, "status", status)
@property
@pulumi.getter(name="automaticRotation")
def automatic_rotation(self) -> Optional[pulumi.Input[str]]:
"""
Specifies whether to enable automatic key rotation. Default:"Disabled".
"""
return pulumi.get(self, "automatic_rotation")
@automatic_rotation.setter
def automatic_rotation(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "automatic_rotation", value)
@property
@pulumi.getter(name="deletionWindowInDays")
def deletion_window_in_days(self) -> Optional[pulumi.Input[int]]:
"""
Field `deletion_window_in_days` has been deprecated from provider version 1.85.0. New field `pending_window_in_days` instead.
"""
return pulumi.get(self, "deletion_window_in_days")
@deletion_window_in_days.setter
def deletion_window_in_days(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "deletion_window_in_days", value)
@property
@pulumi.getter
def description(self) -> Optional[pulumi.Input[str]]:
"""
The description of the key as viewed in Alicloud console.
"""
return pulumi.get(self, "description")
@description.setter
def description(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "description", value)
@property
@pulumi.getter(name="isEnabled")
def is_enabled(self) -> Optional[pulumi.Input[bool]]:
"""
Field `is_enabled` has been deprecated from provider version 1.85.0. New field `key_state` instead.
"""
return pulumi.get(self, "is_enabled")
@is_enabled.setter
def is_enabled(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "is_enabled", value)
@property
@pulumi.getter(name="keySpec")
def key_spec(self) -> Optional[pulumi.Input[str]]:
"""
The type of the CMK.
"""
return pulumi.get(self, "key_spec")
@key_spec.setter
def key_spec(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "key_spec", value)
@property
@pulumi.getter(name="keyState")
def key_state(self) -> Optional[pulumi.Input[str]]:
"""
Field `key_state` has been deprecated from provider version 1.123.1. New field `status` instead.
"""
return pulumi.get(self, "key_state")
@key_state.setter
def key_state(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "key_state", value)
@property
@pulumi.getter(name="keyUsage")
def key_usage(self) -> Optional[pulumi.Input[str]]:
"""
Specifies the usage of CMK. Currently, default to `ENCRYPT/DECRYPT`, indicating that CMK is used for encryption and decryption.
"""
return pulumi.get(self, "key_usage")
@key_usage.setter
def key_usage(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "key_usage", value)
@property
@pulumi.getter
def origin(self) -> Optional[pulumi.Input[str]]:
"""
The source of the key material for the CMK. Defaults to "Aliyun_KMS".
"""
return pulumi.get(self, "origin")
@origin.setter
def origin(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "origin", value)
@property
@pulumi.getter(name="pendingWindowInDays")
def pending_window_in_days(self) -> Optional[pulumi.Input[int]]:
"""
Duration in days after which the key is deleted after destruction of the resource, must be between 7 and 30 days. Defaults to 30 days.
"""
return pulumi.get(self, "pending_window_in_days")
@pending_window_in_days.setter
def pending_window_in_days(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "pending_window_in_days", value)
@property
@pulumi.getter(name="protectionLevel")
def protection_level(self) -> Optional[pulumi.Input[str]]:
"""
The protection level of the CMK. Defaults to "SOFTWARE".
"""
return pulumi.get(self, "protection_level")
@protection_level.setter
def protection_level(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "protection_level", value)
@property
@pulumi.getter(name="rotationInterval")
def rotation_interval(self) -> Optional[pulumi.Input[str]]:
"""
The period of automatic key rotation. Unit: seconds.
"""
return pulumi.get(self, "rotation_interval")
@rotation_interval.setter
def rotation_interval(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "rotation_interval", value)
@property
@pulumi.getter
def status(self) -> Optional[pulumi.Input[str]]:
"""
The status of CMK. Defaults to Enabled. Valid Values: `Disabled`, `Enabled`, `PendingDeletion`.
"""
return pulumi.get(self, "status")
@status.setter
def status(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "status", value)
@pulumi.input_type
class _KeyState:
def __init__(__self__, *,
arn: Optional[pulumi.Input[str]] = None,
automatic_rotation: Optional[pulumi.Input[str]] = None,
creation_date: Optional[pulumi.Input[str]] = None,
creator: Optional[pulumi.Input[str]] = None,
delete_date: Optional[pulumi.Input[str]] = None,
deletion_window_in_days: Optional[pulumi.Input[int]] = None,
description: Optional[pulumi.Input[str]] = None,
is_enabled: Optional[pulumi.Input[bool]] = None,
key_spec: Optional[pulumi.Input[str]] = None,
key_state: Optional[pulumi.Input[str]] = None,
key_usage: Optional[pulumi.Input[str]] = None,
last_rotation_date: Optional[pulumi.Input[str]] = None,
material_expire_time: Optional[pulumi.Input[str]] = None,
next_rotation_date: Optional[pulumi.Input[str]] = None,
origin: Optional[pulumi.Input[str]] = None,
pending_window_in_days: Optional[pulumi.Input[int]] = None,
primary_key_version: Optional[pulumi.Input[str]] = None,
protection_level: Optional[pulumi.Input[str]] = None,
rotation_interval: Optional[pulumi.Input[str]] = None,
status: Optional[pulumi.Input[str]] = None):
"""
Input properties used for looking up and filtering Key resources.
:param pulumi.Input[str] arn: The Alicloud Resource Name (ARN) of the key.
* `creation_date` -The date and time when the CMK was created. The time is displayed in UTC.
* `creator` -The creator of the CMK.
* `delete_date` -The scheduled date to delete CMK. The time is displayed in UTC. This value is returned only when the KeyState value is PendingDeletion.
:param pulumi.Input[str] automatic_rotation: Specifies whether to enable automatic key rotation. Default:"Disabled".
:param pulumi.Input[int] deletion_window_in_days: Field `deletion_window_in_days` has been deprecated from provider version 1.85.0. New field `pending_window_in_days` instead.
:param pulumi.Input[str] description: The description of the key as viewed in Alicloud console.
:param pulumi.Input[bool] is_enabled: Field `is_enabled` has been deprecated from provider version 1.85.0. New field `key_state` instead.
:param pulumi.Input[str] key_spec: The type of the CMK.
:param pulumi.Input[str] key_state: Field `key_state` has been deprecated from provider version 1.123.1. New field `status` instead.
:param pulumi.Input[str] key_usage: Specifies the usage of CMK. Currently, default to `ENCRYPT/DECRYPT`, indicating that CMK is used for encryption and decryption.
:param pulumi.Input[str] last_rotation_date: The date and time the last rotation was performed. The time is displayed in UTC.
:param pulumi.Input[str] material_expire_time: The time and date the key material for the CMK expires. The time is displayed in UTC. If the value is empty, the key material for the CMK does not expire.
:param pulumi.Input[str] next_rotation_date: The time the next rotation is scheduled for execution.
:param pulumi.Input[str] origin: The source of the key material for the CMK. Defaults to "Aliyun_KMS".
:param pulumi.Input[int] pending_window_in_days: Duration in days after which the key is deleted after destruction of the resource, must be between 7 and 30 days. Defaults to 30 days.
:param pulumi.Input[str] primary_key_version: The ID of the current primary key version of the symmetric CMK.
:param pulumi.Input[str] protection_level: The protection level of the CMK. Defaults to "SOFTWARE".
:param pulumi.Input[str] rotation_interval: The period of automatic key rotation. Unit: seconds.
:param pulumi.Input[str] status: The status of CMK. Defaults to Enabled. Valid Values: `Disabled`, `Enabled`, `PendingDeletion`.
"""
if arn is not None:
pulumi.set(__self__, "arn", arn)
if automatic_rotation is not None:
pulumi.set(__self__, "automatic_rotation", automatic_rotation)
if creation_date is not None:
pulumi.set(__self__, "creation_date", creation_date)
if creator is not None:
pulumi.set(__self__, "creator", creator)
if delete_date is not None:
pulumi.set(__self__, "delete_date", delete_date)
if deletion_window_in_days is not None:
warnings.warn("""Field 'deletion_window_in_days' has been deprecated from provider version 1.85.0. New field 'pending_window_in_days' instead.""", DeprecationWarning)
pulumi.log.warn("""deletion_window_in_days is deprecated: Field 'deletion_window_in_days' has been deprecated from provider version 1.85.0. New field 'pending_window_in_days' instead.""")
if deletion_window_in_days is not None:
pulumi.set(__self__, "deletion_window_in_days", deletion_window_in_days)
if description is not None:
pulumi.set(__self__, "description", description)
if is_enabled is not None:
warnings.warn("""Field 'is_enabled' has been deprecated from provider version 1.85.0. New field 'key_state' instead.""", DeprecationWarning)
pulumi.log.warn("""is_enabled is deprecated: Field 'is_enabled' has been deprecated from provider version 1.85.0. New field 'key_state' instead.""")
if is_enabled is not None:
pulumi.set(__self__, "is_enabled", is_enabled)
if key_spec is not None:
pulumi.set(__self__, "key_spec", key_spec)
if key_state is not None:
warnings.warn("""Field 'key_state' has been deprecated from provider version 1.123.1. New field 'status' instead.""", DeprecationWarning)
pulumi.log.warn("""key_state is deprecated: Field 'key_state' has been deprecated from provider version 1.123.1. New field 'status' instead.""")
if key_state is not None:
pulumi.set(__self__, "key_state", key_state)
if key_usage is not None:
pulumi.set(__self__, "key_usage", key_usage)
if last_rotation_date is not None:
pulumi.set(__self__, "last_rotation_date", last_rotation_date)
if material_expire_time is not None:
pulumi.set(__self__, "material_expire_time", material_expire_time)
if next_rotation_date is not None:
pulumi.set(__self__, "next_rotation_date", next_rotation_date)
if origin is not None:
pulumi.set(__self__, "origin", origin)
if pending_window_in_days is not None:
pulumi.set(__self__, "pending_window_in_days", pending_window_in_days)
if primary_key_version is not None:
pulumi.set(__self__, "primary_key_version", primary_key_version)
if protection_level is not None:
pulumi.set(__self__, "protection_level", protection_level)
if rotation_interval is not None:
pulumi.set(__self__, "rotation_interval", rotation_interval)
if status is not None:
pulumi.set(__self__, "status", status)
@property
@pulumi.getter
def arn(self) -> Optional[pulumi.Input[str]]:
"""
The Alicloud Resource Name (ARN) of the key.
* `creation_date` -The date and time when the CMK was created. The time is displayed in UTC.
* `creator` -The creator of the CMK.
* `delete_date` -The scheduled date to delete CMK. The time is displayed in UTC. This value is returned only when the KeyState value is PendingDeletion.
"""
return pulumi.get(self, "arn")
@arn.setter
def arn(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "arn", value)
@property
@pulumi.getter(name="automaticRotation")
def automatic_rotation(self) -> Optional[pulumi.Input[str]]:
"""
Specifies whether to enable automatic key rotation. Default:"Disabled".
"""
return pulumi.get(self, "automatic_rotation")
@automatic_rotation.setter
def automatic_rotation(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "automatic_rotation", value)
@property
@pulumi.getter(name="creationDate")
def creation_date(self) -> Optional[pulumi.Input[str]]:
return pulumi.get(self, "creation_date")
@creation_date.setter
def creation_date(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "creation_date", value)
@property
@pulumi.getter
def creator(self) -> Optional[pulumi.Input[str]]:
return pulumi.get(self, "creator")
@creator.setter
def creator(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "creator", value)
@property
@pulumi.getter(name="deleteDate")
def delete_date(self) -> Optional[pulumi.Input[str]]:
return pulumi.get(self, "delete_date")
@delete_date.setter
def delete_date(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "delete_date", value)
@property
@pulumi.getter(name="deletionWindowInDays")
def deletion_window_in_days(self) -> Optional[pulumi.Input[int]]:
"""
Field `deletion_window_in_days` has been deprecated from provider version 1.85.0. New field `pending_window_in_days` instead.
"""
return pulumi.get(self, "deletion_window_in_days")
@deletion_window_in_days.setter
def deletion_window_in_days(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "deletion_window_in_days", value)
@property
@pulumi.getter
def description(self) -> Optional[pulumi.Input[str]]:
"""
The description of the key as viewed in Alicloud console.
"""
return pulumi.get(self, "description")
@description.setter
def description(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "description", value)
@property
@pulumi.getter(name="isEnabled")
def is_enabled(self) -> Optional[pulumi.Input[bool]]:
"""
Field `is_enabled` has been deprecated from provider version 1.85.0. New field `key_state` instead.
"""
return pulumi.get(self, "is_enabled")
@is_enabled.setter
def is_enabled(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "is_enabled", value)
@property
@pulumi.getter(name="keySpec")
def key_spec(self) -> Optional[pulumi.Input[str]]:
"""
The type of the CMK.
"""
return pulumi.get(self, "key_spec")
@key_spec.setter
def key_spec(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "key_spec", value)
@property
@pulumi.getter(name="keyState")
def key_state(self) -> Optional[pulumi.Input[str]]:
"""
Field `key_state` has been deprecated from provider version 1.123.1. New field `status` instead.
"""
return pulumi.get(self, "key_state")
@key_state.setter
def key_state(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "key_state", value)
@property
@pulumi.getter(name="keyUsage")
def key_usage(self) -> Optional[pulumi.Input[str]]:
"""
Specifies the usage of CMK. Currently, default to `ENCRYPT/DECRYPT`, indicating that CMK is used for encryption and decryption.
"""
return pulumi.get(self, "key_usage")
@key_usage.setter
def key_usage(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "key_usage", value)
@property
@pulumi.getter(name="lastRotationDate")
def last_rotation_date(self) -> Optional[pulumi.Input[str]]:
"""
The date and time the last rotation was performed. The time is displayed in UTC.
"""
return pulumi.get(self, "last_rotation_date")
@last_rotation_date.setter
def last_rotation_date(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "last_rotation_date", value)
@property
@pulumi.getter(name="materialExpireTime")
def material_expire_time(self) -> Optional[pulumi.Input[str]]:
"""
The time and date the key material for the CMK expires. The time is displayed in UTC. If the value is empty, the key material for the CMK does not expire.
"""
return pulumi.get(self, "material_expire_time")
@material_expire_time.setter
def material_expire_time(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "material_expire_time", value)
@property
@pulumi.getter(name="nextRotationDate")
def next_rotation_date(self) -> Optional[pulumi.Input[str]]:
"""
The time the next rotation is scheduled for execution.
"""
return pulumi.get(self, "next_rotation_date")
@next_rotation_date.setter
def next_rotation_date(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "next_rotation_date", value)
@property
@pulumi.getter
def origin(self) -> Optional[pulumi.Input[str]]:
"""
The source of the key material for the CMK. Defaults to "Aliyun_KMS".
"""
return pulumi.get(self, "origin")
@origin.setter
def origin(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "origin", value)
@property
@pulumi.getter(name="pendingWindowInDays")
def pending_window_in_days(self) -> Optional[pulumi.Input[int]]:
"""
Duration in days after which the key is deleted after destruction of the resource, must be between 7 and 30 days. Defaults to 30 days.
"""
return pulumi.get(self, "pending_window_in_days")
@pending_window_in_days.setter
def pending_window_in_days(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "pending_window_in_days", value)
@property
@pulumi.getter(name="primaryKeyVersion")
def primary_key_version(self) -> Optional[pulumi.Input[str]]:
"""
The ID of the current primary key version of the symmetric CMK.
"""
return pulumi.get(self, "primary_key_version")
@primary_key_version.setter
def primary_key_version(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "primary_key_version", value)
@property
@pulumi.getter(name="protectionLevel")
def protection_level(self) -> Optional[pulumi.Input[str]]:
"""
The protection level of the CMK. Defaults to "SOFTWARE".
"""
return pulumi.get(self, "protection_level")
@protection_level.setter
def protection_level(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "protection_level", value)
@property
@pulumi.getter(name="rotationInterval")
def rotation_interval(self) -> Optional[pulumi.Input[str]]:
"""
The period of automatic key rotation. Unit: seconds.
"""
return pulumi.get(self, "rotation_interval")
@rotation_interval.setter
def rotation_interval(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "rotation_interval", value)
@property
@pulumi.getter
def status(self) -> Optional[pulumi.Input[str]]:
"""
The status of CMK. Defaults to Enabled. Valid Values: `Disabled`, `Enabled`, `PendingDeletion`.
"""
return pulumi.get(self, "status")
@status.setter
def status(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "status", value)
class Key(pulumi.CustomResource):
@overload
def __init__(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
automatic_rotation: Optional[pulumi.Input[str]] = None,
deletion_window_in_days: Optional[pulumi.Input[int]] = None,
description: Optional[pulumi.Input[str]] = None,
is_enabled: Optional[pulumi.Input[bool]] = None,
key_spec: Optional[pulumi.Input[str]] = None,
key_state: Optional[pulumi.Input[str]] = None,
key_usage: Optional[pulumi.Input[str]] = None,
origin: Optional[pulumi.Input[str]] = None,
pending_window_in_days: Optional[pulumi.Input[int]] = None,
protection_level: Optional[pulumi.Input[str]] = None,
rotation_interval: Optional[pulumi.Input[str]] = None,
status: Optional[pulumi.Input[str]] = None,
__props__=None):
"""
A kms key can help user to protect data security in the transmission process. For information about Alikms Key and how to use it, see [What is Resource Alikms Key](https://www.alibabacloud.com/help/doc-detail/28947.htm).
> **NOTE:** Available in v1.85.0+.
## Example Usage
Basic Usage
```python
import pulumi
import pulumi_alicloud as alicloud
key = alicloud.kms.Key("key",
description="Hello KMS",
pending_window_in_days=7,
status="Enabled")
```
## Import
Alikms key can be imported using the id, e.g.
```sh
$ pulumi import alicloud:kms/key:Key example abc123456
```
:param str resource_name: The name of the resource.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[str] automatic_rotation: Specifies whether to enable automatic key rotation. Default:"Disabled".
:param pulumi.Input[int] deletion_window_in_days: Field `deletion_window_in_days` has been deprecated from provider version 1.85.0. New field `pending_window_in_days` instead.
:param pulumi.Input[str] description: The description of the key as viewed in Alicloud console.
:param pulumi.Input[bool] is_enabled: Field `is_enabled` has been deprecated from provider version 1.85.0. New field `key_state` instead.
:param pulumi.Input[str] key_spec: The type of the CMK.
:param pulumi.Input[str] key_state: Field `key_state` has been deprecated from provider version 1.123.1. New field `status` instead.
:param pulumi.Input[str] key_usage: Specifies the usage of CMK. Currently, default to `ENCRYPT/DECRYPT`, indicating that CMK is used for encryption and decryption.
:param pulumi.Input[str] origin: The source of the key material for the CMK. Defaults to "Aliyun_KMS".
:param pulumi.Input[int] pending_window_in_days: Duration in days after which the key is deleted after destruction of the resource, must be between 7 and 30 days. Defaults to 30 days.
:param pulumi.Input[str] protection_level: The protection level of the CMK. Defaults to "SOFTWARE".
:param pulumi.Input[str] rotation_interval: The period of automatic key rotation. Unit: seconds.
:param pulumi.Input[str] status: The status of CMK. Defaults to Enabled. Valid Values: `Disabled`, `Enabled`, `PendingDeletion`.
"""
...
@overload
def __init__(__self__,
resource_name: str,
args: Optional[KeyArgs] = None,
opts: Optional[pulumi.ResourceOptions] = None):
"""
A kms key can help user to protect data security in the transmission process. For information about Alikms Key and how to use it, see [What is Resource Alikms Key](https://www.alibabacloud.com/help/doc-detail/28947.htm).
> **NOTE:** Available in v1.85.0+.
## Example Usage
Basic Usage
```python
import pulumi
import pulumi_alicloud as alicloud
key = alicloud.kms.Key("key",
description="Hello KMS",
pending_window_in_days=7,
status="Enabled")
```
## Import
Alikms key can be imported using the id, e.g.
```sh
$ pulumi import alicloud:kms/key:Key example abc123456
```
:param str resource_name: The name of the resource.
:param KeyArgs args: The arguments to use to populate this resource's properties.
:param pulumi.ResourceOptions opts: Options for the resource.
"""
...
def __init__(__self__, resource_name: str, *args, **kwargs):
resource_args, opts = _utilities.get_resource_args_opts(KeyArgs, pulumi.ResourceOptions, *args, **kwargs)
if resource_args is not None:
__self__._internal_init(resource_name, opts, **resource_args.__dict__)
else:
__self__._internal_init(resource_name, *args, **kwargs)
def _internal_init(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
automatic_rotation: Optional[pulumi.Input[str]] = None,
deletion_window_in_days: Optional[pulumi.Input[int]] = None,
description: Optional[pulumi.Input[str]] = None,
is_enabled: Optional[pulumi.Input[bool]] = None,
key_spec: Optional[pulumi.Input[str]] = None,
key_state: Optional[pulumi.Input[str]] = None,
key_usage: Optional[pulumi.Input[str]] = None,
origin: Optional[pulumi.Input[str]] = None,
pending_window_in_days: Optional[pulumi.Input[int]] = None,
protection_level: Optional[pulumi.Input[str]] = None,
rotation_interval: Optional[pulumi.Input[str]] = None,
status: Optional[pulumi.Input[str]] = None,
__props__=None):
if opts is None:
opts = pulumi.ResourceOptions()
if not isinstance(opts, pulumi.ResourceOptions):
raise TypeError('Expected resource options to be a ResourceOptions instance')
if opts.version is None:
opts.version = _utilities.get_version()
if opts.id is None:
if __props__ is not None:
raise TypeError('__props__ is only valid when passed in combination with a valid opts.id to get an existing resource')
__props__ = KeyArgs.__new__(KeyArgs)
__props__.__dict__["automatic_rotation"] = automatic_rotation
if deletion_window_in_days is not None and not opts.urn:
warnings.warn("""Field 'deletion_window_in_days' has been deprecated from provider version 1.85.0. New field 'pending_window_in_days' instead.""", DeprecationWarning)
pulumi.log.warn("""deletion_window_in_days is deprecated: Field 'deletion_window_in_days' has been deprecated from provider version 1.85.0. New field 'pending_window_in_days' instead.""")
__props__.__dict__["deletion_window_in_days"] = deletion_window_in_days
__props__.__dict__["description"] = description
if is_enabled is not None and not opts.urn:
warnings.warn("""Field 'is_enabled' has been deprecated from provider version 1.85.0. New field 'key_state' instead.""", DeprecationWarning)
pulumi.log.warn("""is_enabled is deprecated: Field 'is_enabled' has been deprecated from provider version 1.85.0. New field 'key_state' instead.""")
__props__.__dict__["is_enabled"] = is_enabled
__props__.__dict__["key_spec"] = key_spec
if key_state is not None and not opts.urn:
warnings.warn("""Field 'key_state' has been deprecated from provider version 1.123.1. New field 'status' instead.""", DeprecationWarning)
pulumi.log.warn("""key_state is deprecated: Field 'key_state' has been deprecated from provider version 1.123.1. New field 'status' instead.""")
__props__.__dict__["key_state"] = key_state
__props__.__dict__["key_usage"] = key_usage
__props__.__dict__["origin"] = origin
__props__.__dict__["pending_window_in_days"] = pending_window_in_days
__props__.__dict__["protection_level"] = protection_level
__props__.__dict__["rotation_interval"] = rotation_interval
__props__.__dict__["status"] = status
__props__.__dict__["arn"] = None
__props__.__dict__["creation_date"] = None
__props__.__dict__["creator"] = None
__props__.__dict__["delete_date"] = None
__props__.__dict__["last_rotation_date"] = None
__props__.__dict__["material_expire_time"] = None
__props__.__dict__["next_rotation_date"] = None
__props__.__dict__["primary_key_version"] = None
super(Key, __self__).__init__(
'alicloud:kms/key:Key',
resource_name,
__props__,
opts)
@staticmethod
def get(resource_name: str,
id: pulumi.Input[str],
opts: Optional[pulumi.ResourceOptions] = None,
arn: Optional[pulumi.Input[str]] = None,
automatic_rotation: Optional[pulumi.Input[str]] = None,
creation_date: Optional[pulumi.Input[str]] = None,
creator: Optional[pulumi.Input[str]] = None,
delete_date: Optional[pulumi.Input[str]] = None,
deletion_window_in_days: Optional[pulumi.Input[int]] = None,
description: Optional[pulumi.Input[str]] = None,
is_enabled: Optional[pulumi.Input[bool]] = None,
key_spec: Optional[pulumi.Input[str]] = None,
key_state: Optional[pulumi.Input[str]] = None,
key_usage: Optional[pulumi.Input[str]] = None,
last_rotation_date: Optional[pulumi.Input[str]] = None,
material_expire_time: Optional[pulumi.Input[str]] = None,
next_rotation_date: Optional[pulumi.Input[str]] = None,
origin: Optional[pulumi.Input[str]] = None,
pending_window_in_days: Optional[pulumi.Input[int]] = None,
primary_key_version: Optional[pulumi.Input[str]] = None,
protection_level: Optional[pulumi.Input[str]] = None,
rotation_interval: Optional[pulumi.Input[str]] = None,
status: Optional[pulumi.Input[str]] = None) -> 'Key':
"""
Get an existing Key resource's state with the given name, id, and optional extra
properties used to qualify the lookup.
:param str resource_name: The unique name of the resulting resource.
:param pulumi.Input[str] id: The unique provider ID of the resource to lookup.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[str] arn: The Alicloud Resource Name (ARN) of the key.
* `creation_date` -The date and time when the CMK was created. The time is displayed in UTC.
* `creator` -The creator of the CMK.
* `delete_date` -The scheduled date to delete CMK. The time is displayed in UTC. This value is returned only when the KeyState value is PendingDeletion.
:param pulumi.Input[str] automatic_rotation: Specifies whether to enable automatic key rotation. Default:"Disabled".
:param pulumi.Input[int] deletion_window_in_days: Field `deletion_window_in_days` has been deprecated from provider version 1.85.0. New field `pending_window_in_days` instead.
:param pulumi.Input[str] description: The description of the key as viewed in Alicloud console.
:param pulumi.Input[bool] is_enabled: Field `is_enabled` has been deprecated from provider version 1.85.0. New field `key_state` instead.
:param pulumi.Input[str] key_spec: The type of the CMK.
:param pulumi.Input[str] key_state: Field `key_state` has been deprecated from provider version 1.123.1. New field `status` instead.
:param pulumi.Input[str] key_usage: Specifies the usage of CMK. Currently, default to `ENCRYPT/DECRYPT`, indicating that CMK is used for encryption and decryption.
:param pulumi.Input[str] last_rotation_date: The date and time the last rotation was performed. The time is displayed in UTC.
:param pulumi.Input[str] material_expire_time: The time and date the key material for the CMK expires. The time is displayed in UTC. If the value is empty, the key material for the CMK does not expire.
:param pulumi.Input[str] next_rotation_date: The time the next rotation is scheduled for execution.
:param pulumi.Input[str] origin: The source of the key material for the CMK. Defaults to "Aliyun_KMS".
:param pulumi.Input[int] pending_window_in_days: Duration in days after which the key is deleted after destruction of the resource, must be between 7 and 30 days. Defaults to 30 days.
:param pulumi.Input[str] primary_key_version: The ID of the current primary key version of the symmetric CMK.
:param pulumi.Input[str] protection_level: The protection level of the CMK. Defaults to "SOFTWARE".
:param pulumi.Input[str] rotation_interval: The period of automatic key rotation. Unit: seconds.
:param pulumi.Input[str] status: The status of CMK. Defaults to Enabled. Valid Values: `Disabled`, `Enabled`, `PendingDeletion`.
"""
opts = pulumi.ResourceOptions.merge(opts, pulumi.ResourceOptions(id=id))
__props__ = _KeyState.__new__(_KeyState)
__props__.__dict__["arn"] = arn
__props__.__dict__["automatic_rotation"] = automatic_rotation
__props__.__dict__["creation_date"] = creation_date
__props__.__dict__["creator"] = creator
__props__.__dict__["delete_date"] = delete_date
__props__.__dict__["deletion_window_in_days"] = deletion_window_in_days
__props__.__dict__["description"] = description
__props__.__dict__["is_enabled"] = is_enabled
__props__.__dict__["key_spec"] = key_spec
__props__.__dict__["key_state"] = key_state
__props__.__dict__["key_usage"] = key_usage
__props__.__dict__["last_rotation_date"] = last_rotation_date
__props__.__dict__["material_expire_time"] = material_expire_time
__props__.__dict__["next_rotation_date"] = next_rotation_date
__props__.__dict__["origin"] = origin
__props__.__dict__["pending_window_in_days"] = pending_window_in_days
__props__.__dict__["primary_key_version"] = primary_key_version
__props__.__dict__["protection_level"] = protection_level
__props__.__dict__["rotation_interval"] = rotation_interval
__props__.__dict__["status"] = status
return Key(resource_name, opts=opts, __props__=__props__)
@property
@pulumi.getter
def arn(self) -> pulumi.Output[str]:
"""
The Alicloud Resource Name (ARN) of the key.
* `creation_date` -The date and time when the CMK was created. The time is displayed in UTC.
* `creator` -The creator of the CMK.
* `delete_date` -The scheduled date to delete CMK. The time is displayed in UTC. This value is returned only when the KeyState value is PendingDeletion.
"""
return pulumi.get(self, "arn")
@property
@pulumi.getter(name="automaticRotation")
def automatic_rotation(self) -> pulumi.Output[Optional[str]]:
"""
Specifies whether to enable automatic key rotation. Default:"Disabled".
"""
return pulumi.get(self, "automatic_rotation")
@property
@pulumi.getter(name="creationDate")
def creation_date(self) -> pulumi.Output[str]:
return pulumi.get(self, "creation_date")
@property
@pulumi.getter
def creator(self) -> pulumi.Output[str]:
return pulumi.get(self, "creator")
@property
@pulumi.getter(name="deleteDate")
def delete_date(self) -> pulumi.Output[str]:
return pulumi.get(self, "delete_date")
@property
@pulumi.getter(name="deletionWindowInDays")
def deletion_window_in_days(self) -> pulumi.Output[int]:
"""
Field `deletion_window_in_days` has been deprecated from provider version 1.85.0. New field `pending_window_in_days` instead.
"""
return pulumi.get(self, "deletion_window_in_days")
@property
@pulumi.getter
def description(self) -> pulumi.Output[Optional[str]]:
"""
The description of the key as viewed in Alicloud console.
"""
return pulumi.get(self, "description")
@property
@pulumi.getter(name="isEnabled")
def is_enabled(self) -> pulumi.Output[Optional[bool]]:
"""
Field `is_enabled` has been deprecated from provider version 1.85.0. New field `key_state` instead.
"""
return pulumi.get(self, "is_enabled")
@property
@pulumi.getter(name="keySpec")
def key_spec(self) -> pulumi.Output[str]:
"""
The type of the CMK.
"""
return pulumi.get(self, "key_spec")
@property
@pulumi.getter(name="keyState")
def key_state(self) -> pulumi.Output[str]:
"""
Field `key_state` has been deprecated from provider version 1.123.1. New field `status` instead.
"""
return pulumi.get(self, "key_state")
@property
@pulumi.getter(name="keyUsage")
def key_usage(self) -> pulumi.Output[Optional[str]]:
"""
Specifies the usage of CMK. Currently, default to `ENCRYPT/DECRYPT`, indicating that CMK is used for encryption and decryption.
"""
return pulumi.get(self, "key_usage")
@property
@pulumi.getter(name="lastRotationDate")
def last_rotation_date(self) -> pulumi.Output[str]:
"""
The date and time the last rotation was performed. The time is displayed in UTC.
"""
return pulumi.get(self, "last_rotation_date")
@property
@pulumi.getter(name="materialExpireTime")
def material_expire_time(self) -> pulumi.Output[str]:
"""
The time and date the key material for the CMK expires. The time is displayed in UTC. If the value is empty, the key material for the CMK does not expire.
"""
return pulumi.get(self, "material_expire_time")
@property
@pulumi.getter(name="nextRotationDate")
def next_rotation_date(self) -> pulumi.Output[str]:
"""
The time the next rotation is scheduled for execution.
"""
return pulumi.get(self, "next_rotation_date")
@property
@pulumi.getter
def origin(self) -> pulumi.Output[Optional[str]]:
"""
The source of the key material for the CMK. Defaults to "Aliyun_KMS".
"""
return pulumi.get(self, "origin")
@property
@pulumi.getter(name="pendingWindowInDays")
def pending_window_in_days(self) -> pulumi.Output[int]:
"""
Duration in days after which the key is deleted after destruction of the resource, must be between 7 and 30 days. Defaults to 30 days.
"""
return pulumi.get(self, "pending_window_in_days")
@property
@pulumi.getter(name="primaryKeyVersion")
def primary_key_version(self) -> pulumi.Output[str]:
"""
The ID of the current primary key version of the symmetric CMK.
"""
return pulumi.get(self, "primary_key_version")
@property
@pulumi.getter(name="protectionLevel")
def protection_level(self) -> pulumi.Output[Optional[str]]:
"""
The protection level of the CMK. Defaults to "SOFTWARE".
"""
return pulumi.get(self, "protection_level")
@property
@pulumi.getter(name="rotationInterval")
def rotation_interval(self) -> pulumi.Output[Optional[str]]:
"""
The period of automatic key rotation. Unit: seconds.
"""
return pulumi.get(self, "rotation_interval")
@property
@pulumi.getter
def status(self) -> pulumi.Output[str]:
"""
The status of CMK. Defaults to Enabled. Valid Values: `Disabled`, `Enabled`, `PendingDeletion`.
"""
return pulumi.get(self, "status")
| 48.517857 | 228 | 0.661175 | 5,832 | 46,189 | 5.008059 | 0.044067 | 0.076078 | 0.077173 | 0.085117 | 0.937241 | 0.921286 | 0.894512 | 0.87907 | 0.872017 | 0.843325 | 0 | 0.00671 | 0.235316 | 46,189 | 951 | 229 | 48.568875 | 0.820215 | 0.310247 | 0 | 0.776408 | 1 | 0.03169 | 0.166611 | 0.025777 | 0 | 0 | 0 | 0 | 0 | 1 | 0.160211 | false | 0.001761 | 0.008803 | 0.010563 | 0.267606 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
3e674919b43bb6e0d6d33aba05d96f9639bfd194 | 383 | py | Python | movo_object_search/scripts/ros_util.py | zkytony/mos3d | b2a68baec5b0627ec83be092c6557485561e804f | [
"MIT"
] | null | null | null | movo_object_search/scripts/ros_util.py | zkytony/mos3d | b2a68baec5b0627ec83be092c6557485561e804f | [
"MIT"
] | null | null | null | movo_object_search/scripts/ros_util.py | zkytony/mos3d | b2a68baec5b0627ec83be092c6557485561e804f | [
"MIT"
] | null | null | null | import rospy
def get_param(param):
if rospy.has_param(param):
return rospy.get_param(param)
else:
return rospy.get_param("~" + param)
def get_if_has_param(param):
if rospy.has_param(param):
return rospy.get_param(param)
elif rospy.has_param("~" + param):
return rospy.get_param("~" + param)
else:
return None
| 19.15 | 43 | 0.613577 | 51 | 383 | 4.392157 | 0.215686 | 0.401786 | 0.290179 | 0.339286 | 0.839286 | 0.758929 | 0.758929 | 0.758929 | 0.758929 | 0.758929 | 0 | 0 | 0.27154 | 383 | 19 | 44 | 20.157895 | 0.802867 | 0 | 0 | 0.615385 | 0 | 0 | 0.007853 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.153846 | false | 0 | 0.076923 | 0 | 0.615385 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
e4216c6d53fd4cb66be3870b3ea6b590af27fb2c | 295 | py | Python | html/semantics/embedded-content/the-img-element/404-response-with-actual-image-data.py | ziransun/wpt | ab8f451eb39eb198584d547f5d965ef54df2a86a | [
"BSD-3-Clause"
] | 8 | 2019-04-09T21:13:05.000Z | 2021-11-23T17:25:18.000Z | html/semantics/embedded-content/the-img-element/404-response-with-actual-image-data.py | ziransun/wpt | ab8f451eb39eb198584d547f5d965ef54df2a86a | [
"BSD-3-Clause"
] | 14 | 2019-03-18T20:11:48.000Z | 2019-04-23T22:41:46.000Z | html/semantics/embedded-content/the-img-element/404-response-with-actual-image-data.py | ziransun/wpt | ab8f451eb39eb198584d547f5d965ef54df2a86a | [
"BSD-3-Clause"
] | 11 | 2019-04-12T01:20:16.000Z | 2021-11-23T17:25:02.000Z | import base64
def main(req, res):
return 404, [('Content-Type', 'image/png')], base64.decodestring("iVBORw0KGgoAAAANSUhEUgAAABAAAAAQCAYAAAAf8/9hAAAAAXNSR0IArs4c6QAAAARnQU1BAACxjwv8YQUAAAAJcEhZcwAADsQAAA7EAZUrDhsAAAAhSURBVDhPY3wro/KfgQLABKXJBqMGjBoAAqMGDLwBDAwAEsoCTFWunmQAAAAASUVORK5CYII=")
| 73.75 | 260 | 0.867797 | 17 | 295 | 15.058824 | 0.941176 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.064057 | 0.047458 | 295 | 3 | 261 | 98.333333 | 0.846975 | 0 | 0 | 0 | 0 | 0 | 0.708475 | 0.637288 | 0 | 1 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 9 |
e4ce7bbdce167dbbc7ca250455f18bde97a615a7 | 6,121 | py | Python | api/files/api/app/es/googlemetric.py | trackit/trackit-legacy | 76cfab7941eddb9d390dd6c7b9a408a9ad4fc8da | [
"Apache-2.0"
] | 2 | 2018-02-01T09:18:05.000Z | 2020-03-12T18:11:11.000Z | api/files/api/app/es/googlemetric.py | trackit/trackit-legacy | 76cfab7941eddb9d390dd6c7b9a408a9ad4fc8da | [
"Apache-2.0"
] | null | null | null | api/files/api/app/es/googlemetric.py | trackit/trackit-legacy | 76cfab7941eddb9d390dd6c7b9a408a9ad4fc8da | [
"Apache-2.0"
] | 5 | 2018-05-11T10:32:52.000Z | 2021-05-26T12:09:47.000Z | import elasticsearch_dsl as dsl
from datetime import datetime, timedelta
from . import client
class GoogleMetric(dsl.DocType):
class Meta:
index = 'googlemetric'
identity = dsl.String(index='not_analyzed')
resource = dsl.String(index='not_analyzed')
metric = dsl.String(index='not_analyzed')
time = dsl.Date(format='date_optional_time||epoch_millis')
value = dsl.Double()
@classmethod
def daily_cpu_utilization(cls, identity_email):
s = cls.search()
s = s.filter('term', identity=identity_email)
s = s.filter('term', metric='GCLOUD/COMPUTE:compute.googleapis.com/instance/cpu/utilization')
agg = s.aggs.bucket('intervals', 'date_histogram', field='time', interval='day', min_doc_count=1)
agg.metric('utilization', 'avg', field='value')
res = client.search(index='googlemetric', body=s.to_dict(), size=0)
for interval in res['aggregations']['intervals']['buckets']:
yield interval['key_as_string'].split('T')[0], interval['utilization']['value']
@classmethod
def get_cpu_usage(cls, identity_email, timespan=timedelta(days=30)):
s = cls.search()
s = s.filter('range', time={'gt': (datetime.utcnow() - timespan).isoformat()})
s = s.filter('term', metric='GCLOUD/COMPUTE:compute.googleapis.com/instance/cpu/utilization')
s = s.filter('term', identity=identity_email)
agg = s.aggs.bucket('resources', 'terms', field='resource', size=300)
agg.metric('utilization', 'avg', field='value')
res = client.search(index='googlemetric', body=s.to_dict(), size=0)
for resource in res['aggregations']['resources']['buckets']:
yield resource['key'], resource['utilization']['value']
@classmethod
def get_disk_read_iops_usage(cls, identity_email, timespan=timedelta(days=30)):
s = cls.search()
s = s.filter('range', time={'gt': (datetime.utcnow() - timespan).isoformat()})
s = s.filter('term', metric='GCLOUD/COMPUTE:compute.googleapis.com/instance/disk/read_ops_count')
s = s.filter('term', identity=identity_email)
agg = s.aggs.bucket('resources', 'terms', field='resource', size=300)
agg.metric('utilization', 'avg', field='value')
res = client.search(index='googlemetric', body=s.to_dict(), size=0)
for resource in res['aggregations']['resources']['buckets']:
yield resource['key'], resource['utilization']['value']
@classmethod
def get_disk_write_iops_usage(cls, identity_email, timespan=timedelta(days=30)):
s = cls.search()
s = s.filter('range', time={'gt': (datetime.utcnow() - timespan).isoformat()})
s = s.filter('term', metric='GCLOUD/COMPUTE:compute.googleapis.com/instance/disk/write_ops_count')
s = s.filter('term', identity=identity_email)
agg = s.aggs.bucket('resources', 'terms', field='resource', size=300)
agg.metric('utilization', 'avg', field='value')
res = client.search(index='googlemetric', body=s.to_dict(), size=0)
for resource in res['aggregations']['resources']['buckets']:
yield resource['key'], resource['utilization']['value']
@classmethod
def get_disk_read_bytes_usage(cls, identity_email, timespan=timedelta(days=30)):
s = cls.search()
s = s.filter('range', time={'gt': (datetime.utcnow() - timespan).isoformat()})
s = s.filter('term', metric='GCLOUD/COMPUTE:compute.googleapis.com/instance/disk/read_bytes_count')
s = s.filter('term', identity=identity_email)
agg = s.aggs.bucket('resources', 'terms', field='resource', size=300)
agg.metric('utilization', 'avg', field='value')
res = client.search(index='googlemetric', body=s.to_dict(), size=0)
for resource in res['aggregations']['resources']['buckets']:
yield resource['key'], resource['utilization']['value']
@classmethod
def get_disk_write_bytes_usage(cls, identity_email, timespan=timedelta(days=30)):
s = cls.search()
s = s.filter('range', time={'gt': (datetime.utcnow() - timespan).isoformat()})
s = s.filter('term', metric='GCLOUD/COMPUTE:compute.googleapis.com/instance/disk/write_bytes_count')
s = s.filter('term', identity=identity_email)
agg = s.aggs.bucket('resources', 'terms', field='resource', size=300)
agg.metric('utilization', 'avg', field='value')
res = client.search(index='googlemetric', body=s.to_dict(), size=0)
for resource in res['aggregations']['resources']['buckets']:
yield resource['key'], resource['utilization']['value']
@classmethod
def get_network_in_usage(cls, identity_email, timespan=timedelta(days=30)):
s = cls.search()
s = s.filter('range', time={'gt': (datetime.utcnow() - timespan).isoformat()})
s = s.filter('term', metric='GCLOUD/COMPUTE:compute.googleapis.com/instance/network/received_bytes_count')
s = s.filter('term', identity=identity_email)
agg = s.aggs.bucket('resources', 'terms', field='resource', size=300)
agg.metric('utilization', 'avg', field='value')
res = client.search(index='googlemetric', body=s.to_dict(), size=0)
for resource in res['aggregations']['resources']['buckets']:
yield resource['key'], resource['utilization']['value']
@classmethod
def get_network_out_usage(cls, identity_email, timespan=timedelta(days=30)):
s = cls.search()
s = s.filter('range', time={'gt': (datetime.utcnow() - timespan).isoformat()})
s = s.filter('term', metric='GCLOUD/COMPUTE:compute.googleapis.com/instance/network/sent_bytes_count')
s = s.filter('term', identity=identity_email)
agg = s.aggs.bucket('resources', 'terms', field='resource', size=300)
agg.metric('utilization', 'avg', field='value')
res = client.search(index='googlemetric', body=s.to_dict(), size=0)
for resource in res['aggregations']['resources']['buckets']:
yield resource['key'], resource['utilization']['value']
| 48.968 | 114 | 0.64679 | 749 | 6,121 | 5.186916 | 0.12283 | 0.01184 | 0.047362 | 0.049421 | 0.889833 | 0.862033 | 0.859459 | 0.850965 | 0.850965 | 0.850965 | 0 | 0.008966 | 0.180036 | 6,121 | 124 | 115 | 49.362903 | 0.765093 | 0 | 0 | 0.714286 | 0 | 0 | 0.253553 | 0.093449 | 0 | 0 | 0 | 0 | 0 | 1 | 0.081633 | false | 0 | 0.030612 | 0 | 0.183673 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
e4dd59243e64ff41554a0934896df221f4757ef4 | 353 | py | Python | tensorhelper/__init__.py | Ifeanyi-omeck/tensorhelper | 52c10ca0124b2048c07eea2fc3cc52cef15a5646 | [
"MIT"
] | null | null | null | tensorhelper/__init__.py | Ifeanyi-omeck/tensorhelper | 52c10ca0124b2048c07eea2fc3cc52cef15a5646 | [
"MIT"
] | null | null | null | tensorhelper/__init__.py | Ifeanyi-omeck/tensorhelper | 52c10ca0124b2048c07eea2fc3cc52cef15a5646 | [
"MIT"
] | null | null | null | from tensorhelper.custom_tensor import image_reader
from tensorhelper.custom_tensor import plot_generator
from tensorhelper.custom_tensor import load_custom_images
from tensorhelper.custom_tensor import make_confusion_matrix
from tensorhelper.custom_tensor import create_tensorboard_callback
from tensorhelper.custom_tensor import create_tensorhub_model | 58.833333 | 66 | 0.917847 | 46 | 353 | 6.695652 | 0.413043 | 0.311688 | 0.428571 | 0.545455 | 0.701299 | 0.25974 | 0 | 0 | 0 | 0 | 0 | 0 | 0.065156 | 353 | 6 | 67 | 58.833333 | 0.933333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
e4f04240f118511ccafd8d97f69fd7f65611496f | 1,182 | py | Python | tests/test_100_init.py | icarito/guy | 9477b548b91ae81bfc327dac7ba1ec80804f4f8d | [
"Apache-2.0"
] | null | null | null | tests/test_100_init.py | icarito/guy | 9477b548b91ae81bfc327dac7ba1ec80804f4f8d | [
"Apache-2.0"
] | null | null | null | tests/test_100_init.py | icarito/guy | 9477b548b91ae81bfc327dac7ba1ec80804f4f8d | [
"Apache-2.0"
] | null | null | null | from guy import Guy
def test_init(runner):
class T(Guy):
__doc__="""
<script>
guy.init( async function() {
await self.append("C")
guy.exit()
})
</script>
"""
def __init__(self):
Guy.__init__(self)
self.word="A"
def init(self):
self.append("B")
def append(self,letter):
self.word+=letter
def __del__(self):
self.word+="D" # will be ignored (but perhaps in future ?!)
t=T()
r=runner(t)
assert r.word=="ABC"
def test_init_async(runner):
class T(Guy):
__doc__="""
<script>
guy.init( async function() {
await self.append("C")
guy.exit()
})
</script>
"""
def __init__(self):
Guy.__init__(self)
self.word="A"
async def init(self):
self.append("B")
def append(self,letter):
self.word+=letter
def __del__(self):
self.word+="D" # will be ignored (but perhaps in future ?!)
t=T()
r=runner(t)
assert r.word=="ABC"
| 22.730769 | 72 | 0.464467 | 134 | 1,182 | 3.835821 | 0.253731 | 0.093385 | 0.085603 | 0.058366 | 0.906615 | 0.906615 | 0.906615 | 0.906615 | 0.906615 | 0.906615 | 0 | 0 | 0.396785 | 1,182 | 51 | 73 | 23.176471 | 0.720898 | 0.071912 | 0 | 0.888889 | 0 | 0 | 0.285192 | 0 | 0 | 0 | 0 | 0 | 0.044444 | 1 | 0.2 | false | 0 | 0.022222 | 0 | 0.311111 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
901ff52ae37615424026fdcefd2d55bac2278dbe | 9,113 | py | Python | src/api/tests/test_carbon_equivalencies.py | massenergize/api | 0df3368cb763e9160229f48138b7706a9d0569aa | [
"MIT"
] | 2 | 2020-07-24T12:58:17.000Z | 2020-12-17T02:26:13.000Z | src/api/tests/test_carbon_equivalencies.py | massenergize/api | 0df3368cb763e9160229f48138b7706a9d0569aa | [
"MIT"
] | 214 | 2019-06-26T17:33:54.000Z | 2022-03-26T00:02:34.000Z | src/api/tests/test_carbon_equivalencies.py | massenergize/portalBackEnd | 7ed971b2be13901667a216d8c8a46f0bed6d6ccd | [
"MIT"
] | 6 | 2020-03-13T20:29:06.000Z | 2021-08-20T16:15:08.000Z | from django.test import TestCase, Client
#from urllib.parse import urlencode
from django.utils.http import urlencode
from database.models import CarbonEquivalency
from api.tests.common import signinAs, setupCC, createUsers
class CarbonEquivalenciesTestCase(TestCase):
@classmethod
def setUpClass(self):
print("\n---> Testing Carbon Equivalencies <---\n")
self.client = Client()
self.USER, self.CADMIN, self.SADMIN = createUsers()
signinAs(self.client, self.SADMIN)
setupCC(self.client)
equivalency = {"name": "trees", "value": "41"}
self.CARBON_EQUIVALENCY = CarbonEquivalency.objects.create(**equivalency)
self.CARBON_EQUIVALENCY.save()
@classmethod
def tearDownClass(self):
pass
def setUp(self):
# this gets run on every test case
pass
def test_create(self):
# test not logged
signinAs(self.client, None)
response = self.client.post('/api/data.carbonEquivalency.create', urlencode({
"name": "test_none",
"value": 300,
"explanation": "explanation_text",
"reference": "google.com"}), content_type="application/x-www-form-urlencoded").toDict()
self.assertFalse(response["success"])
# test logged as user
signinAs(self.client, self.USER)
response = self.client.post('/api/data.carbonEquivalency.create', urlencode({
"name": "test_user",
"value": 300,
"explanation": "explanation_text",
"reference": "google.com"}), content_type="application/x-www-form-urlencoded").toDict()
self.assertFalse(response["success"])
# test logged as cadmin
signinAs(self.client, self.CADMIN)
response = self.client.post('/api/data.carbonEquivalency.create', urlencode({
"name": "test_cadmin",
"value": 300,
"explanation": "explanation_text",
"reference": "google.com"}), content_type="application/x-www-form-urlencoded").toDict()
self.assertFalse(response["success"])
# test logged as sadmin
signinAs(self.client, self.SADMIN)
response = self.client.post('/api/data.carbonEquivalency.create', urlencode({
"name": "test_sadmin",
"value": 300,
"explanation": "explanation_text",
"reference": "google.com"}), content_type="application/x-www-form-urlencoded").toDict()
self.assertTrue(response["success"])
# test bad args
signinAs(self.client, self.SADMIN)
response = self.client.post('/api/data.carbonEquivalency.create', urlencode({"community_id": 3,
"name": "test_bad_args"}), content_type="application/x-www-form-urlencoded").toDict()
self.assertFalse(response["success"])
def test_get(self):
# test not logged
signinAs(self.client, None)
response = self.client.get('/api/data.carbonEquivalency.get', urlencode({}), content_type="application/x-www-form-urlencoded").toDict()
self.assertTrue(response["success"])
self.assertGreater(len(response["data"]), 0)
# test logged as user
signinAs(self.client, self.USER)
response = self.client.get('/api/data.carbonEquivalency.get', urlencode({}), content_type="application/x-www-form-urlencoded").toDict()
self.assertTrue(response["success"])
self.assertGreater(len(response["data"]), 0)
# test logged as cadmin
signinAs(self.client, self.CADMIN)
response = self.client.get('/api/data.carbonEquivalency.get', urlencode({}), content_type="application/x-www-form-urlencoded").toDict()
self.assertTrue(response["success"])
self.assertGreater(len(response["data"]), 0)
# test logged as sadmin
signinAs(self.client, self.SADMIN)
response = self.client.get('/api/data.carbonEquivalency.get', urlencode({}), content_type="application/x-www-form-urlencoded").toDict()
self.assertTrue(response["success"])
self.assertGreater(len(response["data"]), 0)
# test get one
signinAs(self.client, self.USER)
response = self.client.post('/api/data.carbonEquivalency.get', urlencode({"id": self.CARBON_EQUIVALENCY.id}), content_type="application/x-www-form-urlencoded")
response = response.toDict()
self.assertTrue(response["success"])
def test_update(self):
# test not logged
signinAs(self.client, None)
response = self.client.post('/api/data.carbonEquivalency.update', urlencode({
"id": self.CARBON_EQUIVALENCY.id,
"name": "Another name",
"value": 300,
"explanation": "explanation_text",
"reference": "google.com"}), content_type="application/x-www-form-urlencoded").toDict()
self.assertFalse(response["success"])
# test logged as user
signinAs(self.client, self.USER)
response = self.client.post('/api/data.carbonEquivalency.update', urlencode({
"id": self.CARBON_EQUIVALENCY.id,
"name": "Another name",
"value": 300,
"explanation": "explanation_text",
"reference": "google.com"}), content_type="application/x-www-form-urlencoded").toDict()
self.assertFalse(response["success"])
# test logged as cadmin
signinAs(self.client, self.CADMIN)
response = self.client.post('/api/data.carbonEquivalency.update', urlencode({
"id": self.CARBON_EQUIVALENCY.id,
"name": "Another name",
"value": 300,
"explanation": "explanation_text",
"reference": "google.com"}), content_type="application/x-www-form-urlencoded").toDict()
self.assertFalse(response["success"])
# test logged as sadmin
signinAs(self.client, self.SADMIN)
response = self.client.post('/api/data.carbonEquivalency.update', urlencode({
"id": self.CARBON_EQUIVALENCY.id,
"name": "Another name",
"value": 300,
"explanation": "explanation_text",
"reference": "google.com"}), content_type="application/x-www-form-urlencoded").toDict()
self.assertTrue(response["success"])
self.assertEqual(response["data"]["name"], "Another name")
# test bad args
signinAs(self.client, self.SADMIN)
response = self.client.post('/api/data.carbonEquivalency.update', urlencode({"community_id": 333,
"name": "test_bad_args"}), content_type="application/x-www-form-urlencoded").toDict()
self.assertFalse(response["success"])
| 58.416667 | 167 | 0.463184 | 706 | 9,113 | 5.917847 | 0.126062 | 0.078985 | 0.068933 | 0.082575 | 0.833652 | 0.817855 | 0.809718 | 0.800144 | 0.800144 | 0.800144 | 0 | 0.00655 | 0.430374 | 9,113 | 155 | 168 | 58.793548 | 0.798305 | 0.038187 | 0 | 0.758929 | 0 | 0 | 0.204047 | 0.113169 | 0 | 0 | 0 | 0 | 0.178571 | 1 | 0.053571 | false | 0.017857 | 0.035714 | 0 | 0.098214 | 0.008929 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
5f7261905e6f51f0665b28b4fd5f40a40f426868 | 136 | py | Python | product/admin.py | hossainchisty/Multi-Vendor-eCommerce | 42c5f62b8b098255cc9ea57858d3cc7de94bd76a | [
"MIT"
] | 16 | 2021-09-22T19:08:28.000Z | 2022-03-18T18:57:02.000Z | product/admin.py | hossainchisty/Multi-Vendor-eCommerce | 42c5f62b8b098255cc9ea57858d3cc7de94bd76a | [
"MIT"
] | 6 | 2021-09-30T12:36:02.000Z | 2022-03-18T22:18:00.000Z | product/admin.py | hossainchisty/Multi-Vendor-eCommerce | 42c5f62b8b098255cc9ea57858d3cc7de94bd76a | [
"MIT"
] | 6 | 2021-12-06T02:04:51.000Z | 2022-03-13T14:38:14.000Z | from django.contrib import admin
from .models import Product, Category
admin.site.register(Category)
admin.site.register(Product)
| 22.666667 | 38 | 0.794118 | 18 | 136 | 6 | 0.555556 | 0.240741 | 0.314815 | 0.462963 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.125 | 136 | 5 | 39 | 27.2 | 0.907563 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
5f803cc45a6de7051b060de81a3095fd4dce7254 | 210 | py | Python | utils.py | chengfenggui/LogisticsManagementSys | 00715195ee011789725b06b684f4b3de6c8ac4b9 | [
"Apache-2.0"
] | 4 | 2020-12-17T10:52:49.000Z | 2022-03-25T08:26:47.000Z | utils.py | chengfenggui/LogisticsManagementSys | 00715195ee011789725b06b684f4b3de6c8ac4b9 | [
"Apache-2.0"
] | null | null | null | utils.py | chengfenggui/LogisticsManagementSys | 00715195ee011789725b06b684f4b3de6c8ac4b9 | [
"Apache-2.0"
] | 1 | 2020-12-18T01:41:16.000Z | 2020-12-18T01:41:16.000Z | import base64
import psycopg2
# 解密方法
def base64decode(str):
return base64.urlsafe_b64decode(str).decode()
# 加密方法
def base64encode(str):
return base64.urlsafe_b64encode(str.encode("utf-8")).decode()
| 16.153846 | 65 | 0.742857 | 27 | 210 | 5.703704 | 0.62963 | 0.116883 | 0.194805 | 0.285714 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.087912 | 0.133333 | 210 | 12 | 66 | 17.5 | 0.758242 | 0.042857 | 0 | 0 | 0 | 0 | 0.025253 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 7 |
5fb13648d4ee1eef58aae38ee545691ce182eece | 82 | py | Python | ips/ip/i2c_slave/__init__.py | zld012739/zldrepository | 5635b78a168956091676ef4dd99fa564be0e5ba0 | [
"MIT"
] | null | null | null | ips/ip/i2c_slave/__init__.py | zld012739/zldrepository | 5635b78a168956091676ef4dd99fa564be0e5ba0 | [
"MIT"
] | null | null | null | ips/ip/i2c_slave/__init__.py | zld012739/zldrepository | 5635b78a168956091676ef4dd99fa564be0e5ba0 | [
"MIT"
] | null | null | null | from i2c_slave_partial import get_ip_name
from i2c_slave_partial import I2C_SLAVE
| 27.333333 | 41 | 0.902439 | 15 | 82 | 4.466667 | 0.533333 | 0.358209 | 0.358209 | 0.567164 | 0.746269 | 0 | 0 | 0 | 0 | 0 | 0 | 0.040541 | 0.097561 | 82 | 2 | 42 | 41 | 0.864865 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
84096935381c0987ea8bc377cb160c3f8217b9a0 | 211 | py | Python | evap/context_processors.py | JenniferStamm/EvaP | 1d71e4efcd34d01f28e30c6026c8dcc708921193 | [
"MIT"
] | null | null | null | evap/context_processors.py | JenniferStamm/EvaP | 1d71e4efcd34d01f28e30c6026c8dcc708921193 | [
"MIT"
] | null | null | null | evap/context_processors.py | JenniferStamm/EvaP | 1d71e4efcd34d01f28e30c6026c8dcc708921193 | [
"MIT"
] | null | null | null | from django.conf import settings
def legal_notice_active(request):
return {'LEGAL_NOTICE_ACTIVE': settings.LEGAL_NOTICE_ACTIVE}
def tracker_url(request):
return {'TRACKER_URL': settings.TRACKER_URL}
| 21.1 | 64 | 0.78673 | 28 | 211 | 5.607143 | 0.464286 | 0.210191 | 0.324841 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.123223 | 211 | 9 | 65 | 23.444444 | 0.848649 | 0 | 0 | 0 | 0 | 0 | 0.14218 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.4 | false | 0 | 0.2 | 0.4 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 7 |
842c50de06f4928b295eeeda7c03a4a827664914 | 3,665 | py | Python | tests/integration/text/test_remove_fasta_entry.py | JLSteenwyk/BioKIT | 9ca31d8003dc845bf56b2c56c87820c0b05021c4 | [
"MIT"
] | 8 | 2021-10-03T21:08:33.000Z | 2021-12-02T17:15:32.000Z | tests/integration/text/test_remove_fasta_entry.py | JLSteenwyk/BioKIT | 9ca31d8003dc845bf56b2c56c87820c0b05021c4 | [
"MIT"
] | null | null | null | tests/integration/text/test_remove_fasta_entry.py | JLSteenwyk/BioKIT | 9ca31d8003dc845bf56b2c56c87820c0b05021c4 | [
"MIT"
] | 5 | 2021-10-05T06:25:03.000Z | 2022-01-04T11:01:09.000Z | import pytest
from mock import patch, call # noqa
from pathlib import Path
import sys
from biokit.biokit import Biokit
here = Path(__file__)
@pytest.mark.integration
class TestRemoveFastaEntry(object):
@patch("builtins.print")
def test_remove_fasta_entry_invalid_input(self, mocked_print): # noqa
with pytest.raises(SystemExit) as pytest_wrapped_e:
Biokit()
assert pytest_wrapped_e.type == SystemExit
assert pytest_wrapped_e.value.code == 2
@patch("builtins.print")
def test_remove_fasta_entry(self, mocked_print):
testargs = [
"biokit",
"remove_fasta_entry",
f"{here.parent.parent.parent}/sample_files/simple.fa",
"-e",
"1",
]
with patch.object(sys, "argv", testargs):
Biokit()
with open(
f"{here.parent.parent}/expected/simple_pruned.fa", "r"
) as expected_out:
expected_out = expected_out.read()
with open(
f"{here.parent.parent.parent}/sample_files/simple.fa.pruned.fa", "r"
) as output_file:
output_file = output_file.read()
assert expected_out == output_file
@patch("builtins.print")
def test_remove_fasta_entry_long_arg(self, mocked_print):
testargs = [
"biokit",
"remove_fasta_entry",
f"{here.parent.parent.parent}/sample_files/simple.fa",
"--entry",
"1",
]
with patch.object(sys, "argv", testargs):
Biokit()
with open(
f"{here.parent.parent}/expected/simple_pruned.fa", "r"
) as expected_out:
expected_out = expected_out.read()
with open(
f"{here.parent.parent.parent}/sample_files/simple.fa.pruned.fa", "r"
) as output_file:
output_file = output_file.read()
assert expected_out == output_file
@patch("builtins.print")
def test_remove_fasta_entry_custom_out(self, mocked_print):
testargs = [
"biokit",
"remove_fasta_entry",
f"{here.parent.parent.parent}/sample_files/simple.fa",
"--entry",
"1",
"-o",
f"{here.parent.parent.parent}/sample_files/simple.fa.custom_out.pruned.fa"
]
with patch.object(sys, "argv", testargs):
Biokit()
with open(
f"{here.parent.parent}/expected/simple_pruned.fa", "r"
) as expected_out:
expected_out = expected_out.read()
with open(
f"{here.parent.parent.parent}/sample_files/simple.fa.custom_out.pruned.fa", "r"
) as output_file:
output_file = output_file.read()
assert expected_out == output_file
@patch("builtins.print")
def test_remove_fasta_entry_custom_out_long_arg(self, mocked_print):
testargs = [
"biokit",
"remove_fasta_entry",
f"{here.parent.parent.parent}/sample_files/simple.fa",
"--entry",
"1",
"--output",
f"{here.parent.parent.parent}/sample_files/simple.fa.custom_out.pruned.fa"
]
with patch.object(sys, "argv", testargs):
Biokit()
with open(
f"{here.parent.parent}/expected/simple_pruned.fa", "r"
) as expected_out:
expected_out = expected_out.read()
with open(
f"{here.parent.parent.parent}/sample_files/simple.fa.custom_out.pruned.fa", "r"
) as output_file:
output_file = output_file.read()
assert expected_out == output_file
| 30.289256 | 91 | 0.581173 | 424 | 3,665 | 4.792453 | 0.141509 | 0.141732 | 0.075787 | 0.117126 | 0.846457 | 0.846457 | 0.846457 | 0.846457 | 0.806102 | 0.806102 | 0 | 0.001949 | 0.299864 | 3,665 | 120 | 92 | 30.541667 | 0.789945 | 0.002456 | 0 | 0.765306 | 0 | 0 | 0.277854 | 0.215713 | 0 | 0 | 0 | 0 | 0.061224 | 1 | 0.05102 | false | 0 | 0.05102 | 0 | 0.112245 | 0.102041 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
84491f18738cc88fe9a853594efb082e22d5416e | 81 | py | Python | app/school/__init__.py | Anioko/CMS | b6465faf2a5d7333f494526bcddf8083d6807aee | [
"MIT"
] | null | null | null | app/school/__init__.py | Anioko/CMS | b6465faf2a5d7333f494526bcddf8083d6807aee | [
"MIT"
] | 1 | 2021-06-02T01:40:15.000Z | 2021-06-02T01:40:15.000Z | app/school/__init__.py | Anioko/CMS | b6465faf2a5d7333f494526bcddf8083d6807aee | [
"MIT"
] | null | null | null | from app.school import errors # noqa
from app.school.views import school # noqa
| 27 | 42 | 0.777778 | 13 | 81 | 4.846154 | 0.538462 | 0.222222 | 0.412698 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.160494 | 81 | 2 | 43 | 40.5 | 0.926471 | 0.111111 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
ffdc3b47a0aed966d41adef53d607b3175e656a9 | 44,672 | py | Python | events/input_events.py | mtasa-typescript/mtasa-wiki-dump | edea1746850fb6c99d6155d1d7891e2cceb33a5c | [
"MIT"
] | null | null | null | events/input_events.py | mtasa-typescript/mtasa-wiki-dump | edea1746850fb6c99d6155d1d7891e2cceb33a5c | [
"MIT"
] | 1 | 2021-02-24T21:50:18.000Z | 2021-02-24T21:50:18.000Z | events/input_events.py | mtasa-typescript/mtasa-wiki-dump | edea1746850fb6c99d6155d1d7891e2cceb33a5c | [
"MIT"
] | null | null | null | # Autogenerated file. ANY CHANGES WILL BE OVERWRITTEN
from to_python.core.types import FunctionType, \
FunctionArgument, \
FunctionArgumentValues, \
FunctionReturnTypes, \
FunctionSignature, \
FunctionDoc, \
EventData, \
CompoundEventData
DUMP_PARTIAL = [
CompoundEventData(
server=[
],
client=[
EventData(
name='onClientCharacter',
docs=FunctionDoc(
description='This event triggers whenever the user presses an alphanumeric character on their keyboard. This also includes special characters, ie. / # % { }.' ,
arguments={
"character": """: a string representing the pressed character. """
},
result='' ,
),
arguments=FunctionArgumentValues(
arguments=[
[
FunctionArgument(
name='character',
argument_type=FunctionType(
names=['string'],
is_optional=False,
),
default_value=None,
)
]
],
variable_length=False,
),
)
],
),
CompoundEventData(
server=[
],
client=[
EventData(
name='onClientClick',
docs=FunctionDoc(
description='This event triggers whenever the user clicks his mouse. This is linked to the GTA world, as oppose to GUI for which onClientGUIClick is to be used. This event allows detection of click positions of the 3D world.' ,
arguments={
"button": """: This refers the button used to click on the mouse, can be left, right, or middle. """,
"state": """: This can be used to tell if the user released or pressed the mouse button, where up is passed if the button is released, and down is passed if the button is pushed. """,
"absoluteX": """: This refers to the 2D x coordinate the user clicked on his screen, and is an absolute position in pixels. """,
"absoluteY": """: This refers to the 2D y coordinate the user clicked on his screen, and is an absolute position in pixels. """,
"worldX": """: This represents the 3D x coordinate the player clicked on the screen, and is relative to the GTA world. """,
"worldY": """: This represents the 3D y coordinate the player clicked on the screen, and is relative to the GTA world. """,
"worldZ": """: This represents the 3D z coordinate the player clicked on the screen, and is relative to the GTA world. """,
"clickedWorld": """: This represents any physical entity elements that were clicked. If the player clicked on no MTA element, its set to false. """
},
result='' ,
),
arguments=FunctionArgumentValues(
arguments=[
[
FunctionArgument(
name='button',
argument_type=FunctionType(
names=['string'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='state',
argument_type=FunctionType(
names=['string'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='absoluteX',
argument_type=FunctionType(
names=['int'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='absoluteY',
argument_type=FunctionType(
names=['int'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='worldX',
argument_type=FunctionType(
names=['float'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='worldY',
argument_type=FunctionType(
names=['float'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='worldZ',
argument_type=FunctionType(
names=['float'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='clickedWorld',
argument_type=FunctionType(
names=['element'],
is_optional=False,
),
default_value=None,
)
]
],
variable_length=False,
),
)
],
),
CompoundEventData(
server=[
],
client=[
EventData(
name='onClientCursorMove',
docs=FunctionDoc(
description='This event is called by the root element whenever the cursor is moved over the screen, by the player. It returns information about the world coordinates as well as the screen coordinates of where the player moved the cursor.\nThe difference between this event and onClientMouseMove, is that the latter is actually called by GUI elements. This is to prevent double calling of onClientCursorMove, as onClientCursorMove is always called.' ,
arguments={
"cursorX": """the relative X coordinate of the mouse cursor. 0 = left side of the screen, 1 = right side. """,
"cursorY": """the relative Y coordinate of the mouse cursor. 0 = top of the screen, 1 = bottom. """,
"absoluteX": """the X coordinate of the mouse cursor, in pixels, measured from the left side of the screen. """,
"absoluteY": """the Y coordinate of the mouse cursor, in pixels, measured from the top of the screen. """,
"worldX, worldY, worldZ": """the 3D in-game world coordinates that the cursor is pointing at. """
},
result='' ,
),
arguments=FunctionArgumentValues(
arguments=[
[
FunctionArgument(
name='cursorX',
argument_type=FunctionType(
names=['float'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='cursorY',
argument_type=FunctionType(
names=['float'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='absoluteX',
argument_type=FunctionType(
names=['int'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='absoluteY',
argument_type=FunctionType(
names=['int'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='worldX',
argument_type=FunctionType(
names=['float'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='worldY',
argument_type=FunctionType(
names=['float'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='worldZ',
argument_type=FunctionType(
names=['float'],
is_optional=False,
),
default_value=None,
)
]
],
variable_length=False,
),
)
],
),
CompoundEventData(
server=[
],
client=[
EventData(
name='onClientDoubleClick',
docs=FunctionDoc(
description='This event triggers whenever the user double-clicks his mouse. This is linked to the GTA world, as appose to GUI for which onClientGUIDoubleClick is to be used. This event allows detection of click positions of the 3D world.' ,
arguments={
"button": """: This refers the button used to click on the mouse, can be left, right, or middle. """,
"absoluteX": """: This refers to the 2D x coordinate the user clicked on his screen, and is an absolute position in pixels. """,
"absoluteY": """: This refers to the 2D y coordinate the user clicked on his screen, and is an absolute position in pixels. """,
"worldX": """: This represents the 3D x coordinate the player clicked on the screen, and is relative to the GTA world. """,
"worldY": """: This represents the 3D y coordinate the player clicked on the screen, and is relative to the GTA world. """,
"worldZ": """: This represents the 3D z coordinate the player clicked on the screen, and is relative to the GTA world. """,
"clickedWorld": """: This represents any physical entity elements that were clicked. If the player clicked on no MTA element, its set to false. """
},
result='' ,
),
arguments=FunctionArgumentValues(
arguments=[
[
FunctionArgument(
name='button',
argument_type=FunctionType(
names=['string'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='absoluteX',
argument_type=FunctionType(
names=['int'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='absoluteY',
argument_type=FunctionType(
names=['int'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='worldX',
argument_type=FunctionType(
names=['float'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='worldY',
argument_type=FunctionType(
names=['float'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='worldZ',
argument_type=FunctionType(
names=['float'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='clickedWorld',
argument_type=FunctionType(
names=['element'],
is_optional=False,
),
default_value=None,
)
]
],
variable_length=False,
),
)
],
),
CompoundEventData(
server=[
],
client=[
EventData(
name='onClientGUIAccepted',
docs=FunctionDoc(
description='This event is triggered when enter is pressed on an editbox.' ,
arguments={
"editBox": """: the Element/GUI/Edit_field|editbox which had focus. """
},
result='' ,
),
arguments=FunctionArgumentValues(
arguments=[
[
FunctionArgument(
name='editBox',
argument_type=FunctionType(
names=['element'],
is_optional=False,
),
default_value=None,
)
]
],
variable_length=False,
),
)
],
),
CompoundEventData(
server=[
],
client=[
EventData(
name='onClientGUIBlur',
docs=FunctionDoc(
description='This event is triggered each time a GUI element looses input focus (mainly useful for windows, editboxes and memos but triggered for all GUI elements nevertheless).' ,
arguments={
},
result='' ,
),
arguments=FunctionArgumentValues(
arguments=[
],
variable_length=False,
),
)
],
),
CompoundEventData(
server=[
],
client=[
EventData(
name='onClientGUIChanged',
docs=FunctionDoc(
description='This event is fired when a Element/GUI/Memo|memo or an Element/GUI/Edit_field|editbox has changed (either by the user or by guiSetText).' ,
arguments={
"theElement": """: The GUI element which was changed. """
},
result='' ,
),
arguments=FunctionArgumentValues(
arguments=[
[
FunctionArgument(
name='theElement',
argument_type=FunctionType(
names=['element'],
is_optional=False,
),
default_value=None,
)
]
],
variable_length=False,
),
)
],
),
CompoundEventData(
server=[
],
client=[
EventData(
name='onClientGUIClick',
docs=FunctionDoc(
description='This event happens when any gui-element clicked.' ,
arguments={
"button": """the name of the button which will be clicked, it can be left, right, middle. """,
"state": """the state of the mouse button, will be down if the mouse button was pushed, or up if it was released. Please note currently only the up state is supported. """,
"absoluteX": """the X position of the mouse cursor, in pixels, measured from the left side of the screen. """,
"absoluteY": """the Y position of the mouse cursor, in pixels, measured from the top of the screen. """
},
result='' ,
),
arguments=FunctionArgumentValues(
arguments=[
[
FunctionArgument(
name='button',
argument_type=FunctionType(
names=['string'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='state',
argument_type=FunctionType(
names=['string'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='absoluteX',
argument_type=FunctionType(
names=['int'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='absoluteY',
argument_type=FunctionType(
names=['int'],
is_optional=False,
),
default_value=None,
)
]
],
variable_length=False,
),
)
],
),
CompoundEventData(
server=[
],
client=[
EventData(
name='onClientGUIComboBoxAccepted',
docs=FunctionDoc(
description='This event is called when a Element/GUI/Combobox|combobox gets accepted.' ,
arguments={
"theElement": """the Element/GUI/Combobox|combobox that got accepted. """
},
result='' ,
),
arguments=FunctionArgumentValues(
arguments=[
[
FunctionArgument(
name='theElement',
argument_type=FunctionType(
names=['element'],
is_optional=False,
),
default_value=None,
)
]
],
variable_length=False,
),
)
],
),
CompoundEventData(
server=[
],
client=[
EventData(
name='onClientGUIDoubleClick',
docs=FunctionDoc(
description='This event is fired when the user double clicks a GUI element. Doesnt work with buttons.' ,
arguments={
"button": """the name of the mouse button that the GUI element was double clicked with. """,
"state": """the state of the mouse button. Can be down or up. """,
"absoluteX": """the X position of the mouse cursor, in pixels, measured from the left side of the screen. """,
"absoluteY": """the Y position of the mouse cursor, in pixels, measured from the top of the screen. """
},
result='' ,
),
arguments=FunctionArgumentValues(
arguments=[
[
FunctionArgument(
name='button',
argument_type=FunctionType(
names=['string'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='state',
argument_type=FunctionType(
names=['string'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='absoluteX',
argument_type=FunctionType(
names=['int'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='absoluteY',
argument_type=FunctionType(
names=['int'],
is_optional=False,
),
default_value=None,
)
]
],
variable_length=False,
),
)
],
),
CompoundEventData(
server=[
],
client=[
EventData(
name='onClientGUIFocus',
docs=FunctionDoc(
description='This event is triggered each time a GUI element gains input focus (mainly useful for windows, editboxes and memos but triggered for all GUI elements nevertheless).' ,
arguments={
},
result='' ,
),
arguments=FunctionArgumentValues(
arguments=[
],
variable_length=False,
),
)
],
),
CompoundEventData(
server=[
],
client=[
EventData(
name='onClientGUIMouseDown',
docs=FunctionDoc(
description='This event is fired when the user clicks certain mouse button on a GUI element.' ,
arguments={
"button": """the name of the mouse button that the GUI element was clicked with, can be left, right, or middle. """,
"absoluteX": """the X position of the mouse cursor, in pixels, measured from the left side of the screen. """,
"absoluteY": """the Y position of the mouse cursor, in pixels, measured from the top of the screen. """
},
result='' ,
),
arguments=FunctionArgumentValues(
arguments=[
[
FunctionArgument(
name='button',
argument_type=FunctionType(
names=['string'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='absoluteX',
argument_type=FunctionType(
names=['int'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='absoluteY',
argument_type=FunctionType(
names=['int'],
is_optional=False,
),
default_value=None,
)
]
],
variable_length=False,
),
)
],
),
CompoundEventData(
server=[
],
client=[
EventData(
name='onClientGUIMouseUp',
docs=FunctionDoc(
description='This event is fired when the user releases his mouse button when on top of a GUI element.' ,
arguments={
"button": """the name of the mouse button that was released on a GUI element, can be left, right, or middle. """,
"absoluteX": """the X position of the mouse cursor, in pixels, measured from the left side of the screen. """,
"absoluteY": """the Y position of the mouse cursor, in pixels, measured from the top of the screen. """
},
result='' ,
),
arguments=FunctionArgumentValues(
arguments=[
[
FunctionArgument(
name='button',
argument_type=FunctionType(
names=['string'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='absoluteX',
argument_type=FunctionType(
names=['int'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='absoluteY',
argument_type=FunctionType(
names=['int'],
is_optional=False,
),
default_value=None,
)
]
],
variable_length=False,
),
)
],
),
CompoundEventData(
server=[
],
client=[
EventData(
name='onClientGUIMove',
docs=FunctionDoc(
description='This event is triggered each time the user moves a GUI element.' ,
arguments={
},
result='' ,
),
arguments=FunctionArgumentValues(
arguments=[
],
variable_length=False,
),
)
],
),
CompoundEventData(
server=[
],
client=[
EventData(
name='onClientGUIScroll',
docs=FunctionDoc(
description='This event is fired when a GUI scrollbar is scrolled.' ,
arguments={
"scrolled": """: the Element/GUI/Scrollbar|scrollbar element that was scrolled. """
},
result='' ,
),
arguments=FunctionArgumentValues(
arguments=[
[
FunctionArgument(
name='scrolled',
argument_type=FunctionType(
names=['element'],
is_optional=False,
),
default_value=None,
)
]
],
variable_length=False,
),
)
],
),
CompoundEventData(
server=[
],
client=[
EventData(
name='onClientGUISize',
docs=FunctionDoc(
description='This event is triggered when the local client resizes a GUI element.' ,
arguments={
},
result='' ,
),
arguments=FunctionArgumentValues(
arguments=[
],
variable_length=False,
),
)
],
),
CompoundEventData(
server=[
],
client=[
EventData(
name='onClientGUITabSwitched',
docs=FunctionDoc(
description='This event is triggered each time the user switch from GUI tab.\nWhen adding the event handler on the tab panel, propagate must be true.' ,
arguments={
"theElement": """: the Element/GUI/Tab|tab which was selected. """
},
result='' ,
),
arguments=FunctionArgumentValues(
arguments=[
[
FunctionArgument(
name='theElement',
argument_type=FunctionType(
names=['element'],
is_optional=False,
),
default_value=None,
)
]
],
variable_length=False,
),
)
],
),
CompoundEventData(
server=[
],
client=[
EventData(
name='onClientKey',
docs=FunctionDoc(
description='This event triggers whenever the user presses a button on their keyboard or mouse.\nThis event can also be used to see if the client scrolls their mouse wheel.' ,
arguments={
"button": """: This refers the button pressed. See key names for a list of keys. """,
"pressOrRelease": """: This refers to whether they were pressing or releasing the key, true when pressing, false when releasing. """
},
result='' ,
),
arguments=FunctionArgumentValues(
arguments=[
[
FunctionArgument(
name='button',
argument_type=FunctionType(
names=['string'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='pressOrRelease',
argument_type=FunctionType(
names=['bool'],
is_optional=False,
),
default_value=None,
)
]
],
variable_length=False,
),
)
],
),
CompoundEventData(
server=[
],
client=[
EventData(
name='onClientMouseEnter',
docs=FunctionDoc(
description='This event is fired when the user moves the mouse over a GUI element.' ,
arguments={
"absoluteX": """: the X position of the mouse cursor, in pixels, measured from the left side of the screen. """,
"absoluteY": """: the Y position of the mouse cursor, in pixels, measured from the top of the screen. """,
"leftGUI": """: the gui element that was switched from, or nil if it doesnt exist. """
},
result='' ,
),
arguments=FunctionArgumentValues(
arguments=[
[
FunctionArgument(
name='absoluteX',
argument_type=FunctionType(
names=['int'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='absoluteY',
argument_type=FunctionType(
names=['int'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='leftGUI',
argument_type=FunctionType(
names=['element'],
is_optional=False,
),
default_value=None,
)
]
],
variable_length=False,
),
)
],
),
CompoundEventData(
server=[
],
client=[
EventData(
name='onClientMouseLeave',
docs=FunctionDoc(
description='This event is fired when the user moves the mouse away from a GUI element.' ,
arguments={
"absoluteX": """: the X position of the mouse cursor, in pixels, measured from the left side of the screen. """,
"absoluteY": """: the Y position of the mouse cursor, in pixels, measured from the top of the screen. """,
"enteredGUI": """: is the GUI element that was switched from, or nil if it doesnt exist. """
},
result='' ,
),
arguments=FunctionArgumentValues(
arguments=[
[
FunctionArgument(
name='absoluteX',
argument_type=FunctionType(
names=['int'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='absoluteY',
argument_type=FunctionType(
names=['int'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='enteredGUI',
argument_type=FunctionType(
names=['element'],
is_optional=False,
),
default_value=None,
)
]
],
variable_length=False,
),
)
],
),
CompoundEventData(
server=[
],
client=[
EventData(
name='onClientMouseMove',
docs=FunctionDoc(
description='This event is triggered each time the user moves the mouse on top of a GUI element.' ,
arguments={
"absoluteX": """: the X position of the mouse cursor, in pixels, measured from the left side of the screen. """,
"absoluteY": """: the Y position of the mouse cursor, in pixels, measured from the top of the screen. """
},
result='' ,
),
arguments=FunctionArgumentValues(
arguments=[
[
FunctionArgument(
name='absoluteX',
argument_type=FunctionType(
names=['int'],
is_optional=False,
),
default_value=None,
)
],
[
FunctionArgument(
name='absoluteY',
argument_type=FunctionType(
names=['int'],
is_optional=False,
),
default_value=None,
)
]
],
variable_length=False,
),
)
],
),
CompoundEventData(
server=[
],
client=[
EventData(
name='onClientMouseWheel',
docs=FunctionDoc(
description='This event is triggered each time the user scrolls his mouse scroll on top of a GUI element.' ,
arguments={
"upOrDown": """: An int|integer representing whether the scroll was scrolled up or down. This can be either 1 (mouse was scrolled up) or -1 (mouse was scrolled down). """
},
result='' ,
),
arguments=FunctionArgumentValues(
arguments=[
[
FunctionArgument(
name='upOrDown',
argument_type=FunctionType(
names=['int'],
is_optional=False,
),
default_value=None,
)
]
],
variable_length=False,
),
)
],
),
CompoundEventData(
server=[
],
client=[
EventData(
name='onClientPaste',
docs=FunctionDoc(
description='This event triggers when user paste whatever (CTRL + V). This event isnt triggered if menu or console is visible or if any browser is focused, or if cursor is invisible.' ,
arguments={
"clipboardText": """: a string representing the pasted value from clipboard. """
},
result='' ,
),
arguments=FunctionArgumentValues(
arguments=[
[
FunctionArgument(
name='clipboardText',
argument_type=FunctionType(
names=['string'],
is_optional=False,
),
default_value=None,
)
]
],
variable_length=False,
),
)
],
)
]
| 40.759124 | 466 | 0.346056 | 2,584 | 44,672 | 5.909443 | 0.109133 | 0.070727 | 0.084872 | 0.102554 | 0.80943 | 0.795023 | 0.755796 | 0.74368 | 0.737132 | 0.723379 | 0 | 0.00103 | 0.586945 | 44,672 | 1,095 | 467 | 40.796347 | 0.826523 | 0.001142 | 0 | 0.749529 | 1 | 0.032015 | 0.204151 | 0.005894 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.000942 | 0.000942 | 0 | 0.000942 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
4b4309110e0a69500c2aa514129f1c5b72572330 | 107 | py | Python | tests/__init__.py | venturehacks/dbt-metabase | 8fd76c5b2dab180fca24b6742b68f5f668a3e7cb | [
"MIT"
] | null | null | null | tests/__init__.py | venturehacks/dbt-metabase | 8fd76c5b2dab180fca24b6742b68f5f668a3e7cb | [
"MIT"
] | 1 | 2021-08-01T19:29:59.000Z | 2021-08-01T19:29:59.000Z | tests/__init__.py | venturehacks/dbt-metabase | 8fd76c5b2dab180fca24b6742b68f5f668a3e7cb | [
"MIT"
] | null | null | null | from .test_dbt_folder_reader import *
from .test_dbt_manifest_reader import *
from .test_metabase import *
| 26.75 | 39 | 0.831776 | 16 | 107 | 5.125 | 0.5 | 0.292683 | 0.268293 | 0.487805 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.11215 | 107 | 3 | 40 | 35.666667 | 0.863158 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
4bb55d0f9fdc56c10ba804f20592e7fcd1e5bfb1 | 150 | py | Python | des/NoneSwapper.py | Kushal-kothari/Cryptography-Network_Security | fd85ead3e2794d856de2072dc9d40d69c1b94f37 | [
"MIT"
] | 9 | 2020-08-24T22:07:44.000Z | 2022-01-15T12:27:23.000Z | des/NoneSwapper.py | Kushal-kothari/Cryptography-Network_Security | fd85ead3e2794d856de2072dc9d40d69c1b94f37 | [
"MIT"
] | null | null | null | des/NoneSwapper.py | Kushal-kothari/Cryptography-Network_Security | fd85ead3e2794d856de2072dc9d40d69c1b94f37 | [
"MIT"
] | 2 | 2020-11-01T00:28:30.000Z | 2021-02-16T17:21:36.000Z | class NoneSwapper:
def encrypt(self, binary: str) -> str:
return binary
def decrypt(self, binary: str) -> str:
return binary
| 21.428571 | 42 | 0.613333 | 18 | 150 | 5.111111 | 0.5 | 0.217391 | 0.282609 | 0.347826 | 0.608696 | 0.608696 | 0 | 0 | 0 | 0 | 0 | 0 | 0.286667 | 150 | 6 | 43 | 25 | 0.859813 | 0 | 0 | 0.4 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.4 | false | 0 | 0 | 0.4 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 8 |
d99bc7d52497fc6ca6dddfb5a2018566f47805ad | 69,968 | py | Python | electrum_bynd/tests/test_psbt.py | beyondcoin-project/electrum-bynd | b113e9b01db2b0b5f123f579457332f0b95f63e9 | [
"MIT"
] | 4 | 2019-09-25T15:04:09.000Z | 2021-05-10T15:05:18.000Z | electrum_bynd/tests/test_psbt.py | beyondcoin-project/electrum-bynd | b113e9b01db2b0b5f123f579457332f0b95f63e9 | [
"MIT"
] | 1 | 2019-11-11T19:46:15.000Z | 2019-11-14T00:11:07.000Z | electrum_bynd/tests/test_psbt.py | beyondcoin-project/electrum-bynd | b113e9b01db2b0b5f123f579457332f0b95f63e9 | [
"MIT"
] | 2 | 2019-11-08T02:07:22.000Z | 2019-11-08T02:20:29.000Z | from pprint import pprint
import unittest
from electrum_bynd import constants
from electrum_bynd.transaction import (tx_from_any, PartialTransaction, BadHeaderMagic, UnexpectedEndOfStream,
SerializationError, PSBTInputConsistencyFailure)
from . import ElectrumTestCase, TestCaseForTestnet
class TestValidPSBT(TestCaseForTestnet):
# test cases from BIP-0174
def test_valid_psbt_001(self):
# Case: PSBT with one P2PKH input. Outputs are empty
tx1 = tx_from_any(bytes.fromhex('70736274ff0100750200000001268171371edff285e937adeea4b37b78000c0566cbb3ad64641713ca42171bf60000000000feffffff02d3dff505000000001976a914d0c59903c5bac2868760e90fd521a4665aa7652088ac00e1f5050000000017a9143545e6e33b832c47050f24d3eeb93c9c03948bc787b32e1300000100fda5010100000000010289a3c71eab4d20e0371bbba4cc698fa295c9463afa2e397f8533ccb62f9567e50100000017160014be18d152a9b012039daf3da7de4f53349eecb985ffffffff86f8aa43a71dff1448893a530a7237ef6b4608bbb2dd2d0171e63aec6a4890b40100000017160014fe3e9ef1a745e974d902c4355943abcb34bd5353ffffffff0200c2eb0b000000001976a91485cff1097fd9e008bb34af709c62197b38978a4888ac72fef84e2c00000017a914339725ba21efd62ac753a9bcd067d6c7a6a39d05870247304402202712be22e0270f394f568311dc7ca9a68970b8025fdd3b240229f07f8a5f3a240220018b38d7dcd314e734c9276bd6fb40f673325bc4baa144c800d2f2f02db2765c012103d2e15674941bad4a996372cb87e1856d3652606d98562fe39c5e9e7e413f210502483045022100d12b852d85dcd961d2f5f4ab660654df6eedcc794c0c33ce5cc309ffb5fce58d022067338a8e0e1725c197fb1a88af59f51e44e4255b20167c8684031c05d1f2592a01210223b72beef0965d10be0778efecd61fcac6f79a4ea169393380734464f84f2ab300000000000000'))
tx2 = tx_from_any('cHNidP8BAHUCAAAAASaBcTce3/KF6Tet7qSze3gADAVmy7OtZGQXE8pCFxv2AAAAAAD+////AtPf9QUAAAAAGXapFNDFmQPFusKGh2DpD9UhpGZap2UgiKwA4fUFAAAAABepFDVF5uM7gyxHBQ8k0+65PJwDlIvHh7MuEwAAAQD9pQEBAAAAAAECiaPHHqtNIOA3G7ukzGmPopXJRjr6Ljl/hTPMti+VZ+UBAAAAFxYAFL4Y0VKpsBIDna89p95PUzSe7LmF/////4b4qkOnHf8USIk6UwpyN+9rRgi7st0tAXHmOuxqSJC0AQAAABcWABT+Pp7xp0XpdNkCxDVZQ6vLNL1TU/////8CAMLrCwAAAAAZdqkUhc/xCX/Z4Ai7NK9wnGIZeziXikiIrHL++E4sAAAAF6kUM5cluiHv1irHU6m80GfWx6ajnQWHAkcwRAIgJxK+IuAnDzlPVoMR3HyppolwuAJf3TskAinwf4pfOiQCIAGLONfc0xTnNMkna9b7QPZzMlvEuqFEyADS8vAtsnZcASED0uFWdJQbrUqZY3LLh+GFbTZSYG2YVi/jnF6efkE/IQUCSDBFAiEA0SuFLYXc2WHS9fSrZgZU327tzHlMDDPOXMMJ/7X85Y0CIGczio4OFyXBl/saiK9Z9R5E5CVbIBZ8hoQDHAXR8lkqASECI7cr7vCWXRC+B3jv7NYfysb3mk6haTkzgHNEZPhPKrMAAAAAAAAA')
for tx in (tx1, tx2):
self.assertEqual(1, len(tx.inputs()))
self.assertFalse(tx.inputs()[0].is_complete())
def test_valid_psbt_002(self):
# Case: PSBT with one P2PKH input and one P2SH-P2WPKH input. First input is signed and finalized. Outputs are empty
tx1 = tx_from_any(bytes.fromhex('70736274ff0100a00200000002ab0949a08c5af7c49b8212f417e2f15ab3f5c33dcf153821a8139f877a5b7be40000000000feffffffab0949a08c5af7c49b8212f417e2f15ab3f5c33dcf153821a8139f877a5b7be40100000000feffffff02603bea0b000000001976a914768a40bbd740cbe81d988e71de2a4d5c71396b1d88ac8e240000000000001976a9146f4620b553fa095e721b9ee0efe9fa039cca459788ac000000000001076a47304402204759661797c01b036b25928948686218347d89864b719e1f7fcf57d1e511658702205309eabf56aa4d8891ffd111fdf1336f3a29da866d7f8486d75546ceedaf93190121035cdc61fc7ba971c0b501a646a2a83b102cb43881217ca682dc86e2d73fa882920001012000e1f5050000000017a9143545e6e33b832c47050f24d3eeb93c9c03948bc787010416001485d13537f2e265405a34dbafa9e3dda01fb82308000000'))
tx2 = tx_from_any('cHNidP8BAKACAAAAAqsJSaCMWvfEm4IS9Bfi8Vqz9cM9zxU4IagTn4d6W3vkAAAAAAD+////qwlJoIxa98SbghL0F+LxWrP1wz3PFTghqBOfh3pbe+QBAAAAAP7///8CYDvqCwAAAAAZdqkUdopAu9dAy+gdmI5x3ipNXHE5ax2IrI4kAAAAAAAAGXapFG9GILVT+glechue4O/p+gOcykWXiKwAAAAAAAEHakcwRAIgR1lmF5fAGwNrJZKJSGhiGDR9iYZLcZ4ff89X0eURZYcCIFMJ6r9Wqk2Ikf/REf3xM286KdqGbX+EhtdVRs7tr5MZASEDXNxh/HupccC1AaZGoqg7ECy0OIEhfKaC3Ibi1z+ogpIAAQEgAOH1BQAAAAAXqRQ1RebjO4MsRwUPJNPuuTycA5SLx4cBBBYAFIXRNTfy4mVAWjTbr6nj3aAfuCMIAAAA')
for tx in (tx1, tx2):
self.assertEqual(2, len(tx.inputs()))
self.assertTrue(tx.inputs()[0].is_complete())
self.assertFalse(tx.inputs()[1].is_complete())
def test_valid_psbt_003(self):
# Case: PSBT with one P2PKH input which has a non-final scriptSig and has a sighash type specified. Outputs are empty
tx1 = tx_from_any(bytes.fromhex('70736274ff0100750200000001268171371edff285e937adeea4b37b78000c0566cbb3ad64641713ca42171bf60000000000feffffff02d3dff505000000001976a914d0c59903c5bac2868760e90fd521a4665aa7652088ac00e1f5050000000017a9143545e6e33b832c47050f24d3eeb93c9c03948bc787b32e1300000100fda5010100000000010289a3c71eab4d20e0371bbba4cc698fa295c9463afa2e397f8533ccb62f9567e50100000017160014be18d152a9b012039daf3da7de4f53349eecb985ffffffff86f8aa43a71dff1448893a530a7237ef6b4608bbb2dd2d0171e63aec6a4890b40100000017160014fe3e9ef1a745e974d902c4355943abcb34bd5353ffffffff0200c2eb0b000000001976a91485cff1097fd9e008bb34af709c62197b38978a4888ac72fef84e2c00000017a914339725ba21efd62ac753a9bcd067d6c7a6a39d05870247304402202712be22e0270f394f568311dc7ca9a68970b8025fdd3b240229f07f8a5f3a240220018b38d7dcd314e734c9276bd6fb40f673325bc4baa144c800d2f2f02db2765c012103d2e15674941bad4a996372cb87e1856d3652606d98562fe39c5e9e7e413f210502483045022100d12b852d85dcd961d2f5f4ab660654df6eedcc794c0c33ce5cc309ffb5fce58d022067338a8e0e1725c197fb1a88af59f51e44e4255b20167c8684031c05d1f2592a01210223b72beef0965d10be0778efecd61fcac6f79a4ea169393380734464f84f2ab30000000001030401000000000000'))
tx2 = tx_from_any('cHNidP8BAHUCAAAAASaBcTce3/KF6Tet7qSze3gADAVmy7OtZGQXE8pCFxv2AAAAAAD+////AtPf9QUAAAAAGXapFNDFmQPFusKGh2DpD9UhpGZap2UgiKwA4fUFAAAAABepFDVF5uM7gyxHBQ8k0+65PJwDlIvHh7MuEwAAAQD9pQEBAAAAAAECiaPHHqtNIOA3G7ukzGmPopXJRjr6Ljl/hTPMti+VZ+UBAAAAFxYAFL4Y0VKpsBIDna89p95PUzSe7LmF/////4b4qkOnHf8USIk6UwpyN+9rRgi7st0tAXHmOuxqSJC0AQAAABcWABT+Pp7xp0XpdNkCxDVZQ6vLNL1TU/////8CAMLrCwAAAAAZdqkUhc/xCX/Z4Ai7NK9wnGIZeziXikiIrHL++E4sAAAAF6kUM5cluiHv1irHU6m80GfWx6ajnQWHAkcwRAIgJxK+IuAnDzlPVoMR3HyppolwuAJf3TskAinwf4pfOiQCIAGLONfc0xTnNMkna9b7QPZzMlvEuqFEyADS8vAtsnZcASED0uFWdJQbrUqZY3LLh+GFbTZSYG2YVi/jnF6efkE/IQUCSDBFAiEA0SuFLYXc2WHS9fSrZgZU327tzHlMDDPOXMMJ/7X85Y0CIGczio4OFyXBl/saiK9Z9R5E5CVbIBZ8hoQDHAXR8lkqASECI7cr7vCWXRC+B3jv7NYfysb3mk6haTkzgHNEZPhPKrMAAAAAAQMEAQAAAAAAAA==')
for tx in (tx1, tx2):
self.assertEqual(1, len(tx.inputs()))
self.assertEqual(1, tx.inputs()[0].sighash)
self.assertFalse(tx.inputs()[0].is_complete())
def test_valid_psbt_004(self):
# Case: PSBT with one P2PKH input and one P2SH-P2WPKH input both with non-final scriptSigs. P2SH-P2WPKH input's redeemScript is available. Outputs filled.
tx1 = tx_from_any(bytes.fromhex('70736274ff0100a00200000002ab0949a08c5af7c49b8212f417e2f15ab3f5c33dcf153821a8139f877a5b7be40000000000feffffffab0949a08c5af7c49b8212f417e2f15ab3f5c33dcf153821a8139f877a5b7be40100000000feffffff02603bea0b000000001976a914768a40bbd740cbe81d988e71de2a4d5c71396b1d88ac8e240000000000001976a9146f4620b553fa095e721b9ee0efe9fa039cca459788ac00000000000100df0200000001268171371edff285e937adeea4b37b78000c0566cbb3ad64641713ca42171bf6000000006a473044022070b2245123e6bf474d60c5b50c043d4c691a5d2435f09a34a7662a9dc251790a022001329ca9dacf280bdf30740ec0390422422c81cb45839457aeb76fc12edd95b3012102657d118d3357b8e0f4c2cd46db7b39f6d9c38d9a70abcb9b2de5dc8dbfe4ce31feffffff02d3dff505000000001976a914d0c59903c5bac2868760e90fd521a4665aa7652088ac00e1f5050000000017a9143545e6e33b832c47050f24d3eeb93c9c03948bc787b32e13000001012000e1f5050000000017a9143545e6e33b832c47050f24d3eeb93c9c03948bc787010416001485d13537f2e265405a34dbafa9e3dda01fb8230800220202ead596687ca806043edc3de116cdf29d5e9257c196cd055cf698c8d02bf24e9910b4a6ba670000008000000080020000800022020394f62be9df19952c5587768aeb7698061ad2c4a25c894f47d8c162b4d7213d0510b4a6ba6700000080010000800200008000'))
tx2 = tx_from_any('cHNidP8BAKACAAAAAqsJSaCMWvfEm4IS9Bfi8Vqz9cM9zxU4IagTn4d6W3vkAAAAAAD+////qwlJoIxa98SbghL0F+LxWrP1wz3PFTghqBOfh3pbe+QBAAAAAP7///8CYDvqCwAAAAAZdqkUdopAu9dAy+gdmI5x3ipNXHE5ax2IrI4kAAAAAAAAGXapFG9GILVT+glechue4O/p+gOcykWXiKwAAAAAAAEA3wIAAAABJoFxNx7f8oXpN63upLN7eAAMBWbLs61kZBcTykIXG/YAAAAAakcwRAIgcLIkUSPmv0dNYMW1DAQ9TGkaXSQ18Jo0p2YqncJReQoCIAEynKnazygL3zB0DsA5BCJCLIHLRYOUV663b8Eu3ZWzASECZX0RjTNXuOD0ws1G23s59tnDjZpwq8ubLeXcjb/kzjH+////AtPf9QUAAAAAGXapFNDFmQPFusKGh2DpD9UhpGZap2UgiKwA4fUFAAAAABepFDVF5uM7gyxHBQ8k0+65PJwDlIvHh7MuEwAAAQEgAOH1BQAAAAAXqRQ1RebjO4MsRwUPJNPuuTycA5SLx4cBBBYAFIXRNTfy4mVAWjTbr6nj3aAfuCMIACICAurVlmh8qAYEPtw94RbN8p1eklfBls0FXPaYyNAr8k6ZELSmumcAAACAAAAAgAIAAIAAIgIDlPYr6d8ZlSxVh3aK63aYBhrSxKJciU9H2MFitNchPQUQtKa6ZwAAAIABAACAAgAAgAA=')
for tx in (tx1, tx2):
self.assertEqual(2, len(tx.inputs()))
self.assertFalse(tx.inputs()[0].is_complete())
self.assertFalse(tx.inputs()[1].is_complete())
self.assertTrue(tx.inputs()[1].redeem_script is not None)
def test_valid_psbt_005(self):
# Case: PSBT with one P2SH-P2WSH input of a 2-of-2 multisig, redeemScript, witnessScript, and keypaths are available. Contains one signature.
tx1 = tx_from_any(bytes.fromhex('70736274ff0100550200000001279a2323a5dfb51fc45f220fa58b0fc13e1e3342792a85d7e36cd6333b5cbc390000000000ffffffff01a05aea0b000000001976a914ffe9c0061097cc3b636f2cb0460fa4fc427d2b4588ac0000000000010120955eea0b0000000017a9146345200f68d189e1adc0df1c4d16ea8f14c0dbeb87220203b1341ccba7683b6af4f1238cd6e97e7167d569fac47f1e48d47541844355bd4646304302200424b58effaaa694e1559ea5c93bbfd4a89064224055cdf070b6771469442d07021f5c8eb0fea6516d60b8acb33ad64ede60e8785bfb3aa94b99bdf86151db9a9a010104220020771fd18ad459666dd49f3d564e3dbc42f4c84774e360ada16816a8ed488d5681010547522103b1341ccba7683b6af4f1238cd6e97e7167d569fac47f1e48d47541844355bd462103de55d1e1dac805e3f8a58c1fbf9b94c02f3dbaafe127fefca4995f26f82083bd52ae220603b1341ccba7683b6af4f1238cd6e97e7167d569fac47f1e48d47541844355bd4610b4a6ba67000000800000008004000080220603de55d1e1dac805e3f8a58c1fbf9b94c02f3dbaafe127fefca4995f26f82083bd10b4a6ba670000008000000080050000800000'))
tx2 = tx_from_any('cHNidP8BAFUCAAAAASeaIyOl37UfxF8iD6WLD8E+HjNCeSqF1+Ns1jM7XLw5AAAAAAD/////AaBa6gsAAAAAGXapFP/pwAYQl8w7Y28ssEYPpPxCfStFiKwAAAAAAAEBIJVe6gsAAAAAF6kUY0UgD2jRieGtwN8cTRbqjxTA2+uHIgIDsTQcy6doO2r08SOM1ul+cWfVafrEfx5I1HVBhENVvUZGMEMCIAQktY7/qqaU4VWepck7v9SokGQiQFXN8HC2dxRpRC0HAh9cjrD+plFtYLisszrWTt5g6Hhb+zqpS5m9+GFR25qaAQEEIgAgdx/RitRZZm3Unz1WTj28QvTIR3TjYK2haBao7UiNVoEBBUdSIQOxNBzLp2g7avTxI4zW6X5xZ9Vp+sR/HkjUdUGEQ1W9RiED3lXR4drIBeP4pYwfv5uUwC89uq/hJ/78pJlfJvggg71SriIGA7E0HMunaDtq9PEjjNbpfnFn1Wn6xH8eSNR1QYRDVb1GELSmumcAAACAAAAAgAQAAIAiBgPeVdHh2sgF4/iljB+/m5TALz26r+En/vykmV8m+CCDvRC0prpnAAAAgAAAAIAFAACAAAA=')
for tx in (tx1, tx2):
self.assertEqual(1, len(tx.inputs()))
self.assertFalse(tx.inputs()[0].is_complete())
self.assertTrue(tx.inputs()[0].redeem_script is not None)
self.assertTrue(tx.inputs()[0].witness_script is not None)
self.assertEqual(2, len(tx.inputs()[0].bip32_paths))
self.assertEqual(1, len(tx.inputs()[0].part_sigs))
def test_valid_psbt_006(self):
# Case: PSBT with one P2WSH input of a 2-of-2 multisig. witnessScript, keypaths, and global xpubs are available. Contains no signatures. Outputs filled.
tx1 = tx_from_any(bytes.fromhex('70736274ff01005202000000019dfc6628c26c5899fe1bd3dc338665bfd55d7ada10f6220973df2d386dec12760100000000ffffffff01f03dcd1d000000001600147b3a00bfdc14d27795c2b74901d09da6ef133579000000004f01043587cf02da3fd0088000000097048b1ad0445b1ec8275517727c87b4e4ebc18a203ffa0f94c01566bd38e9000351b743887ee1d40dc32a6043724f2d6459b3b5a4d73daec8fbae0472f3bc43e20cd90c6a4fae000080000000804f01043587cf02da3fd00880000001b90452427139cd78c2cff2444be353cd58605e3e513285e528b407fae3f6173503d30a5e97c8adbc557dac2ad9a7e39c1722ebac69e668b6f2667cc1d671c83cab0cd90c6a4fae000080010000800001012b0065cd1d000000002200202c5486126c4978079a814e13715d65f36459e4d6ccaded266d0508645bafa6320105475221029da12cdb5b235692b91536afefe5c91c3ab9473d8e43b533836ab456299c88712103372b34234ed7cf9c1fea5d05d441557927be9542b162eb02e1ab2ce80224c00b52ae2206029da12cdb5b235692b91536afefe5c91c3ab9473d8e43b533836ab456299c887110d90c6a4fae0000800000008000000000220603372b34234ed7cf9c1fea5d05d441557927be9542b162eb02e1ab2ce80224c00b10d90c6a4fae0000800100008000000000002202039eff1f547a1d5f92dfa2ba7af6ac971a4bd03ba4a734b03156a256b8ad3a1ef910ede45cc500000080000000800100008000'))
tx2 = tx_from_any('cHNidP8BAFICAAAAAZ38ZijCbFiZ/hvT3DOGZb/VXXraEPYiCXPfLTht7BJ2AQAAAAD/////AfA9zR0AAAAAFgAUezoAv9wU0neVwrdJAdCdpu8TNXkAAAAATwEENYfPAto/0AiAAAAAlwSLGtBEWx7IJ1UXcnyHtOTrwYogP/oPlMAVZr046QADUbdDiH7h1A3DKmBDck8tZFmztaTXPa7I+64EcvO8Q+IM2QxqT64AAIAAAACATwEENYfPAto/0AiAAAABuQRSQnE5zXjCz/JES+NTzVhgXj5RMoXlKLQH+uP2FzUD0wpel8itvFV9rCrZp+OcFyLrrGnmaLbyZnzB1nHIPKsM2QxqT64AAIABAACAAAEBKwBlzR0AAAAAIgAgLFSGEmxJeAeagU4TcV1l82RZ5NbMre0mbQUIZFuvpjIBBUdSIQKdoSzbWyNWkrkVNq/v5ckcOrlHPY5DtTODarRWKZyIcSEDNys0I07Xz5wf6l0F1EFVeSe+lUKxYusC4ass6AIkwAtSriIGAp2hLNtbI1aSuRU2r+/lyRw6uUc9jkO1M4NqtFYpnIhxENkMak+uAACAAAAAgAAAAAAiBgM3KzQjTtfPnB/qXQXUQVV5J76VQrFi6wLhqyzoAiTACxDZDGpPrgAAgAEAAIAAAAAAACICA57/H1R6HV+S36K6evaslxpL0DukpzSwMVaiVritOh75EO3kXMUAAACAAAAAgAEAAIAA')
for tx in (tx1, tx2):
self.assertEqual(1, len(tx.inputs()))
self.assertFalse(tx.inputs()[0].is_complete())
self.assertTrue(tx.inputs()[0].witness_script is not None)
self.assertEqual(2, len(tx.inputs()[0].bip32_paths))
self.assertEqual(2, len(tx.xpubs))
self.assertEqual(0, len(tx.inputs()[0].part_sigs))
def test_valid_psbt_007(self):
# Case: PSBT with unknown types in the inputs.
tx1 = tx_from_any(bytes.fromhex('70736274ff01003f0200000001ffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffff0000000000ffffffff010000000000000000036a010000000000000a0f0102030405060708090f0102030405060708090a0b0c0d0e0f0000'))
tx2 = tx_from_any('cHNidP8BAD8CAAAAAf//////////////////////////////////////////AAAAAAD/////AQAAAAAAAAAAA2oBAAAAAAAACg8BAgMEBQYHCAkPAQIDBAUGBwgJCgsMDQ4PAAA=')
for tx in (tx1, tx2):
self.assertEqual(1, len(tx.inputs()))
self.assertEqual(1, len(tx.inputs()[0]._unknown))
def test_valid_psbt_008(self):
# Case: PSBT with `PSBT_GLOBAL_XPUB`.
constants.set_mainnet()
try:
tx1 = tx_from_any(bytes.fromhex('70736274ff01009d0100000002710ea76ab45c5cb6438e607e59cc037626981805ae9e0dfd9089012abb0be5350100000000ffffffff190994d6a8b3c8c82ccbcfb2fba4106aa06639b872a8d447465c0d42588d6d670000000000ffffffff0200e1f505000000001976a914b6bc2c0ee5655a843d79afedd0ccc3f7dd64340988ac605af405000000001600141188ef8e4ce0449eaac8fb141cbf5a1176e6a088000000004f010488b21e039e530cac800000003dbc8a5c9769f031b17e77fea1518603221a18fd18f2b9a54c6c8c1ac75cbc3502f230584b155d1c7f1cd45120a653c48d650b431b67c5b2c13f27d7142037c1691027569c503100008000000080000000800001011f00e1f5050000000016001433b982f91b28f160c920b4ab95e58ce50dda3a4a220203309680f33c7de38ea6a47cd4ecd66f1f5a49747c6ffb8808ed09039243e3ad5c47304402202d704ced830c56a909344bd742b6852dccd103e963bae92d38e75254d2bb424502202d86c437195df46c0ceda084f2a291c3da2d64070f76bf9b90b195e7ef28f77201220603309680f33c7de38ea6a47cd4ecd66f1f5a49747c6ffb8808ed09039243e3ad5c1827569c5031000080000000800000008000000000010000000001011f00e1f50500000000160014388fb944307eb77ef45197d0b0b245e079f011de220202c777161f73d0b7c72b9ee7bde650293d13f095bc7656ad1f525da5fd2e10b11047304402204cb1fb5f869c942e0e26100576125439179ae88dca8a9dc3ba08f7953988faa60220521f49ca791c27d70e273c9b14616985909361e25be274ea200d7e08827e514d01220602c777161f73d0b7c72b9ee7bde650293d13f095bc7656ad1f525da5fd2e10b1101827569c5031000080000000800000008000000000000000000000220202d20ca502ee289686d21815bd43a80637b0698e1fbcdbe4caed445f6c1a0a90ef1827569c50310000800000008000000080000000000400000000'))
tx2 = tx_from_any('cHNidP8BAJ0BAAAAAnEOp2q0XFy2Q45gflnMA3YmmBgFrp4N/ZCJASq7C+U1AQAAAAD/////GQmU1qizyMgsy8+y+6QQaqBmObhyqNRHRlwNQliNbWcAAAAAAP////8CAOH1BQAAAAAZdqkUtrwsDuVlWoQ9ea/t0MzD991kNAmIrGBa9AUAAAAAFgAUEYjvjkzgRJ6qyPsUHL9aEXbmoIgAAAAATwEEiLIeA55TDKyAAAAAPbyKXJdp8DGxfnf+oVGGAyIaGP0Y8rmlTGyMGsdcvDUC8jBYSxVdHH8c1FEgplPEjWULQxtnxbLBPyfXFCA3wWkQJ1acUDEAAIAAAACAAAAAgAABAR8A4fUFAAAAABYAFDO5gvkbKPFgySC0q5XljOUN2jpKIgIDMJaA8zx9446mpHzU7NZvH1pJdHxv+4gI7QkDkkPjrVxHMEQCIC1wTO2DDFapCTRL10K2hS3M0QPpY7rpLTjnUlTSu0JFAiAthsQ3GV30bAztoITyopHD2i1kBw92v5uQsZXn7yj3cgEiBgMwloDzPH3jjqakfNTs1m8fWkl0fG/7iAjtCQOSQ+OtXBgnVpxQMQAAgAAAAIAAAACAAAAAAAEAAAAAAQEfAOH1BQAAAAAWABQ4j7lEMH63fvRRl9CwskXgefAR3iICAsd3Fh9z0LfHK57nveZQKT0T8JW8dlatH1Jdpf0uELEQRzBEAiBMsftfhpyULg4mEAV2ElQ5F5rojcqKncO6CPeVOYj6pgIgUh9JynkcJ9cOJzybFGFphZCTYeJb4nTqIA1+CIJ+UU0BIgYCx3cWH3PQt8crnue95lApPRPwlbx2Vq0fUl2l/S4QsRAYJ1acUDEAAIAAAACAAAAAgAAAAAAAAAAAAAAiAgLSDKUC7iiWhtIYFb1DqAY3sGmOH7zb5MrtRF9sGgqQ7xgnVpxQMQAAgAAAAIAAAACAAAAAAAQAAAAA')
for tx in (tx1, tx2):
self.assertEqual(1, len(tx.xpubs))
finally:
constants.set_testnet()
class TestInvalidPSBT(TestCaseForTestnet):
# test cases from BIP-0174
def test_invalid_psbt_001(self):
# Case: Network transaction, not PSBT format
with self.assertRaises(BadHeaderMagic):
tx1 = PartialTransaction.from_raw_psbt(bytes.fromhex('0200000001268171371edff285e937adeea4b37b78000c0566cbb3ad64641713ca42171bf6000000006a473044022070b2245123e6bf474d60c5b50c043d4c691a5d2435f09a34a7662a9dc251790a022001329ca9dacf280bdf30740ec0390422422c81cb45839457aeb76fc12edd95b3012102657d118d3357b8e0f4c2cd46db7b39f6d9c38d9a70abcb9b2de5dc8dbfe4ce31feffffff02d3dff505000000001976a914d0c59903c5bac2868760e90fd521a4665aa7652088ac00e1f5050000000017a9143545e6e33b832c47050f24d3eeb93c9c03948bc787b32e1300'))
with self.assertRaises(BadHeaderMagic):
tx2 = PartialTransaction.from_raw_psbt('AgAAAAEmgXE3Ht/yhek3re6ks3t4AAwFZsuzrWRkFxPKQhcb9gAAAABqRzBEAiBwsiRRI+a/R01gxbUMBD1MaRpdJDXwmjSnZiqdwlF5CgIgATKcqdrPKAvfMHQOwDkEIkIsgctFg5RXrrdvwS7dlbMBIQJlfRGNM1e44PTCzUbbezn22cONmnCry5st5dyNv+TOMf7///8C09/1BQAAAAAZdqkU0MWZA8W6woaHYOkP1SGkZlqnZSCIrADh9QUAAAAAF6kUNUXm4zuDLEcFDyTT7rk8nAOUi8eHsy4TAA==')
def test_invalid_psbt_002(self):
# Case: PSBT missing outputs
with self.assertRaises(UnexpectedEndOfStream):
tx1 = tx_from_any(bytes.fromhex('70736274ff0100750200000001268171371edff285e937adeea4b37b78000c0566cbb3ad64641713ca42171bf60000000000feffffff02d3dff505000000001976a914d0c59903c5bac2868760e90fd521a4665aa7652088ac00e1f5050000000017a9143545e6e33b832c47050f24d3eeb93c9c03948bc787b32e1300000100fda5010100000000010289a3c71eab4d20e0371bbba4cc698fa295c9463afa2e397f8533ccb62f9567e50100000017160014be18d152a9b012039daf3da7de4f53349eecb985ffffffff86f8aa43a71dff1448893a530a7237ef6b4608bbb2dd2d0171e63aec6a4890b40100000017160014fe3e9ef1a745e974d902c4355943abcb34bd5353ffffffff0200c2eb0b000000001976a91485cff1097fd9e008bb34af709c62197b38978a4888ac72fef84e2c00000017a914339725ba21efd62ac753a9bcd067d6c7a6a39d05870247304402202712be22e0270f394f568311dc7ca9a68970b8025fdd3b240229f07f8a5f3a240220018b38d7dcd314e734c9276bd6fb40f673325bc4baa144c800d2f2f02db2765c012103d2e15674941bad4a996372cb87e1856d3652606d98562fe39c5e9e7e413f210502483045022100d12b852d85dcd961d2f5f4ab660654df6eedcc794c0c33ce5cc309ffb5fce58d022067338a8e0e1725c197fb1a88af59f51e44e4255b20167c8684031c05d1f2592a01210223b72beef0965d10be0778efecd61fcac6f79a4ea169393380734464f84f2ab30000000000'))
with self.assertRaises(UnexpectedEndOfStream):
tx2 = tx_from_any('cHNidP8BAHUCAAAAASaBcTce3/KF6Tet7qSze3gADAVmy7OtZGQXE8pCFxv2AAAAAAD+////AtPf9QUAAAAAGXapFNDFmQPFusKGh2DpD9UhpGZap2UgiKwA4fUFAAAAABepFDVF5uM7gyxHBQ8k0+65PJwDlIvHh7MuEwAAAQD9pQEBAAAAAAECiaPHHqtNIOA3G7ukzGmPopXJRjr6Ljl/hTPMti+VZ+UBAAAAFxYAFL4Y0VKpsBIDna89p95PUzSe7LmF/////4b4qkOnHf8USIk6UwpyN+9rRgi7st0tAXHmOuxqSJC0AQAAABcWABT+Pp7xp0XpdNkCxDVZQ6vLNL1TU/////8CAMLrCwAAAAAZdqkUhc/xCX/Z4Ai7NK9wnGIZeziXikiIrHL++E4sAAAAF6kUM5cluiHv1irHU6m80GfWx6ajnQWHAkcwRAIgJxK+IuAnDzlPVoMR3HyppolwuAJf3TskAinwf4pfOiQCIAGLONfc0xTnNMkna9b7QPZzMlvEuqFEyADS8vAtsnZcASED0uFWdJQbrUqZY3LLh+GFbTZSYG2YVi/jnF6efkE/IQUCSDBFAiEA0SuFLYXc2WHS9fSrZgZU327tzHlMDDPOXMMJ/7X85Y0CIGczio4OFyXBl/saiK9Z9R5E5CVbIBZ8hoQDHAXR8lkqASECI7cr7vCWXRC+B3jv7NYfysb3mk6haTkzgHNEZPhPKrMAAAAAAA==')
def test_invalid_psbt_003(self):
# Case: PSBT where one input has a filled scriptSig in the unsigned tx
with self.assertRaises(SerializationError):
tx1 = tx_from_any(bytes.fromhex('70736274ff0100fd0a010200000002ab0949a08c5af7c49b8212f417e2f15ab3f5c33dcf153821a8139f877a5b7be4000000006a47304402204759661797c01b036b25928948686218347d89864b719e1f7fcf57d1e511658702205309eabf56aa4d8891ffd111fdf1336f3a29da866d7f8486d75546ceedaf93190121035cdc61fc7ba971c0b501a646a2a83b102cb43881217ca682dc86e2d73fa88292feffffffab0949a08c5af7c49b8212f417e2f15ab3f5c33dcf153821a8139f877a5b7be40100000000feffffff02603bea0b000000001976a914768a40bbd740cbe81d988e71de2a4d5c71396b1d88ac8e240000000000001976a9146f4620b553fa095e721b9ee0efe9fa039cca459788ac00000000000001012000e1f5050000000017a9143545e6e33b832c47050f24d3eeb93c9c03948bc787010416001485d13537f2e265405a34dbafa9e3dda01fb82308000000'))
with self.assertRaises(SerializationError):
tx2 = tx_from_any('cHNidP8BAP0KAQIAAAACqwlJoIxa98SbghL0F+LxWrP1wz3PFTghqBOfh3pbe+QAAAAAakcwRAIgR1lmF5fAGwNrJZKJSGhiGDR9iYZLcZ4ff89X0eURZYcCIFMJ6r9Wqk2Ikf/REf3xM286KdqGbX+EhtdVRs7tr5MZASEDXNxh/HupccC1AaZGoqg7ECy0OIEhfKaC3Ibi1z+ogpL+////qwlJoIxa98SbghL0F+LxWrP1wz3PFTghqBOfh3pbe+QBAAAAAP7///8CYDvqCwAAAAAZdqkUdopAu9dAy+gdmI5x3ipNXHE5ax2IrI4kAAAAAAAAGXapFG9GILVT+glechue4O/p+gOcykWXiKwAAAAAAAABASAA4fUFAAAAABepFDVF5uM7gyxHBQ8k0+65PJwDlIvHhwEEFgAUhdE1N/LiZUBaNNuvqePdoB+4IwgAAAA=')
def test_invalid_psbt_004(self):
# Case: PSBT where inputs and outputs are provided but without an unsigned tx
with self.assertRaises(SerializationError):
tx1 = tx_from_any(bytes.fromhex('70736274ff000100fda5010100000000010289a3c71eab4d20e0371bbba4cc698fa295c9463afa2e397f8533ccb62f9567e50100000017160014be18d152a9b012039daf3da7de4f53349eecb985ffffffff86f8aa43a71dff1448893a530a7237ef6b4608bbb2dd2d0171e63aec6a4890b40100000017160014fe3e9ef1a745e974d902c4355943abcb34bd5353ffffffff0200c2eb0b000000001976a91485cff1097fd9e008bb34af709c62197b38978a4888ac72fef84e2c00000017a914339725ba21efd62ac753a9bcd067d6c7a6a39d05870247304402202712be22e0270f394f568311dc7ca9a68970b8025fdd3b240229f07f8a5f3a240220018b38d7dcd314e734c9276bd6fb40f673325bc4baa144c800d2f2f02db2765c012103d2e15674941bad4a996372cb87e1856d3652606d98562fe39c5e9e7e413f210502483045022100d12b852d85dcd961d2f5f4ab660654df6eedcc794c0c33ce5cc309ffb5fce58d022067338a8e0e1725c197fb1a88af59f51e44e4255b20167c8684031c05d1f2592a01210223b72beef0965d10be0778efecd61fcac6f79a4ea169393380734464f84f2ab30000000000'))
with self.assertRaises(SerializationError):
tx2 = tx_from_any('cHNidP8AAQD9pQEBAAAAAAECiaPHHqtNIOA3G7ukzGmPopXJRjr6Ljl/hTPMti+VZ+UBAAAAFxYAFL4Y0VKpsBIDna89p95PUzSe7LmF/////4b4qkOnHf8USIk6UwpyN+9rRgi7st0tAXHmOuxqSJC0AQAAABcWABT+Pp7xp0XpdNkCxDVZQ6vLNL1TU/////8CAMLrCwAAAAAZdqkUhc/xCX/Z4Ai7NK9wnGIZeziXikiIrHL++E4sAAAAF6kUM5cluiHv1irHU6m80GfWx6ajnQWHAkcwRAIgJxK+IuAnDzlPVoMR3HyppolwuAJf3TskAinwf4pfOiQCIAGLONfc0xTnNMkna9b7QPZzMlvEuqFEyADS8vAtsnZcASED0uFWdJQbrUqZY3LLh+GFbTZSYG2YVi/jnF6efkE/IQUCSDBFAiEA0SuFLYXc2WHS9fSrZgZU327tzHlMDDPOXMMJ/7X85Y0CIGczio4OFyXBl/saiK9Z9R5E5CVbIBZ8hoQDHAXR8lkqASECI7cr7vCWXRC+B3jv7NYfysb3mk6haTkzgHNEZPhPKrMAAAAAAA==')
def test_invalid_psbt_005(self):
# Case: PSBT with duplicate keys in an input
with self.assertRaises(SerializationError):
tx1 = tx_from_any(bytes.fromhex('70736274ff0100750200000001268171371edff285e937adeea4b37b78000c0566cbb3ad64641713ca42171bf60000000000feffffff02d3dff505000000001976a914d0c59903c5bac2868760e90fd521a4665aa7652088ac00e1f5050000000017a9143545e6e33b832c47050f24d3eeb93c9c03948bc787b32e1300000100fda5010100000000010289a3c71eab4d20e0371bbba4cc698fa295c9463afa2e397f8533ccb62f9567e50100000017160014be18d152a9b012039daf3da7de4f53349eecb985ffffffff86f8aa43a71dff1448893a530a7237ef6b4608bbb2dd2d0171e63aec6a4890b40100000017160014fe3e9ef1a745e974d902c4355943abcb34bd5353ffffffff0200c2eb0b000000001976a91485cff1097fd9e008bb34af709c62197b38978a4888ac72fef84e2c00000017a914339725ba21efd62ac753a9bcd067d6c7a6a39d05870247304402202712be22e0270f394f568311dc7ca9a68970b8025fdd3b240229f07f8a5f3a240220018b38d7dcd314e734c9276bd6fb40f673325bc4baa144c800d2f2f02db2765c012103d2e15674941bad4a996372cb87e1856d3652606d98562fe39c5e9e7e413f210502483045022100d12b852d85dcd961d2f5f4ab660654df6eedcc794c0c33ce5cc309ffb5fce58d022067338a8e0e1725c197fb1a88af59f51e44e4255b20167c8684031c05d1f2592a01210223b72beef0965d10be0778efecd61fcac6f79a4ea169393380734464f84f2ab30000000001003f0200000001ffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffff0000000000ffffffff010000000000000000036a010000000000000000'))
with self.assertRaises(SerializationError):
tx2 = tx_from_any('cHNidP8BAHUCAAAAASaBcTce3/KF6Tet7qSze3gADAVmy7OtZGQXE8pCFxv2AAAAAAD+////AtPf9QUAAAAAGXapFNDFmQPFusKGh2DpD9UhpGZap2UgiKwA4fUFAAAAABepFDVF5uM7gyxHBQ8k0+65PJwDlIvHh7MuEwAAAQD9pQEBAAAAAAECiaPHHqtNIOA3G7ukzGmPopXJRjr6Ljl/hTPMti+VZ+UBAAAAFxYAFL4Y0VKpsBIDna89p95PUzSe7LmF/////4b4qkOnHf8USIk6UwpyN+9rRgi7st0tAXHmOuxqSJC0AQAAABcWABT+Pp7xp0XpdNkCxDVZQ6vLNL1TU/////8CAMLrCwAAAAAZdqkUhc/xCX/Z4Ai7NK9wnGIZeziXikiIrHL++E4sAAAAF6kUM5cluiHv1irHU6m80GfWx6ajnQWHAkcwRAIgJxK+IuAnDzlPVoMR3HyppolwuAJf3TskAinwf4pfOiQCIAGLONfc0xTnNMkna9b7QPZzMlvEuqFEyADS8vAtsnZcASED0uFWdJQbrUqZY3LLh+GFbTZSYG2YVi/jnF6efkE/IQUCSDBFAiEA0SuFLYXc2WHS9fSrZgZU327tzHlMDDPOXMMJ/7X85Y0CIGczio4OFyXBl/saiK9Z9R5E5CVbIBZ8hoQDHAXR8lkqASECI7cr7vCWXRC+B3jv7NYfysb3mk6haTkzgHNEZPhPKrMAAAAAAQA/AgAAAAH//////////////////////////////////////////wAAAAAA/////wEAAAAAAAAAAANqAQAAAAAAAAAA')
def test_invalid_psbt_006(self):
# Case: PSBT With invalid global transaction typed key
with self.assertRaises(SerializationError):
tx1 = tx_from_any(bytes.fromhex('70736274ff020001550200000001279a2323a5dfb51fc45f220fa58b0fc13e1e3342792a85d7e36cd6333b5cbc390000000000ffffffff01a05aea0b000000001976a914ffe9c0061097cc3b636f2cb0460fa4fc427d2b4588ac0000000000010120955eea0b0000000017a9146345200f68d189e1adc0df1c4d16ea8f14c0dbeb87220203b1341ccba7683b6af4f1238cd6e97e7167d569fac47f1e48d47541844355bd4646304302200424b58effaaa694e1559ea5c93bbfd4a89064224055cdf070b6771469442d07021f5c8eb0fea6516d60b8acb33ad64ede60e8785bfb3aa94b99bdf86151db9a9a010104220020771fd18ad459666dd49f3d564e3dbc42f4c84774e360ada16816a8ed488d5681010547522103b1341ccba7683b6af4f1238cd6e97e7167d569fac47f1e48d47541844355bd462103de55d1e1dac805e3f8a58c1fbf9b94c02f3dbaafe127fefca4995f26f82083bd52ae220603b1341ccba7683b6af4f1238cd6e97e7167d569fac47f1e48d47541844355bd4610b4a6ba67000000800000008004000080220603de55d1e1dac805e3f8a58c1fbf9b94c02f3dbaafe127fefca4995f26f82083bd10b4a6ba670000008000000080050000800000'))
with self.assertRaises(SerializationError):
tx2 = tx_from_any('cHNidP8CAAFVAgAAAAEnmiMjpd+1H8RfIg+liw/BPh4zQnkqhdfjbNYzO1y8OQAAAAAA/////wGgWuoLAAAAABl2qRT/6cAGEJfMO2NvLLBGD6T8Qn0rRYisAAAAAAABASCVXuoLAAAAABepFGNFIA9o0YnhrcDfHE0W6o8UwNvrhyICA7E0HMunaDtq9PEjjNbpfnFn1Wn6xH8eSNR1QYRDVb1GRjBDAiAEJLWO/6qmlOFVnqXJO7/UqJBkIkBVzfBwtncUaUQtBwIfXI6w/qZRbWC4rLM61k7eYOh4W/s6qUuZvfhhUduamgEBBCIAIHcf0YrUWWZt1J89Vk49vEL0yEd042CtoWgWqO1IjVaBAQVHUiEDsTQcy6doO2r08SOM1ul+cWfVafrEfx5I1HVBhENVvUYhA95V0eHayAXj+KWMH7+blMAvPbqv4Sf+/KSZXyb4IIO9Uq4iBgOxNBzLp2g7avTxI4zW6X5xZ9Vp+sR/HkjUdUGEQ1W9RhC0prpnAAAAgAAAAIAEAACAIgYD3lXR4drIBeP4pYwfv5uUwC89uq/hJ/78pJlfJvggg70QtKa6ZwAAAIAAAACABQAAgAAA')
def test_invalid_psbt_007(self):
# Case: PSBT With invalid input witness utxo typed key
with self.assertRaises(SerializationError):
tx1 = tx_from_any(bytes.fromhex('70736274ff0100550200000001279a2323a5dfb51fc45f220fa58b0fc13e1e3342792a85d7e36cd6333b5cbc390000000000ffffffff01a05aea0b000000001976a914ffe9c0061097cc3b636f2cb0460fa4fc427d2b4588ac000000000002010020955eea0b0000000017a9146345200f68d189e1adc0df1c4d16ea8f14c0dbeb87220203b1341ccba7683b6af4f1238cd6e97e7167d569fac47f1e48d47541844355bd4646304302200424b58effaaa694e1559ea5c93bbfd4a89064224055cdf070b6771469442d07021f5c8eb0fea6516d60b8acb33ad64ede60e8785bfb3aa94b99bdf86151db9a9a010104220020771fd18ad459666dd49f3d564e3dbc42f4c84774e360ada16816a8ed488d5681010547522103b1341ccba7683b6af4f1238cd6e97e7167d569fac47f1e48d47541844355bd462103de55d1e1dac805e3f8a58c1fbf9b94c02f3dbaafe127fefca4995f26f82083bd52ae220603b1341ccba7683b6af4f1238cd6e97e7167d569fac47f1e48d47541844355bd4610b4a6ba67000000800000008004000080220603de55d1e1dac805e3f8a58c1fbf9b94c02f3dbaafe127fefca4995f26f82083bd10b4a6ba670000008000000080050000800000'))
with self.assertRaises(SerializationError):
tx2 = tx_from_any('cHNidP8BAFUCAAAAASeaIyOl37UfxF8iD6WLD8E+HjNCeSqF1+Ns1jM7XLw5AAAAAAD/////AaBa6gsAAAAAGXapFP/pwAYQl8w7Y28ssEYPpPxCfStFiKwAAAAAAAIBACCVXuoLAAAAABepFGNFIA9o0YnhrcDfHE0W6o8UwNvrhyICA7E0HMunaDtq9PEjjNbpfnFn1Wn6xH8eSNR1QYRDVb1GRjBDAiAEJLWO/6qmlOFVnqXJO7/UqJBkIkBVzfBwtncUaUQtBwIfXI6w/qZRbWC4rLM61k7eYOh4W/s6qUuZvfhhUduamgEBBCIAIHcf0YrUWWZt1J89Vk49vEL0yEd042CtoWgWqO1IjVaBAQVHUiEDsTQcy6doO2r08SOM1ul+cWfVafrEfx5I1HVBhENVvUYhA95V0eHayAXj+KWMH7+blMAvPbqv4Sf+/KSZXyb4IIO9Uq4iBgOxNBzLp2g7avTxI4zW6X5xZ9Vp+sR/HkjUdUGEQ1W9RhC0prpnAAAAgAAAAIAEAACAIgYD3lXR4drIBeP4pYwfv5uUwC89uq/hJ/78pJlfJvggg70QtKa6ZwAAAIAAAACABQAAgAAA')
def test_invalid_psbt_008(self):
# Case: PSBT With invalid pubkey length for input partial signature typed key
with self.assertRaises(SerializationError):
tx1 = tx_from_any(bytes.fromhex('70736274ff0100550200000001279a2323a5dfb51fc45f220fa58b0fc13e1e3342792a85d7e36cd6333b5cbc390000000000ffffffff01a05aea0b000000001976a914ffe9c0061097cc3b636f2cb0460fa4fc427d2b4588ac0000000000010120955eea0b0000000017a9146345200f68d189e1adc0df1c4d16ea8f14c0dbeb87210203b1341ccba7683b6af4f1238cd6e97e7167d569fac47f1e48d47541844355bd46304302200424b58effaaa694e1559ea5c93bbfd4a89064224055cdf070b6771469442d07021f5c8eb0fea6516d60b8acb33ad64ede60e8785bfb3aa94b99bdf86151db9a9a010104220020771fd18ad459666dd49f3d564e3dbc42f4c84774e360ada16816a8ed488d5681010547522103b1341ccba7683b6af4f1238cd6e97e7167d569fac47f1e48d47541844355bd462103de55d1e1dac805e3f8a58c1fbf9b94c02f3dbaafe127fefca4995f26f82083bd52ae220603b1341ccba7683b6af4f1238cd6e97e7167d569fac47f1e48d47541844355bd4610b4a6ba67000000800000008004000080220603de55d1e1dac805e3f8a58c1fbf9b94c02f3dbaafe127fefca4995f26f82083bd10b4a6ba670000008000000080050000800000'))
with self.assertRaises(SerializationError):
tx2 = tx_from_any('cHNidP8BAFUCAAAAASeaIyOl37UfxF8iD6WLD8E+HjNCeSqF1+Ns1jM7XLw5AAAAAAD/////AaBa6gsAAAAAGXapFP/pwAYQl8w7Y28ssEYPpPxCfStFiKwAAAAAAAEBIJVe6gsAAAAAF6kUY0UgD2jRieGtwN8cTRbqjxTA2+uHIQIDsTQcy6doO2r08SOM1ul+cWfVafrEfx5I1HVBhENVvUYwQwIgBCS1jv+qppThVZ6lyTu/1KiQZCJAVc3wcLZ3FGlELQcCH1yOsP6mUW1guKyzOtZO3mDoeFv7OqlLmb34YVHbmpoBAQQiACB3H9GK1FlmbdSfPVZOPbxC9MhHdONgraFoFqjtSI1WgQEFR1IhA7E0HMunaDtq9PEjjNbpfnFn1Wn6xH8eSNR1QYRDVb1GIQPeVdHh2sgF4/iljB+/m5TALz26r+En/vykmV8m+CCDvVKuIgYDsTQcy6doO2r08SOM1ul+cWfVafrEfx5I1HVBhENVvUYQtKa6ZwAAAIAAAACABAAAgCIGA95V0eHayAXj+KWMH7+blMAvPbqv4Sf+/KSZXyb4IIO9ELSmumcAAACAAAAAgAUAAIAAAA==')
def test_invalid_psbt_009(self):
# Case: PSBT With invalid redeemscript typed key
with self.assertRaises(SerializationError):
tx1 = tx_from_any(bytes.fromhex('70736274ff0100550200000001279a2323a5dfb51fc45f220fa58b0fc13e1e3342792a85d7e36cd6333b5cbc390000000000ffffffff01a05aea0b000000001976a914ffe9c0061097cc3b636f2cb0460fa4fc427d2b4588ac0000000000010120955eea0b0000000017a9146345200f68d189e1adc0df1c4d16ea8f14c0dbeb87220203b1341ccba7683b6af4f1238cd6e97e7167d569fac47f1e48d47541844355bd4646304302200424b58effaaa694e1559ea5c93bbfd4a89064224055cdf070b6771469442d07021f5c8eb0fea6516d60b8acb33ad64ede60e8785bfb3aa94b99bdf86151db9a9a01020400220020771fd18ad459666dd49f3d564e3dbc42f4c84774e360ada16816a8ed488d5681010547522103b1341ccba7683b6af4f1238cd6e97e7167d569fac47f1e48d47541844355bd462103de55d1e1dac805e3f8a58c1fbf9b94c02f3dbaafe127fefca4995f26f82083bd52ae220603b1341ccba7683b6af4f1238cd6e97e7167d569fac47f1e48d47541844355bd4610b4a6ba67000000800000008004000080220603de55d1e1dac805e3f8a58c1fbf9b94c02f3dbaafe127fefca4995f26f82083bd10b4a6ba670000008000000080050000800000'))
with self.assertRaises(SerializationError):
tx2 = tx_from_any('cHNidP8BAFUCAAAAASeaIyOl37UfxF8iD6WLD8E+HjNCeSqF1+Ns1jM7XLw5AAAAAAD/////AaBa6gsAAAAAGXapFP/pwAYQl8w7Y28ssEYPpPxCfStFiKwAAAAAAAEBIJVe6gsAAAAAF6kUY0UgD2jRieGtwN8cTRbqjxTA2+uHIgIDsTQcy6doO2r08SOM1ul+cWfVafrEfx5I1HVBhENVvUZGMEMCIAQktY7/qqaU4VWepck7v9SokGQiQFXN8HC2dxRpRC0HAh9cjrD+plFtYLisszrWTt5g6Hhb+zqpS5m9+GFR25qaAQIEACIAIHcf0YrUWWZt1J89Vk49vEL0yEd042CtoWgWqO1IjVaBAQVHUiEDsTQcy6doO2r08SOM1ul+cWfVafrEfx5I1HVBhENVvUYhA95V0eHayAXj+KWMH7+blMAvPbqv4Sf+/KSZXyb4IIO9Uq4iBgOxNBzLp2g7avTxI4zW6X5xZ9Vp+sR/HkjUdUGEQ1W9RhC0prpnAAAAgAAAAIAEAACAIgYD3lXR4drIBeP4pYwfv5uUwC89uq/hJ/78pJlfJvggg70QtKa6ZwAAAIAAAACABQAAgAAA')
def test_invalid_psbt_010(self):
# Case: PSBT With invalid witnessscript typed key
with self.assertRaises(SerializationError):
tx1 = tx_from_any(bytes.fromhex('70736274ff0100550200000001279a2323a5dfb51fc45f220fa58b0fc13e1e3342792a85d7e36cd6333b5cbc390000000000ffffffff01a05aea0b000000001976a914ffe9c0061097cc3b636f2cb0460fa4fc427d2b4588ac0000000000010120955eea0b0000000017a9146345200f68d189e1adc0df1c4d16ea8f14c0dbeb87220203b1341ccba7683b6af4f1238cd6e97e7167d569fac47f1e48d47541844355bd4646304302200424b58effaaa694e1559ea5c93bbfd4a89064224055cdf070b6771469442d07021f5c8eb0fea6516d60b8acb33ad64ede60e8785bfb3aa94b99bdf86151db9a9a010104220020771fd18ad459666dd49f3d564e3dbc42f4c84774e360ada16816a8ed488d568102050047522103b1341ccba7683b6af4f1238cd6e97e7167d569fac47f1e48d47541844355bd462103de55d1e1dac805e3f8a58c1fbf9b94c02f3dbaafe127fefca4995f26f82083bd52ae220603b1341ccba7683b6af4f1238cd6e97e7167d569fac47f1e48d47541844355bd4610b4a6ba67000000800000008004000080220603de55d1e1dac805e3f8a58c1fbf9b94c02f3dbaafe127fefca4995f26f82083bd10b4a6ba670000008000000080050000800000'))
with self.assertRaises(SerializationError):
tx2 = tx_from_any('cHNidP8BAFUCAAAAASeaIyOl37UfxF8iD6WLD8E+HjNCeSqF1+Ns1jM7XLw5AAAAAAD/////AaBa6gsAAAAAGXapFP/pwAYQl8w7Y28ssEYPpPxCfStFiKwAAAAAAAEBIJVe6gsAAAAAF6kUY0UgD2jRieGtwN8cTRbqjxTA2+uHIgIDsTQcy6doO2r08SOM1ul+cWfVafrEfx5I1HVBhENVvUZGMEMCIAQktY7/qqaU4VWepck7v9SokGQiQFXN8HC2dxRpRC0HAh9cjrD+plFtYLisszrWTt5g6Hhb+zqpS5m9+GFR25qaAQEEIgAgdx/RitRZZm3Unz1WTj28QvTIR3TjYK2haBao7UiNVoECBQBHUiEDsTQcy6doO2r08SOM1ul+cWfVafrEfx5I1HVBhENVvUYhA95V0eHayAXj+KWMH7+blMAvPbqv4Sf+/KSZXyb4IIO9Uq4iBgOxNBzLp2g7avTxI4zW6X5xZ9Vp+sR/HkjUdUGEQ1W9RhC0prpnAAAAgAAAAIAEAACAIgYD3lXR4drIBeP4pYwfv5uUwC89uq/hJ/78pJlfJvggg70QtKa6ZwAAAIAAAACABQAAgAAA')
def test_invalid_psbt_011(self):
# Case: PSBT With invalid bip32 typed key
with self.assertRaises(SerializationError):
tx1 = tx_from_any(bytes.fromhex('70736274ff0100550200000001279a2323a5dfb51fc45f220fa58b0fc13e1e3342792a85d7e36cd6333b5cbc390000000000ffffffff01a05aea0b000000001976a914ffe9c0061097cc3b636f2cb0460fa4fc427d2b4588ac0000000000010120955eea0b0000000017a9146345200f68d189e1adc0df1c4d16ea8f14c0dbeb87220203b1341ccba7683b6af4f1238cd6e97e7167d569fac47f1e48d47541844355bd4646304302200424b58effaaa694e1559ea5c93bbfd4a89064224055cdf070b6771469442d07021f5c8eb0fea6516d60b8acb33ad64ede60e8785bfb3aa94b99bdf86151db9a9a010104220020771fd18ad459666dd49f3d564e3dbc42f4c84774e360ada16816a8ed488d5681010547522103b1341ccba7683b6af4f1238cd6e97e7167d569fac47f1e48d47541844355bd462103de55d1e1dac805e3f8a58c1fbf9b94c02f3dbaafe127fefca4995f26f82083bd52ae210603b1341ccba7683b6af4f1238cd6e97e7167d569fac47f1e48d47541844355bd10b4a6ba67000000800000008004000080220603de55d1e1dac805e3f8a58c1fbf9b94c02f3dbaafe127fefca4995f26f82083bd10b4a6ba670000008000000080050000800000'))
with self.assertRaises(SerializationError):
tx2 = tx_from_any('cHNidP8BAFUCAAAAASeaIyOl37UfxF8iD6WLD8E+HjNCeSqF1+Ns1jM7XLw5AAAAAAD/////AaBa6gsAAAAAGXapFP/pwAYQl8w7Y28ssEYPpPxCfStFiKwAAAAAAAEBIJVe6gsAAAAAF6kUY0UgD2jRieGtwN8cTRbqjxTA2+uHIgIDsTQcy6doO2r08SOM1ul+cWfVafrEfx5I1HVBhENVvUZGMEMCIAQktY7/qqaU4VWepck7v9SokGQiQFXN8HC2dxRpRC0HAh9cjrD+plFtYLisszrWTt5g6Hhb+zqpS5m9+GFR25qaAQEEIgAgdx/RitRZZm3Unz1WTj28QvTIR3TjYK2haBao7UiNVoEBBUdSIQOxNBzLp2g7avTxI4zW6X5xZ9Vp+sR/HkjUdUGEQ1W9RiED3lXR4drIBeP4pYwfv5uUwC89uq/hJ/78pJlfJvggg71SriEGA7E0HMunaDtq9PEjjNbpfnFn1Wn6xH8eSNR1QYRDVb0QtKa6ZwAAAIAAAACABAAAgCIGA95V0eHayAXj+KWMH7+blMAvPbqv4Sf+/KSZXyb4IIO9ELSmumcAAACAAAAAgAUAAIAAAA==')
def test_invalid_psbt_012(self):
# Case: PSBT With invalid non-witness utxo typed key
with self.assertRaises(SerializationError):
tx1 = tx_from_any(bytes.fromhex('70736274ff01009a020000000258e87a21b56daf0c23be8e7070456c336f7cbaa5c8757924f545887bb2abdd750000000000ffffffff838d0427d0ec650a68aa46bb0b098aea4422c071b2ca78352a077959d07cea1d0100000000ffffffff0270aaf00800000000160014d85c2b71d0060b09c9886aeb815e50991dda124d00e1f5050000000016001400aea9a2e5f0f876a588df5546e8742d1d87008f0000000000020000bb0200000001aad73931018bd25f84ae400b68848be09db706eac2ac18298babee71ab656f8b0000000048473044022058f6fc7c6a33e1b31548d481c826c015bd30135aad42cd67790dab66d2ad243b02204a1ced2604c6735b6393e5b41691dd78b00f0c5942fb9f751856faa938157dba01feffffff0280f0fa020000000017a9140fb9463421696b82c833af241c78c17ddbde493487d0f20a270100000017a91429ca74f8a08f81999428185c97b5d852e4063f6187650000000107da00473044022074018ad4180097b873323c0015720b3684cc8123891048e7dbcd9b55ad679c99022073d369b740e3eb53dcefa33823c8070514ca55a7dd9544f157c167913261118c01483045022100f61038b308dc1da865a34852746f015772934208c6d24454393cd99bdf2217770220056e675a675a6d0a02b85b14e5e29074d8a25a9b5760bea2816f661910a006ea01475221029583bf39ae0a609747ad199addd634fa6108559d6c5cd39b4c2183f1ab96e07f2102dab61ff49a14db6a7d02b0cd1fbb78fc4b18312b5b4e54dae4dba2fbfef536d752ae0001012000c2eb0b0000000017a914b7f5faf40e3d40a5a459b1db3535f2b72fa921e8870107232200208c2353173743b595dfb4a07b72ba8e42e3797da74e87fe7d9d7497e3b20289030108da0400473044022062eb7a556107a7c73f45ac4ab5a1dddf6f7075fb1275969a7f383efff784bcb202200c05dbb7470dbf2f08557dd356c7325c1ed30913e996cd3840945db12228da5f01473044022065f45ba5998b59a27ffe1a7bed016af1f1f90d54b3aa8f7450aa5f56a25103bd02207f724703ad1edb96680b284b56d4ffcb88f7fb759eabbe08aa30f29b851383d20147522103089dc10c7ac6db54f91329af617333db388cead0c231f723379d1b99030b02dc21023add904f3d6dcf59ddb906b0dee23529b7ffb9ed50e5e86151926860221f0e7352ae00220203a9a4c37f5996d3aa25dbac6b570af0650394492942460b354753ed9eeca5877110d90c6a4f000000800000008004000080002202027f6399757d2eff55a136ad02c684b1838b6556e5f1b6b34282a94b6b5005109610d90c6a4f00000080000000800500008000'))
with self.assertRaises(SerializationError):
tx2 = tx_from_any('cHNidP8BAJoCAAAAAljoeiG1ba8MI76OcHBFbDNvfLqlyHV5JPVFiHuyq911AAAAAAD/////g40EJ9DsZQpoqka7CwmK6kQiwHGyyng1Kgd5WdB86h0BAAAAAP////8CcKrwCAAAAAAWABTYXCtx0AYLCcmIauuBXlCZHdoSTQDh9QUAAAAAFgAUAK6pouXw+HaliN9VRuh0LR2HAI8AAAAAAAIAALsCAAAAAarXOTEBi9JfhK5AC2iEi+CdtwbqwqwYKYur7nGrZW+LAAAAAEhHMEQCIFj2/HxqM+GzFUjUgcgmwBW9MBNarULNZ3kNq2bSrSQ7AiBKHO0mBMZzW2OT5bQWkd14sA8MWUL7n3UYVvqpOBV9ugH+////AoDw+gIAAAAAF6kUD7lGNCFpa4LIM68kHHjBfdveSTSH0PIKJwEAAAAXqRQpynT4oI+BmZQoGFyXtdhS5AY/YYdlAAAAAQfaAEcwRAIgdAGK1BgAl7hzMjwAFXILNoTMgSOJEEjn282bVa1nnJkCIHPTabdA4+tT3O+jOCPIBwUUylWn3ZVE8VfBZ5EyYRGMAUgwRQIhAPYQOLMI3B2oZaNIUnRvAVdyk0IIxtJEVDk82ZvfIhd3AiAFbmdaZ1ptCgK4WxTl4pB02KJam1dgvqKBb2YZEKAG6gFHUiEClYO/Oa4KYJdHrRma3dY0+mEIVZ1sXNObTCGD8auW4H8hAtq2H/SaFNtqfQKwzR+7ePxLGDErW05U2uTbovv+9TbXUq4AAQEgAMLrCwAAAAAXqRS39fr0Dj1ApaRZsds1NfK3L6kh6IcBByMiACCMI1MXN0O1ld+0oHtyuo5C43l9p06H/n2ddJfjsgKJAwEI2gQARzBEAiBi63pVYQenxz9FrEq1od3fb3B1+xJ1lpp/OD7/94S8sgIgDAXbt0cNvy8IVX3TVscyXB7TCRPpls04QJRdsSIo2l8BRzBEAiBl9FulmYtZon/+GnvtAWrx8fkNVLOqj3RQql9WolEDvQIgf3JHA60e25ZoCyhLVtT/y4j3+3Weq74IqjDym4UTg9IBR1IhAwidwQx6xttU+RMpr2FzM9s4jOrQwjH3IzedG5kDCwLcIQI63ZBPPW3PWd25BrDe4jUpt/+57VDl6GFRkmhgIh8Oc1KuACICA6mkw39ZltOqJdusa1cK8GUDlEkpQkYLNUdT7Z7spYdxENkMak8AAACAAAAAgAQAAIAAIgICf2OZdX0u/1WhNq0CxoSxg4tlVuXxtrNCgqlLa1AFEJYQ2QxqTwAAAIAAAACABQAAgAA=')
def test_invalid_psbt_013(self):
# Case: PSBT With invalid final scriptsig typed key
with self.assertRaises(SerializationError):
tx1 = tx_from_any(bytes.fromhex('70736274ff01009a020000000258e87a21b56daf0c23be8e7070456c336f7cbaa5c8757924f545887bb2abdd750000000000ffffffff838d0427d0ec650a68aa46bb0b098aea4422c071b2ca78352a077959d07cea1d0100000000ffffffff0270aaf00800000000160014d85c2b71d0060b09c9886aeb815e50991dda124d00e1f5050000000016001400aea9a2e5f0f876a588df5546e8742d1d87008f00000000000100bb0200000001aad73931018bd25f84ae400b68848be09db706eac2ac18298babee71ab656f8b0000000048473044022058f6fc7c6a33e1b31548d481c826c015bd30135aad42cd67790dab66d2ad243b02204a1ced2604c6735b6393e5b41691dd78b00f0c5942fb9f751856faa938157dba01feffffff0280f0fa020000000017a9140fb9463421696b82c833af241c78c17ddbde493487d0f20a270100000017a91429ca74f8a08f81999428185c97b5d852e4063f618765000000020700da00473044022074018ad4180097b873323c0015720b3684cc8123891048e7dbcd9b55ad679c99022073d369b740e3eb53dcefa33823c8070514ca55a7dd9544f157c167913261118c01483045022100f61038b308dc1da865a34852746f015772934208c6d24454393cd99bdf2217770220056e675a675a6d0a02b85b14e5e29074d8a25a9b5760bea2816f661910a006ea01475221029583bf39ae0a609747ad199addd634fa6108559d6c5cd39b4c2183f1ab96e07f2102dab61ff49a14db6a7d02b0cd1fbb78fc4b18312b5b4e54dae4dba2fbfef536d752ae0001012000c2eb0b0000000017a914b7f5faf40e3d40a5a459b1db3535f2b72fa921e8870107232200208c2353173743b595dfb4a07b72ba8e42e3797da74e87fe7d9d7497e3b20289030108da0400473044022062eb7a556107a7c73f45ac4ab5a1dddf6f7075fb1275969a7f383efff784bcb202200c05dbb7470dbf2f08557dd356c7325c1ed30913e996cd3840945db12228da5f01473044022065f45ba5998b59a27ffe1a7bed016af1f1f90d54b3aa8f7450aa5f56a25103bd02207f724703ad1edb96680b284b56d4ffcb88f7fb759eabbe08aa30f29b851383d20147522103089dc10c7ac6db54f91329af617333db388cead0c231f723379d1b99030b02dc21023add904f3d6dcf59ddb906b0dee23529b7ffb9ed50e5e86151926860221f0e7352ae00220203a9a4c37f5996d3aa25dbac6b570af0650394492942460b354753ed9eeca5877110d90c6a4f000000800000008004000080002202027f6399757d2eff55a136ad02c684b1838b6556e5f1b6b34282a94b6b5005109610d90c6a4f00000080000000800500008000'))
with self.assertRaises(SerializationError):
tx2 = tx_from_any('cHNidP8BAJoCAAAAAljoeiG1ba8MI76OcHBFbDNvfLqlyHV5JPVFiHuyq911AAAAAAD/////g40EJ9DsZQpoqka7CwmK6kQiwHGyyng1Kgd5WdB86h0BAAAAAP////8CcKrwCAAAAAAWABTYXCtx0AYLCcmIauuBXlCZHdoSTQDh9QUAAAAAFgAUAK6pouXw+HaliN9VRuh0LR2HAI8AAAAAAAEAuwIAAAABqtc5MQGL0l+ErkALaISL4J23BurCrBgpi6vucatlb4sAAAAASEcwRAIgWPb8fGoz4bMVSNSByCbAFb0wE1qtQs1neQ2rZtKtJDsCIEoc7SYExnNbY5PltBaR3XiwDwxZQvufdRhW+qk4FX26Af7///8CgPD6AgAAAAAXqRQPuUY0IWlrgsgzryQceMF9295JNIfQ8gonAQAAABepFCnKdPigj4GZlCgYXJe12FLkBj9hh2UAAAACBwDaAEcwRAIgdAGK1BgAl7hzMjwAFXILNoTMgSOJEEjn282bVa1nnJkCIHPTabdA4+tT3O+jOCPIBwUUylWn3ZVE8VfBZ5EyYRGMAUgwRQIhAPYQOLMI3B2oZaNIUnRvAVdyk0IIxtJEVDk82ZvfIhd3AiAFbmdaZ1ptCgK4WxTl4pB02KJam1dgvqKBb2YZEKAG6gFHUiEClYO/Oa4KYJdHrRma3dY0+mEIVZ1sXNObTCGD8auW4H8hAtq2H/SaFNtqfQKwzR+7ePxLGDErW05U2uTbovv+9TbXUq4AAQEgAMLrCwAAAAAXqRS39fr0Dj1ApaRZsds1NfK3L6kh6IcBByMiACCMI1MXN0O1ld+0oHtyuo5C43l9p06H/n2ddJfjsgKJAwEI2gQARzBEAiBi63pVYQenxz9FrEq1od3fb3B1+xJ1lpp/OD7/94S8sgIgDAXbt0cNvy8IVX3TVscyXB7TCRPpls04QJRdsSIo2l8BRzBEAiBl9FulmYtZon/+GnvtAWrx8fkNVLOqj3RQql9WolEDvQIgf3JHA60e25ZoCyhLVtT/y4j3+3Weq74IqjDym4UTg9IBR1IhAwidwQx6xttU+RMpr2FzM9s4jOrQwjH3IzedG5kDCwLcIQI63ZBPPW3PWd25BrDe4jUpt/+57VDl6GFRkmhgIh8Oc1KuACICA6mkw39ZltOqJdusa1cK8GUDlEkpQkYLNUdT7Z7spYdxENkMak8AAACAAAAAgAQAAIAAIgICf2OZdX0u/1WhNq0CxoSxg4tlVuXxtrNCgqlLa1AFEJYQ2QxqTwAAAIAAAACABQAAgAA=')
def test_invalid_psbt_014(self):
# Case: PSBT With invalid final script witness typed key
with self.assertRaises(SerializationError):
tx1 = tx_from_any(bytes.fromhex('70736274ff01009a020000000258e87a21b56daf0c23be8e7070456c336f7cbaa5c8757924f545887bb2abdd750000000000ffffffff838d0427d0ec650a68aa46bb0b098aea4422c071b2ca78352a077959d07cea1d0100000000ffffffff0270aaf00800000000160014d85c2b71d0060b09c9886aeb815e50991dda124d00e1f5050000000016001400aea9a2e5f0f876a588df5546e8742d1d87008f00000000000100bb0200000001aad73931018bd25f84ae400b68848be09db706eac2ac18298babee71ab656f8b0000000048473044022058f6fc7c6a33e1b31548d481c826c015bd30135aad42cd67790dab66d2ad243b02204a1ced2604c6735b6393e5b41691dd78b00f0c5942fb9f751856faa938157dba01feffffff0280f0fa020000000017a9140fb9463421696b82c833af241c78c17ddbde493487d0f20a270100000017a91429ca74f8a08f81999428185c97b5d852e4063f6187650000000107da00473044022074018ad4180097b873323c0015720b3684cc8123891048e7dbcd9b55ad679c99022073d369b740e3eb53dcefa33823c8070514ca55a7dd9544f157c167913261118c01483045022100f61038b308dc1da865a34852746f015772934208c6d24454393cd99bdf2217770220056e675a675a6d0a02b85b14e5e29074d8a25a9b5760bea2816f661910a006ea01475221029583bf39ae0a609747ad199addd634fa6108559d6c5cd39b4c2183f1ab96e07f2102dab61ff49a14db6a7d02b0cd1fbb78fc4b18312b5b4e54dae4dba2fbfef536d752ae0001012000c2eb0b0000000017a914b7f5faf40e3d40a5a459b1db3535f2b72fa921e8870107232200208c2353173743b595dfb4a07b72ba8e42e3797da74e87fe7d9d7497e3b2028903020800da0400473044022062eb7a556107a7c73f45ac4ab5a1dddf6f7075fb1275969a7f383efff784bcb202200c05dbb7470dbf2f08557dd356c7325c1ed30913e996cd3840945db12228da5f01473044022065f45ba5998b59a27ffe1a7bed016af1f1f90d54b3aa8f7450aa5f56a25103bd02207f724703ad1edb96680b284b56d4ffcb88f7fb759eabbe08aa30f29b851383d20147522103089dc10c7ac6db54f91329af617333db388cead0c231f723379d1b99030b02dc21023add904f3d6dcf59ddb906b0dee23529b7ffb9ed50e5e86151926860221f0e7352ae00220203a9a4c37f5996d3aa25dbac6b570af0650394492942460b354753ed9eeca5877110d90c6a4f000000800000008004000080002202027f6399757d2eff55a136ad02c684b1838b6556e5f1b6b34282a94b6b5005109610d90c6a4f00000080000000800500008000'))
with self.assertRaises(SerializationError):
tx2 = tx_from_any('cHNidP8BAJoCAAAAAljoeiG1ba8MI76OcHBFbDNvfLqlyHV5JPVFiHuyq911AAAAAAD/////g40EJ9DsZQpoqka7CwmK6kQiwHGyyng1Kgd5WdB86h0BAAAAAP////8CcKrwCAAAAAAWABTYXCtx0AYLCcmIauuBXlCZHdoSTQDh9QUAAAAAFgAUAK6pouXw+HaliN9VRuh0LR2HAI8AAAAAAAEAuwIAAAABqtc5MQGL0l+ErkALaISL4J23BurCrBgpi6vucatlb4sAAAAASEcwRAIgWPb8fGoz4bMVSNSByCbAFb0wE1qtQs1neQ2rZtKtJDsCIEoc7SYExnNbY5PltBaR3XiwDwxZQvufdRhW+qk4FX26Af7///8CgPD6AgAAAAAXqRQPuUY0IWlrgsgzryQceMF9295JNIfQ8gonAQAAABepFCnKdPigj4GZlCgYXJe12FLkBj9hh2UAAAABB9oARzBEAiB0AYrUGACXuHMyPAAVcgs2hMyBI4kQSOfbzZtVrWecmQIgc9Npt0Dj61Pc76M4I8gHBRTKVafdlUTxV8FnkTJhEYwBSDBFAiEA9hA4swjcHahlo0hSdG8BV3KTQgjG0kRUOTzZm98iF3cCIAVuZ1pnWm0KArhbFOXikHTYolqbV2C+ooFvZhkQoAbqAUdSIQKVg785rgpgl0etGZrd1jT6YQhVnWxc05tMIYPxq5bgfyEC2rYf9JoU22p9ArDNH7t4/EsYMStbTlTa5Nui+/71NtdSrgABASAAwusLAAAAABepFLf1+vQOPUClpFmx2zU18rcvqSHohwEHIyIAIIwjUxc3Q7WV37Sge3K6jkLjeX2nTof+fZ10l+OyAokDAggA2gQARzBEAiBi63pVYQenxz9FrEq1od3fb3B1+xJ1lpp/OD7/94S8sgIgDAXbt0cNvy8IVX3TVscyXB7TCRPpls04QJRdsSIo2l8BRzBEAiBl9FulmYtZon/+GnvtAWrx8fkNVLOqj3RQql9WolEDvQIgf3JHA60e25ZoCyhLVtT/y4j3+3Weq74IqjDym4UTg9IBR1IhAwidwQx6xttU+RMpr2FzM9s4jOrQwjH3IzedG5kDCwLcIQI63ZBPPW3PWd25BrDe4jUpt/+57VDl6GFRkmhgIh8Oc1KuACICA6mkw39ZltOqJdusa1cK8GUDlEkpQkYLNUdT7Z7spYdxENkMak8AAACAAAAAgAQAAIAAIgICf2OZdX0u/1WhNq0CxoSxg4tlVuXxtrNCgqlLa1AFEJYQ2QxqTwAAAIAAAACABQAAgAA=')
def test_invalid_psbt_015(self):
# Case: PSBT With invalid pubkey in output BIP 32 derivation paths typed key
with self.assertRaises(SerializationError):
tx1 = tx_from_any(bytes.fromhex('70736274ff01009a020000000258e87a21b56daf0c23be8e7070456c336f7cbaa5c8757924f545887bb2abdd750000000000ffffffff838d0427d0ec650a68aa46bb0b098aea4422c071b2ca78352a077959d07cea1d0100000000ffffffff0270aaf00800000000160014d85c2b71d0060b09c9886aeb815e50991dda124d00e1f5050000000016001400aea9a2e5f0f876a588df5546e8742d1d87008f00000000000100bb0200000001aad73931018bd25f84ae400b68848be09db706eac2ac18298babee71ab656f8b0000000048473044022058f6fc7c6a33e1b31548d481c826c015bd30135aad42cd67790dab66d2ad243b02204a1ced2604c6735b6393e5b41691dd78b00f0c5942fb9f751856faa938157dba01feffffff0280f0fa020000000017a9140fb9463421696b82c833af241c78c17ddbde493487d0f20a270100000017a91429ca74f8a08f81999428185c97b5d852e4063f6187650000000107da00473044022074018ad4180097b873323c0015720b3684cc8123891048e7dbcd9b55ad679c99022073d369b740e3eb53dcefa33823c8070514ca55a7dd9544f157c167913261118c01483045022100f61038b308dc1da865a34852746f015772934208c6d24454393cd99bdf2217770220056e675a675a6d0a02b85b14e5e29074d8a25a9b5760bea2816f661910a006ea01475221029583bf39ae0a609747ad199addd634fa6108559d6c5cd39b4c2183f1ab96e07f2102dab61ff49a14db6a7d02b0cd1fbb78fc4b18312b5b4e54dae4dba2fbfef536d752ae0001012000c2eb0b0000000017a914b7f5faf40e3d40a5a459b1db3535f2b72fa921e8870107232200208c2353173743b595dfb4a07b72ba8e42e3797da74e87fe7d9d7497e3b20289030108da0400473044022062eb7a556107a7c73f45ac4ab5a1dddf6f7075fb1275969a7f383efff784bcb202200c05dbb7470dbf2f08557dd356c7325c1ed30913e996cd3840945db12228da5f01473044022065f45ba5998b59a27ffe1a7bed016af1f1f90d54b3aa8f7450aa5f56a25103bd02207f724703ad1edb96680b284b56d4ffcb88f7fb759eabbe08aa30f29b851383d20147522103089dc10c7ac6db54f91329af617333db388cead0c231f723379d1b99030b02dc21023add904f3d6dcf59ddb906b0dee23529b7ffb9ed50e5e86151926860221f0e7352ae00210203a9a4c37f5996d3aa25dbac6b570af0650394492942460b354753ed9eeca58710d90c6a4f000000800000008004000080002202027f6399757d2eff55a136ad02c684b1838b6556e5f1b6b34282a94b6b5005109610d90c6a4f00000080000000800500008000'))
with self.assertRaises(SerializationError):
tx2 = tx_from_any('cHNidP8BAJoCAAAAAljoeiG1ba8MI76OcHBFbDNvfLqlyHV5JPVFiHuyq911AAAAAAD/////g40EJ9DsZQpoqka7CwmK6kQiwHGyyng1Kgd5WdB86h0BAAAAAP////8CcKrwCAAAAAAWABTYXCtx0AYLCcmIauuBXlCZHdoSTQDh9QUAAAAAFgAUAK6pouXw+HaliN9VRuh0LR2HAI8AAAAAAAEAuwIAAAABqtc5MQGL0l+ErkALaISL4J23BurCrBgpi6vucatlb4sAAAAASEcwRAIgWPb8fGoz4bMVSNSByCbAFb0wE1qtQs1neQ2rZtKtJDsCIEoc7SYExnNbY5PltBaR3XiwDwxZQvufdRhW+qk4FX26Af7///8CgPD6AgAAAAAXqRQPuUY0IWlrgsgzryQceMF9295JNIfQ8gonAQAAABepFCnKdPigj4GZlCgYXJe12FLkBj9hh2UAAAABB9oARzBEAiB0AYrUGACXuHMyPAAVcgs2hMyBI4kQSOfbzZtVrWecmQIgc9Npt0Dj61Pc76M4I8gHBRTKVafdlUTxV8FnkTJhEYwBSDBFAiEA9hA4swjcHahlo0hSdG8BV3KTQgjG0kRUOTzZm98iF3cCIAVuZ1pnWm0KArhbFOXikHTYolqbV2C+ooFvZhkQoAbqAUdSIQKVg785rgpgl0etGZrd1jT6YQhVnWxc05tMIYPxq5bgfyEC2rYf9JoU22p9ArDNH7t4/EsYMStbTlTa5Nui+/71NtdSrgABASAAwusLAAAAABepFLf1+vQOPUClpFmx2zU18rcvqSHohwEHIyIAIIwjUxc3Q7WV37Sge3K6jkLjeX2nTof+fZ10l+OyAokDAQjaBABHMEQCIGLrelVhB6fHP0WsSrWh3d9vcHX7EnWWmn84Pv/3hLyyAiAMBdu3Rw2/LwhVfdNWxzJcHtMJE+mWzThAlF2xIijaXwFHMEQCIGX0W6WZi1mif/4ae+0BavHx+Q1Us6qPdFCqX1aiUQO9AiB/ckcDrR7blmgLKEtW1P/LiPf7dZ6rvgiqMPKbhROD0gFHUiEDCJ3BDHrG21T5EymvYXMz2ziM6tDCMfcjN50bmQMLAtwhAjrdkE89bc9Z3bkGsN7iNSm3/7ntUOXoYVGSaGAiHw5zUq4AIQIDqaTDf1mW06ol26xrVwrwZQOUSSlCRgs1R1PtnuylhxDZDGpPAAAAgAAAAIAEAACAACICAn9jmXV9Lv9VoTatAsaEsYOLZVbl8bazQoKpS2tQBRCWENkMak8AAACAAAAAgAUAAIAA')
def test_invalid_psbt_016(self):
# Case: PSBT With invalid input sighash type typed key
with self.assertRaises(SerializationError):
tx1 = tx_from_any(bytes.fromhex('70736274ff0100730200000001301ae986e516a1ec8ac5b4bc6573d32f83b465e23ad76167d68b38e730b4dbdb0000000000ffffffff02747b01000000000017a91403aa17ae882b5d0d54b25d63104e4ffece7b9ea2876043993b0000000017a914b921b1ba6f722e4bfa83b6557a3139986a42ec8387000000000001011f00ca9a3b00000000160014d2d94b64ae08587eefc8eeb187c601e939f9037c0203000100000000010016001462e9e982fff34dd8239610316b090cd2a3b747cb000100220020876bad832f1d168015ed41232a9ea65a1815d9ef13c0ef8759f64b5b2b278a65010125512103b7ce23a01c5b4bf00a642537cdfabb315b668332867478ef51309d2bd57f8a8751ae00'))
with self.assertRaises(SerializationError):
tx2 = tx_from_any('cHNidP8BAHMCAAAAATAa6YblFqHsisW0vGVz0y+DtGXiOtdhZ9aLOOcwtNvbAAAAAAD/////AnR7AQAAAAAAF6kUA6oXrogrXQ1Usl1jEE5P/s57nqKHYEOZOwAAAAAXqRS5IbG6b3IuS/qDtlV6MTmYakLsg4cAAAAAAAEBHwDKmjsAAAAAFgAU0tlLZK4IWH7vyO6xh8YB6Tn5A3wCAwABAAAAAAEAFgAUYunpgv/zTdgjlhAxawkM0qO3R8sAAQAiACCHa62DLx0WgBXtQSMqnqZaGBXZ7xPA74dZ9ktbKyeKZQEBJVEhA7fOI6AcW0vwCmQlN836uzFbZoMyhnR471EwnSvVf4qHUa4A')
def test_invalid_psbt_017(self):
# Case: PSBT With invalid output redeemScript typed key
with self.assertRaises(SerializationError):
tx1 = tx_from_any(bytes.fromhex('70736274ff0100730200000001301ae986e516a1ec8ac5b4bc6573d32f83b465e23ad76167d68b38e730b4dbdb0000000000ffffffff02747b01000000000017a91403aa17ae882b5d0d54b25d63104e4ffece7b9ea2876043993b0000000017a914b921b1ba6f722e4bfa83b6557a3139986a42ec8387000000000001011f00ca9a3b00000000160014d2d94b64ae08587eefc8eeb187c601e939f9037c0002000016001462e9e982fff34dd8239610316b090cd2a3b747cb000100220020876bad832f1d168015ed41232a9ea65a1815d9ef13c0ef8759f64b5b2b278a65010125512103b7ce23a01c5b4bf00a642537cdfabb315b668332867478ef51309d2bd57f8a8751ae00'))
with self.assertRaises(SerializationError):
tx2 = tx_from_any('cHNidP8BAHMCAAAAATAa6YblFqHsisW0vGVz0y+DtGXiOtdhZ9aLOOcwtNvbAAAAAAD/////AnR7AQAAAAAAF6kUA6oXrogrXQ1Usl1jEE5P/s57nqKHYEOZOwAAAAAXqRS5IbG6b3IuS/qDtlV6MTmYakLsg4cAAAAAAAEBHwDKmjsAAAAAFgAU0tlLZK4IWH7vyO6xh8YB6Tn5A3wAAgAAFgAUYunpgv/zTdgjlhAxawkM0qO3R8sAAQAiACCHa62DLx0WgBXtQSMqnqZaGBXZ7xPA74dZ9ktbKyeKZQEBJVEhA7fOI6AcW0vwCmQlN836uzFbZoMyhnR471EwnSvVf4qHUa4A')
def test_invalid_psbt_018(self):
# Case: PSBT With invalid output witnessScript typed key
with self.assertRaises(SerializationError):
tx1 = tx_from_any(bytes.fromhex('70736274ff0100730200000001301ae986e516a1ec8ac5b4bc6573d32f83b465e23ad76167d68b38e730b4dbdb0000000000ffffffff02747b01000000000017a91403aa17ae882b5d0d54b25d63104e4ffece7b9ea2876043993b0000000017a914b921b1ba6f722e4bfa83b6557a3139986a42ec8387000000000001011f00ca9a3b00000000160014d2d94b64ae08587eefc8eeb187c601e939f9037c00010016001462e9e982fff34dd8239610316b090cd2a3b747cb000100220020876bad832f1d168015ed41232a9ea65a1815d9ef13c0ef8759f64b5b2b278a6521010025512103b7ce23a01c5b4bf00a642537cdfabb315b668332867478ef51309d2bd57f8a8751ae00'))
with self.assertRaises(SerializationError):
tx2 = tx_from_any('cHNidP8BAHMCAAAAATAa6YblFqHsisW0vGVz0y+DtGXiOtdhZ9aLOOcwtNvbAAAAAAD/////AnR7AQAAAAAAF6kUA6oXrogrXQ1Usl1jEE5P/s57nqKHYEOZOwAAAAAXqRS5IbG6b3IuS/qDtlV6MTmYakLsg4cAAAAAAAEBHwDKmjsAAAAAFgAU0tlLZK4IWH7vyO6xh8YB6Tn5A3wAAQAWABRi6emC//NN2COWEDFrCQzSo7dHywABACIAIIdrrYMvHRaAFe1BIyqeploYFdnvE8Dvh1n2S1srJ4plIQEAJVEhA7fOI6AcW0vwCmQlN836uzFbZoMyhnR471EwnSvVf4qHUa4A')
class TestPSBTSignerChecks(TestCaseForTestnet):
# test cases from BIP-0174
@unittest.skip("the check this test is testing is intentionally disabled in transaction.py")
def test_psbt_fails_signer_checks_001(self):
# Case: A Witness UTXO is provided for a non-witness input
with self.assertRaises(PSBTInputConsistencyFailure):
tx1 = PartialTransaction.from_raw_psbt(bytes.fromhex('70736274ff0100a00200000002ab0949a08c5af7c49b8212f417e2f15ab3f5c33dcf153821a8139f877a5b7be40000000000feffffffab0949a08c5af7c49b8212f417e2f15ab3f5c33dcf153821a8139f877a5b7be40100000000feffffff02603bea0b000000001976a914768a40bbd740cbe81d988e71de2a4d5c71396b1d88ac8e240000000000001976a9146f4620b553fa095e721b9ee0efe9fa039cca459788ac0000000000010122d3dff505000000001976a914d48ed3110b94014cb114bd32d6f4d066dc74256b88ac0001012000e1f5050000000017a9143545e6e33b832c47050f24d3eeb93c9c03948bc787010416001485d13537f2e265405a34dbafa9e3dda01fb8230800220202ead596687ca806043edc3de116cdf29d5e9257c196cd055cf698c8d02bf24e9910b4a6ba670000008000000080020000800022020394f62be9df19952c5587768aeb7698061ad2c4a25c894f47d8c162b4d7213d0510b4a6ba6700000080010000800200008000'))
for txin in tx1.inputs():
txin.validate_data(for_signing=True)
with self.assertRaises(PSBTInputConsistencyFailure):
tx2 = PartialTransaction.from_raw_psbt('cHNidP8BAKACAAAAAqsJSaCMWvfEm4IS9Bfi8Vqz9cM9zxU4IagTn4d6W3vkAAAAAAD+////qwlJoIxa98SbghL0F+LxWrP1wz3PFTghqBOfh3pbe+QBAAAAAP7///8CYDvqCwAAAAAZdqkUdopAu9dAy+gdmI5x3ipNXHE5ax2IrI4kAAAAAAAAGXapFG9GILVT+glechue4O/p+gOcykWXiKwAAAAAAAEBItPf9QUAAAAAGXapFNSO0xELlAFMsRS9Mtb00GbcdCVriKwAAQEgAOH1BQAAAAAXqRQ1RebjO4MsRwUPJNPuuTycA5SLx4cBBBYAFIXRNTfy4mVAWjTbr6nj3aAfuCMIACICAurVlmh8qAYEPtw94RbN8p1eklfBls0FXPaYyNAr8k6ZELSmumcAAACAAAAAgAIAAIAAIgIDlPYr6d8ZlSxVh3aK63aYBhrSxKJciU9H2MFitNchPQUQtKa6ZwAAAIABAACAAgAAgAA=')
for txin in tx2.inputs():
txin.validate_data(for_signing=True)
def test_psbt_fails_signer_checks_002(self):
# Case: redeemScript with non-witness UTXO does not match the scriptPubKey
with self.assertRaises(PSBTInputConsistencyFailure):
tx1 = PartialTransaction.from_raw_psbt(bytes.fromhex('70736274ff01009a020000000258e87a21b56daf0c23be8e7070456c336f7cbaa5c8757924f545887bb2abdd750000000000ffffffff838d0427d0ec650a68aa46bb0b098aea4422c071b2ca78352a077959d07cea1d0100000000ffffffff0270aaf00800000000160014d85c2b71d0060b09c9886aeb815e50991dda124d00e1f5050000000016001400aea9a2e5f0f876a588df5546e8742d1d87008f00000000000100bb0200000001aad73931018bd25f84ae400b68848be09db706eac2ac18298babee71ab656f8b0000000048473044022058f6fc7c6a33e1b31548d481c826c015bd30135aad42cd67790dab66d2ad243b02204a1ced2604c6735b6393e5b41691dd78b00f0c5942fb9f751856faa938157dba01feffffff0280f0fa020000000017a9140fb9463421696b82c833af241c78c17ddbde493487d0f20a270100000017a91429ca74f8a08f81999428185c97b5d852e4063f618765000000220202dab61ff49a14db6a7d02b0cd1fbb78fc4b18312b5b4e54dae4dba2fbfef536d7483045022100f61038b308dc1da865a34852746f015772934208c6d24454393cd99bdf2217770220056e675a675a6d0a02b85b14e5e29074d8a25a9b5760bea2816f661910a006ea01010304010000000104475221029583bf39ae0a609747ad199addd634fa6108559d6c5cd39b4c2183f1ab96e07f2102dab61ff49a14db6a7d02b0cd1fbb78fc4b18312b5b4e54dae4dba2fbfef536d752af2206029583bf39ae0a609747ad199addd634fa6108559d6c5cd39b4c2183f1ab96e07f10d90c6a4f000000800000008000000080220602dab61ff49a14db6a7d02b0cd1fbb78fc4b18312b5b4e54dae4dba2fbfef536d710d90c6a4f0000008000000080010000800001012000c2eb0b0000000017a914b7f5faf40e3d40a5a459b1db3535f2b72fa921e8872202023add904f3d6dcf59ddb906b0dee23529b7ffb9ed50e5e86151926860221f0e73473044022065f45ba5998b59a27ffe1a7bed016af1f1f90d54b3aa8f7450aa5f56a25103bd02207f724703ad1edb96680b284b56d4ffcb88f7fb759eabbe08aa30f29b851383d2010103040100000001042200208c2353173743b595dfb4a07b72ba8e42e3797da74e87fe7d9d7497e3b2028903010547522103089dc10c7ac6db54f91329af617333db388cead0c231f723379d1b99030b02dc21023add904f3d6dcf59ddb906b0dee23529b7ffb9ed50e5e86151926860221f0e7352ae2206023add904f3d6dcf59ddb906b0dee23529b7ffb9ed50e5e86151926860221f0e7310d90c6a4f000000800000008003000080220603089dc10c7ac6db54f91329af617333db388cead0c231f723379d1b99030b02dc10d90c6a4f00000080000000800200008000220203a9a4c37f5996d3aa25dbac6b570af0650394492942460b354753ed9eeca5877110d90c6a4f000000800000008004000080002202027f6399757d2eff55a136ad02c684b1838b6556e5f1b6b34282a94b6b5005109610d90c6a4f00000080000000800500008000'))
with self.assertRaises(PSBTInputConsistencyFailure):
tx2 = PartialTransaction.from_raw_psbt('cHNidP8BAJoCAAAAAljoeiG1ba8MI76OcHBFbDNvfLqlyHV5JPVFiHuyq911AAAAAAD/////g40EJ9DsZQpoqka7CwmK6kQiwHGyyng1Kgd5WdB86h0BAAAAAP////8CcKrwCAAAAAAWABTYXCtx0AYLCcmIauuBXlCZHdoSTQDh9QUAAAAAFgAUAK6pouXw+HaliN9VRuh0LR2HAI8AAAAAAAEAuwIAAAABqtc5MQGL0l+ErkALaISL4J23BurCrBgpi6vucatlb4sAAAAASEcwRAIgWPb8fGoz4bMVSNSByCbAFb0wE1qtQs1neQ2rZtKtJDsCIEoc7SYExnNbY5PltBaR3XiwDwxZQvufdRhW+qk4FX26Af7///8CgPD6AgAAAAAXqRQPuUY0IWlrgsgzryQceMF9295JNIfQ8gonAQAAABepFCnKdPigj4GZlCgYXJe12FLkBj9hh2UAAAAiAgLath/0mhTban0CsM0fu3j8SxgxK1tOVNrk26L7/vU210gwRQIhAPYQOLMI3B2oZaNIUnRvAVdyk0IIxtJEVDk82ZvfIhd3AiAFbmdaZ1ptCgK4WxTl4pB02KJam1dgvqKBb2YZEKAG6gEBAwQBAAAAAQRHUiEClYO/Oa4KYJdHrRma3dY0+mEIVZ1sXNObTCGD8auW4H8hAtq2H/SaFNtqfQKwzR+7ePxLGDErW05U2uTbovv+9TbXUq8iBgKVg785rgpgl0etGZrd1jT6YQhVnWxc05tMIYPxq5bgfxDZDGpPAAAAgAAAAIAAAACAIgYC2rYf9JoU22p9ArDNH7t4/EsYMStbTlTa5Nui+/71NtcQ2QxqTwAAAIAAAACAAQAAgAABASAAwusLAAAAABepFLf1+vQOPUClpFmx2zU18rcvqSHohyICAjrdkE89bc9Z3bkGsN7iNSm3/7ntUOXoYVGSaGAiHw5zRzBEAiBl9FulmYtZon/+GnvtAWrx8fkNVLOqj3RQql9WolEDvQIgf3JHA60e25ZoCyhLVtT/y4j3+3Weq74IqjDym4UTg9IBAQMEAQAAAAEEIgAgjCNTFzdDtZXftKB7crqOQuN5fadOh/59nXSX47ICiQMBBUdSIQMIncEMesbbVPkTKa9hczPbOIzq0MIx9yM3nRuZAwsC3CECOt2QTz1tz1nduQaw3uI1Kbf/ue1Q5ehhUZJoYCIfDnNSriIGAjrdkE89bc9Z3bkGsN7iNSm3/7ntUOXoYVGSaGAiHw5zENkMak8AAACAAAAAgAMAAIAiBgMIncEMesbbVPkTKa9hczPbOIzq0MIx9yM3nRuZAwsC3BDZDGpPAAAAgAAAAIACAACAACICA6mkw39ZltOqJdusa1cK8GUDlEkpQkYLNUdT7Z7spYdxENkMak8AAACAAAAAgAQAAIAAIgICf2OZdX0u/1WhNq0CxoSxg4tlVuXxtrNCgqlLa1AFEJYQ2QxqTwAAAIAAAACABQAAgAA=')
def test_psbt_fails_signer_checks_003(self):
# Case: redeemScript with witness UTXO does not match the scriptPubKey
with self.assertRaises(PSBTInputConsistencyFailure):
tx1 = PartialTransaction.from_raw_psbt(bytes.fromhex('70736274ff01009a020000000258e87a21b56daf0c23be8e7070456c336f7cbaa5c8757924f545887bb2abdd750000000000ffffffff838d0427d0ec650a68aa46bb0b098aea4422c071b2ca78352a077959d07cea1d0100000000ffffffff0270aaf00800000000160014d85c2b71d0060b09c9886aeb815e50991dda124d00e1f5050000000016001400aea9a2e5f0f876a588df5546e8742d1d87008f00000000000100bb0200000001aad73931018bd25f84ae400b68848be09db706eac2ac18298babee71ab656f8b0000000048473044022058f6fc7c6a33e1b31548d481c826c015bd30135aad42cd67790dab66d2ad243b02204a1ced2604c6735b6393e5b41691dd78b00f0c5942fb9f751856faa938157dba01feffffff0280f0fa020000000017a9140fb9463421696b82c833af241c78c17ddbde493487d0f20a270100000017a91429ca74f8a08f81999428185c97b5d852e4063f618765000000220202dab61ff49a14db6a7d02b0cd1fbb78fc4b18312b5b4e54dae4dba2fbfef536d7483045022100f61038b308dc1da865a34852746f015772934208c6d24454393cd99bdf2217770220056e675a675a6d0a02b85b14e5e29074d8a25a9b5760bea2816f661910a006ea01010304010000000104475221029583bf39ae0a609747ad199addd634fa6108559d6c5cd39b4c2183f1ab96e07f2102dab61ff49a14db6a7d02b0cd1fbb78fc4b18312b5b4e54dae4dba2fbfef536d752ae2206029583bf39ae0a609747ad199addd634fa6108559d6c5cd39b4c2183f1ab96e07f10d90c6a4f000000800000008000000080220602dab61ff49a14db6a7d02b0cd1fbb78fc4b18312b5b4e54dae4dba2fbfef536d710d90c6a4f0000008000000080010000800001012000c2eb0b0000000017a914b7f5faf40e3d40a5a459b1db3535f2b72fa921e8872202023add904f3d6dcf59ddb906b0dee23529b7ffb9ed50e5e86151926860221f0e73473044022065f45ba5998b59a27ffe1a7bed016af1f1f90d54b3aa8f7450aa5f56a25103bd02207f724703ad1edb96680b284b56d4ffcb88f7fb759eabbe08aa30f29b851383d2010103040100000001042200208c2353173743b595dfb4a07b72ba8e42e3797da74e87fe7d9d7497e3b2028900010547522103089dc10c7ac6db54f91329af617333db388cead0c231f723379d1b99030b02dc21023add904f3d6dcf59ddb906b0dee23529b7ffb9ed50e5e86151926860221f0e7352ae2206023add904f3d6dcf59ddb906b0dee23529b7ffb9ed50e5e86151926860221f0e7310d90c6a4f000000800000008003000080220603089dc10c7ac6db54f91329af617333db388cead0c231f723379d1b99030b02dc10d90c6a4f00000080000000800200008000220203a9a4c37f5996d3aa25dbac6b570af0650394492942460b354753ed9eeca5877110d90c6a4f000000800000008004000080002202027f6399757d2eff55a136ad02c684b1838b6556e5f1b6b34282a94b6b5005109610d90c6a4f00000080000000800500008000'))
with self.assertRaises(PSBTInputConsistencyFailure):
tx2 = PartialTransaction.from_raw_psbt('cHNidP8BAJoCAAAAAljoeiG1ba8MI76OcHBFbDNvfLqlyHV5JPVFiHuyq911AAAAAAD/////g40EJ9DsZQpoqka7CwmK6kQiwHGyyng1Kgd5WdB86h0BAAAAAP////8CcKrwCAAAAAAWABTYXCtx0AYLCcmIauuBXlCZHdoSTQDh9QUAAAAAFgAUAK6pouXw+HaliN9VRuh0LR2HAI8AAAAAAAEAuwIAAAABqtc5MQGL0l+ErkALaISL4J23BurCrBgpi6vucatlb4sAAAAASEcwRAIgWPb8fGoz4bMVSNSByCbAFb0wE1qtQs1neQ2rZtKtJDsCIEoc7SYExnNbY5PltBaR3XiwDwxZQvufdRhW+qk4FX26Af7///8CgPD6AgAAAAAXqRQPuUY0IWlrgsgzryQceMF9295JNIfQ8gonAQAAABepFCnKdPigj4GZlCgYXJe12FLkBj9hh2UAAAAiAgLath/0mhTban0CsM0fu3j8SxgxK1tOVNrk26L7/vU210gwRQIhAPYQOLMI3B2oZaNIUnRvAVdyk0IIxtJEVDk82ZvfIhd3AiAFbmdaZ1ptCgK4WxTl4pB02KJam1dgvqKBb2YZEKAG6gEBAwQBAAAAAQRHUiEClYO/Oa4KYJdHrRma3dY0+mEIVZ1sXNObTCGD8auW4H8hAtq2H/SaFNtqfQKwzR+7ePxLGDErW05U2uTbovv+9TbXUq4iBgKVg785rgpgl0etGZrd1jT6YQhVnWxc05tMIYPxq5bgfxDZDGpPAAAAgAAAAIAAAACAIgYC2rYf9JoU22p9ArDNH7t4/EsYMStbTlTa5Nui+/71NtcQ2QxqTwAAAIAAAACAAQAAgAABASAAwusLAAAAABepFLf1+vQOPUClpFmx2zU18rcvqSHohyICAjrdkE89bc9Z3bkGsN7iNSm3/7ntUOXoYVGSaGAiHw5zRzBEAiBl9FulmYtZon/+GnvtAWrx8fkNVLOqj3RQql9WolEDvQIgf3JHA60e25ZoCyhLVtT/y4j3+3Weq74IqjDym4UTg9IBAQMEAQAAAAEEIgAgjCNTFzdDtZXftKB7crqOQuN5fadOh/59nXSX47ICiQABBUdSIQMIncEMesbbVPkTKa9hczPbOIzq0MIx9yM3nRuZAwsC3CECOt2QTz1tz1nduQaw3uI1Kbf/ue1Q5ehhUZJoYCIfDnNSriIGAjrdkE89bc9Z3bkGsN7iNSm3/7ntUOXoYVGSaGAiHw5zENkMak8AAACAAAAAgAMAAIAiBgMIncEMesbbVPkTKa9hczPbOIzq0MIx9yM3nRuZAwsC3BDZDGpPAAAAgAAAAIACAACAACICA6mkw39ZltOqJdusa1cK8GUDlEkpQkYLNUdT7Z7spYdxENkMak8AAACAAAAAgAQAAIAAIgICf2OZdX0u/1WhNq0CxoSxg4tlVuXxtrNCgqlLa1AFEJYQ2QxqTwAAAIAAAACABQAAgAA=')
def test_psbt_fails_signer_checks_004(self):
# Case: witnessScript with witness UTXO does not match the redeemScript
with self.assertRaises(PSBTInputConsistencyFailure):
tx1 = PartialTransaction.from_raw_psbt(bytes.fromhex('70736274ff01009a020000000258e87a21b56daf0c23be8e7070456c336f7cbaa5c8757924f545887bb2abdd750000000000ffffffff838d0427d0ec650a68aa46bb0b098aea4422c071b2ca78352a077959d07cea1d0100000000ffffffff0270aaf00800000000160014d85c2b71d0060b09c9886aeb815e50991dda124d00e1f5050000000016001400aea9a2e5f0f876a588df5546e8742d1d87008f00000000000100bb0200000001aad73931018bd25f84ae400b68848be09db706eac2ac18298babee71ab656f8b0000000048473044022058f6fc7c6a33e1b31548d481c826c015bd30135aad42cd67790dab66d2ad243b02204a1ced2604c6735b6393e5b41691dd78b00f0c5942fb9f751856faa938157dba01feffffff0280f0fa020000000017a9140fb9463421696b82c833af241c78c17ddbde493487d0f20a270100000017a91429ca74f8a08f81999428185c97b5d852e4063f618765000000220202dab61ff49a14db6a7d02b0cd1fbb78fc4b18312b5b4e54dae4dba2fbfef536d7483045022100f61038b308dc1da865a34852746f015772934208c6d24454393cd99bdf2217770220056e675a675a6d0a02b85b14e5e29074d8a25a9b5760bea2816f661910a006ea01010304010000000104475221029583bf39ae0a609747ad199addd634fa6108559d6c5cd39b4c2183f1ab96e07f2102dab61ff49a14db6a7d02b0cd1fbb78fc4b18312b5b4e54dae4dba2fbfef536d752ae2206029583bf39ae0a609747ad199addd634fa6108559d6c5cd39b4c2183f1ab96e07f10d90c6a4f000000800000008000000080220602dab61ff49a14db6a7d02b0cd1fbb78fc4b18312b5b4e54dae4dba2fbfef536d710d90c6a4f0000008000000080010000800001012000c2eb0b0000000017a914b7f5faf40e3d40a5a459b1db3535f2b72fa921e8872202023add904f3d6dcf59ddb906b0dee23529b7ffb9ed50e5e86151926860221f0e73473044022065f45ba5998b59a27ffe1a7bed016af1f1f90d54b3aa8f7450aa5f56a25103bd02207f724703ad1edb96680b284b56d4ffcb88f7fb759eabbe08aa30f29b851383d2010103040100000001042200208c2353173743b595dfb4a07b72ba8e42e3797da74e87fe7d9d7497e3b2028903010547522103089dc10c7ac6db54f91329af617333db388cead0c231f723379d1b99030b02dc21023add904f3d6dcf59ddb906b0dee23529b7ffb9ed50e5e86151926860221f0e7352ad2206023add904f3d6dcf59ddb906b0dee23529b7ffb9ed50e5e86151926860221f0e7310d90c6a4f000000800000008003000080220603089dc10c7ac6db54f91329af617333db388cead0c231f723379d1b99030b02dc10d90c6a4f00000080000000800200008000220203a9a4c37f5996d3aa25dbac6b570af0650394492942460b354753ed9eeca5877110d90c6a4f000000800000008004000080002202027f6399757d2eff55a136ad02c684b1838b6556e5f1b6b34282a94b6b5005109610d90c6a4f00000080000000800500008000'))
with self.assertRaises(PSBTInputConsistencyFailure):
tx2 = PartialTransaction.from_raw_psbt('cHNidP8BAJoCAAAAAljoeiG1ba8MI76OcHBFbDNvfLqlyHV5JPVFiHuyq911AAAAAAD/////g40EJ9DsZQpoqka7CwmK6kQiwHGyyng1Kgd5WdB86h0BAAAAAP////8CcKrwCAAAAAAWABTYXCtx0AYLCcmIauuBXlCZHdoSTQDh9QUAAAAAFgAUAK6pouXw+HaliN9VRuh0LR2HAI8AAAAAAAEAuwIAAAABqtc5MQGL0l+ErkALaISL4J23BurCrBgpi6vucatlb4sAAAAASEcwRAIgWPb8fGoz4bMVSNSByCbAFb0wE1qtQs1neQ2rZtKtJDsCIEoc7SYExnNbY5PltBaR3XiwDwxZQvufdRhW+qk4FX26Af7///8CgPD6AgAAAAAXqRQPuUY0IWlrgsgzryQceMF9295JNIfQ8gonAQAAABepFCnKdPigj4GZlCgYXJe12FLkBj9hh2UAAAAiAgLath/0mhTban0CsM0fu3j8SxgxK1tOVNrk26L7/vU210gwRQIhAPYQOLMI3B2oZaNIUnRvAVdyk0IIxtJEVDk82ZvfIhd3AiAFbmdaZ1ptCgK4WxTl4pB02KJam1dgvqKBb2YZEKAG6gEBAwQBAAAAAQRHUiEClYO/Oa4KYJdHrRma3dY0+mEIVZ1sXNObTCGD8auW4H8hAtq2H/SaFNtqfQKwzR+7ePxLGDErW05U2uTbovv+9TbXUq4iBgKVg785rgpgl0etGZrd1jT6YQhVnWxc05tMIYPxq5bgfxDZDGpPAAAAgAAAAIAAAACAIgYC2rYf9JoU22p9ArDNH7t4/EsYMStbTlTa5Nui+/71NtcQ2QxqTwAAAIAAAACAAQAAgAABASAAwusLAAAAABepFLf1+vQOPUClpFmx2zU18rcvqSHohyICAjrdkE89bc9Z3bkGsN7iNSm3/7ntUOXoYVGSaGAiHw5zRzBEAiBl9FulmYtZon/+GnvtAWrx8fkNVLOqj3RQql9WolEDvQIgf3JHA60e25ZoCyhLVtT/y4j3+3Weq74IqjDym4UTg9IBAQMEAQAAAAEEIgAgjCNTFzdDtZXftKB7crqOQuN5fadOh/59nXSX47ICiQMBBUdSIQMIncEMesbbVPkTKa9hczPbOIzq0MIx9yM3nRuZAwsC3CECOt2QTz1tz1nduQaw3uI1Kbf/ue1Q5ehhUZJoYCIfDnNSrSIGAjrdkE89bc9Z3bkGsN7iNSm3/7ntUOXoYVGSaGAiHw5zENkMak8AAACAAAAAgAMAAIAiBgMIncEMesbbVPkTKa9hczPbOIzq0MIx9yM3nRuZAwsC3BDZDGpPAAAAgAAAAIACAACAACICA6mkw39ZltOqJdusa1cK8GUDlEkpQkYLNUdT7Z7spYdxENkMak8AAACAAAAAgAQAAIAAIgICf2OZdX0u/1WhNq0CxoSxg4tlVuXxtrNCgqlLa1AFEJYQ2QxqTwAAAIAAAACABQAAgAA=')
class TestPSBTComplexChecks(TestCaseForTestnet):
# test cases from BIP-0174
def test_psbt_combiner_unknown_fields(self):
tx1 = tx_from_any("70736274ff01003f0200000001ffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffff0000000000ffffffff010000000000000000036a0100000000000a0f0102030405060708090f0102030405060708090a0b0c0d0e0f000a0f0102030405060708090f0102030405060708090a0b0c0d0e0f000a0f0102030405060708090f0102030405060708090a0b0c0d0e0f00")
tx2 = tx_from_any("70736274ff01003f0200000001ffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffff0000000000ffffffff010000000000000000036a0100000000000a0f0102030405060708100f0102030405060708090a0b0c0d0e0f000a0f0102030405060708100f0102030405060708090a0b0c0d0e0f000a0f0102030405060708100f0102030405060708090a0b0c0d0e0f00")
tx1.combine_with_other_psbt(tx2)
self.assertEqual("70736274ff01003f0200000001ffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffff0000000000ffffffff010000000000000000036a0100000000000a0f0102030405060708090f0102030405060708090a0b0c0d0e0f0a0f0102030405060708100f0102030405060708090a0b0c0d0e0f000a0f0102030405060708090f0102030405060708090a0b0c0d0e0f0a0f0102030405060708100f0102030405060708090a0b0c0d0e0f000a0f0102030405060708090f0102030405060708090a0b0c0d0e0f0a0f0102030405060708100f0102030405060708090a0b0c0d0e0f00",
tx1.serialize_as_bytes().hex())
| 259.140741 | 2,305 | 0.930268 | 1,942 | 69,968 | 33.382595 | 0.217302 | 0.004905 | 0.007358 | 0.018757 | 0.324747 | 0.316371 | 0.307301 | 0.299696 | 0.282852 | 0.269679 | 0 | 0.413013 | 0.041205 | 69,968 | 269 | 2,306 | 260.104089 | 0.553358 | 0.030614 | 0 | 0.371134 | 0 | 0.123711 | 0.862209 | 0.861117 | 0 | 1 | 0 | 0 | 0.371134 | 1 | 0.159794 | false | 0 | 0.025773 | 0 | 0.206186 | 0.005155 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
d9acdd04793e6272118b736a394cecce8b472314 | 36,675 | py | Python | reviewboard/reviews/tests/test_counters.py | pombredanne/reviewboard | 15f1d7236ec7a5cb4778ebfeb8b45d13a46ac71d | [
"MIT"
] | null | null | null | reviewboard/reviews/tests/test_counters.py | pombredanne/reviewboard | 15f1d7236ec7a5cb4778ebfeb8b45d13a46ac71d | [
"MIT"
] | null | null | null | reviewboard/reviews/tests/test_counters.py | pombredanne/reviewboard | 15f1d7236ec7a5cb4778ebfeb8b45d13a46ac71d | [
"MIT"
] | null | null | null | from django.contrib.auth.models import User
from kgb import SpyAgency
from reviewboard.accounts.models import Profile, LocalSiteProfile
from reviewboard.reviews.errors import NotModifiedError
from reviewboard.reviews.models import (Group, ReviewRequest,
ReviewRequestDraft)
from reviewboard.scmtools.models import Repository, Tool
from reviewboard.site.models import LocalSite
from reviewboard.testing import TestCase
class ReviewRequestCounterTests(SpyAgency, TestCase):
"""Unit tests for review request counters."""
fixtures = ['test_scmtools']
def setUp(self):
super(ReviewRequestCounterTests, self).setUp()
tool = Tool.objects.get(name='Subversion')
repository = Repository.objects.create(name='Test1', path='path1',
tool=tool)
self.user = User.objects.create_user(username='testuser', password='',
email='user@example.com')
self.profile = self.user.get_profile()
self.test_site = LocalSite.objects.create(name='test')
self.site_profile2 = \
LocalSiteProfile.objects.create(user=self.user,
profile=self.profile,
local_site=self.test_site)
self.review_request = self.create_review_request(submitter=self.user,
repository=repository)
self.profile.star_review_request(self.review_request)
self.site_profile = self.profile.site_profiles.get(local_site=None)
self.assertEqual(self.site_profile.total_outgoing_request_count, 1)
self.assertEqual(self.site_profile.pending_outgoing_request_count, 1)
self.assertEqual(self.site_profile.starred_public_request_count, 0)
self.group = Group.objects.create(name='test-group')
self.group.users.add(self.user)
self._reload_objects()
self.assertEqual(self.site_profile2.total_outgoing_request_count, 0)
self.assertEqual(self.site_profile2.pending_outgoing_request_count, 0)
self.assertEqual(self.site_profile2.starred_public_request_count, 0)
def test_new_site_profile(self):
"""Testing counters on a new LocalSiteProfile"""
self.site_profile.delete()
self.site_profile = \
LocalSiteProfile.objects.create(user=self.user,
profile=self.profile)
self.assertEqual(self.site_profile.total_outgoing_request_count, 1)
self.assertEqual(self.site_profile.pending_outgoing_request_count, 1)
self.assertEqual(self.site_profile.starred_public_request_count, 0)
self.review_request.publish(self.user)
self._reload_objects()
self.assertEqual(self.site_profile.total_outgoing_request_count, 1)
self.assertEqual(self.site_profile.pending_outgoing_request_count, 1)
self.assertEqual(self.site_profile.starred_public_request_count, 1)
def test_outgoing_requests(self):
"""Testing counters with creating outgoing review requests"""
# The review request was already created
self._check_counters(total_outgoing=1,
pending_outgoing=1)
draft = ReviewRequestDraft.create(self.review_request)
draft.target_people = [self.user]
draft.save()
self.review_request.publish(self.user)
self._check_counters(direct_incoming=1,
total_incoming=1,
total_outgoing=1,
pending_outgoing=1,
starred_public=1)
def test_closing_requests(self, close_type=ReviewRequest.DISCARDED):
"""Testing counters with closing outgoing review requests"""
# The review request was already created
self._check_counters(total_outgoing=1, pending_outgoing=1)
draft = ReviewRequestDraft.create(self.review_request)
draft.target_groups.add(self.group)
draft.target_people.add(self.user)
self.review_request.publish(self.user)
self._check_counters(total_outgoing=1,
pending_outgoing=1,
direct_incoming=1,
total_incoming=1,
starred_public=1,
group_incoming=1)
self.assertTrue(self.review_request.public)
self.assertEqual(self.review_request.status,
ReviewRequest.PENDING_REVIEW)
self.review_request.close(close_type)
self._check_counters(total_outgoing=1)
def test_closing_draft_requests(self, close_type=ReviewRequest.DISCARDED):
"""Testing counters with closing draft review requests"""
# The review request was already created
self._check_counters(total_outgoing=1,
pending_outgoing=1)
self.assertFalse(self.review_request.public)
self.assertEqual(self.review_request.status,
ReviewRequest.PENDING_REVIEW)
self.review_request.close(close_type)
self._check_counters(total_outgoing=1)
def test_closing_closed_requests(self):
"""Testing counters with closing closed review requests"""
self._check_counters(total_outgoing=1,
pending_outgoing=1)
self.review_request.publish(self.user)
self._check_counters(total_outgoing=1,
pending_outgoing=1,
starred_public=1)
self.assertTrue(self.review_request.public)
self.assertEqual(self.review_request.status,
ReviewRequest.PENDING_REVIEW)
self.review_request.close(ReviewRequest.DISCARDED)
self._check_counters(total_outgoing=1)
self.review_request.close(ReviewRequest.SUBMITTED)
self._check_counters(total_outgoing=1)
def test_closing_draft_requests_with_site(self):
"""Testing counters with closing draft review requests on LocalSite"""
self.review_request.delete()
self._check_counters(with_local_site=True)
tool = Tool.objects.get(name='Subversion')
repository = Repository.objects.create(name='Test1', path='path1',
tool=tool,
local_site=self.test_site)
self.review_request = ReviewRequest.objects.create(
self.user,
repository,
local_site=self.test_site)
self._check_counters(with_local_site=True,
total_outgoing=1,
pending_outgoing=1)
self.assertFalse(self.review_request.public)
self.assertEqual(self.review_request.status,
ReviewRequest.PENDING_REVIEW)
self.review_request.close(ReviewRequest.DISCARDED)
self._check_counters(with_local_site=True,
total_outgoing=1)
def test_deleting_requests(self):
"""Testing counters with deleting outgoing review requests"""
# The review request was already created
self._check_counters(total_outgoing=1,
pending_outgoing=1)
draft = ReviewRequestDraft.create(self.review_request)
draft.target_groups.add(self.group)
draft.target_people.add(self.user)
self.review_request.publish(self.user)
self._check_counters(total_outgoing=1,
pending_outgoing=1,
direct_incoming=1,
total_incoming=1,
starred_public=1,
group_incoming=1)
self.review_request.delete()
self._check_counters()
def test_deleting_draft_requests(self):
"""Testing counters with deleting draft review requests"""
# We're simulating what a DefaultReviewer would do by populating
# the ReviewRequest's target users and groups while not public and
# without a draft.
self.review_request.target_people.add(self.user)
self.review_request.target_groups.add(self.group)
# The review request was already created
self._check_counters(total_outgoing=1,
pending_outgoing=1)
self.review_request.delete()
self._check_counters()
def test_deleting_closed_requests(self):
"""Testing counters with deleting closed review requests"""
# We're simulating what a DefaultReviewer would do by populating
# the ReviewRequest's target users and groups while not public and
# without a draft.
self.review_request.target_people.add(self.user)
self.review_request.target_groups.add(self.group)
# The review request was already created
self._check_counters(total_outgoing=1,
pending_outgoing=1)
self.review_request.close(ReviewRequest.DISCARDED)
self._check_counters(total_outgoing=1)
self.review_request.delete()
self._check_counters()
def test_reopen_discarded_requests(self):
"""Testing counters with reopening discarded outgoing review requests
"""
self.test_closing_requests(ReviewRequest.DISCARDED)
self.review_request.reopen()
self.assertFalse(self.review_request.public)
self.assertEqual(self.review_request.status,
ReviewRequest.PENDING_REVIEW)
self._check_counters(total_outgoing=1,
pending_outgoing=1)
self.review_request.publish(self.user)
self._check_counters(total_outgoing=1,
pending_outgoing=1,
direct_incoming=1,
total_incoming=1,
starred_public=1,
group_incoming=1)
def test_reopen_submitted_requests(self):
"""Testing counters with reopening submitted outgoing review requests
"""
self.test_closing_requests(ReviewRequest.SUBMITTED)
self.review_request.reopen()
self.assertTrue(self.review_request.public)
self.assertEqual(self.review_request.status,
ReviewRequest.PENDING_REVIEW)
self._check_counters(total_outgoing=1,
pending_outgoing=1,
direct_incoming=1,
total_incoming=1,
starred_public=1,
group_incoming=1)
self.review_request.publish(self.user)
self._check_counters(total_outgoing=1,
pending_outgoing=1,
direct_incoming=1,
total_incoming=1,
starred_public=1,
group_incoming=1)
def test_reopen_discarded_draft_requests(self):
"""Testing counters with reopening discarded draft review requests"""
self.assertFalse(self.review_request.public)
self.test_closing_draft_requests(ReviewRequest.DISCARDED)
self.review_request.reopen()
self.assertFalse(self.review_request.public)
self.assertEqual(self.review_request.status,
ReviewRequest.PENDING_REVIEW)
self._check_counters(total_outgoing=1,
pending_outgoing=1)
def test_reopen_submitted_draft_requests(self):
"""Testing counters with reopening submitted draft review requests"""
self.test_closing_requests(ReviewRequest.SUBMITTED)
# We're simulating what a DefaultReviewer would do by populating
# the ReviewRequest's target users and groups while not public and
# without a draft.
self.review_request.target_people.add(self.user)
self.review_request.target_groups.add(self.group)
self._check_counters(total_outgoing=1)
self.review_request.reopen()
self.assertTrue(self.review_request.public)
self.assertEqual(self.review_request.status,
ReviewRequest.PENDING_REVIEW)
self._check_counters(total_outgoing=1,
pending_outgoing=1,
direct_incoming=1,
total_incoming=1,
starred_public=1,
group_incoming=1)
def test_double_publish(self):
"""Testing counters with publishing a review request twice"""
self.assertFalse(self.review_request.public)
self.assertEqual(self.review_request.status,
ReviewRequest.PENDING_REVIEW)
# Publish the first time.
self.review_request.publish(self.user)
self._check_counters(total_outgoing=1,
pending_outgoing=1,
starred_public=1)
# Publish the second time.
self.review_request.publish(self.user)
self._check_counters(total_outgoing=1,
pending_outgoing=1,
starred_public=1)
def test_add_group(self):
"""Testing counters when adding a group reviewer"""
draft = ReviewRequestDraft.create(self.review_request)
draft.summary = 'Test Summary'
draft.target_groups.add(self.group)
self._check_counters(total_outgoing=1,
pending_outgoing=1)
self.review_request.publish(self.user)
self._check_counters(total_outgoing=1,
pending_outgoing=1,
total_incoming=1,
group_incoming=1,
starred_public=1)
def test_remove_group(self):
"""Testing counters when removing a group reviewer"""
self.test_add_group()
draft = ReviewRequestDraft.create(self.review_request)
draft.target_groups.remove(self.group)
# There must be at least one target_group or target_people
draft.target_people = [self.user]
self._check_counters(total_outgoing=1,
pending_outgoing=1,
total_incoming=1,
direct_incoming=0,
group_incoming=1,
starred_public=1)
self.review_request.publish(self.user)
self._check_counters(total_outgoing=1,
pending_outgoing=1,
direct_incoming=1,
total_incoming=1,
starred_public=1)
def test_remove_group_and_fail_publish(self):
"""Testing counters when removing a group reviewer and then
failing to publish the draft
"""
self.test_add_group()
draft = ReviewRequestDraft.create(self.review_request)
draft.target_groups.remove(self.group)
self._check_counters(total_outgoing=1,
pending_outgoing=1,
total_incoming=1,
group_incoming=1,
starred_public=1)
self.spy_on(ReviewRequestDraft.publish,
owner=ReviewRequestDraft,
call_fake=self._raise_publish_error)
with self.assertRaises(NotModifiedError):
self.review_request.publish(self.user)
self._check_counters(total_outgoing=1,
pending_outgoing=1,
total_incoming=1,
group_incoming=1,
starred_public=1)
def test_add_person(self):
"""Testing counters when adding a person reviewer"""
draft = ReviewRequestDraft.create(self.review_request)
draft.summary = 'Test Summary'
draft.target_people.add(self.user)
self._check_counters(total_outgoing=1,
pending_outgoing=1)
self.review_request.publish(self.user)
self._check_counters(total_outgoing=1,
pending_outgoing=1,
direct_incoming=1,
total_incoming=1,
starred_public=1)
def test_remove_person(self):
"""Testing counters when removing a person reviewer"""
self.test_add_person()
draft = ReviewRequestDraft.create(self.review_request)
draft.target_people.remove(self.user)
# There must be at least one target_group or target_people
draft.target_groups = [self.group]
self._check_counters(total_outgoing=1,
pending_outgoing=1,
direct_incoming=1,
total_incoming=1,
starred_public=1)
self.review_request.publish(self.user)
self._check_counters(total_outgoing=1,
pending_outgoing=1,
group_incoming=1,
total_incoming=1,
starred_public=1)
def test_remove_person_and_fail_publish(self):
"""Testing counters when removing a person reviewer and then
failing to publish the draft
"""
self.test_add_person()
draft = ReviewRequestDraft.create(self.review_request)
draft.target_people.remove(self.user)
self._check_counters(total_outgoing=1,
pending_outgoing=1,
direct_incoming=1,
total_incoming=1,
starred_public=1)
self.spy_on(ReviewRequestDraft.publish,
owner=ReviewRequestDraft,
call_fake=self._raise_publish_error)
with self.assertRaises(NotModifiedError):
self.review_request.publish(self.user)
self._check_counters(total_outgoing=1,
pending_outgoing=1,
direct_incoming=1,
total_incoming=1,
starred_public=1)
def test_populate_counters(self):
"""Testing counters when populated from a fresh upgrade or clear"""
# The review request was already created
draft = ReviewRequestDraft.create(self.review_request)
draft.target_groups.add(self.group)
draft.target_people.add(self.user)
self.review_request.publish(self.user)
self._check_counters(total_outgoing=1,
pending_outgoing=1,
total_incoming=1,
direct_incoming=1,
starred_public=1,
group_incoming=1)
LocalSiteProfile.objects.update(
direct_incoming_request_count=None,
total_incoming_request_count=None,
pending_outgoing_request_count=None,
total_outgoing_request_count=None,
starred_public_request_count=None)
Group.objects.update(incoming_request_count=None)
self._check_counters(total_outgoing=1,
pending_outgoing=1,
total_incoming=1,
direct_incoming=1,
starred_public=1,
group_incoming=1)
def test_populate_counters_after_change(self):
"""Testing counter inc/dec on uninitialized counter fields"""
# The review request was already created
draft = ReviewRequestDraft.create(self.review_request)
draft.target_groups.add(self.group)
draft.target_people.add(self.user)
self._check_counters(total_outgoing=1,
pending_outgoing=1)
LocalSiteProfile.objects.update(
direct_incoming_request_count=None,
total_incoming_request_count=None,
pending_outgoing_request_count=None,
total_outgoing_request_count=None,
starred_public_request_count=None)
Group.objects.update(incoming_request_count=None)
profile_fields = [
'direct_incoming_request_count',
'total_incoming_request_count',
'pending_outgoing_request_count',
'total_outgoing_request_count',
'starred_public_request_count',
]
# Lock the fields so we don't re-initialize them on publish.
locks = {
self.site_profile: 1,
self.site_profile2: 1,
}
for field in profile_fields:
getattr(LocalSiteProfile, field)._locks = locks
Group.incoming_request_count._locks = locks
# Publish the review request. This will normally try to
# increment/decrement the counts, which it should ignore now.
self.review_request.publish(self.user)
# Unlock the profiles so we can query/re-initialize them again.
for field in profile_fields:
getattr(LocalSiteProfile, field)._locks = {}
Group.incoming_request_count._locks = {}
self._check_counters(total_outgoing=1,
pending_outgoing=1,
direct_incoming=1,
total_incoming=1,
starred_public=1,
group_incoming=1)
def test_counts_with_reassignment_in_initial_draft(self):
"""Testing counters when changing review request ownership in initial
draft
"""
self._check_counters(total_outgoing=1,
pending_outgoing=1)
new_user = User.objects.create_user(username='test2',
password='',
email='user@example.com')
site_profile = \
new_user.get_site_profile(self.review_request.local_site)
draft = ReviewRequestDraft.create(self.review_request)
draft.owner = new_user
draft.target_people = [draft.owner]
draft.save()
self.review_request.publish(self.user)
self._check_counters(total_outgoing=0,
pending_outgoing=0,
starred_public=1)
site_profile = LocalSiteProfile.objects.get(pk=site_profile.pk)
self._check_counters_on_profile(site_profile,
total_outgoing=1,
pending_outgoing=1,
direct_incoming=1,
total_incoming=1)
def test_counts_with_reassignment_in_initial_draft_new_profile(self):
"""Testing counters when changing review request ownership in initial
draft and new owner without initial site profile
"""
self._check_counters(total_outgoing=1,
pending_outgoing=1)
new_user = User.objects.create_user(username='test2',
password='',
email='user@example.com')
draft = ReviewRequestDraft.create(self.review_request)
draft.owner = new_user
draft.target_people = [draft.owner]
draft.save()
self.review_request.publish(self.user)
self._check_counters(total_outgoing=0,
pending_outgoing=0,
starred_public=1)
site_profile = \
new_user.get_site_profile(self.review_request.local_site)
self._check_counters_on_profile(site_profile,
total_outgoing=1,
pending_outgoing=1,
direct_incoming=1,
total_incoming=1)
def test_counts_with_reassignment_after_publish(self):
"""Testing counters when changing review request ownership after
publish
"""
self.review_request.publish(self.user)
self._check_counters(total_outgoing=1,
pending_outgoing=1,
starred_public=1)
new_user = User.objects.create_user(username='test2',
password='',
email='user@example.com')
site_profile = \
new_user.get_site_profile(self.review_request.local_site)
draft = ReviewRequestDraft.create(self.review_request)
draft.owner = new_user
draft.target_people = [draft.owner]
draft.save()
self.review_request.publish(self.user)
self._check_counters(total_outgoing=0,
pending_outgoing=0,
starred_public=1)
site_profile = LocalSiteProfile.objects.get(pk=site_profile.pk)
self._check_counters_on_profile(site_profile,
total_outgoing=1,
pending_outgoing=1,
direct_incoming=1,
total_incoming=1)
def test_counts_with_reassignment_after_publish_new_profile(self):
"""Testing counters when changing review request ownership after
publish and new owner without initial site profile
"""
self.review_request.publish(self.user)
self._check_counters(total_outgoing=1,
pending_outgoing=1,
starred_public=1)
new_user = User.objects.create_user(username='test2',
password='',
email='user@example.com')
draft = ReviewRequestDraft.create(self.review_request)
draft.owner = new_user
draft.target_people = [draft.owner]
draft.save()
self.review_request.publish(self.user)
self._check_counters(total_outgoing=0,
pending_outgoing=0,
starred_public=1)
site_profile = \
new_user.get_site_profile(self.review_request.local_site)
self._check_counters_on_profile(site_profile,
total_outgoing=1,
pending_outgoing=1,
direct_incoming=1,
total_incoming=1)
def test_counts_with_reassignment_and_close(self):
"""Testing counters when changing review request ownership and closing
in same operation
"""
self.review_request.publish(self.user)
self._check_counters(total_outgoing=1,
pending_outgoing=1,
starred_public=1)
new_user = User.objects.create_user(username='test2',
password='',
email='user@example.com')
site_profile = \
new_user.get_site_profile(self.review_request.local_site)
# Note that it's not normally possible to update something like an
# owner while also closing in the same operation. Drafts don't allow
# it. However, we have logic that considers these combinations of
# operations, and it's technically possible to do, so we're testing
# it here by updating the review request manually.
self.review_request.owner = new_user
self.review_request.status = ReviewRequest.SUBMITTED
self.review_request.save(update_counts=True,
old_submitter=self.user)
self._check_counters(total_outgoing=0,
pending_outgoing=0,
starred_public=0)
site_profile = LocalSiteProfile.objects.get(pk=site_profile.pk)
self._check_counters_on_profile(site_profile,
total_outgoing=1,
pending_outgoing=0)
def test_counts_with_reassignment_and_reopen(self):
"""Testing counters when changing review request ownership and
reopening in same operation
"""
self.review_request.close(ReviewRequest.DISCARDED)
self.assertFalse(self.review_request.public)
self.assertEqual(self.review_request.status, ReviewRequest.DISCARDED)
self._check_counters(total_outgoing=1,
pending_outgoing=0,
starred_public=0)
new_user = User.objects.create_user(username='test2',
password='',
email='user@example.com')
site_profile = \
new_user.get_site_profile(self.review_request.local_site)
# Note that it's not normally possible to update something like an
# owner while also reopening in the same operation. Drafts don't allow
# it. However, we have logic that considers these combinations of
# operations, and it's technically possible to do, so we're testing
# it here by updating the review request manually.
self.review_request.owner = new_user
self.review_request.status = ReviewRequest.PENDING_REVIEW
self.review_request.save(update_counts=True,
old_submitter=self.user)
self._check_counters(total_outgoing=0,
pending_outgoing=0,
starred_public=0)
site_profile = LocalSiteProfile.objects.get(pk=site_profile.pk)
self._check_counters_on_profile(site_profile,
total_outgoing=1,
pending_outgoing=1)
def test_counts_with_join_group(self):
"""Testing counters when joining a review group"""
user2 = self.create_user()
group2 = self.create_review_group(name='group2')
self.create_review_request(submitter=user2,
target_groups=[group2],
publish=True)
self._check_counters(total_outgoing=1,
pending_outgoing=1,
total_incoming=0)
group2.users.add(self.user)
self._check_counters(total_outgoing=1,
pending_outgoing=1,
total_incoming=1)
def test_counts_with_leave_group(self):
"""Testing counters when leaving a review group"""
user2 = self.create_user()
group2 = self.create_review_group(name='group2')
group2.users.add(self.user)
self.create_review_request(submitter=user2,
target_groups=[group2],
publish=True)
self._check_counters(total_outgoing=1,
pending_outgoing=1,
total_incoming=1)
group2.users.remove(self.user)
self._check_counters(total_outgoing=1,
pending_outgoing=1,
total_incoming=0)
def _check_counters(self, total_outgoing=0, pending_outgoing=0,
direct_incoming=0, total_incoming=0,
starred_public=0, group_incoming=0,
with_local_site=False):
"""Check that the counters match the expected values.
Args:
total_outgoing (int):
The expected number of total outgoing review requests.
pending_outgoing (int):
The expected number of pending outgoing review requests.
direct_incoming (int):
The expected number of review requests assigned directly to the
user.
total_incoming (int):
The expected number of review requests assigned either directly
or indirectly to the user.
starred_public (int):
The expected number of public review requests starred by the
user.
group_incoming (int):
The expected number of review requests assigned to the test
group.
with_local_site (bool):
Whether to run the test for a local site.
"""
self._reload_objects()
if with_local_site:
main_site_profile = self.site_profile2
unused_site_profile = self.site_profile
else:
main_site_profile = self.site_profile
unused_site_profile = self.site_profile2
self._check_counters_on_profile(main_site_profile, total_outgoing,
pending_outgoing, direct_incoming,
total_incoming, starred_public)
self.assertEqual(
self.group.incoming_request_count,
group_incoming,
'Expected Group.incoming_request_count to be %s. Got %s instead.'
% (group_incoming, self.group.incoming_request_count))
# These should never be affected by updates on the main LocalSite we're
# working with, so they should always be 0.
self._check_counters_on_profile(unused_site_profile)
def _check_counters_on_profile(self, profile, total_outgoing=0,
pending_outgoing=0, direct_incoming=0,
total_incoming=0, starred_public=0):
"""Check that the counters match the expected values.
Args:
profile (reviewboard.accounts.models.LocalSiteProfile):
The profile object to test counts on.
total_outgoing (int):
The expected number of total outgoing review requests.
pending_outgoing (int):
The expected number of pending outgoing review requests.
direct_incoming (int):
The expected number of review requests assigned directly to the
user.
total_incoming (int):
The expected number of review requests assigned either directly
or indirectly to the user.
starred_public (int):
The expected number of public review requests starred by the
user.
"""
msg = 'Expected %s to be %s. Got %s instead.'
self.assertEqual(
profile.total_outgoing_request_count,
total_outgoing,
msg % ('total_outgoing_request_count', total_outgoing,
profile.total_outgoing_request_count))
self.assertEqual(
profile.pending_outgoing_request_count,
pending_outgoing,
msg % ('pending_outgoing_request_count', pending_outgoing,
profile.pending_outgoing_request_count))
self.assertEqual(
profile.direct_incoming_request_count,
direct_incoming,
msg % ('direct_incoming_request_count', direct_incoming,
profile.direct_incoming_request_count))
self.assertEqual(
profile.total_incoming_request_count,
total_incoming,
msg % ('total_incoming_request_count', total_incoming,
profile.total_incoming_request_count))
self.assertEqual(
profile.starred_public_request_count,
starred_public,
msg % ('starred_public_request_count', starred_public,
profile.starred_public_request_count))
def _reload_objects(self):
self.test_site = LocalSite.objects.get(pk=self.test_site.pk)
self.site_profile = \
LocalSiteProfile.objects.get(pk=self.site_profile.pk)
self.site_profile2 = \
LocalSiteProfile.objects.get(pk=self.site_profile2.pk)
self.group = Group.objects.get(pk=self.group.pk)
def _raise_publish_error(self, *args, **kwargs):
raise NotModifiedError()
| 40.795328 | 79 | 0.58364 | 3,700 | 36,675 | 5.514054 | 0.068649 | 0.075826 | 0.079992 | 0.060386 | 0.882953 | 0.843496 | 0.784972 | 0.763013 | 0.738555 | 0.700471 | 0 | 0.011562 | 0.349093 | 36,675 | 898 | 80 | 40.840757 | 0.84308 | 0.146012 | 0 | 0.762144 | 0 | 0 | 0.020825 | 0.010233 | 0 | 0 | 0 | 0 | 0.068677 | 1 | 0.058626 | false | 0.011725 | 0.0134 | 0 | 0.075377 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
d9d0a1dae8e9bc3db1c2ad76d598d44fdbdc1b45 | 10,318 | py | Python | st2common/tests/unit/test_rbac_types.py | kkkanil/st2 | 07cd195d7a6e177a37dd019e5c9ab8329259d0fa | [
"Apache-2.0"
] | null | null | null | st2common/tests/unit/test_rbac_types.py | kkkanil/st2 | 07cd195d7a6e177a37dd019e5c9ab8329259d0fa | [
"Apache-2.0"
] | 15 | 2021-02-11T22:58:54.000Z | 2021-08-06T18:03:47.000Z | st2common/tests/unit/test_rbac_types.py | kkkanil/st2 | 07cd195d7a6e177a37dd019e5c9ab8329259d0fa | [
"Apache-2.0"
] | 1 | 2021-07-10T15:02:29.000Z | 2021-07-10T15:02:29.000Z | # Copyright 2020 The StackStorm Authors.
# Copyright 2019 Extreme Networks, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from __future__ import absolute_import
from unittest2 import TestCase
from st2common.constants.types import ResourceType as SystemType
from st2common.rbac.types import PermissionType
from st2common.rbac.types import ResourceType
class RBACPermissionTypeTestCase(TestCase):
def test_get_valid_permission_for_resource_type(self):
valid_action_permissions = PermissionType.get_valid_permissions_for_resource_type(
resource_type=ResourceType.ACTION)
for name in valid_action_permissions:
self.assertTrue(name.startswith(ResourceType.ACTION + '_'))
valid_rule_permissions = PermissionType.get_valid_permissions_for_resource_type(
resource_type=ResourceType.RULE)
for name in valid_rule_permissions:
self.assertTrue(name.startswith(ResourceType.RULE + '_'))
def test_get_resource_type(self):
self.assertEqual(PermissionType.get_resource_type(PermissionType.PACK_LIST),
SystemType.PACK)
self.assertEqual(PermissionType.get_resource_type(PermissionType.PACK_VIEW),
SystemType.PACK)
self.assertEqual(PermissionType.get_resource_type(PermissionType.PACK_CREATE),
SystemType.PACK)
self.assertEqual(PermissionType.get_resource_type(PermissionType.PACK_MODIFY),
SystemType.PACK)
self.assertEqual(PermissionType.get_resource_type(PermissionType.PACK_DELETE),
SystemType.PACK)
self.assertEqual(PermissionType.get_resource_type(PermissionType.PACK_ALL),
SystemType.PACK)
self.assertEqual(PermissionType.get_resource_type(PermissionType.SENSOR_LIST),
SystemType.SENSOR_TYPE)
self.assertEqual(PermissionType.get_resource_type(PermissionType.SENSOR_VIEW),
SystemType.SENSOR_TYPE)
self.assertEqual(PermissionType.get_resource_type(PermissionType.SENSOR_MODIFY),
SystemType.SENSOR_TYPE)
self.assertEqual(PermissionType.get_resource_type(PermissionType.SENSOR_ALL),
SystemType.SENSOR_TYPE)
self.assertEqual(PermissionType.get_resource_type(PermissionType.ACTION_LIST),
SystemType.ACTION)
self.assertEqual(PermissionType.get_resource_type(PermissionType.ACTION_VIEW),
SystemType.ACTION)
self.assertEqual(PermissionType.get_resource_type(PermissionType.ACTION_CREATE),
SystemType.ACTION)
self.assertEqual(PermissionType.get_resource_type(PermissionType.ACTION_MODIFY),
SystemType.ACTION)
self.assertEqual(PermissionType.get_resource_type(PermissionType.ACTION_DELETE),
SystemType.ACTION)
self.assertEqual(PermissionType.get_resource_type(PermissionType.ACTION_EXECUTE),
SystemType.ACTION)
self.assertEqual(PermissionType.get_resource_type(PermissionType.ACTION_ALL),
SystemType.ACTION)
self.assertEqual(PermissionType.get_resource_type(PermissionType.EXECUTION_LIST),
SystemType.EXECUTION)
self.assertEqual(PermissionType.get_resource_type(PermissionType.EXECUTION_VIEW),
SystemType.EXECUTION)
self.assertEqual(PermissionType.get_resource_type(PermissionType.EXECUTION_RE_RUN),
SystemType.EXECUTION)
self.assertEqual(PermissionType.get_resource_type(PermissionType.EXECUTION_STOP),
SystemType.EXECUTION)
self.assertEqual(PermissionType.get_resource_type(PermissionType.EXECUTION_ALL),
SystemType.EXECUTION)
self.assertEqual(PermissionType.get_resource_type(PermissionType.RULE_LIST),
SystemType.RULE)
self.assertEqual(PermissionType.get_resource_type(PermissionType.RULE_VIEW),
SystemType.RULE)
self.assertEqual(PermissionType.get_resource_type(PermissionType.RULE_CREATE),
SystemType.RULE)
self.assertEqual(PermissionType.get_resource_type(PermissionType.RULE_MODIFY),
SystemType.RULE)
self.assertEqual(PermissionType.get_resource_type(PermissionType.RULE_DELETE),
SystemType.RULE)
self.assertEqual(PermissionType.get_resource_type(PermissionType.RULE_ALL),
SystemType.RULE)
self.assertEqual(PermissionType.get_resource_type(PermissionType.RULE_ENFORCEMENT_LIST),
SystemType.RULE_ENFORCEMENT)
self.assertEqual(PermissionType.get_resource_type(PermissionType.RULE_ENFORCEMENT_VIEW),
SystemType.RULE_ENFORCEMENT)
self.assertEqual(PermissionType.get_resource_type(PermissionType.KEY_VALUE_VIEW),
SystemType.KEY_VALUE_PAIR)
self.assertEqual(PermissionType.get_resource_type(PermissionType.KEY_VALUE_SET),
SystemType.KEY_VALUE_PAIR)
self.assertEqual(PermissionType.get_resource_type(PermissionType.KEY_VALUE_DELETE),
SystemType.KEY_VALUE_PAIR)
self.assertEqual(PermissionType.get_resource_type(PermissionType.WEBHOOK_CREATE),
SystemType.WEBHOOK)
self.assertEqual(PermissionType.get_resource_type(PermissionType.WEBHOOK_SEND),
SystemType.WEBHOOK)
self.assertEqual(PermissionType.get_resource_type(PermissionType.WEBHOOK_DELETE),
SystemType.WEBHOOK)
self.assertEqual(PermissionType.get_resource_type(PermissionType.WEBHOOK_ALL),
SystemType.WEBHOOK)
self.assertEqual(PermissionType.get_resource_type(PermissionType.API_KEY_LIST),
SystemType.API_KEY)
self.assertEqual(PermissionType.get_resource_type(PermissionType.API_KEY_VIEW),
SystemType.API_KEY)
self.assertEqual(PermissionType.get_resource_type(PermissionType.API_KEY_CREATE),
SystemType.API_KEY)
self.assertEqual(PermissionType.get_resource_type(PermissionType.API_KEY_DELETE),
SystemType.API_KEY)
self.assertEqual(PermissionType.get_resource_type(PermissionType.API_KEY_ALL),
SystemType.API_KEY)
def test_get_permission_type(self):
self.assertEqual(PermissionType.get_permission_type(resource_type=ResourceType.ACTION,
permission_name='view'),
PermissionType.ACTION_VIEW)
self.assertEqual(PermissionType.get_permission_type(resource_type=ResourceType.ACTION,
permission_name='all'),
PermissionType.ACTION_ALL)
self.assertEqual(PermissionType.get_permission_type(resource_type=ResourceType.ACTION,
permission_name='execute'),
PermissionType.ACTION_EXECUTE)
self.assertEqual(PermissionType.get_permission_type(resource_type=ResourceType.RULE,
permission_name='view'),
PermissionType.RULE_VIEW)
self.assertEqual(PermissionType.get_permission_type(resource_type=ResourceType.RULE,
permission_name='delete'),
PermissionType.RULE_DELETE)
self.assertEqual(PermissionType.get_permission_type(resource_type=ResourceType.SENSOR,
permission_name='view'),
PermissionType.SENSOR_VIEW)
self.assertEqual(PermissionType.get_permission_type(resource_type=ResourceType.SENSOR,
permission_name='all'),
PermissionType.SENSOR_ALL)
self.assertEqual(PermissionType.get_permission_type(resource_type=ResourceType.SENSOR,
permission_name='modify'),
PermissionType.SENSOR_MODIFY)
self.assertEqual(
PermissionType.get_permission_type(resource_type=ResourceType.RULE_ENFORCEMENT,
permission_name='view'),
PermissionType.RULE_ENFORCEMENT_VIEW)
def test_get_permission_name(self):
self.assertEqual(PermissionType.get_permission_name(PermissionType.ACTION_LIST),
'list')
self.assertEqual(PermissionType.get_permission_name(PermissionType.ACTION_CREATE),
'create')
self.assertEqual(PermissionType.get_permission_name(PermissionType.ACTION_DELETE),
'delete')
self.assertEqual(PermissionType.get_permission_name(PermissionType.ACTION_ALL),
'all')
self.assertEqual(PermissionType.get_permission_name(PermissionType.PACK_ALL),
'all')
self.assertEqual(PermissionType.get_permission_name(PermissionType.SENSOR_MODIFY),
'modify')
self.assertEqual(PermissionType.get_permission_name(PermissionType.ACTION_EXECUTE),
'execute')
self.assertEqual(PermissionType.get_permission_name(PermissionType.RULE_ENFORCEMENT_LIST),
'list')
| 57.322222 | 98 | 0.656135 | 924 | 10,318 | 7.036797 | 0.126623 | 0.159489 | 0.26315 | 0.290372 | 0.798216 | 0.779145 | 0.754537 | 0.745309 | 0.680098 | 0.607813 | 0 | 0.002148 | 0.277961 | 10,318 | 179 | 99 | 57.642458 | 0.870604 | 0.057957 | 0 | 0.410959 | 0 | 0 | 0.008449 | 0 | 0 | 0 | 0 | 0 | 0.417808 | 1 | 0.027397 | false | 0 | 0.034247 | 0 | 0.068493 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
d9de701baf3b4cac7725df9006b5d356b11e76f6 | 71,489 | py | Python | tests/unit/modules/test_junos.py | nevins-b/salt | 56363bc41ca36e757103df3504d1bb07e3a7251b | [
"Apache-2.0"
] | null | null | null | tests/unit/modules/test_junos.py | nevins-b/salt | 56363bc41ca36e757103df3504d1bb07e3a7251b | [
"Apache-2.0"
] | null | null | null | tests/unit/modules/test_junos.py | nevins-b/salt | 56363bc41ca36e757103df3504d1bb07e3a7251b | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
'''
:codeauthor: :email:`Rajvi Dhimar <rajvidhimar95@gmail.com>`
'''
# Import python libs
from __future__ import absolute_import, print_function
# Import test libs
from tests.support.mixins import LoaderModuleMockMixin, XMLEqualityMixin
from tests.support.mock import patch, mock_open
from tests.support.unit import skipIf, TestCase
# Import 3rd-party libs
try:
from lxml import etree
except ImportError:
from salt._compat import ElementTree as etree
try:
from jnpr.junos.utils.config import Config
from jnpr.junos.utils.sw import SW
from jnpr.junos.device import Device
HAS_JUNOS = True
except ImportError:
HAS_JUNOS = False
# Import salt modules
import salt.modules.junos as junos
@skipIf(not HAS_JUNOS, 'Missing dependencies')
class Test_Junos_Module(TestCase, LoaderModuleMockMixin, XMLEqualityMixin):
def setup_loader_modules(self):
return {
junos: {
'__proxy__': {
'junos.conn': self.make_connect,
'junos.get_serialized_facts': self.get_facts
},
'__salt__': {'cp.get_template': self.mock_cp}
}
}
def mock_cp(self, *args, **kwargs):
pass
@patch('ncclient.manager.connect')
def make_connect(self, mock_connect):
self.dev = self.dev = Device(
host='1.1.1.1',
user='test',
password='test123',
gather_facts=False)
self.dev.open()
self.dev.timeout = 30
self.dev.bind(cu=Config)
self.dev.bind(sw=SW)
self.addCleanup(delattr, self, 'dev')
return self.dev
def raise_exception(self, *args, **kwargs):
raise Exception('Test exception')
def get_facts(self):
facts = {'2RE': True,
'HOME': '/var/home/regress',
'RE0': {'last_reboot_reason': '0x200:normal shutdown',
'mastership_state': 'master',
'model': 'RE-VMX',
'status': 'OK',
'up_time': '11 days, 23 hours, 16 minutes, 54 seconds'},
'RE1': {'last_reboot_reason': '0x200:normal shutdown',
'mastership_state': 'backup',
'model': 'RE-VMX',
'status': 'OK',
'up_time': '11 days, 23 hours, 16 minutes, 41 seconds'},
'RE_hw_mi': False,
'current_re': ['re0', 'master', 'node', 'fwdd', 'member', 'pfem'],
'domain': 'englab.juniper.net',
'fqdn': 'R1_re0.englab.juniper.net',
'hostname': 'R1_re0',
'hostname_info': {'re0': 'R1_re0', 're1': 'R1_re01'},
'ifd_style': 'CLASSIC',
'junos_info': {'re0': {'object': {'build': None,
'major': (16, 1),
'minor': '20160413_0837_aamish',
'type': 'I'},
'text': '16.1I20160413_0837_aamish'},
're1': {'object': {'build': None,
'major': (16, 1),
'minor': '20160413_0837_aamish',
'type': 'I'},
'text': '16.1I20160413_0837_aamish'}},
'master': 'RE0',
'model': 'MX240',
'model_info': {'re0': 'MX240', 're1': 'MX240'},
'personality': 'MX',
're_info': {'default': {'0': {'last_reboot_reason': '0x200:normal shutdown',
'mastership_state': 'master',
'model': 'RE-VMX',
'status': 'OK'},
'1': {'last_reboot_reason': '0x200:normal shutdown',
'mastership_state': 'backup',
'model': 'RE-VMX',
'status': 'OK'},
'default': {'last_reboot_reason': '0x200:normal shutdown',
'mastership_state': 'master',
'model': 'RE-VMX',
'status': 'OK'}}},
're_master': {'default': '0'},
'serialnumber': 'VMX4eaf',
'srx_cluster': None,
'switch_style': 'BRIDGE_DOMAIN',
'vc_capable': False,
'vc_fabric': None,
'vc_master': None,
'vc_mode': None,
'version': '16.1I20160413_0837_aamish',
'version_RE0': '16.1I20160413_0837_aamish',
'version_RE1': '16.1I20160413_0837_aamish',
'version_info': {'build': None,
'major': (16, 1),
'minor': '20160413_0837_aamish',
'type': 'I'},
'virtual': True}
return facts
@patch('salt.modules.saltutil.sync_grains')
def test_facts_refresh(self, mock_sync_grains):
ret = dict()
ret['facts'] = {'2RE': True,
'HOME': '/var/home/regress',
'RE0': {'last_reboot_reason': '0x200:normal shutdown',
'mastership_state': 'master',
'model': 'RE-VMX',
'status': 'OK',
'up_time': '11 days, 23 hours, 16 minutes, 54 seconds'},
'RE1': {'last_reboot_reason': '0x200:normal shutdown',
'mastership_state': 'backup',
'model': 'RE-VMX',
'status': 'OK',
'up_time': '11 days, 23 hours, 16 minutes, 41 seconds'},
'RE_hw_mi': False,
'current_re': ['re0', 'master', 'node', 'fwdd', 'member', 'pfem'],
'domain': 'englab.juniper.net',
'fqdn': 'R1_re0.englab.juniper.net',
'hostname': 'R1_re0',
'hostname_info': {'re0': 'R1_re0', 're1': 'R1_re01'},
'ifd_style': 'CLASSIC',
'junos_info': {'re0': {'object': {'build': None,
'major': (16, 1),
'minor': '20160413_0837_aamish',
'type': 'I'},
'text': '16.1I20160413_0837_aamish'},
're1': {'object': {'build': None,
'major': (16, 1),
'minor': '20160413_0837_aamish',
'type': 'I'},
'text': '16.1I20160413_0837_aamish'}},
'master': 'RE0',
'model': 'MX240',
'model_info': {'re0': 'MX240', 're1': 'MX240'},
'personality': 'MX',
're_info': {'default': {'0': {'last_reboot_reason': '0x200:normal shutdown',
'mastership_state': 'master',
'model': 'RE-VMX',
'status': 'OK'},
'1': {'last_reboot_reason': '0x200:normal shutdown',
'mastership_state': 'backup',
'model': 'RE-VMX',
'status': 'OK'},
'default': {'last_reboot_reason': '0x200:normal shutdown',
'mastership_state': 'master',
'model': 'RE-VMX',
'status': 'OK'}}},
're_master': {'default': '0'},
'serialnumber': 'VMX4eaf',
'srx_cluster': None,
'switch_style': 'BRIDGE_DOMAIN',
'vc_capable': False,
'vc_fabric': None,
'vc_master': None,
'vc_mode': None,
'version': '16.1I20160413_0837_aamish',
'version_RE0': '16.1I20160413_0837_aamish',
'version_RE1': '16.1I20160413_0837_aamish',
'version_info': {'build': None,
'major': (16, 1),
'minor': '20160413_0837_aamish',
'type': 'I'},
'virtual': True}
ret['out'] = True
self.assertEqual(junos.facts_refresh(), ret)
@patch('jnpr.junos.device.Device.facts_refresh')
def test_facts_refresh_exception(self, mock_facts_refresh):
mock_facts_refresh.side_effect = self.raise_exception
ret = dict()
ret['message'] = 'Execution failed due to "Test exception"'
ret['out'] = False
self.assertEqual(junos.facts_refresh(), ret)
def test_facts(self):
ret = dict()
ret['facts'] = {'2RE': True,
'HOME': '/var/home/regress',
'RE0': {'last_reboot_reason': '0x200:normal shutdown',
'mastership_state': 'master',
'model': 'RE-VMX',
'status': 'OK',
'up_time': '11 days, 23 hours, 16 minutes, 54 seconds'},
'RE1': {'last_reboot_reason': '0x200:normal shutdown',
'mastership_state': 'backup',
'model': 'RE-VMX',
'status': 'OK',
'up_time': '11 days, 23 hours, 16 minutes, 41 seconds'},
'RE_hw_mi': False,
'current_re': ['re0', 'master', 'node', 'fwdd', 'member', 'pfem'],
'domain': 'englab.juniper.net',
'fqdn': 'R1_re0.englab.juniper.net',
'hostname': 'R1_re0',
'hostname_info': {'re0': 'R1_re0', 're1': 'R1_re01'},
'ifd_style': 'CLASSIC',
'junos_info': {'re0': {'object': {'build': None,
'major': (16, 1),
'minor': '20160413_0837_aamish',
'type': 'I'},
'text': '16.1I20160413_0837_aamish'},
're1': {'object': {'build': None,
'major': (16, 1),
'minor': '20160413_0837_aamish',
'type': 'I'},
'text': '16.1I20160413_0837_aamish'}},
'master': 'RE0',
'model': 'MX240',
'model_info': {'re0': 'MX240', 're1': 'MX240'},
'personality': 'MX',
're_info': {'default': {'0': {'last_reboot_reason': '0x200:normal shutdown',
'mastership_state': 'master',
'model': 'RE-VMX',
'status': 'OK'},
'1': {'last_reboot_reason': '0x200:normal shutdown',
'mastership_state': 'backup',
'model': 'RE-VMX',
'status': 'OK'},
'default': {'last_reboot_reason': '0x200:normal shutdown',
'mastership_state': 'master',
'model': 'RE-VMX',
'status': 'OK'}}},
're_master': {'default': '0'},
'serialnumber': 'VMX4eaf',
'srx_cluster': None,
'switch_style': 'BRIDGE_DOMAIN',
'vc_capable': False,
'vc_fabric': None,
'vc_master': None,
'vc_mode': None,
'version': '16.1I20160413_0837_aamish',
'version_RE0': '16.1I20160413_0837_aamish',
'version_RE1': '16.1I20160413_0837_aamish',
'version_info': {'build': None,
'major': (16, 1),
'minor': '20160413_0837_aamish',
'type': 'I'},
'virtual': True}
ret['out'] = True
self.assertEqual(junos.facts(), ret)
def test_facts_exception(self):
with patch.dict(junos.__proxy__, {'junos.get_serialized_facts': self.raise_exception}):
ret = dict()
ret['message'] = 'Could not display facts due to "Test exception"'
ret['out'] = False
self.assertEqual(junos.facts(), ret)
def test_set_hostname_without_args(self):
ret = dict()
ret['message'] = 'Please provide the hostname.'
ret['out'] = False
self.assertEqual(junos.set_hostname(), ret)
def test_set_hostname_load_called_with_valid_name(self):
with patch('jnpr.junos.utils.config.Config.load') as mock_load:
junos.set_hostname('test-name')
mock_load.assert_called_with(
'set system host-name test-name', format='set')
@patch('jnpr.junos.utils.config.Config.load')
def test_set_hostname_raise_exception_for_load(self, mock_load):
mock_load.side_effect = self.raise_exception
ret = dict()
ret['message'] = 'Could not load configuration due to error "Test exception"'
ret['out'] = False
self.assertEqual(junos.set_hostname('Test-name'), ret)
@patch('jnpr.junos.utils.config.Config.commit_check')
def test_set_hostname_raise_exception_for_commit_check(
self, mock_commit_check):
mock_commit_check.side_effect = self.raise_exception
ret = dict()
ret['message'] = 'Could not commit check due to error "Test exception"'
ret['out'] = False
self.assertEqual(junos.set_hostname('test-name'), ret)
@patch('jnpr.junos.utils.config.Config.load')
@patch('jnpr.junos.utils.config.Config.commit_check')
@patch('jnpr.junos.utils.config.Config.commit')
def test_set_hostname_one_arg_parsed_correctly(
self, mock_commit, mock_commit_check, mock_load):
mock_commit_check.return_value = True
args = {'comment': 'Committed via salt', '__pub_user': 'root',
'__pub_arg': ['test-name', {'comment': 'Committed via salt'}],
'__pub_fun': 'junos.set_hostname', '__pub_jid':
'20170220210915624885', '__pub_tgt': 'mac_min',
'__pub_tgt_type': 'glob', '__pub_ret': ''}
junos.set_hostname('test-name', **args)
mock_commit.assert_called_with(comment='Committed via salt')
@patch('jnpr.junos.utils.config.Config.load')
@patch('jnpr.junos.utils.config.Config.commit_check')
@patch('jnpr.junos.utils.config.Config.commit')
def test_set_hostname_more_than_one_args_parsed_correctly(
self, mock_commit, mock_commit_check, mock_load):
mock_commit_check.return_value = True
args = {'comment': 'Committed via salt',
'__pub_user': 'root',
'__pub_arg': ['test-name',
{'comment': 'Committed via salt',
'confirm': 5}],
'__pub_fun': 'junos.set_hostname',
'__pub_jid': '20170220210915624885',
'__pub_tgt': 'mac_min',
'__pub_tgt_type': 'glob',
'__pub_ret': ''}
junos.set_hostname('test-name', **args)
mock_commit.assert_called_with(comment='Committed via salt', confirm=5)
@patch('jnpr.junos.utils.config.Config.load')
@patch('jnpr.junos.utils.config.Config.commit_check')
@patch('jnpr.junos.utils.config.Config.commit')
def test_set_hostname_successful_return_message(
self, mock_commit, mock_commit_check, mock_load):
mock_commit_check.return_value = True
args = {'comment': 'Committed via salt',
'__pub_user': 'root',
'__pub_arg': ['test-name',
{'comment': 'Committed via salt'}],
'__pub_fun': 'junos.set_hostname',
'__pub_jid': '20170220210915624885',
'__pub_tgt': 'mac_min',
'__pub_tgt_type': 'glob',
'__pub_ret': ''}
ret = dict()
ret['message'] = 'Successfully changed hostname.'
ret['out'] = True
self.assertEqual(junos.set_hostname('test-name', **args), ret)
@patch('jnpr.junos.utils.config.Config.commit')
def test_set_hostname_raise_exception_for_commit(self, mock_commit):
mock_commit.side_effect = self.raise_exception
ret = dict()
ret['message'] = 'Successfully loaded host-name but commit failed with "Test exception"'
ret['out'] = False
self.assertEqual(junos.set_hostname('test-name'), ret)
@patch('jnpr.junos.utils.config.Config.commit_check')
@patch('salt.modules.junos.rollback')
def test_set_hostname_fail_commit_check(
self, mock_rollback, mock_commit_check):
mock_commit_check.return_value = False
ret = dict()
ret['out'] = False
ret['message'] = 'Successfully loaded host-name but pre-commit check failed.'
self.assertEqual(junos.set_hostname('test'), ret)
@patch('jnpr.junos.utils.config.Config.commit_check')
@patch('jnpr.junos.utils.config.Config.commit')
def test_commit_without_args(self, mock_commit, mock_commit_check):
mock_commit.return_value = True
mock_commit_check.return_value = True
ret = dict()
ret['message'] = 'Commit Successful.'
ret['out'] = True
self.assertEqual(junos.commit(), ret)
@patch('jnpr.junos.utils.config.Config.commit_check')
def test_commit_raise_commit_check_exeception(self, mock_commit_check):
mock_commit_check.side_effect = self.raise_exception
ret = dict()
ret['message'] = 'Could not perform commit check due to "Test exception"'
ret['out'] = False
self.assertEqual(junos.commit(), ret)
@patch('jnpr.junos.utils.config.Config.commit_check')
@patch('jnpr.junos.utils.config.Config.commit')
def test_commit_raise_commit_exception(
self, mock_commit, mock_commit_check):
mock_commit_check.return_value = True
mock_commit.side_effect = self.raise_exception
ret = dict()
ret['out'] = False
ret['message'] = \
'Commit check succeeded but actual commit failed with "Test exception"'
self.assertEqual(junos.commit(), ret)
@patch('jnpr.junos.utils.config.Config.commit_check')
@patch('jnpr.junos.utils.config.Config.commit')
def test_commit_with_single_argument(self, mock_commit, mock_commit_check):
mock_commit_check.return_value = True
args = {'__pub_user': 'root',
'__pub_arg': [{'sync': True}],
'sync': True,
'__pub_fun': 'junos.commit',
'__pub_jid': '20170221182531323467',
'__pub_tgt': 'mac_min',
'__pub_tgt_type': 'glob',
'__pub_ret': ''}
junos.commit(**args)
mock_commit.assert_called_with(detail=False, sync=True)
@patch('jnpr.junos.utils.config.Config.commit_check')
@patch('jnpr.junos.utils.config.Config.commit')
def test_commit_with_multiple_arguments(
self, mock_commit, mock_commit_check):
mock_commit_check.return_value = True
args = {'comment': 'comitted via salt',
'__pub_user': 'root',
'__pub_arg': [{'comment': 'comitted via salt',
'confirm': 3,
'detail': True}],
'confirm': 3,
'detail': True,
'__pub_fun': 'junos.commit',
'__pub_jid': '20170221182856987820',
'__pub_tgt': 'mac_min',
'__pub_tgt_type': 'glob',
'__pub_ret': ''}
junos.commit(**args)
mock_commit.assert_called_with(
comment='comitted via salt', detail=True, confirm=3)
@patch('jnpr.junos.utils.config.Config.commit_check')
@patch('jnpr.junos.utils.config.Config.commit')
def test_commit_pyez_commit_returning_false(
self, mock_commit, mock_commit_check):
mock_commit.return_value = False
mock_commit_check.return_value = True
ret = dict()
ret['message'] = 'Commit failed.'
ret['out'] = False
self.assertEqual(junos.commit(), ret)
@patch('jnpr.junos.utils.config.Config.commit_check')
def test_commit_pyez_commit_check_returns_false(self, mock_commit_check):
mock_commit_check.return_value = False
ret = dict()
ret['out'] = False
ret['message'] = 'Pre-commit check failed.'
self.assertEqual(junos.commit(), ret)
@patch('jnpr.junos.utils.config.Config.rollback')
def test_rollback_exception(self, mock_rollback):
mock_rollback.side_effect = self.raise_exception
ret = dict()
ret['message'] = 'Rollback failed due to "Test exception"'
ret['out'] = False
self.assertEqual(junos.rollback(), ret)
@patch('jnpr.junos.utils.config.Config.commit_check')
@patch('jnpr.junos.utils.config.Config.commit')
@patch('jnpr.junos.utils.config.Config.rollback')
def test_rollback_without_args_success(
self, mock_rollback, mock_commit, mock_commit_check):
mock_commit_check.return_value = True
mock_rollback.return_value = True
ret = dict()
ret['message'] = 'Rollback successful'
ret['out'] = True
self.assertEqual(junos.rollback(), ret)
@patch('jnpr.junos.utils.config.Config.rollback')
def test_rollback_without_args_fail(self, mock_rollback):
mock_rollback.return_value = False
ret = dict()
ret['message'] = 'Rollback failed'
ret['out'] = False
self.assertEqual(junos.rollback(), ret)
@patch('jnpr.junos.utils.config.Config.commit_check')
@patch('jnpr.junos.utils.config.Config.commit')
@patch('jnpr.junos.utils.config.Config.rollback')
def test_rollback_with_id(
self,
mock_rollback,
mock_commit,
mock_commit_check):
mock_commit_check.return_value = True
junos.rollback(id=5)
mock_rollback.assert_called_with(5)
@patch('jnpr.junos.utils.config.Config.commit_check')
@patch('jnpr.junos.utils.config.Config.commit')
@patch('jnpr.junos.utils.config.Config.rollback')
def test_rollback_with_id_and_single_arg(
self, mock_rollback, mock_commit, mock_commit_check):
mock_commit_check.return_value = True
args = {'__pub_user': 'root', '__pub_arg': [2, {'confirm': 2}],
'confirm': 2, '__pub_fun': 'junos.rollback',
'__pub_jid': '20170221184518526067', '__pub_tgt': 'mac_min',
'__pub_tgt_type': 'glob', '__pub_ret': ''}
junos.rollback(2, **args)
mock_rollback.assert_called_with(2)
mock_commit.assert_called_with(confirm=2)
@patch('jnpr.junos.utils.config.Config.commit_check')
@patch('jnpr.junos.utils.config.Config.commit')
@patch('jnpr.junos.utils.config.Config.rollback')
def test_rollback_with_id_and_multiple_args(
self, mock_rollback, mock_commit, mock_commit_check):
mock_commit_check.return_value = True
args = {'comment': 'Comitted via salt',
'__pub_user': 'root',
'dev_timeout': 40,
'__pub_arg': [2,
{'comment': 'Comitted via salt',
'timeout': 40,
'confirm': 1}],
'confirm': 1,
'__pub_fun': 'junos.rollback',
'__pub_jid': '20170221192708251721',
'__pub_tgt': 'mac_min',
'__pub_tgt_type': 'glob',
'__pub_ret': ''}
junos.rollback(id=2, **args)
mock_rollback.assert_called_with(2)
mock_commit.assert_called_with(
comment='Comitted via salt', confirm=1, timeout=40)
@patch('jnpr.junos.utils.config.Config.commit_check')
@patch('jnpr.junos.utils.config.Config.commit')
@patch('jnpr.junos.utils.config.Config.rollback')
def test_rollback_with_only_single_arg(
self, mock_rollback, mock_commit, mock_commit_check):
mock_commit_check.return_value = True
args = {'__pub_user': 'root',
'__pub_arg': [{'sync': True}],
'sync': True,
'__pub_fun': 'junos.rollback',
'__pub_jid': '20170221193615696475',
'__pub_tgt': 'mac_min',
'__pub_tgt_type': 'glob',
'__pub_ret': ''}
junos.rollback(**args)
mock_rollback.assert_called_once_with(0)
mock_commit.assert_called_once_with(sync=True)
@patch('jnpr.junos.utils.config.Config.commit_check')
@patch('jnpr.junos.utils.config.Config.commit')
@patch('jnpr.junos.utils.config.Config.rollback')
def test_rollback_with_only_multiple_args_no_id(
self, mock_rollback, mock_commit, mock_commit_check):
mock_commit_check.return_value = True
args = {'comment': 'Comitted via salt',
'__pub_user': 'root',
'__pub_arg': [{'comment': 'Comitted via salt',
'confirm': 3,
'sync': True}],
'confirm': 3,
'sync': True,
'__pub_fun': 'junos.rollback',
'__pub_jid': '20170221193945996362',
'__pub_tgt': 'mac_min',
'__pub_tgt_type': 'glob',
'__pub_ret': ''}
junos.rollback(**args)
mock_rollback.assert_called_with(0)
mock_commit.assert_called_once_with(
sync=True, confirm=3, comment='Comitted via salt')
@patch('salt.modules.junos.fopen')
@patch('jnpr.junos.utils.config.Config.diff')
@patch('jnpr.junos.utils.config.Config.commit_check')
@patch('jnpr.junos.utils.config.Config.commit')
@patch('jnpr.junos.utils.config.Config.rollback')
def test_rollback_with_diffs_file_option_when_diff_is_None(
self, mock_rollback, mock_commit, mock_commit_check, mock_diff, mock_fopen):
mock_commit_check.return_value = True
mock_diff.return_value = 'diff'
args = {'__pub_user': 'root',
'__pub_arg': [{'diffs_file': '/home/regress/diff',
'confirm': 2}],
'confirm': 2,
'__pub_fun': 'junos.rollback',
'__pub_jid': '20170221205153884009',
'__pub_tgt': 'mac_min',
'__pub_tgt_type': 'glob',
'__pub_ret': '',
'diffs_file': '/home/regress/diff'}
junos.rollback(**args)
mock_fopen.assert_called_with('/home/regress/diff', 'w')
@patch('salt.modules.junos.fopen')
@patch('jnpr.junos.utils.config.Config.diff')
@patch('jnpr.junos.utils.config.Config.commit_check')
@patch('jnpr.junos.utils.config.Config.commit')
@patch('jnpr.junos.utils.config.Config.rollback')
def test_rollback_with_diffs_file_option(
self,
mock_rollback,
mock_commit,
mock_commit_check,
mock_diff,
mock_fopen):
mock_commit_check.return_value = True
mock_diff.return_value = None
args = {'__pub_user': 'root',
'__pub_arg': [{'diffs_file': '/home/regress/diff',
'confirm': 2}],
'confirm': 2,
'__pub_fun': 'junos.rollback',
'__pub_jid': '20170221205153884009',
'__pub_tgt': 'mac_min',
'__pub_tgt_type': 'glob',
'__pub_ret': '',
'diffs_file': '/home/regress/diff'}
junos.rollback(**args)
assert not mock_fopen.called
@patch('jnpr.junos.utils.config.Config.commit_check')
@patch('jnpr.junos.utils.config.Config.rollback')
def test_rollback_commit_check_exception(self,
mock_rollback,
mock_commit_check):
mock_commit_check.side_effect = self.raise_exception
ret = dict()
ret['message'] = 'Could not commit check due to "Test exception"'
ret['out'] = False
self.assertEqual(junos.rollback(), ret)
@patch('jnpr.junos.utils.config.Config.commit_check')
@patch('jnpr.junos.utils.config.Config.commit')
@patch('jnpr.junos.utils.config.Config.rollback')
def test_rollback_commit_exception(self,
mock_rollback,
mock_commit,
mock_commit_check):
mock_commit_check.return_value = True
mock_commit.side_effect = self.raise_exception
ret = dict()
ret['message'] = \
'Rollback successful but commit failed with error "Test exception"'
ret['out'] = False
self.assertEqual(junos.rollback(), ret)
@patch('jnpr.junos.utils.config.Config.commit_check')
@patch('jnpr.junos.utils.config.Config.rollback')
def test_rollback_commit_check_fails(self,
mock_rollback,
mock_commit_check):
mock_commit_check.return_value = False
ret = dict()
ret['message'] = 'Rollback succesfull but pre-commit check failed.'
ret['out'] = False
self.assertEqual(junos.rollback(), ret)
@patch('jnpr.junos.utils.config.Config.diff')
def test_diff_without_args(self, mock_diff):
junos.diff()
mock_diff.assert_called_with(rb_id=0)
@patch('jnpr.junos.utils.config.Config.diff')
def test_diff_with_arg(self, mock_diff):
junos.diff(2)
mock_diff.assert_called_with(rb_id=2)
@patch('jnpr.junos.utils.config.Config.diff')
def test_diff_exception(self, mock_diff):
mock_diff.side_effect = self.raise_exception
ret = dict()
ret['message'] = 'Could not get diff with error "Test exception"'
ret['out'] = False
self.assertEqual(junos.diff(), ret)
def test_ping_without_args(self):
ret = dict()
ret['message'] = 'Please specify the destination ip to ping.'
ret['out'] = False
self.assertEqual(junos.ping(), ret)
@patch('jnpr.junos.device.Device.execute')
def test_ping(self, mock_execute):
junos.ping('1.1.1.1')
args = mock_execute.call_args
rpc = '<ping><count>5</count><host>1.1.1.1</host></ping>'
self.assertEqualXML(args[0][0], rpc)
@patch('jnpr.junos.device.Device.execute')
def test_ping_ttl(self, mock_execute):
args = {'__pub_user': 'sudo_drajvi',
'__pub_arg': ['1.1.1.1',
{'ttl': 3}],
'__pub_fun': 'junos.ping',
'__pub_jid': '20170306165237683279',
'__pub_tgt': 'mac_min',
'ttl': 3,
'__pub_tgt_type': 'glob',
'__pub_ret': ''}
junos.ping('1.1.1.1', **args)
exec_args = mock_execute.call_args
rpc = '<ping><count>5</count><host>1.1.1.1</host><ttl>3</ttl></ping>'
self.assertEqualXML(exec_args[0][0], rpc)
@patch('jnpr.junos.device.Device.execute')
def test_ping_exception(self, mock_execute):
mock_execute.side_effect = self.raise_exception
ret = dict()
ret['message'] = 'Execution failed due to "Test exception"'
ret['out'] = False
self.assertEqual(junos.ping('1.1.1.1'), ret)
def test_cli_without_args(self):
ret = dict()
ret['message'] = 'Please provide the CLI command to be executed.'
ret['out'] = False
self.assertEqual(junos.cli(), ret)
@patch('jnpr.junos.device.Device.cli')
def test_cli_with_format_as_empty_string(self, mock_cli):
junos.cli('show version', '')
mock_cli.assert_called_with('show version', 'text', warning=False)
@patch('jnpr.junos.device.Device.cli')
def test_cli(self, mock_cli):
mock_cli.return_vale = 'CLI result'
ret = dict()
ret['message'] = 'CLI result'
ret['out'] = True
junos.cli('show version')
mock_cli.assert_called_with('show version', 'text', warning=False)
@patch('salt.modules.junos.jxmlease.parse')
@patch('salt.modules.junos.etree.tostring')
@patch('jnpr.junos.device.Device.cli')
def test_cli_format_xml(self, mock_cli, mock_to_string, mock_jxml):
mock_cli.return_value = '<root><a>test</a></root>'
mock_jxml.return_value = '<root><a>test</a></root>'
args = {'__pub_user': 'root',
'__pub_arg': [{'format': 'xml'}],
'format': 'xml',
'__pub_fun': 'junos.cli',
'__pub_jid': '20170221182531323467',
'__pub_tgt': 'mac_min',
'__pub_tgt_type': 'glob',
'__pub_ret': ''}
ret = dict()
ret['message'] = '<root><a>test</a></root>'
ret['out'] = True
self.assertEqual(junos.cli('show version', **args), ret)
mock_cli.assert_called_with('show version', 'xml', warning=False)
mock_to_string.assert_called_once_with('<root><a>test</a></root>')
assert mock_jxml.called
@patch('jnpr.junos.device.Device.cli')
def test_cli_exception_in_cli(self, mock_cli):
mock_cli.side_effect = self.raise_exception
ret = dict()
ret['message'] = 'Execution failed due to "Test exception"'
ret['out'] = False
self.assertEqual(junos.cli('show version'), ret)
@patch('salt.modules.junos.fopen')
@patch('jnpr.junos.device.Device.cli')
def test_cli_write_output(self, mock_cli, mock_fopen):
mock_cli.return_vale = 'cli text output'
args = {'__pub_user': 'root',
'__pub_arg': [{'dest': 'copy/output/here'}],
'dest': 'copy/output/here',
'__pub_fun': 'junos.cli',
'__pub_jid': '20170221182531323467',
'__pub_tgt': 'mac_min',
'__pub_tgt_type': 'glob',
'__pub_ret': ''}
ret = dict()
ret['message'] = 'cli text output'
ret['out'] = True
junos.cli('show version', **args)
mock_fopen.assert_called_with('copy/output/here', 'w')
def test_shutdown_without_args(self):
ret = dict()
ret['message'] = \
'Provide either one of the arguments: shutdown or reboot.'
ret['out'] = False
self.assertEqual(junos.shutdown(), ret)
@patch('salt.modules.junos.SW.reboot')
def test_shutdown_with_reboot_args(self, mock_reboot):
ret = dict()
ret['message'] = 'Successfully powered off/rebooted.'
ret['out'] = True
args = {'__pub_user': 'root', '__pub_arg': [{'reboot': True}],
'reboot': True, '__pub_fun': 'junos.shutdown',
'__pub_jid': '20170222213858582619', '__pub_tgt': 'mac_min',
'__pub_tgt_type': 'glob', '__pub_ret': ''}
self.assertEqual(junos.shutdown(**args), ret)
assert mock_reboot.called
@patch('salt.modules.junos.SW.poweroff')
def test_shutdown_with_poweroff_args(self, mock_poweroff):
ret = dict()
ret['message'] = 'Successfully powered off/rebooted.'
ret['out'] = True
args = {'__pub_user': 'root', '__pub_arg': [{'shutdown': True}],
'reboot': True, '__pub_fun': 'junos.shutdown',
'__pub_jid': '20170222213858582619', '__pub_tgt': 'mac_min',
'__pub_tgt_type': 'glob', '__pub_ret': ''}
self.assertEqual(junos.shutdown(**args), ret)
assert mock_poweroff.called
def test_shutdown_with_shutdown_as_false(self):
ret = dict()
ret['message'] = 'Nothing to be done.'
ret['out'] = False
args = {'__pub_user': 'root', '__pub_arg': [{'shutdown': False}],
'reboot': True, '__pub_fun': 'junos.shutdown',
'__pub_jid': '20170222213858582619', '__pub_tgt': 'mac_min',
'__pub_tgt_type': 'glob', '__pub_ret': ''}
self.assertEqual(junos.shutdown(**args), ret)
@patch('salt.modules.junos.SW.poweroff')
def test_shutdown_with_in_min_arg(self, mock_poweroff):
args = {'__pub_user': 'root',
'in_min': 10,
'__pub_arg': [{'in_min': 10,
'shutdown': True}],
'reboot': True,
'__pub_fun': 'junos.shutdown',
'__pub_jid': '20170222231445709212',
'__pub_tgt': 'mac_min',
'__pub_tgt_type': 'glob',
'__pub_ret': ''}
junos.shutdown(**args)
mock_poweroff.assert_called_with(in_min=10)
@patch('salt.modules.junos.SW.reboot')
def test_shutdown_with_at_arg(self, mock_reboot):
args = {'__pub_user': 'root',
'__pub_arg': [{'at': '12:00 pm',
'reboot': True}],
'reboot': True,
'__pub_fun': 'junos.shutdown',
'__pub_jid': '201702276857',
'at': '12:00 pm',
'__pub_tgt': 'mac_min',
'__pub_tgt_type': 'glob',
'__pub_ret': ''}
junos.shutdown(**args)
mock_reboot.assert_called_with(at='12:00 pm')
@patch('salt.modules.junos.SW.poweroff')
def test_shutdown_fail_with_exception(self, mock_poweroff):
mock_poweroff.side_effect = self.raise_exception
args = {'__pub_user': 'root', '__pub_arg': [{'shutdown': True}],
'shutdown': True, '__pub_fun': 'junos.shutdown',
'__pub_jid': '20170222213858582619', '__pub_tgt': 'mac_min',
'__pub_tgt_type': 'glob', '__pub_ret': ''}
ret = dict()
ret['message'] = 'Could not poweroff/reboot beacause "Test exception"'
ret['out'] = False
self.assertEqual(junos.shutdown(**args), ret)
def test_install_config_without_args(self):
ret = dict()
ret['message'] = \
'Please provide the salt path where the configuration is present'
ret['out'] = False
self.assertEqual(junos.install_config(), ret)
@patch('os.path.isfile')
def test_install_config_cp_fails(self, mock_isfile):
mock_isfile.return_value = False
ret = dict()
ret['message'] = 'Invalid file path.'
ret['out'] = False
self.assertEqual(junos.install_config('path'), ret)
@patch('os.path.isfile')
@patch('os.path.getsize')
def test_install_config_file_cp_fails(self, mock_getsize, mock_isfile):
mock_isfile.return_value = True
mock_getsize.return_value = 0
ret = dict()
ret['message'] = 'Template failed to render'
ret['out'] = False
self.assertEqual(junos.install_config('path'), ret)
@patch('jnpr.junos.utils.config.Config.commit')
@patch('jnpr.junos.utils.config.Config.commit_check')
@patch('jnpr.junos.utils.config.Config.diff')
@patch('jnpr.junos.utils.config.Config.load')
@patch('salt.modules.junos.safe_rm')
@patch('salt.modules.junos.files.mkstemp')
@patch('os.path.isfile')
@patch('os.path.getsize')
def test_install_config(
self,
mock_getsize,
mock_isfile,
mock_mkstemp,
mock_safe_rm,
mock_load,
mock_diff,
mock_commit_check,
mock_commit):
mock_isfile.return_value = True
mock_getsize.return_value = 10
mock_mkstemp.return_value = 'test/path/config'
mock_diff.return_value = 'diff'
mock_commit_check.return_value = True
ret = dict()
ret['message'] = 'Successfully loaded and committed!'
ret['out'] = True
self.assertEqual(junos.install_config('actual/path/config.set'), ret)
mock_load.assert_called_with(path='test/path/config', format='set')
@patch('jnpr.junos.utils.config.Config.commit')
@patch('jnpr.junos.utils.config.Config.commit_check')
@patch('jnpr.junos.utils.config.Config.diff')
@patch('jnpr.junos.utils.config.Config.load')
@patch('salt.modules.junos.safe_rm')
@patch('salt.modules.junos.files.mkstemp')
@patch('os.path.isfile')
@patch('os.path.getsize')
def test_install_config_xml_file(
self,
mock_getsize,
mock_isfile,
mock_mkstemp,
mock_safe_rm,
mock_load,
mock_diff,
mock_commit_check,
mock_commit):
mock_isfile.return_value = True
mock_getsize.return_value = 10
mock_mkstemp.return_value = 'test/path/config'
mock_diff.return_value = 'diff'
mock_commit_check.return_value = True
ret = dict()
ret['message'] = 'Successfully loaded and committed!'
ret['out'] = True
self.assertEqual(junos.install_config('actual/path/config.xml'), ret)
mock_load.assert_called_with(path='test/path/config', format='xml')
@patch('jnpr.junos.utils.config.Config.commit')
@patch('jnpr.junos.utils.config.Config.commit_check')
@patch('jnpr.junos.utils.config.Config.diff')
@patch('jnpr.junos.utils.config.Config.load')
@patch('salt.modules.junos.safe_rm')
@patch('salt.modules.junos.files.mkstemp')
@patch('os.path.isfile')
@patch('os.path.getsize')
def test_install_config_text_file(
self,
mock_getsize,
mock_isfile,
mock_mkstemp,
mock_safe_rm,
mock_load,
mock_diff,
mock_commit_check,
mock_commit):
mock_isfile.return_value = True
mock_getsize.return_value = 10
mock_mkstemp.return_value = 'test/path/config'
mock_diff.return_value = 'diff'
mock_commit_check.return_value = True
ret = dict()
ret['message'] = 'Successfully loaded and committed!'
ret['out'] = True
self.assertEqual(junos.install_config('actual/path/config'), ret)
mock_load.assert_called_with(path='test/path/config', format='text')
@patch('jnpr.junos.utils.config.Config.commit')
@patch('jnpr.junos.utils.config.Config.commit_check')
@patch('jnpr.junos.utils.config.Config.diff')
@patch('jnpr.junos.utils.config.Config.load')
@patch('salt.modules.junos.safe_rm')
@patch('salt.modules.junos.files.mkstemp')
@patch('os.path.isfile')
@patch('os.path.getsize')
def test_install_config_replace(
self,
mock_getsize,
mock_isfile,
mock_mkstemp,
mock_safe_rm,
mock_load,
mock_diff,
mock_commit_check,
mock_commit):
mock_isfile.return_value = True
mock_getsize.return_value = 10
mock_mkstemp.return_value = 'test/path/config'
mock_diff.return_value = 'diff'
mock_commit_check.return_value = True
args = {'__pub_user': 'root', '__pub_arg': [{'replace': True}],
'replace': True, '__pub_fun': 'junos.install_config',
'__pub_jid': '20170222213858582619', '__pub_tgt': 'mac_min',
'__pub_tgt_type': 'glob', '__pub_ret': ''}
ret = dict()
ret['message'] = 'Successfully loaded and committed!'
ret['out'] = True
self.assertEqual(
junos.install_config(
'actual/path/config.set',
**args),
ret)
mock_load.assert_called_with(
path='test/path/config',
format='set',
merge=False)
@patch('jnpr.junos.utils.config.Config.commit')
@patch('jnpr.junos.utils.config.Config.commit_check')
@patch('jnpr.junos.utils.config.Config.diff')
@patch('jnpr.junos.utils.config.Config.load')
@patch('salt.modules.junos.safe_rm')
@patch('salt.modules.junos.files.mkstemp')
@patch('os.path.isfile')
@patch('os.path.getsize')
def test_install_config_overwrite(
self,
mock_getsize,
mock_isfile,
mock_mkstemp,
mock_safe_rm,
mock_load,
mock_diff,
mock_commit_check,
mock_commit):
mock_isfile.return_value = True
mock_getsize.return_value = 10
mock_mkstemp.return_value = 'test/path/config'
mock_diff.return_value = 'diff'
mock_commit_check.return_value = True
args = {'__pub_user': 'root', '__pub_arg': [{'overwrite': True}],
'overwrite': True, '__pub_fun': 'junos.install_config',
'__pub_jid': '20170222213858582619', '__pub_tgt': 'mac_min',
'__pub_tgt_type': 'glob', '__pub_ret': ''}
ret = dict()
ret['message'] = 'Successfully loaded and committed!'
ret['out'] = True
self.assertEqual(
junos.install_config(
'actual/path/config.xml',
**args),
ret)
mock_load.assert_called_with(
path='test/path/config',
format='xml',
overwrite=True)
@patch('jnpr.junos.utils.config.Config.commit')
@patch('jnpr.junos.utils.config.Config.commit_check')
@patch('jnpr.junos.utils.config.Config.diff')
@patch('jnpr.junos.utils.config.Config.load')
@patch('salt.modules.junos.safe_rm')
@patch('salt.modules.junos.files.mkstemp')
@patch('os.path.isfile')
@patch('os.path.getsize')
def test_install_config_overwrite_false(
self,
mock_getsize,
mock_isfile,
mock_mkstemp,
mock_safe_rm,
mock_load,
mock_diff,
mock_commit_check,
mock_commit):
mock_isfile.return_value = True
mock_getsize.return_value = 10
mock_mkstemp.return_value = 'test/path/config'
mock_diff.return_value = 'diff'
mock_commit_check.return_value = True
args = {'__pub_user': 'root', '__pub_arg': [{'overwrite': False}],
'overwrite': False, '__pub_fun': 'junos.install_config',
'__pub_jid': '20170222213858582619', '__pub_tgt': 'mac_min',
'__pub_tgt_type': 'glob', '__pub_ret': ''}
ret = dict()
ret['message'] = 'Successfully loaded and committed!'
ret['out'] = True
self.assertEqual(
junos.install_config(
'actual/path/config',
**args),
ret)
mock_load.assert_called_with(
path='test/path/config', format='text', merge=True)
@patch('jnpr.junos.utils.config.Config.load')
@patch('salt.modules.junos.safe_rm')
@patch('salt.modules.junos.files.mkstemp')
@patch('os.path.isfile')
@patch('os.path.getsize')
def test_install_config_load_causes_exception(
self,
mock_getsize,
mock_isfile,
mock_mkstemp,
mock_safe_rm,
mock_load):
mock_isfile.return_value = True
mock_getsize.return_value = 10
mock_mkstemp.return_value = 'test/path/config'
mock_load.side_effect = self.raise_exception
ret = dict()
ret['message'] = 'Could not load configuration due to : "Test exception"'
ret['format'] = 'set'
ret['out'] = False
self.assertEqual(
junos.install_config(
path='actual/path/config.set'), ret)
@patch('jnpr.junos.utils.config.Config.diff')
@patch('jnpr.junos.utils.config.Config.load')
@patch('salt.modules.junos.safe_rm')
@patch('salt.modules.junos.files.mkstemp')
@patch('os.path.isfile')
@patch('os.path.getsize')
def test_install_config_no_diff(
self,
mock_getsize,
mock_isfile,
mock_mkstemp,
mock_safe_rm,
mock_load,
mock_diff):
mock_isfile.return_value = True
mock_getsize.return_value = 10
mock_mkstemp.return_value = 'test/path/config'
mock_diff.return_value = None
ret = dict()
ret['message'] = 'Configuration already applied!'
ret['out'] = True
self.assertEqual(junos.install_config('actual/path/config'), ret)
@patch('salt.modules.junos.fopen')
@patch('jnpr.junos.utils.config.Config.commit')
@patch('jnpr.junos.utils.config.Config.commit_check')
@patch('jnpr.junos.utils.config.Config.diff')
@patch('jnpr.junos.utils.config.Config.load')
@patch('salt.modules.junos.safe_rm')
@patch('salt.modules.junos.files.mkstemp')
@patch('os.path.isfile')
@patch('os.path.getsize')
def test_install_config_write_diff(
self,
mock_getsize,
mock_isfile,
mock_mkstemp,
mock_safe_rm,
mock_load,
mock_diff,
mock_commit_check,
mock_commit,
mock_fopen):
mock_isfile.return_value = True
mock_getsize.return_value = 10
mock_mkstemp.return_value = 'test/path/config'
mock_diff.return_value = 'diff'
mock_commit_check.return_value = True
args = {'__pub_user': 'root',
'__pub_arg': [{'diffs_file': 'copy/config/here'}],
'diffs_file': 'copy/config/here',
'__pub_fun': 'junos.install_config',
'__pub_jid': '20170222213858582619',
'__pub_tgt': 'mac_min',
'__pub_tgt_type': 'glob',
'__pub_ret': ''}
ret = dict()
ret['message'] = 'Successfully loaded and committed!'
ret['out'] = True
self.assertEqual(
junos.install_config(
'actual/path/config',
**args),
ret)
mock_fopen.assert_called_with('copy/config/here', 'w')
@patch('salt.modules.junos.fopen')
@patch('jnpr.junos.utils.config.Config.commit')
@patch('jnpr.junos.utils.config.Config.commit_check')
@patch('jnpr.junos.utils.config.Config.diff')
@patch('jnpr.junos.utils.config.Config.load')
@patch('salt.modules.junos.safe_rm')
@patch('salt.modules.junos.files.mkstemp')
@patch('os.path.isfile')
@patch('os.path.getsize')
def test_install_config_write_diff_exception(
self,
mock_getsize,
mock_isfile,
mock_mkstemp,
mock_safe_rm,
mock_load,
mock_diff,
mock_commit_check,
mock_commit,
mock_fopen):
mock_isfile.return_value = True
mock_getsize.return_value = 10
mock_mkstemp.return_value = 'test/path/config'
mock_diff.return_value = 'diff'
mock_commit_check.return_value = True
mock_fopen.side_effect = self.raise_exception
args = {'__pub_user': 'root',
'__pub_arg': [{'diffs_file': 'copy/config/here'}],
'diffs_file': 'copy/config/here',
'__pub_fun': 'junos.install_config',
'__pub_jid': '20170222213858582619',
'__pub_tgt': 'mac_min',
'__pub_tgt_type': 'glob',
'__pub_ret': ''}
ret = dict()
ret['message'] = 'Could not write into diffs_file due to: "Test exception"'
ret['out'] = False
self.assertEqual(
junos.install_config(
'actual/path/config',
**args),
ret)
mock_fopen.assert_called_with('copy/config/here', 'w')
@patch('jnpr.junos.utils.config.Config.commit')
@patch('jnpr.junos.utils.config.Config.commit_check')
@patch('jnpr.junos.utils.config.Config.diff')
@patch('jnpr.junos.utils.config.Config.load')
@patch('salt.modules.junos.safe_rm')
@patch('salt.modules.junos.files.mkstemp')
@patch('os.path.isfile')
@patch('os.path.getsize')
def test_install_config_commit_params(
self,
mock_getsize,
mock_isfile,
mock_mkstemp,
mock_safe_rm,
mock_load,
mock_diff,
mock_commit_check,
mock_commit):
mock_isfile.return_value = True
mock_getsize.return_value = 10
mock_mkstemp.return_value = 'test/path/config'
mock_diff.return_value = 'diff'
mock_commit_check.return_value = True
args = {'comment': 'comitted via salt',
'__pub_user': 'root',
'__pub_arg': [{'comment': 'comitted via salt',
'confirm': 3}],
'confirm': 3,
'__pub_fun': 'junos.commit',
'__pub_jid': '20170221182856987820',
'__pub_tgt': 'mac_min',
'__pub_tgt_type': 'glob',
'__pub_ret': ''}
ret = dict()
ret['message'] = 'Successfully loaded and committed!'
ret['out'] = True
self.assertEqual(
junos.install_config(
'actual/path/config',
**args),
ret)
mock_commit.assert_called_with(comment='comitted via salt', confirm=3)
@patch('jnpr.junos.utils.config.Config.commit_check')
@patch('jnpr.junos.utils.config.Config.diff')
@patch('jnpr.junos.utils.config.Config.load')
@patch('salt.modules.junos.safe_rm')
@patch('salt.modules.junos.files.mkstemp')
@patch('os.path.isfile')
@patch('os.path.getsize')
def test_install_config_commit_check_exception(
self,
mock_getsize,
mock_isfile,
mock_mkstemp,
mock_safe_rm,
mock_load,
mock_diff,
mock_commit_check):
mock_isfile.return_value = True
mock_getsize.return_value = 10
mock_mkstemp.return_value = 'test/path/config'
mock_diff.return_value = 'diff'
mock_commit_check.side_effect = self.raise_exception
ret = dict()
ret['message'] = 'Commit check threw the following exception: "Test exception"'
ret['out'] = False
self.assertEqual(junos.install_config('actual/path/config.xml'), ret)
@patch('jnpr.junos.utils.config.Config.commit_check')
@patch('jnpr.junos.utils.config.Config.diff')
@patch('jnpr.junos.utils.config.Config.load')
@patch('salt.modules.junos.safe_rm')
@patch('salt.modules.junos.files.mkstemp')
@patch('os.path.isfile')
@patch('os.path.getsize')
def test_install_config_commit_check_fails(
self,
mock_getsize,
mock_isfile,
mock_mkstemp,
mock_safe_rm,
mock_load,
mock_diff,
mock_commit_check):
mock_isfile.return_value = True
mock_getsize.return_value = 10
mock_mkstemp.return_value = 'test/path/config'
mock_diff.return_value = 'diff'
mock_commit_check.return_value = False
ret = dict()
ret['message'] = 'Loaded configuration but commit check failed.'
ret['out'] = False
self.assertEqual(junos.install_config('actual/path/config.xml'), ret)
@patch('jnpr.junos.utils.config.Config.commit')
@patch('jnpr.junos.utils.config.Config.commit_check')
@patch('jnpr.junos.utils.config.Config.diff')
@patch('jnpr.junos.utils.config.Config.load')
@patch('salt.modules.junos.safe_rm')
@patch('salt.modules.junos.files.mkstemp')
@patch('os.path.isfile')
@patch('os.path.getsize')
def test_install_config_commit_exception(
self,
mock_getsize,
mock_isfile,
mock_mkstemp,
mock_safe_rm,
mock_load,
mock_diff,
mock_commit_check,
mock_commit):
mock_isfile.return_value = True
mock_getsize.return_value = 10
mock_mkstemp.return_value = 'test/path/config'
mock_diff.return_value = 'diff'
mock_commit_check.return_value = True
mock_commit.side_effect = self.raise_exception
ret = dict()
ret['message'] = \
'Commit check successful but commit failed with "Test exception"'
ret['out'] = False
self.assertEqual(junos.install_config('actual/path/config'), ret)
@patch('jnpr.junos.device.Device.cli')
def test_zeroize(self, mock_cli):
result = junos.zeroize()
ret = dict()
ret['out'] = True
ret['message'] = 'Completed zeroize and rebooted'
mock_cli.assert_called_once_with('request system zeroize')
self.assertEqual(result, ret)
@patch('jnpr.junos.device.Device.cli')
def test_zeroize_throw_exception(self, mock_cli):
mock_cli.side_effect = self.raise_exception
ret = dict()
ret['message'] = 'Could not zeroize due to : "Test exception"'
ret['out'] = False
self.assertEqual(junos.zeroize(), ret)
def test_install_os_without_args(self):
ret = dict()
ret['message'] = \
'Please provide the salt path where the junos image is present.'
ret['out'] = False
self.assertEqual(junos.install_os(), ret)
@patch('os.path.isfile')
@patch('os.path.getsize')
def test_install_os_cp_fails(self, mock_getsize, mock_isfile):
mock_getsize.return_value = 10
mock_isfile.return_value = False
ret = dict()
ret['message'] = 'Invalid image path.'
ret['out'] = False
self.assertEqual(junos.install_os('/image/path/'), ret)
@patch('os.path.isfile')
@patch('os.path.getsize')
def test_install_os_image_cp_fails(
self, mock_getsize, mock_isfile):
mock_getsize.return_value = 0
mock_isfile.return_value = True
ret = dict()
ret['message'] = 'Failed to copy image'
ret['out'] = False
self.assertEqual(junos.install_os('/image/path/'), ret)
@patch('jnpr.junos.utils.sw.SW.install')
@patch('salt.modules.junos.safe_rm')
@patch('salt.modules.junos.files.mkstemp')
@patch('os.path.isfile')
@patch('os.path.getsize')
def test_install_os(
self,
mock_getsize,
mock_isfile,
mock_mkstemp,
mock_safe_rm,
mock_install):
mock_getsize.return_value = 10
mock_isfile.return_value = True
ret = dict()
ret['out'] = True
ret['message'] = 'Installed the os.'
self.assertEqual(junos.install_os('path'), ret)
@patch('jnpr.junos.utils.sw.SW.reboot')
@patch('jnpr.junos.utils.sw.SW.install')
@patch('salt.modules.junos.safe_rm')
@patch('salt.modules.junos.files.mkstemp')
@patch('os.path.isfile')
@patch('os.path.getsize')
def test_install_os_with_reboot_arg(
self,
mock_getsize,
mock_isfile,
mock_mkstemp,
mock_safe_rm,
mock_install,
mock_reboot):
mock_getsize.return_value = 10
mock_isfile.return_value = True
args = {'__pub_user': 'root', '__pub_arg': [{'reboot': True}],
'reboot': True, '__pub_fun': 'junos.install_os',
'__pub_jid': '20170222213858582619', '__pub_tgt': 'mac_min',
'__pub_tgt_type': 'glob', '__pub_ret': ''}
ret = dict()
ret['message'] = 'Successfully installed and rebooted!'
ret['out'] = True
self.assertEqual(junos.install_os('path', **args), ret)
@patch('jnpr.junos.utils.sw.SW.install')
@patch('salt.modules.junos.safe_rm')
@patch('salt.modules.junos.files.mkstemp')
@patch('os.path.isfile')
@patch('os.path.getsize')
def test_install_os_pyez_install_throws_exception(
self,
mock_getsize,
mock_isfile,
mock_mkstemp,
mock_safe_rm,
mock_install):
mock_getsize.return_value = 10
mock_isfile.return_value = True
mock_install.side_effect = self.raise_exception
ret = dict()
ret['message'] = 'Installation failed due to: "Test exception"'
ret['out'] = False
self.assertEqual(junos.install_os('path'), ret)
@patch('jnpr.junos.utils.sw.SW.reboot')
@patch('jnpr.junos.utils.sw.SW.install')
@patch('salt.modules.junos.safe_rm')
@patch('salt.modules.junos.files.mkstemp')
@patch('os.path.isfile')
@patch('os.path.getsize')
def test_install_os_with_reboot_raises_exception(
self,
mock_getsize,
mock_isfile,
mock_mkstemp,
mock_safe_rm,
mock_install,
mock_reboot):
mock_getsize.return_value = 10
mock_isfile.return_value = True
mock_reboot.side_effect = self.raise_exception
args = {'__pub_user': 'root', '__pub_arg': [{'reboot': True}],
'reboot': True, '__pub_fun': 'junos.install_os',
'__pub_jid': '20170222213858582619', '__pub_tgt': 'mac_min',
'__pub_tgt_type': 'glob', '__pub_ret': ''}
ret = dict()
ret['message'] = \
'Installation successful but reboot failed due to : "Test exception"'
ret['out'] = False
self.assertEqual(junos.install_os('path', **args), ret)
def test_file_copy_without_args(self):
ret = dict()
ret['message'] = \
'Please provide the absolute path of the file to be copied.'
ret['out'] = False
self.assertEqual(junos.file_copy(), ret)
@patch('os.path.isfile')
def test_file_copy_invalid_src(self, mock_isfile):
mock_isfile.return_value = False
ret = dict()
ret['message'] = 'Invalid source file path'
ret['out'] = False
self.assertEqual(junos.file_copy('invalid/file/path', 'file'), ret)
def test_file_copy_without_dest(self):
ret = dict()
ret['message'] = \
'Please provide the absolute path of the destination where the file is to be copied.'
ret['out'] = False
with patch('salt.modules.junos.os.path.isfile') as mck:
mck.return_value = True
self.assertEqual(junos.file_copy('/home/user/config.set'), ret)
@patch('salt.modules.junos.SCP')
@patch('os.path.isfile')
def test_file_copy(self, mock_isfile, mock_scp):
mock_isfile.return_value = True
ret = dict()
ret['message'] = 'Successfully copied file from test/src/file to file'
ret['out'] = True
self.assertEqual(
junos.file_copy(
dest='file',
src='test/src/file'),
ret)
@patch('salt.modules.junos.SCP')
@patch('os.path.isfile')
def test_file_copy_exception(self, mock_isfile, mock_scp):
mock_isfile.return_value = True
mock_scp.side_effect = self.raise_exception
ret = dict()
ret['message'] = 'Could not copy file : "Test exception"'
ret['out'] = False
self.assertEqual(
junos.file_copy(
dest='file',
src='test/src/file'),
ret)
# These test cases test the __virtual__ function, used internally by salt
# to check if the given module is loadable. This function is not used by
# an external user.
def test_virtual_proxy_unavailable(self):
with patch.dict(junos.__opts__, {}):
res = (False, 'The junos module could not be '
'loaded: junos-eznc or jxmlease or proxy could not be loaded.')
self.assertEqual(junos.__virtual__(), res)
def test_virtual_all_true(self):
with patch.dict(junos.__opts__, {'proxy': 'test'}):
self.assertEqual(junos.__virtual__(), 'junos')
def test_rpc_without_args(self):
ret = dict()
ret['message'] = 'Please provide the rpc to execute.'
ret['out'] = False
self.assertEqual(junos.rpc(), ret)
@patch('jnpr.junos.device.Device.execute')
def test_rpc_get_config_exception(self, mock_execute):
mock_execute.side_effect = self.raise_exception
ret = dict()
ret['message'] = 'RPC execution failed due to "Test exception"'
ret['out'] = False
self.assertEqual(junos.rpc('get_config'), ret)
@patch('jnpr.junos.device.Device.execute')
def test_rpc_get_config_filter(
self,
mock_execute):
mock_execute.return_value = etree.XML('<reply><rpc/></reply>')
args = {'__pub_user': 'root',
'__pub_arg': ['get-config',
{'filter': '<configuration><system/></configuration>'}],
'__pub_fun': 'junos.rpc',
'__pub_jid': '20170314162715866528',
'__pub_tgt': 'mac_min',
'__pub_tgt_type': 'glob',
'filter': '<configuration><system/></configuration>',
'__pub_ret': ''}
junos.rpc('get-config', **args)
exec_args = mock_execute.call_args
expected_rpc = '<get-configuration dev_timeout="30" ' \
'format="xml"><configuration><system/></configuration></get-configuration>'
self.assertEqualXML(exec_args[0][0], expected_rpc)
@patch('jnpr.junos.device.Device.execute')
def test_rpc_get_interface_information(self, mock_execute):
junos.rpc('get-interface-information', format='json')
args = mock_execute.call_args
expected_rpc = '<get-interface-information format="json"/>'
self.assertEqualXML(args[0][0], expected_rpc)
self.assertEqual(args[1], {'dev_timeout': 30})
@patch('jnpr.junos.device.Device.execute')
def test_rpc_get_interface_information_with_kwargs(
self, mock_execute):
args = {'__pub_user': 'root',
'__pub_arg': ['get-interface-information',
'',
'text',
{'terse': True,
'interface_name': 'lo0'}],
'terse': True,
'__pub_fun': 'junos.rpc',
'__pub_jid': '20170314160943363563',
'__pub_tgt': 'mac_min',
'interface_name': 'lo0',
'__pub_tgt_type': 'glob',
'__pub_ret': ''}
junos.rpc('get-interface-information', format='text', **args)
args = mock_execute.call_args
expected_rpc = (
'<get-interface-information format="text">'
'<terse/><interface-name>lo0</interface-name></get-interface-information>'
)
self.assertEqualXML(etree.tostring(args[0][0]), expected_rpc)
@patch('salt.modules.junos.jxmlease.parse')
@patch('salt.modules.junos.etree.tostring')
@patch('salt.modules.junos.logging.Logger.warning')
@patch('jnpr.junos.device.Device.execute')
def test_rpc_get_chassis_inventory_filter_as_arg(
self, mock_execute, mock_warning, mock_tostring, mock_jxmlease):
junos.rpc(
'get-chassis-inventory',
filter='<configuration><system/></configuration>')
mock_warning.assert_called_with(
'Filter ignored as it is only used with "get-config" rpc')
@patch('jnpr.junos.device.Device.execute')
def test_rpc_get_interface_information_exception(
self, mock_execute):
mock_execute.side_effect = self.raise_exception
ret = dict()
ret['message'] = 'RPC execution failed due to "Test exception"'
ret['out'] = False
self.assertEqual(junos.rpc('get_interface_information'), ret)
@patch('jnpr.junos.device.Device.execute')
def test_rpc_write_file_format_text(self, mock_execute):
mock_execute.return_value = etree.XML(
'<rpc-reply>text rpc reply</rpc-reply>')
m = mock_open()
with patch('salt.modules.junos.fopen', m, create=True):
junos.rpc('get-chassis-inventory', '/path/to/file', 'text')
handle = m()
handle.write.assert_called_with('text rpc reply')
@patch('salt.modules.junos.json.dumps')
@patch('jnpr.junos.device.Device.execute')
def test_rpc_write_file_format_json(self, mock_execute, mock_dumps):
mock_dumps.return_value = 'json rpc reply'
m = mock_open()
with patch('salt.modules.junos.fopen', m, create=True):
junos.rpc('get-chassis-inventory', '/path/to/file', format='json')
handle = m()
handle.write.assert_called_with('json rpc reply')
@patch('salt.modules.junos.jxmlease.parse')
@patch('salt.modules.junos.etree.tostring')
@patch('jnpr.junos.device.Device.execute')
def test_rpc_write_file(self, mock_execute, mock_tostring, mock_parse):
mock_tostring.return_value = 'xml rpc reply'
m = mock_open()
with patch('salt.modules.junos.fopen', m, create=True):
junos.rpc('get-chassis-inventory', '/path/to/file')
handle = m()
handle.write.assert_called_with('xml rpc reply')
| 42.126694 | 106 | 0.54821 | 7,664 | 71,489 | 4.823069 | 0.052323 | 0.034574 | 0.052646 | 0.061168 | 0.866302 | 0.843361 | 0.821475 | 0.791852 | 0.773942 | 0.752759 | 0 | 0.027287 | 0.321784 | 71,489 | 1,696 | 107 | 42.151533 | 0.735093 | 0.004504 | 0 | 0.744231 | 0 | 0.001282 | 0.305139 | 0.117431 | 0 | 0 | 0.001054 | 0 | 0.074359 | 1 | 0.064744 | false | 0.001282 | 0.007692 | 0.000641 | 0.075 | 0.000641 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
8a434dcfb9cf903aa14560751414e4fcefd83883 | 111,391 | py | Python | tests/test_L2_drawer.py | jinjamator/N2G | 2467924abd9fc8f2dd569339cd9cb982488d9cd2 | [
"MIT"
] | 1 | 2022-01-01T20:44:49.000Z | 2022-01-01T20:44:49.000Z | tests/test_L2_drawer.py | jinjamator/N2G | 2467924abd9fc8f2dd569339cd9cb982488d9cd2 | [
"MIT"
] | null | null | null | tests/test_L2_drawer.py | jinjamator/N2G | 2467924abd9fc8f2dd569339cd9cb982488d9cd2 | [
"MIT"
] | null | null | null | import sys
sys.path.insert(0,'..')
# after updated sys path, can do N2G import from parent dir
from N2G import drawio_diagram as create_drawio_diagram
from N2G import yed_diagram as create_yed_diagram
from N2G.N2G_L2_Drawer import layer_2_drawer
def test_cdp_drawing_yed_data_dict():
data = {"Cisco_IOS": ["""
switch-1#show cdp neighbors detail
-------------------------
Device ID: switch-2
Entry address(es):
IP address: 10.2.2.2
Platform: cisco WS-C6509, Capabilities: Router Switch IGMP
Interface: GigabitEthernet4/6, Port ID (outgoing port): GigabitEthernet1/5
-------------------------
Device ID: switch-3
Entry address(es):
IP address: 10.3.3.3
Platform: cisco WS-C3560-48TS, Capabilities: Switch IGMP
Interface: GigabitEthernet1/1, Port ID (outgoing port): GigabitEthernet0/1
-------------------------
Device ID: switch-4
Entry address(es):
IP address: 10.4.4.4
Platform: cisco WS-C3560-48TS, Capabilities: Switch IGMP
Interface: GigabitEthernet1/2, Port ID (outgoing port): GigabitEthernet0/10
switch-1#show run
interface GigabitEthernet4/6
description switch-2: access
switchport
switchport access vlan 2150
switchport mode access
spanning-tree portfast edge
!
interface GigabitEthernet1/1
description switch-3:Gi0/1
switchport
switchport trunk allowed vlan 1771,1887
switchport mode trunk
mtu 9216
!
interface GigabitEthernet1/2
description SW4 Routing Peering
vrf forwarding VRF1
ip address 10.0.0.1 255.255.255.0
""",
"""
switch-2#show cdp neighbors detail
-------------------------
Device ID: switch-1
Entry address(es):
IP address: 10.1.1.1
Platform: cisco WS-C6509, Capabilities: Router Switch IGMP
Interface: GigabitEthernet1/5, Port ID (outgoing port): GigabitEthernet4/6
switch-2#show run
interface GigabitEthernet1/5
description switch-1: access
switchport
switchport access vlan 2150
switchport mode access
spanning-tree portfast edge
"""
]
}
config = {}
drawing = create_yed_diagram()
drawer = layer_2_drawer(drawing, config)
drawer.work(data)
assert drawer.parsed_data == {'Cisco_IOS': {'switch-1': {'cdp_peers': [{'source': 'switch-1',
'src_label': 'Ge4/6',
'target': {'bottom_label': 'cisco WS-C6509', 'id': 'switch-2', 'top_label': '10.2.2.2'},
'trgt_label': 'Ge1/5'},
{'source': 'switch-1',
'src_label': 'Ge1/1',
'target': {'bottom_label': 'cisco WS-C3560-48TS', 'id': 'switch-3', 'top_label': '10.3.3.3'},
'trgt_label': 'Ge0/1'},
{'source': 'switch-1',
'src_label': 'Ge1/2',
'target': {'bottom_label': 'cisco WS-C3560-48TS', 'id': 'switch-4', 'top_label': '10.4.4.4'},
'trgt_label': 'Ge0/10'}],
'interfaces': {'Ge1/1': {'description': 'switch-3:Gi0/1',
'is_l2': True,
'l2_mode': 'trunk',
'mtu': '9216',
'trunk_vlans': '1771,1887'},
'Ge1/2': {'description': 'SW4 Routing Peering', 'ip': '10.0.0.1 255.255.255.0', 'vrf': 'VRF1'},
'Ge4/6': {'access_vlan': '2150', 'description': 'switch-2: access', 'is_l2': True, 'l2_mode': 'access'}}},
'switch-2': {'cdp_peers': [{'source': 'switch-2',
'src_label': 'Ge1/5',
'target': {'bottom_label': 'cisco WS-C6509', 'id': 'switch-1', 'top_label': '10.1.1.1'},
'trgt_label': 'Ge4/6'}],
'interfaces': {'Ge1/5': {'access_vlan': '2150', 'description': 'switch-1: access', 'is_l2': True, 'l2_mode': 'access'}}}}}
drawer.drawing.dump_file(filename="test_cdp_drawing_yed_data_dict.graphml", folder="./Output/")
with open ("./Output/test_cdp_drawing_yed_data_dict.graphml") as produced:
with open("./Output/should_be_test_cdp_drawing_yed_base.graphml") as should_be:
assert produced.read() == should_be.read()
# test_cdp_drawing_yed_data_dict()
def test_cdp_drawing_yed_data_path():
data = "./Data/SAMPLE_CDP_LLDP/"
config = {}
drawing = create_yed_diagram()
drawer = layer_2_drawer(drawing, config)
drawer.work(data)
assert drawer.parsed_data == {'Cisco_IOS': {'switch-1': {'cdp_peers': [{'source': 'switch-1',
'src_label': 'Ge4/6',
'target': {'bottom_label': 'cisco WS-C6509', 'id': 'switch-2', 'top_label': '10.2.2.2'},
'trgt_label': 'Ge1/5'},
{'source': 'switch-1',
'src_label': 'Ge1/1',
'target': {'bottom_label': 'cisco WS-C3560-48TS', 'id': 'switch-3', 'top_label': '10.3.3.3'},
'trgt_label': 'Ge0/1'},
{'source': 'switch-1',
'src_label': 'Ge1/2',
'target': {'bottom_label': 'cisco WS-C3560-48TS', 'id': 'switch-4', 'top_label': '10.4.4.4'},
'trgt_label': 'Ge0/10'}],
'interfaces': {'Ge1/1': {'description': 'switch-3:Gi0/1',
'is_l2': True,
'l2_mode': 'trunk',
'mtu': '9216',
'trunk_vlans': '1771,1887'},
'Ge1/2': {'description': 'SW4 Routing Peering', 'ip': '10.0.0.1 255.255.255.0', 'vrf': 'VRF1'},
'Ge4/6': {'access_vlan': '2150', 'description': 'switch-2: access', 'is_l2': True, 'l2_mode': 'access'}}},
'switch-2': {'cdp_peers': [{'source': 'switch-2',
'src_label': 'Ge1/5',
'target': {'bottom_label': 'cisco WS-C6509', 'id': 'switch-1', 'top_label': '10.1.1.1'},
'trgt_label': 'Ge4/6'}],
'interfaces': {'Ge1/5': {'access_vlan': '2150', 'description': 'switch-1: access', 'is_l2': True, 'l2_mode': 'access'}}}}}
drawer.drawing.dump_file(filename="test_cdp_drawing_yed_data_path.graphml", folder="./Output/")
with open ("./Output/test_cdp_drawing_yed_data_path.graphml") as produced:
with open("./Output/should_be_test_cdp_drawing_yed_base.graphml") as should_be:
assert produced.read() == should_be.read()
# test_cdp_drawing_yed_data_path()
def test_cdp_drawing_yed_data_dict_add_lag():
data = {"Cisco_IOS": ["""
switch-1#show cdp neighbors detail
-------------------------
Device ID: switch-2
Entry address(es):
IP address: 10.2.2.2
Platform: cisco WS-C6509, Capabilities: Router Switch IGMP
Interface: GigabitEthernet4/6, Port ID (outgoing port): GigabitEthernet1/5
-------------------------
Device ID: switch-2
Entry address(es):
IP address: 10.2.2.2
Platform: cisco WS-C6509, Capabilities: Router Switch IGMP
Interface: GigabitEthernet4/7, Port ID (outgoing port): GigabitEthernet1/6
-------------------------
Device ID: switch-3
Entry address(es):
IP address: 10.3.3.3
Platform: cisco WS-C3560-48TS, Capabilities: Switch IGMP
Interface: GigabitEthernet1/1, Port ID (outgoing port): GigabitEthernet0/1
-------------------------
Device ID: switch-4
Entry address(es):
IP address: 10.4.4.4
Platform: cisco WS-C3560-48TS, Capabilities: Switch IGMP
Interface: GigabitEthernet1/2, Port ID (outgoing port): GigabitEthernet0/10
switch-1#show run
interface Port-channel3
description switch-2: trunk LAG
switchport
switchport trunk allowed vlan 200-205
switchport mode trunk
!
interface Port-channel11
description switch-3: trunk LAG
switchport
switchport trunk allowed vlan 101
switchport mode trunk
!
interface GigabitEthernet4/6
description switch-2: trunk
switchport
switchport trunk allowed vlan 200-205
switchport mode trunk
channel-group 3 mode active
!
interface GigabitEthernet4/7
description switch-2: trunk
switchport
switchport trunk allowed vlan 200-205
switchport mode trunk
channel-group 3 mode active
!
interface GigabitEthernet1/1
description switch-3:Gi0/1
switchport
switchport trunk allowed vlan 101
switchport mode trunk
mtu 9216
channel-group 11 mode active
!
interface GigabitEthernet1/2
description SW4 Routing Peering
vrf forwarding VRF1
ip address 10.0.0.1 255.255.255.0
""",
"""
switch-2#show cdp neighbors detail
-------------------------
Device ID: switch-1
Entry address(es):
IP address: 10.1.1.1
Platform: cisco WS-C6509, Capabilities: Router Switch IGMP
Interface: GigabitEthernet1/5, Port ID (outgoing port): GigabitEthernet4/6
-------------------------
Device ID: switch-1
Entry address(es):
IP address: 10.1.1.1
Platform: cisco WS-C6509, Capabilities: Router Switch IGMP
Interface: GigabitEthernet1/6, Port ID (outgoing port): GigabitEthernet4/7
switch-2#show run
interface Port-channel3
description switch-1: trunk LAG
switchport
switchport trunk allowed vlan 200-205
switchport mode trunk
!
interface GigabitEthernet1/5
description switch-1: trunk
switchport
switchport trunk allowed vlan 200-205
switchport mode trunk
channel-group 3 mode active
!
interface GigabitEthernet1/6
description switch-1: trunk
switchport
switchport trunk allowed vlan 200-205
switchport mode trunk
channel-group 3 mode active
"""
]
}
config = {
"add_lag": True
}
drawing = create_yed_diagram()
drawer = layer_2_drawer(drawing, config)
drawer.work(data)
drawer.drawing.dump_file(filename="test_cdp_drawing_yed_data_dict_add_lag.graphml", folder="./Output/")
with open ("./Output/test_cdp_drawing_yed_data_dict_add_lag.graphml") as produced:
with open("./Output/should_be_test_cdp_drawing_yed_data_dict_add_lag.graphml") as should_be:
assert produced.read() == should_be.read()
# test_cdp_drawing_yed_data_dict_add_lag()
def test_cdp_drawing_yed_data_dict_group_links():
data = {"Cisco_IOS": ["""
switch-1#show cdp neighbors detail
-------------------------
Device ID: switch-2
Entry address(es):
IP address: 10.2.2.2
Platform: cisco WS-C6509, Capabilities: Router Switch IGMP
Interface: GigabitEthernet4/6, Port ID (outgoing port): GigabitEthernet1/5
-------------------------
Device ID: switch-2
Entry address(es):
IP address: 10.2.2.2
Platform: cisco WS-C6509, Capabilities: Router Switch IGMP
Interface: GigabitEthernet4/7, Port ID (outgoing port): GigabitEthernet1/6
-------------------------
Device ID: switch-3
Entry address(es):
IP address: 10.3.3.3
Platform: cisco WS-C3560-48TS, Capabilities: Switch IGMP
Interface: GigabitEthernet1/1, Port ID (outgoing port): GigabitEthernet0/1
-------------------------
Device ID: switch-4
Entry address(es):
IP address: 10.4.4.4
Platform: cisco WS-C3560-48TS, Capabilities: Switch IGMP
Interface: GigabitEthernet1/2, Port ID (outgoing port): GigabitEthernet0/10
switch-1#show run
interface Port-channel3
description switch-2: trunk LAG
switchport
switchport trunk allowed vlan 200-205
switchport mode trunk
!
interface Port-channel11
description switch-3: trunk LAG
switchport
switchport trunk allowed vlan 101
switchport mode trunk
!
interface GigabitEthernet4/6
description switch-2: trunk
switchport
switchport trunk allowed vlan 200-205
switchport mode trunk
channel-group 3 mode active
!
interface GigabitEthernet4/7
description switch-2: trunk
switchport
switchport trunk allowed vlan 200-205
switchport mode trunk
channel-group 3 mode active
!
interface GigabitEthernet1/1
description switch-3:Gi0/1
switchport
switchport trunk allowed vlan 101
switchport mode trunk
mtu 9216
channel-group 11 mode active
!
interface GigabitEthernet1/2
description SW4 Routing Peering
vrf forwarding VRF1
ip address 10.0.0.1 255.255.255.0
""",
"""
switch-2#show cdp neighbors detail
-------------------------
Device ID: switch-1
Entry address(es):
IP address: 10.1.1.1
Platform: cisco WS-C6509, Capabilities: Router Switch IGMP
Interface: GigabitEthernet1/5, Port ID (outgoing port): GigabitEthernet4/6
-------------------------
Device ID: switch-1
Entry address(es):
IP address: 10.1.1.1
Platform: cisco WS-C6509, Capabilities: Router Switch IGMP
Interface: GigabitEthernet1/6, Port ID (outgoing port): GigabitEthernet4/7
switch-2#show run
interface Port-channel3
description switch-1: trunk LAG
switchport
switchport trunk allowed vlan 200-205
switchport mode trunk
!
interface GigabitEthernet1/5
description switch-1: trunk
switchport
switchport trunk allowed vlan 200-205
switchport mode trunk
channel-group 3 mode active
!
interface GigabitEthernet1/6
description switch-1: trunk
switchport
switchport trunk allowed vlan 200-205
switchport mode trunk
channel-group 3 mode active
"""
]
}
config = {
"group_links": True
}
drawing = create_yed_diagram()
drawer = layer_2_drawer(drawing, config)
drawer.work(data)
drawer.drawing.dump_file(filename="test_cdp_drawing_yed_data_dict_group_links.graphml", folder="./Output/")
with open ("./Output/test_cdp_drawing_yed_data_dict_group_links.graphml") as produced:
with open("./Output/should_be_test_cdp_drawing_yed_data_dict_group_links.graphml") as should_be:
assert produced.read() == should_be.read()
# test_cdp_drawing_yed_data_dict_group_links()
def test_cdp_drawing_yed_data_dict_group_links_add_lag():
data = {"Cisco_IOS": ["""
switch-1#show cdp neighbors detail
-------------------------
Device ID: switch-2
Entry address(es):
IP address: 10.2.2.2
Platform: cisco WS-C6509, Capabilities: Router Switch IGMP
Interface: GigabitEthernet4/6, Port ID (outgoing port): GigabitEthernet1/5
-------------------------
Device ID: switch-2
Entry address(es):
IP address: 10.2.2.2
Platform: cisco WS-C6509, Capabilities: Router Switch IGMP
Interface: GigabitEthernet4/7, Port ID (outgoing port): GigabitEthernet1/6
-------------------------
Device ID: switch-2
Entry address(es):
IP address: 10.2.2.2
Platform: cisco WS-C6509, Capabilities: Router Switch IGMP
Interface: GigabitEthernet4/8, Port ID (outgoing port): GigabitEthernet1/8
-------------------------
Device ID: switch-3
Entry address(es):
IP address: 10.3.3.3
Platform: cisco WS-C3560-48TS, Capabilities: Switch IGMP
Interface: GigabitEthernet1/1, Port ID (outgoing port): GigabitEthernet0/1
-------------------------
Device ID: switch-4
Entry address(es):
IP address: 10.4.4.4
Platform: cisco WS-C3560-48TS, Capabilities: Switch IGMP
Interface: GigabitEthernet1/2, Port ID (outgoing port): GigabitEthernet0/10
-------------------------
Device ID: switch-4
Entry address(es):
IP address: 10.4.4.4
Platform: cisco WS-C3560-48TS, Capabilities: Switch IGMP
Interface: GigabitEthernet1/24, Port ID (outgoing port): GigabitEthernet0/14
switch-1#show run
interface Port-channel3
description switch-2: trunk LAG
switchport
switchport trunk allowed vlan 200-205
switchport mode trunk
!
interface Port-channel11
description switch-3: trunk LAG
switchport
switchport trunk allowed vlan 101
switchport mode trunk
!
interface GigabitEthernet4/6
description switch-2: trunk
switchport
switchport trunk allowed vlan 200-205
switchport mode trunk
channel-group 3 mode active
!
interface GigabitEthernet4/7
description switch-2: trunk
switchport
switchport trunk allowed vlan 200-205
switchport mode trunk
channel-group 3 mode active
!
interface GigabitEthernet4/8
description switch-2: trunk standalone
switchport
switchport trunk allowed vlan 300-305
switchport mode trunk
!
interface GigabitEthernet1/1
description switch-3:Gi0/1
switchport
switchport trunk allowed vlan 101
switchport mode trunk
mtu 9216
channel-group 11 mode active
!
interface GigabitEthernet1/2
description SW4 Routing Peering
vrf forwarding VRF1
ip address 10.0.0.1 255.255.255.0
!
interface GigabitEthernet1/24
description SW4 Routing Peering VRF2
vrf forwarding VRF2
ip address 10.0.1.1 255.255.255.0
""",
"""
switch-2#show cdp neighbors detail
-------------------------
Device ID: switch-1
Entry address(es):
IP address: 10.1.1.1
Platform: cisco WS-C6509, Capabilities: Router Switch IGMP
Interface: GigabitEthernet1/5, Port ID (outgoing port): GigabitEthernet4/6
-------------------------
Device ID: switch-1
Entry address(es):
IP address: 10.1.1.1
Platform: cisco WS-C6509, Capabilities: Router Switch IGMP
Interface: GigabitEthernet1/6, Port ID (outgoing port): GigabitEthernet4/7
-------------------------
Device ID: switch-1
Entry address(es):
IP address: 10.1.1.1
Platform: cisco WS-C6509, Capabilities: Router Switch IGMP
Interface: GigabitEthernet1/8, Port ID (outgoing port): GigabitEthernet4/8
switch-2#show run
interface Port-channel3
description switch-1: trunk LAG
switchport
switchport trunk allowed vlan 200-205
switchport mode trunk
!
interface GigabitEthernet1/5
description switch-1: trunk
switchport
switchport trunk allowed vlan 200-205
switchport mode trunk
channel-group 3 mode active
!
interface GigabitEthernet1/6
description switch-1: trunk
switchport
switchport trunk allowed vlan 200-205
switchport mode trunk
channel-group 3 mode active
!
interface GigabitEthernet1/8
description switch-1: trunk standalone
switchport
switchport trunk allowed vlan 300-305
switchport mode trunk
"""
]
}
config = {
"group_links": True,
"add_lag": True
}
drawing = create_yed_diagram()
drawer = layer_2_drawer(drawing, config)
drawer.work(data)
drawer.drawing.dump_file(filename="test_cdp_drawing_yed_data_dict_group_links_add_lag.graphml", folder="./Output/")
with open ("./Output/test_cdp_drawing_yed_data_dict_group_links_add_lag.graphml") as produced:
with open("./Output/should_be_test_cdp_drawing_yed_data_dict_group_links_add_lag.graphml") as should_be:
assert produced.read() == should_be.read()
# test_cdp_drawing_yed_data_dict_group_links_add_lag()
def test_lldp_drawing_yed_data_dict():
data = {"Cisco_IOS": ["""
switch-1#show cdp neighbors detail
------------------------------------------------
Local Intf: GigabitEthernet4/6
Port id: GigabitEthernet1/5
System Name: switch-2.com
System Capabilities: B,R
Management Addresses:
IP: 10.2.2.2
------------------------------------------------
Local Intf: GigabitEthernet1/1
Port id: GigabitEthernet0/1
System Name: switch-3.com
System Capabilities: B,R
Management Addresses:
IP: 10.3.3.3
------------------------------------------------
Local Intf: GigabitEthernet1/2
Port id: GigabitEthernet0/10
System Name: switch-4.com
System Capabilities: B,R
Management Addresses:
IP: 10.4.4.4
switch-1#show run
interface GigabitEthernet4/6
description switch-2: access
switchport
switchport access vlan 2150
switchport mode access
spanning-tree portfast edge
!
interface GigabitEthernet1/1
description switch-3:Gi0/1
switchport
switchport trunk allowed vlan 1771,1887
switchport mode trunk
mtu 9216
!
interface GigabitEthernet1/2
description SW4 Routing Peering
vrf forwarding VRF1
ip address 10.0.0.1 255.255.255.0
""",
"""
switch-2#show cdp neighbors detail
------------------------------------------------
Local Intf: GigabitEthernet1/5
Port id: GigabitEthernet4/6
System Name: switch-1.com
System Capabilities: B,R
Management Addresses:
IP: 10.1.1.1
switch-2#show run
interface GigabitEthernet1/5
description switch-1: access
switchport
switchport access vlan 2150
switchport mode access
spanning-tree portfast edge
"""
]
}
config = {}
drawing = create_yed_diagram()
drawer = layer_2_drawer(drawing, config)
drawer.work(data)
drawer.drawing.dump_file(filename="test_lldp_drawing_yed_data_dict.graphml", folder="./Output/")
with open ("./Output/test_lldp_drawing_yed_data_dict.graphml") as produced:
with open("./Output/should_be_test_lldp_drawing_yed_data_dict.graphml") as should_be:
assert produced.read() == should_be.read()
# test_lldp_drawing_yed_data_dict()
def test_cdp_drawing_yed_data_dict_add_vlans_to_nodes():
data = {"Cisco_IOS": ["""
switch-1#show cdp neighbors detail
-------------------------
Device ID: switch-2
Entry address(es):
IP address: 10.2.2.2
Platform: cisco WS-C6509, Capabilities: Router Switch IGMP
Interface: GigabitEthernet4/6, Port ID (outgoing port): GigabitEthernet1/5
-------------------------
Device ID: switch-2
Entry address(es):
IP address: 10.2.2.2
Platform: cisco WS-C6509, Capabilities: Router Switch IGMP
Interface: GigabitEthernet4/7, Port ID (outgoing port): GigabitEthernet1/6
-------------------------
Device ID: switch-3
Entry address(es):
IP address: 10.3.3.3
Platform: cisco WS-C3560-48TS, Capabilities: Switch IGMP
Interface: GigabitEthernet1/1, Port ID (outgoing port): GigabitEthernet0/1
-------------------------
Device ID: switch-4
Entry address(es):
IP address: 10.4.4.4
Platform: cisco WS-C3560-48TS, Capabilities: Switch IGMP
Interface: GigabitEthernet1/2, Port ID (outgoing port): GigabitEthernet0/10
switch-1#show run
interface Port-channel3
description switch-2: trunk LAG
switchport
switchport trunk allowed vlan 200-205
switchport mode trunk
!
interface Port-channel11
description switch-3: trunk LAG
switchport
switchport trunk allowed vlan 101
switchport mode trunk
!
interface GigabitEthernet4/6
description switch-2: trunk
switchport
switchport trunk allowed vlan 200-205
switchport mode trunk
channel-group 3 mode active
!
interface GigabitEthernet4/7
description switch-2: trunk
switchport
switchport trunk allowed vlan 200-205
switchport mode trunk
channel-group 3 mode active
!
interface GigabitEthernet1/1
description switch-3:Gi0/1
switchport
switchport trunk allowed vlan 101
switchport mode trunk
mtu 9216
channel-group 11 mode active
!
interface GigabitEthernet1/2
description SW4 Routing Peering
vrf forwarding VRF1
ip address 10.0.0.1 255.255.255.0
!
vlan 200
name ProdVMS
!
vlan 101
name test_vlan
""",
"""
switch-2#show cdp neighbors detail
-------------------------
Device ID: switch-1
Entry address(es):
IP address: 10.1.1.1
Platform: cisco WS-C6509, Capabilities: Router Switch IGMP
Interface: GigabitEthernet1/5, Port ID (outgoing port): GigabitEthernet4/6
-------------------------
Device ID: switch-1
Entry address(es):
IP address: 10.1.1.1
Platform: cisco WS-C6509, Capabilities: Router Switch IGMP
Interface: GigabitEthernet1/6, Port ID (outgoing port): GigabitEthernet4/7
switch-2#show run
interface Port-channel3
description switch-1: trunk LAG
switchport
switchport trunk allowed vlan 200-205
switchport mode trunk
!
interface GigabitEthernet1/5
description switch-1: trunk
switchport
switchport trunk allowed vlan 200-205
switchport mode trunk
channel-group 3 mode active
!
interface GigabitEthernet1/6
description switch-1: trunk
switchport
switchport trunk allowed vlan 200-205
switchport mode trunk
channel-group 3 mode active
!
vlan 200
name ProdVMS
!
vlan 101
name test_vlan
"""
]
}
config = {}
drawing = create_yed_diagram()
drawer = layer_2_drawer(drawing, config)
drawer.work(data)
drawer.drawing.dump_file(filename="test_cdp_drawing_yed_data_dict_add_vlans_to_nodes.graphml", folder="./Output/")
with open ("./Output/test_cdp_drawing_yed_data_dict_add_vlans_to_nodes.graphml") as produced:
with open("./Output/should_be_test_cdp_drawing_yed_data_dict_add_vlans_to_nodes.graphml") as should_be:
assert produced.read() == should_be.read()
# test_cdp_drawing_yed_data_dict_add_vlans_to_nodes()
def test_cdp_drawing_yed_data_dict_interfaces_state():
data = {"Cisco_IOS": ["""
switch-1#show cdp neighbors detail
-------------------------
Device ID: switch-2
Entry address(es):
IP address: 10.2.2.2
Platform: cisco WS-C6509, Capabilities: Router Switch IGMP
Interface: GigabitEthernet4/6, Port ID (outgoing port): GigabitEthernet1/5
-------------------------
Device ID: switch-2
Entry address(es):
IP address: 10.2.2.2
Platform: cisco WS-C6509, Capabilities: Router Switch IGMP
Interface: GigabitEthernet4/7, Port ID (outgoing port): GigabitEthernet1/6
-------------------------
Device ID: switch-3
Entry address(es):
IP address: 10.3.3.3
Platform: cisco WS-C3560-48TS, Capabilities: Switch IGMP
Interface: GigabitEthernet1/1, Port ID (outgoing port): GigabitEthernet0/1
-------------------------
Device ID: switch-4
Entry address(es):
IP address: 10.4.4.4
Platform: cisco WS-C3560-48TS, Capabilities: Switch IGMP
Interface: GigabitEthernet1/2, Port ID (outgoing port): GigabitEthernet0/10
switch-1#show run
interface Port-channel3
description switch-2: trunk LAG
switchport
switchport trunk allowed vlan 200-205
switchport mode trunk
!
interface Port-channel11
description switch-3: trunk LAG
switchport
switchport trunk allowed vlan 101
switchport mode trunk
!
interface GigabitEthernet4/6
description switch-2: trunk
switchport
switchport trunk allowed vlan 200-205
switchport mode trunk
channel-group 3 mode active
!
interface GigabitEthernet4/7
description switch-2: trunk
switchport
switchport trunk allowed vlan 200-205
switchport mode trunk
channel-group 3 mode active
!
interface GigabitEthernet1/1
description switch-3:Gi0/1
switchport
switchport trunk allowed vlan 101
switchport mode trunk
mtu 9216
channel-group 11 mode active
!
interface GigabitEthernet1/2
description SW4 Routing Peering
vrf forwarding VRF1
ip address 10.0.0.1 255.255.255.0
!
vlan 200
name ProdVMS
!
vlan 101
name test_vlan
!
switch-1#show interface
GigabitEthernet1/1 is up, line protocol is up (connected)
Hardware is Ten Gigabit Ethernet Port, address is a89d.2163.1111 (bia a89d.2163.1111)
Description: switch-3:Gi0/1
MTU 9216 bytes, BW 10000000 Kbit/sec, DLY 10 usec,
Full-duplex, 10Gb/s, link type is auto, media type is 10GBase-LR
!
GigabitEthernet1/2 is up, line protocol is up (connected)
Hardware is Ten Gigabit Ethernet Port, address is a89d.2163.2222 (bia a89d.2163.2222)
Description: SW4 Routing Peering
MTU 1500 bytes, BW 1000000 Kbit/sec, DLY 10 usec,
Full-duplex, 1000Mb/s, link type is auto, media type is 1000BaseT
!
Port-channel3 is up, line protocol is up (connected)
Hardware is EtherChannel, address is a89d.2163.3333 (bia a89d.2163.333)
Description: switch-2: trunk LAG
MTU 1500 bytes, BW 20000000 Kbit/sec, DLY 10 usec,
Full-duplex, 10Gb/s, media type is N/A
Members in this channel: Ge4/6 Ge4/7
""",
"""
switch-2#show cdp neighbors detail
-------------------------
Device ID: switch-1
Entry address(es):
IP address: 10.1.1.1
Platform: cisco WS-C6509, Capabilities: Router Switch IGMP
Interface: GigabitEthernet1/5, Port ID (outgoing port): GigabitEthernet4/6
-------------------------
Device ID: switch-1
Entry address(es):
IP address: 10.1.1.1
Platform: cisco WS-C6509, Capabilities: Router Switch IGMP
Interface: GigabitEthernet1/6, Port ID (outgoing port): GigabitEthernet4/7
switch-2#show run
interface Port-channel3
description switch-1: trunk LAG
switchport
switchport trunk allowed vlan 200-205
switchport mode trunk
!
interface GigabitEthernet1/5
description switch-1: trunk
switchport
switchport trunk allowed vlan 200-205
switchport mode trunk
channel-group 3 mode active
!
interface GigabitEthernet1/6
description switch-1: trunk
switchport
switchport trunk allowed vlan 200-205
switchport mode trunk
channel-group 3 mode active
!
vlan 200
name ProdVMS
!
vlan 101
name test_vlan
"""
]
}
config = {}
drawing = create_yed_diagram()
drawer = layer_2_drawer(drawing, config)
drawer.work(data)
drawer.drawing.dump_file(filename="test_cdp_drawing_yed_data_dict_interfaces_state.graphml", folder="./Output/")
with open ("./Output/test_cdp_drawing_yed_data_dict_interfaces_state.graphml") as produced:
with open("./Output/should_be_test_cdp_drawing_yed_data_dict_interfaces_state.graphml") as should_be:
assert produced.read() == should_be.read()
# test_cdp_drawing_yed_data_dict_interfaces_state()
def test_cdp_drawing_yed_data_dict_lag_interfaces_state():
data = {"Cisco_IOS": ["""
switch-1#show cdp neighbors detail
-------------------------
Device ID: switch-2
Entry address(es):
IP address: 10.2.2.2
Platform: cisco WS-C6509, Capabilities: Router Switch IGMP
Interface: GigabitEthernet4/6, Port ID (outgoing port): GigabitEthernet1/5
-------------------------
Device ID: switch-2
Entry address(es):
IP address: 10.2.2.2
Platform: cisco WS-C6509, Capabilities: Router Switch IGMP
Interface: GigabitEthernet4/7, Port ID (outgoing port): GigabitEthernet1/6
-------------------------
Device ID: switch-3
Entry address(es):
IP address: 10.3.3.3
Platform: cisco WS-C3560-48TS, Capabilities: Switch IGMP
Interface: GigabitEthernet1/1, Port ID (outgoing port): GigabitEthernet0/1
-------------------------
Device ID: switch-4
Entry address(es):
IP address: 10.4.4.4
Platform: cisco WS-C3560-48TS, Capabilities: Switch IGMP
Interface: GigabitEthernet1/2, Port ID (outgoing port): GigabitEthernet0/10
switch-1#show run
interface Port-channel3
description switch-2: trunk LAG
switchport
switchport trunk allowed vlan 200-205
switchport mode trunk
!
interface Port-channel11
description switch-3: trunk LAG
switchport
switchport trunk allowed vlan 101
switchport mode trunk
!
interface GigabitEthernet4/6
description switch-2: trunk
switchport
switchport trunk allowed vlan 200-205
switchport mode trunk
channel-group 3 mode active
!
interface GigabitEthernet4/7
description switch-2: trunk
switchport
switchport trunk allowed vlan 200-205
switchport mode trunk
channel-group 3 mode active
!
interface GigabitEthernet1/1
description switch-3:Gi0/1
switchport
switchport trunk allowed vlan 101
switchport mode trunk
mtu 9216
channel-group 11 mode active
!
interface GigabitEthernet1/2
description SW4 Routing Peering
vrf forwarding VRF1
ip address 10.0.0.1 255.255.255.0
!
vlan 200
name ProdVMS
!
vlan 101
name test_vlan
!
switch-1#show interface
GigabitEthernet1/1 is up, line protocol is up (connected)
Hardware is Ten Gigabit Ethernet Port, address is a89d.2163.1111 (bia a89d.2163.1111)
Description: switch-3:Gi0/1
MTU 9216 bytes, BW 10000000 Kbit/sec, DLY 10 usec,
Full-duplex, 10Gb/s, link type is auto, media type is 10GBase-LR
!
GigabitEthernet1/2 is up, line protocol is up (connected)
Hardware is Ten Gigabit Ethernet Port, address is a89d.2163.2222 (bia a89d.2163.2222)
Description: SW4 Routing Peering
MTU 1500 bytes, BW 1000000 Kbit/sec, DLY 10 usec,
Full-duplex, 1000Mb/s, link type is auto, media type is 1000BaseT
!
Port-channel3 is up, line protocol is up (connected)
Hardware is EtherChannel, address is a89d.2163.3333 (bia a89d.2163.333)
Description: switch-2: trunk LAG
MTU 1500 bytes, BW 20000000 Kbit/sec, DLY 10 usec,
Full-duplex, 10Gb/s, media type is N/A
Members in this channel: Ge4/6 Ge4/7
""",
"""
switch-2#show cdp neighbors detail
-------------------------
Device ID: switch-1
Entry address(es):
IP address: 10.1.1.1
Platform: cisco WS-C6509, Capabilities: Router Switch IGMP
Interface: GigabitEthernet1/5, Port ID (outgoing port): GigabitEthernet4/6
-------------------------
Device ID: switch-1
Entry address(es):
IP address: 10.1.1.1
Platform: cisco WS-C6509, Capabilities: Router Switch IGMP
Interface: GigabitEthernet1/6, Port ID (outgoing port): GigabitEthernet4/7
switch-2#show run
interface Port-channel3
description switch-1: trunk LAG
switchport
switchport trunk allowed vlan 200-205
switchport mode trunk
!
interface GigabitEthernet1/5
description switch-1: trunk
switchport
switchport trunk allowed vlan 200-205
switchport mode trunk
channel-group 3 mode active
!
interface GigabitEthernet1/6
description switch-1: trunk
switchport
switchport trunk allowed vlan 200-205
switchport mode trunk
channel-group 3 mode active
!
vlan 200
name ProdVMS
!
vlan 101
name test_vlan
"""
]
}
config = {"add_lag": True}
drawing = create_yed_diagram()
drawer = layer_2_drawer(drawing, config)
drawer.work(data)
drawer.drawing.dump_file(filename="test_cdp_drawing_yed_data_dict_lag_interfaces_state.graphml", folder="./Output/")
with open ("./Output/test_cdp_drawing_yed_data_dict_lag_interfaces_state.graphml") as produced:
with open("./Output/should_be_test_cdp_drawing_yed_data_dict_lag_interfaces_state.graphml") as should_be:
assert produced.read() == should_be.read()
# test_cdp_drawing_yed_data_dict_lag_interfaces_state()
def test_cdp_drawing_yed_data_dict_add_all_connected():
data = {"Cisco_IOS": ["""
switch-1#show cdp neighbors detail
-------------------------
Device ID: switch-2
Entry address(es):
IP address: 10.2.2.2
Platform: cisco WS-C6509, Capabilities: Router Switch IGMP
Interface: GigabitEthernet4/6, Port ID (outgoing port): GigabitEthernet1/5
-------------------------
Device ID: switch-2
Entry address(es):
IP address: 10.2.2.2
Platform: cisco WS-C6509, Capabilities: Router Switch IGMP
Interface: GigabitEthernet4/7, Port ID (outgoing port): GigabitEthernet1/6
-------------------------
Device ID: switch-3
Entry address(es):
IP address: 10.3.3.3
Platform: cisco WS-C3560-48TS, Capabilities: Switch IGMP
Interface: GigabitEthernet1/1, Port ID (outgoing port): GigabitEthernet0/1
-------------------------
Device ID: switch-4
Entry address(es):
IP address: 10.4.4.4
Platform: cisco WS-C3560-48TS, Capabilities: Switch IGMP
Interface: GigabitEthernet1/2, Port ID (outgoing port): GigabitEthernet0/10
switch-1#show run
interface Port-channel3
description switch-2: trunk LAG
switchport
switchport trunk allowed vlan 200-205
switchport mode trunk
!
interface Port-channel11
description switch-3: trunk LAG
switchport
switchport trunk allowed vlan 101
switchport mode trunk
!
interface GigabitEthernet4/6
description switch-2: trunk
switchport
switchport trunk allowed vlan 200-205
switchport mode trunk
channel-group 3 mode active
!
interface GigabitEthernet4/7
description switch-2: trunk
switchport
switchport trunk allowed vlan 200-205
switchport mode trunk
channel-group 3 mode active
!
interface GigabitEthernet4/8
description switch-22: trunk
switchport
switchport trunk allowed vlan 209
switchport mode trunk
!
interface GigabitEthernet4/9
switchport
switchport trunk allowed vlan 230
switchport mode trunk
!
interface GigabitEthernet1/1
description switch-3:Gi0/1
switchport
switchport trunk allowed vlan 101
switchport mode trunk
mtu 9216
channel-group 11 mode active
!
interface GigabitEthernet1/2
description SW4 Routing Peering
vrf forwarding VRF1
ip address 10.0.0.1 255.255.255.0
!
switch-1#show interface
GigabitEthernet1/1 is up, line protocol is up (connected)
Hardware is Ten Gigabit Ethernet Port, address is a89d.2163.1111 (bia a89d.2163.1111)
Description: switch-3:Gi0/1
MTU 9216 bytes, BW 10000000 Kbit/sec, DLY 10 usec,
Full-duplex, 10Gb/s, link type is auto, media type is 10GBase-LR
!
GigabitEthernet1/2 is up, line protocol is up (connected)
Hardware is Ten Gigabit Ethernet Port, address is a89d.2163.2222 (bia a89d.2163.2222)
Description: SW4 Routing Peering
MTU 1500 bytes, BW 1000000 Kbit/sec, DLY 10 usec,
Full-duplex, 1000Mb/s, link type is auto, media type is 1000BaseT
!
Port-channel3 is up, line protocol is up (connected)
Hardware is EtherChannel, address is a89d.2163.3333 (bia a89d.2163.333)
Description: switch-2: trunk LAG
MTU 1500 bytes, BW 20000000 Kbit/sec, DLY 10 usec,
Full-duplex, 10Gb/s, media type is N/A
Members in this channel: Ge4/6 Ge4/7
!
GigabitEthernet4/8 is up, line protocol is up (connected)
Hardware is Ten Gigabit Ethernet Port, address is a89d.2163.4848 (bia a89d.2163.4848)
Description: switch-22: trunk
MTU 5000 bytes, BW 1000000 Kbit/sec, DLY 10 usec,
Full-duplex, 1000Mb/s, link type is auto, media type is 1000BaseT
!
GigabitEthernet4/9 is up, line protocol is up (connected)
Hardware is Ten Gigabit Ethernet Port, address is a89d.2163.4949 (bia a89d.2163.4949)
MTU 7000 bytes, BW 1000000 Kbit/sec, DLY 10 usec,
Full-duplex, 1000Mb/s, link type is auto, media type is 1000BaseT
""",
"""
switch-2#show cdp neighbors detail
-------------------------
Device ID: switch-1
Entry address(es):
IP address: 10.1.1.1
Platform: cisco WS-C6509, Capabilities: Router Switch IGMP
Interface: GigabitEthernet1/5, Port ID (outgoing port): GigabitEthernet4/6
-------------------------
Device ID: switch-1
Entry address(es):
IP address: 10.1.1.1
Platform: cisco WS-C6509, Capabilities: Router Switch IGMP
Interface: GigabitEthernet1/6, Port ID (outgoing port): GigabitEthernet4/7
switch-2#show run
interface Port-channel3
description switch-1: trunk LAG
switchport
switchport trunk allowed vlan 200-205
switchport mode trunk
!
interface GigabitEthernet1/5
description switch-1: trunk
switchport
switchport trunk allowed vlan 200-205
switchport mode trunk
channel-group 3 mode active
!
interface GigabitEthernet1/6
description switch-1: trunk
switchport
switchport trunk allowed vlan 200-205
switchport mode trunk
channel-group 3 mode active
"""
]
}
config = {"add_all_connected": True}
drawing = create_yed_diagram()
drawer = layer_2_drawer(drawing, config)
drawer.work(data)
drawer.drawing.dump_file(filename="test_cdp_drawing_yed_data_dict_add_all_connected.graphml", folder="./Output/")
with open ("./Output/test_cdp_drawing_yed_data_dict_add_all_connected.graphml") as produced:
with open("./Output/should_be_test_cdp_drawing_yed_data_dict_add_all_connected.graphml") as should_be:
assert produced.read() == should_be.read()
# test_cdp_drawing_yed_data_dict_add_all_connected()
def test_cdp_drawing_yed_data_dict_add_all_connected_add_lag():
data = {"Cisco_IOS": ["""
switch-1#show cdp neighbors detail
-------------------------
Device ID: switch-2
Entry address(es):
IP address: 10.2.2.2
Platform: cisco WS-C6509, Capabilities: Router Switch IGMP
Interface: GigabitEthernet4/6, Port ID (outgoing port): GigabitEthernet1/5
-------------------------
Device ID: switch-2
Entry address(es):
IP address: 10.2.2.2
Platform: cisco WS-C6509, Capabilities: Router Switch IGMP
Interface: GigabitEthernet4/7, Port ID (outgoing port): GigabitEthernet1/6
-------------------------
Device ID: switch-3
Entry address(es):
IP address: 10.3.3.3
Platform: cisco WS-C3560-48TS, Capabilities: Switch IGMP
Interface: GigabitEthernet1/1, Port ID (outgoing port): GigabitEthernet0/1
-------------------------
Device ID: switch-4
Entry address(es):
IP address: 10.4.4.4
Platform: cisco WS-C3560-48TS, Capabilities: Switch IGMP
Interface: GigabitEthernet1/2, Port ID (outgoing port): GigabitEthernet0/10
switch-1#show run
interface Port-channel3
description switch-2: trunk LAG
switchport
switchport trunk allowed vlan 200-205
switchport mode trunk
!
interface Port-channel11
description switch-3: trunk LAG
switchport
switchport trunk allowed vlan 101
switchport mode trunk
!
interface Port-channel48
description switch-22:LAG trunk
switchport
switchport trunk allowed vlan 209
switchport mode trunk
!
interface GigabitEthernet4/6
description switch-2: trunk
switchport
switchport trunk allowed vlan 200-205
switchport mode trunk
channel-group 3 mode active
!
interface GigabitEthernet4/7
description switch-2: trunk
switchport
switchport trunk allowed vlan 200-205
switchport mode trunk
channel-group 3 mode active
!
interface GigabitEthernet4/8
description switch-22: trunk
switchport
switchport trunk allowed vlan 209
switchport mode trunk
channel-group 48 mode active
!
interface GigabitEthernet5/1
description switch-22: trunk
switchport
switchport trunk allowed vlan 209
switchport mode trunk
channel-group 48 mode active
!
interface GigabitEthernet4/9
switchport
switchport trunk allowed vlan 230
switchport mode trunk
!
interface GigabitEthernet1/1
description switch-3:Gi0/1
switchport
switchport trunk allowed vlan 101
switchport mode trunk
mtu 9216
channel-group 11 mode active
!
interface GigabitEthernet1/2
description SW4 Routing Peering
vrf forwarding VRF1
ip address 10.0.0.1 255.255.255.0
!
vlan 200
name ProdVMS
!
vlan 101
name test_vlan
!
switch-1#show interface
GigabitEthernet1/1 is up, line protocol is up (connected)
Hardware is Ten Gigabit Ethernet Port, address is a89d.2163.1111 (bia a89d.2163.1111)
Description: switch-3:Gi0/1
MTU 9216 bytes, BW 10000000 Kbit/sec, DLY 10 usec,
Full-duplex, 10Gb/s, link type is auto, media type is 10GBase-LR
!
GigabitEthernet1/2 is up, line protocol is up (connected)
Hardware is Ten Gigabit Ethernet Port, address is a89d.2163.2222 (bia a89d.2163.2222)
Description: SW4 Routing Peering
MTU 1500 bytes, BW 1000000 Kbit/sec, DLY 10 usec,
Full-duplex, 1000Mb/s, link type is auto, media type is 1000BaseT
!
Port-channel3 is up, line protocol is up (connected)
Hardware is EtherChannel, address is a89d.2163.3333 (bia a89d.2163.333)
Description: switch-2: trunk LAG
MTU 1500 bytes, BW 20000000 Kbit/sec, DLY 10 usec,
Full-duplex, 10Gb/s, media type is N/A
Members in this channel: Ge4/6 Ge4/7
!
GigabitEthernet4/8 is up, line protocol is up (connected)
Hardware is Ten Gigabit Ethernet Port, address is a89d.2163.4848 (bia a89d.2163.4848)
Description: switch-22: trunk
MTU 5000 bytes, BW 1000000 Kbit/sec, DLY 10 usec,
Full-duplex, 1000Mb/s, link type is auto, media type is 1000BaseT
!
Port-channel48 is up, line protocol is up (connected)
Hardware is EtherChannel, address is a89d.2163.3333 (bia a89d.2163.333)
Description: switch-22:LAG trunk
MTU 1500 bytes, BW 20000000 Kbit/sec, DLY 10 usec,
Full-duplex, 10Gb/s, media type is N/A
Members in this channel: Ge4/8
!
GigabitEthernet4/9 is up, line protocol is up (connected)
Hardware is Ten Gigabit Ethernet Port, address is a89d.2163.4949 (bia a89d.2163.4949)
MTU 7000 bytes, BW 1000000 Kbit/sec, DLY 10 usec,
Full-duplex, 1000Mb/s, link type is auto, media type is 1000BaseT
!
GigabitEthernet5/1 is up, line protocol is up (connected)
Hardware is Ten Gigabit Ethernet Port, address is a89d.2163.4949 (bia a89d.2163.4949)
MTU 7000 bytes, BW 1000000 Kbit/sec, DLY 10 usec,
Full-duplex, 1000Mb/s, link type is auto, media type is 1000BaseT
""",
"""
switch-2#show cdp neighbors detail
-------------------------
Device ID: switch-1
Entry address(es):
IP address: 10.1.1.1
Platform: cisco WS-C6509, Capabilities: Router Switch IGMP
Interface: GigabitEthernet1/5, Port ID (outgoing port): GigabitEthernet4/6
-------------------------
Device ID: switch-1
Entry address(es):
IP address: 10.1.1.1
Platform: cisco WS-C6509, Capabilities: Router Switch IGMP
Interface: GigabitEthernet1/6, Port ID (outgoing port): GigabitEthernet4/7
switch-2#show run
interface Port-channel3
description switch-1: trunk LAG
switchport
switchport trunk allowed vlan 200-205
switchport mode trunk
!
interface GigabitEthernet1/5
description switch-1: trunk
switchport
switchport trunk allowed vlan 200-205
switchport mode trunk
channel-group 3 mode active
!
interface GigabitEthernet1/6
description switch-1: trunk
switchport
switchport trunk allowed vlan 200-205
switchport mode trunk
channel-group 3 mode active
!
vlan 200
name ProdVMS
!
vlan 101
name test_vlan
"""
]
}
config = {"add_all_connected": True, "add_lag": True}
drawing = create_yed_diagram()
drawer = layer_2_drawer(drawing, config)
drawer.work(data)
drawer.drawing.dump_file(filename="test_cdp_drawing_yed_data_dict_add_all_connected_add_lag.graphml", folder="./Output/")
with open ("./Output/test_cdp_drawing_yed_data_dict_add_all_connected_add_lag.graphml") as produced:
with open("./Output/should_be_test_cdp_drawing_yed_data_dict_add_all_connected_add_lag.graphml") as should_be:
assert produced.read() == should_be.read()
# test_cdp_drawing_yed_data_dict_add_all_connected_add_lag()
def test_cdp_drawing_drawio_data_dict():
data = {"Cisco_IOS": ["""
switch-1#show cdp neighbors detail
-------------------------
Device ID: switch-2
Entry address(es):
IP address: 10.2.2.2
Platform: cisco WS-C6509, Capabilities: Router Switch IGMP
Interface: GigabitEthernet4/6, Port ID (outgoing port): GigabitEthernet1/5
-------------------------
Device ID: switch-3
Entry address(es):
IP address: 10.3.3.3
Platform: cisco WS-C3560-48TS, Capabilities: Switch IGMP
Interface: GigabitEthernet1/1, Port ID (outgoing port): GigabitEthernet0/1
-------------------------
Device ID: switch-4
Entry address(es):
IP address: 10.4.4.4
Platform: cisco WS-C3560-48TS, Capabilities: Switch IGMP
Interface: GigabitEthernet1/2, Port ID (outgoing port): GigabitEthernet0/10
switch-1#show run
interface GigabitEthernet4/6
description switch-2: access
switchport
switchport access vlan 2150
switchport mode access
spanning-tree portfast edge
!
interface GigabitEthernet1/1
description switch-3:Gi0/1
switchport
switchport trunk allowed vlan 1771,1887
switchport mode trunk
mtu 9216
!
interface GigabitEthernet1/2
description SW4 Routing Peering
vrf forwarding VRF1
ip address 10.0.0.1 255.255.255.0
""",
"""
switch-2#show cdp neighbors detail
-------------------------
Device ID: switch-1
Entry address(es):
IP address: 10.1.1.1
Platform: cisco WS-C6509, Capabilities: Router Switch IGMP
Interface: GigabitEthernet1/5, Port ID (outgoing port): GigabitEthernet4/6
switch-2#show run
interface GigabitEthernet1/5
description switch-1: access
switchport
switchport access vlan 2150
switchport mode access
spanning-tree portfast edge
"""
]
}
config = {}
lldp_drawing = create_drawio_diagram()
drawer = layer_2_drawer(lldp_drawing, config)
drawer.work(data)
drawer.drawing.dump_file(filename="test_cdp_drawing_drawio_data_dict.drawio", folder="./Output/")
with open ("./Output/test_cdp_drawing_drawio_data_dict.drawio") as produced:
with open("./Output/should_be_test_cdp_drawing_drawio_data_dict_or_path.drawio") as should_be:
assert produced.read() == should_be.read()
# test_cdp_drawing_drawio_data_dict()
def test_cdp_drawing_drawio_data_path():
data = "./Data/SAMPLE_CDP_LLDP/"
config = {}
drawing = create_drawio_diagram()
drawer = layer_2_drawer(drawing, config)
drawer.work(data)
drawer.drawing.dump_file(filename="test_cdp_drawing_drawio_data_path.drawio", folder="./Output/")
with open ("./Output/test_cdp_drawing_drawio_data_path.drawio") as produced:
with open("./Output/should_be_test_cdp_drawing_drawio_data_dict_or_path.drawio") as should_be:
assert produced.read() == should_be.read()
# test_cdp_drawing_drawio_data_path()
def test_cdp_drawing_drawio_data_dict_add_lag():
data = {"Cisco_IOS": ["""
switch-1#show cdp neighbors detail
-------------------------
Device ID: switch-2
Entry address(es):
IP address: 10.2.2.2
Platform: cisco WS-C6509, Capabilities: Router Switch IGMP
Interface: GigabitEthernet4/6, Port ID (outgoing port): GigabitEthernet1/5
-------------------------
Device ID: switch-2
Entry address(es):
IP address: 10.2.2.2
Platform: cisco WS-C6509, Capabilities: Router Switch IGMP
Interface: GigabitEthernet4/7, Port ID (outgoing port): GigabitEthernet1/6
-------------------------
Device ID: switch-3
Entry address(es):
IP address: 10.3.3.3
Platform: cisco WS-C3560-48TS, Capabilities: Switch IGMP
Interface: GigabitEthernet1/1, Port ID (outgoing port): GigabitEthernet0/1
-------------------------
Device ID: switch-4
Entry address(es):
IP address: 10.4.4.4
Platform: cisco WS-C3560-48TS, Capabilities: Switch IGMP
Interface: GigabitEthernet1/2, Port ID (outgoing port): GigabitEthernet0/10
switch-1#show run
interface Port-channel3
description switch-2: trunk LAG
switchport
switchport trunk allowed vlan 200-205
switchport mode trunk
!
interface Port-channel11
description switch-3: trunk LAG
switchport
switchport trunk allowed vlan 101
switchport mode trunk
!
interface GigabitEthernet4/6
description switch-2: trunk
switchport
switchport trunk allowed vlan 200-205
switchport mode trunk
channel-group 3 mode active
!
interface GigabitEthernet4/7
description switch-2: trunk
switchport
switchport trunk allowed vlan 200-205
switchport mode trunk
channel-group 3 mode active
!
interface GigabitEthernet1/1
description switch-3:Gi0/1
switchport
switchport trunk allowed vlan 101
switchport mode trunk
mtu 9216
channel-group 11 mode active
!
interface GigabitEthernet1/2
description SW4 Routing Peering
vrf forwarding VRF1
ip address 10.0.0.1 255.255.255.0
""",
"""
switch-2#show cdp neighbors detail
-------------------------
Device ID: switch-1
Entry address(es):
IP address: 10.1.1.1
Platform: cisco WS-C6509, Capabilities: Router Switch IGMP
Interface: GigabitEthernet1/5, Port ID (outgoing port): GigabitEthernet4/6
-------------------------
Device ID: switch-1
Entry address(es):
IP address: 10.1.1.1
Platform: cisco WS-C6509, Capabilities: Router Switch IGMP
Interface: GigabitEthernet1/6, Port ID (outgoing port): GigabitEthernet4/7
switch-2#show run
interface Port-channel3
description switch-1: trunk LAG
switchport
switchport trunk allowed vlan 200-205
switchport mode trunk
!
interface GigabitEthernet1/5
description switch-1: trunk
switchport
switchport trunk allowed vlan 200-205
switchport mode trunk
channel-group 3 mode active
!
interface GigabitEthernet1/6
description switch-1: trunk
switchport
switchport trunk allowed vlan 200-205
switchport mode trunk
channel-group 3 mode active
"""
]
}
config = {
"add_lag": True
}
drawing = create_drawio_diagram()
drawer = layer_2_drawer(drawing, config)
drawer.work(data)
drawer.drawing.dump_file(filename="test_cdp_drawing_drawio_data_dict_add_lag.drawio", folder="./Output/")
with open ("./Output/test_cdp_drawing_drawio_data_dict_add_lag.drawio") as produced:
with open("./Output/should_be_test_cdp_drawing_drawio_data_dict_add_lag.drawio") as should_be:
assert produced.read() == should_be.read()
# test_cdp_drawing_drawio_data_dict_add_lag()
def test_cdp_drawing_drawio_data_dict_group_links():
data = {"Cisco_IOS": ["""
switch-1#show cdp neighbors detail
-------------------------
Device ID: switch-2
Entry address(es):
IP address: 10.2.2.2
Platform: cisco WS-C6509, Capabilities: Router Switch IGMP
Interface: GigabitEthernet4/6, Port ID (outgoing port): GigabitEthernet1/5
-------------------------
Device ID: switch-2
Entry address(es):
IP address: 10.2.2.2
Platform: cisco WS-C6509, Capabilities: Router Switch IGMP
Interface: GigabitEthernet4/7, Port ID (outgoing port): GigabitEthernet1/6
-------------------------
Device ID: switch-3
Entry address(es):
IP address: 10.3.3.3
Platform: cisco WS-C3560-48TS, Capabilities: Switch IGMP
Interface: GigabitEthernet1/1, Port ID (outgoing port): GigabitEthernet0/1
-------------------------
Device ID: switch-4
Entry address(es):
IP address: 10.4.4.4
Platform: cisco WS-C3560-48TS, Capabilities: Switch IGMP
Interface: GigabitEthernet1/2, Port ID (outgoing port): GigabitEthernet0/10
switch-1#show run
interface Port-channel3
description switch-2: trunk LAG
switchport
switchport trunk allowed vlan 200-205
switchport mode trunk
!
interface Port-channel11
description switch-3: trunk LAG
switchport
switchport trunk allowed vlan 101
switchport mode trunk
!
interface GigabitEthernet4/6
description switch-2: trunk
switchport
switchport trunk allowed vlan 200-205
switchport mode trunk
channel-group 3 mode active
!
interface GigabitEthernet4/7
description switch-2: trunk
switchport
switchport trunk allowed vlan 200-205
switchport mode trunk
channel-group 3 mode active
!
interface GigabitEthernet1/1
description switch-3:Gi0/1
switchport
switchport trunk allowed vlan 101
switchport mode trunk
mtu 9216
channel-group 11 mode active
!
interface GigabitEthernet1/2
description SW4 Routing Peering
vrf forwarding VRF1
ip address 10.0.0.1 255.255.255.0
""",
"""
switch-2#show cdp neighbors detail
-------------------------
Device ID: switch-1
Entry address(es):
IP address: 10.1.1.1
Platform: cisco WS-C6509, Capabilities: Router Switch IGMP
Interface: GigabitEthernet1/5, Port ID (outgoing port): GigabitEthernet4/6
-------------------------
Device ID: switch-1
Entry address(es):
IP address: 10.1.1.1
Platform: cisco WS-C6509, Capabilities: Router Switch IGMP
Interface: GigabitEthernet1/6, Port ID (outgoing port): GigabitEthernet4/7
switch-2#show run
interface Port-channel3
description switch-1: trunk LAG
switchport
switchport trunk allowed vlan 200-205
switchport mode trunk
!
interface GigabitEthernet1/5
description switch-1: trunk
switchport
switchport trunk allowed vlan 200-205
switchport mode trunk
channel-group 3 mode active
!
interface GigabitEthernet1/6
description switch-1: trunk
switchport
switchport trunk allowed vlan 200-205
switchport mode trunk
channel-group 3 mode active
"""
]
}
config = {
"group_links": True
}
drawing = create_drawio_diagram()
drawer = layer_2_drawer(drawing, config)
drawer.work(data)
drawer.drawing.dump_file(filename="test_cdp_drawing_drawio_data_dict_group_links.drawio", folder="./Output/")
with open ("./Output/test_cdp_drawing_drawio_data_dict_group_links.drawio") as produced:
with open("./Output/should_be_test_cdp_drawing_drawio_data_dict_group_links.drawio") as should_be:
assert produced.read() == should_be.read()
def test_cdp_drawing_drawio_data_dict_group_links_add_lag():
data = {"Cisco_IOS": ["""
switch-1#show cdp neighbors detail
-------------------------
Device ID: switch-2
Entry address(es):
IP address: 10.2.2.2
Platform: cisco WS-C6509, Capabilities: Router Switch IGMP
Interface: GigabitEthernet4/6, Port ID (outgoing port): GigabitEthernet1/5
-------------------------
Device ID: switch-2
Entry address(es):
IP address: 10.2.2.2
Platform: cisco WS-C6509, Capabilities: Router Switch IGMP
Interface: GigabitEthernet4/7, Port ID (outgoing port): GigabitEthernet1/6
-------------------------
Device ID: switch-2
Entry address(es):
IP address: 10.2.2.2
Platform: cisco WS-C6509, Capabilities: Router Switch IGMP
Interface: GigabitEthernet4/8, Port ID (outgoing port): GigabitEthernet1/8
-------------------------
Device ID: switch-3
Entry address(es):
IP address: 10.3.3.3
Platform: cisco WS-C3560-48TS, Capabilities: Switch IGMP
Interface: GigabitEthernet1/1, Port ID (outgoing port): GigabitEthernet0/1
-------------------------
Device ID: switch-4
Entry address(es):
IP address: 10.4.4.4
Platform: cisco WS-C3560-48TS, Capabilities: Switch IGMP
Interface: GigabitEthernet1/2, Port ID (outgoing port): GigabitEthernet0/10
-------------------------
Device ID: switch-4
Entry address(es):
IP address: 10.4.4.4
Platform: cisco WS-C3560-48TS, Capabilities: Switch IGMP
Interface: GigabitEthernet1/24, Port ID (outgoing port): GigabitEthernet0/14
switch-1#show run
interface Port-channel3
description switch-2: trunk LAG
switchport
switchport trunk allowed vlan 200-205
switchport mode trunk
!
interface Port-channel11
description switch-3: trunk LAG
switchport
switchport trunk allowed vlan 101
switchport mode trunk
!
interface GigabitEthernet4/6
description switch-2: trunk
switchport
switchport trunk allowed vlan 200-205
switchport mode trunk
channel-group 3 mode active
!
interface GigabitEthernet4/7
description switch-2: trunk
switchport
switchport trunk allowed vlan 200-205
switchport mode trunk
channel-group 3 mode active
!
interface GigabitEthernet4/8
description switch-2: trunk standalone
switchport
switchport trunk allowed vlan 300-305
switchport mode trunk
!
interface GigabitEthernet1/1
description switch-3:Gi0/1
switchport
switchport trunk allowed vlan 101
switchport mode trunk
mtu 9216
channel-group 11 mode active
!
interface GigabitEthernet1/2
description SW4 Routing Peering
vrf forwarding VRF1
ip address 10.0.0.1 255.255.255.0
!
interface GigabitEthernet1/24
description SW4 Routing Peering VRF2
vrf forwarding VRF2
ip address 10.0.1.1 255.255.255.0
""",
"""
switch-2#show cdp neighbors detail
-------------------------
Device ID: switch-1
Entry address(es):
IP address: 10.1.1.1
Platform: cisco WS-C6509, Capabilities: Router Switch IGMP
Interface: GigabitEthernet1/5, Port ID (outgoing port): GigabitEthernet4/6
-------------------------
Device ID: switch-1
Entry address(es):
IP address: 10.1.1.1
Platform: cisco WS-C6509, Capabilities: Router Switch IGMP
Interface: GigabitEthernet1/6, Port ID (outgoing port): GigabitEthernet4/7
-------------------------
Device ID: switch-1
Entry address(es):
IP address: 10.1.1.1
Platform: cisco WS-C6509, Capabilities: Router Switch IGMP
Interface: GigabitEthernet1/8, Port ID (outgoing port): GigabitEthernet4/8
switch-2#show run
interface Port-channel3
description switch-1: trunk LAG
switchport
switchport trunk allowed vlan 200-205
switchport mode trunk
!
interface GigabitEthernet1/5
description switch-1: trunk
switchport
switchport trunk allowed vlan 200-205
switchport mode trunk
channel-group 3 mode active
!
interface GigabitEthernet1/6
description switch-1: trunk
switchport
switchport trunk allowed vlan 200-205
switchport mode trunk
channel-group 3 mode active
!
interface GigabitEthernet1/8
description switch-1: trunk standalone
switchport
switchport trunk allowed vlan 300-305
switchport mode trunk
"""
]
}
config = {
"group_links": True,
"add_lag": True
}
drawing = create_drawio_diagram()
drawer = layer_2_drawer(drawing, config)
drawer.work(data)
drawer.drawing.dump_file(filename="test_cdp_drawing_drawio_data_dict_group_links_add_lag.drawio", folder="./Output/")
with open ("./Output/test_cdp_drawing_drawio_data_dict_group_links_add_lag.drawio") as produced:
with open("./Output/should_be_test_cdp_drawing_drawio_data_dict_group_links_add_lag.drawio") as should_be:
assert produced.read() == should_be.read()
def test_lldp_drawing_drawio_data_dict():
data = {"Cisco_IOS": ["""
switch-1#show cdp neighbors detail
------------------------------------------------
Local Intf: GigabitEthernet4/6
Port id: GigabitEthernet1/5
System Name: switch-2.com
System Capabilities: B,R
Management Addresses:
IP: 10.2.2.2
------------------------------------------------
Local Intf: GigabitEthernet1/1
Port id: GigabitEthernet0/1
System Name: switch-3.com
System Capabilities: B,R
Management Addresses:
IP: 10.3.3.3
------------------------------------------------
Local Intf: GigabitEthernet1/2
Port id: GigabitEthernet0/10
System Name: switch-4.com
System Capabilities: B,R
Management Addresses:
IP: 10.4.4.4
switch-1#show run
interface GigabitEthernet4/6
description switch-2: access
switchport
switchport access vlan 2150
switchport mode access
spanning-tree portfast edge
!
interface GigabitEthernet1/1
description switch-3:Gi0/1
switchport
switchport trunk allowed vlan 1771,1887
switchport mode trunk
mtu 9216
!
interface GigabitEthernet1/2
description SW4 Routing Peering
vrf forwarding VRF1
ip address 10.0.0.1 255.255.255.0
""",
"""
switch-2#show cdp neighbors detail
------------------------------------------------
Local Intf: GigabitEthernet1/5
Port id: GigabitEthernet4/6
System Name: switch-1.com
System Capabilities: B,R
Management Addresses:
IP: 10.1.1.1
switch-2#show run
interface GigabitEthernet1/5
description switch-1: access
switchport
switchport access vlan 2150
switchport mode access
spanning-tree portfast edge
"""
]
}
config = {}
drawing = create_drawio_diagram()
drawer = layer_2_drawer(drawing, config)
drawer.work(data)
drawer.drawing.dump_file(filename="test_lldp_drawing_drawio_data_dict.drawio", folder="./Output/")
with open ("./Output/test_lldp_drawing_drawio_data_dict.drawio") as produced:
with open("./Output/should_be_test_lldp_drawing_drawio_data_dict.drawio") as should_be:
assert produced.read() == should_be.read()
def test_cdp_drawing_drawio_data_dict_add_vlans_to_nodes():
data = {"Cisco_IOS": ["""
switch-1#show cdp neighbors detail
-------------------------
Device ID: switch-2
Entry address(es):
IP address: 10.2.2.2
Platform: cisco WS-C6509, Capabilities: Router Switch IGMP
Interface: GigabitEthernet4/6, Port ID (outgoing port): GigabitEthernet1/5
-------------------------
Device ID: switch-2
Entry address(es):
IP address: 10.2.2.2
Platform: cisco WS-C6509, Capabilities: Router Switch IGMP
Interface: GigabitEthernet4/7, Port ID (outgoing port): GigabitEthernet1/6
-------------------------
Device ID: switch-3
Entry address(es):
IP address: 10.3.3.3
Platform: cisco WS-C3560-48TS, Capabilities: Switch IGMP
Interface: GigabitEthernet1/1, Port ID (outgoing port): GigabitEthernet0/1
-------------------------
Device ID: switch-4
Entry address(es):
IP address: 10.4.4.4
Platform: cisco WS-C3560-48TS, Capabilities: Switch IGMP
Interface: GigabitEthernet1/2, Port ID (outgoing port): GigabitEthernet0/10
switch-1#show run
interface Port-channel3
description switch-2: trunk LAG
switchport
switchport trunk allowed vlan 200-205
switchport mode trunk
!
interface Port-channel11
description switch-3: trunk LAG
switchport
switchport trunk allowed vlan 101
switchport mode trunk
!
interface GigabitEthernet4/6
description switch-2: trunk
switchport
switchport trunk allowed vlan 200-205
switchport mode trunk
channel-group 3 mode active
!
interface GigabitEthernet4/7
description switch-2: trunk
switchport
switchport trunk allowed vlan 200-205
switchport mode trunk
channel-group 3 mode active
!
interface GigabitEthernet1/1
description switch-3:Gi0/1
switchport
switchport trunk allowed vlan 101
switchport mode trunk
mtu 9216
channel-group 11 mode active
!
interface GigabitEthernet1/2
description SW4 Routing Peering
vrf forwarding VRF1
ip address 10.0.0.1 255.255.255.0
!
vlan 200
name ProdVMS
!
vlan 101
name test_vlan
""",
"""
switch-2#show cdp neighbors detail
-------------------------
Device ID: switch-1
Entry address(es):
IP address: 10.1.1.1
Platform: cisco WS-C6509, Capabilities: Router Switch IGMP
Interface: GigabitEthernet1/5, Port ID (outgoing port): GigabitEthernet4/6
-------------------------
Device ID: switch-1
Entry address(es):
IP address: 10.1.1.1
Platform: cisco WS-C6509, Capabilities: Router Switch IGMP
Interface: GigabitEthernet1/6, Port ID (outgoing port): GigabitEthernet4/7
switch-2#show run
interface Port-channel3
description switch-1: trunk LAG
switchport
switchport trunk allowed vlan 200-205
switchport mode trunk
!
interface GigabitEthernet1/5
description switch-1: trunk
switchport
switchport trunk allowed vlan 200-205
switchport mode trunk
channel-group 3 mode active
!
interface GigabitEthernet1/6
description switch-1: trunk
switchport
switchport trunk allowed vlan 200-205
switchport mode trunk
channel-group 3 mode active
!
vlan 200
name ProdVMS
!
vlan 101
name test_vlan
"""
]
}
config = {}
drawing = create_drawio_diagram()
drawer = layer_2_drawer(drawing, config)
drawer.work(data)
drawer.drawing.dump_file(filename="test_cdp_drawing_drawio_data_dict_add_vlans_to_nodes.drawio", folder="./Output/")
with open ("./Output/test_cdp_drawing_drawio_data_dict_add_vlans_to_nodes.drawio") as produced:
with open("./Output/should_be_test_cdp_drawing_drawio_data_dict_add_vlans_to_nodes.drawio") as should_be:
assert produced.read() == should_be.read()
# test_cdp_drawing_drawio_data_dict_add_vlans_to_nodes()
def test_cdp_drawing_drawio_data_dict_interfaces_state():
data = {"Cisco_IOS": ["""
switch-1#show cdp neighbors detail
-------------------------
Device ID: switch-2
Entry address(es):
IP address: 10.2.2.2
Platform: cisco WS-C6509, Capabilities: Router Switch IGMP
Interface: GigabitEthernet4/6, Port ID (outgoing port): GigabitEthernet1/5
-------------------------
Device ID: switch-2
Entry address(es):
IP address: 10.2.2.2
Platform: cisco WS-C6509, Capabilities: Router Switch IGMP
Interface: GigabitEthernet4/7, Port ID (outgoing port): GigabitEthernet1/6
-------------------------
Device ID: switch-3
Entry address(es):
IP address: 10.3.3.3
Platform: cisco WS-C3560-48TS, Capabilities: Switch IGMP
Interface: GigabitEthernet1/1, Port ID (outgoing port): GigabitEthernet0/1
-------------------------
Device ID: switch-4
Entry address(es):
IP address: 10.4.4.4
Platform: cisco WS-C3560-48TS, Capabilities: Switch IGMP
Interface: GigabitEthernet1/2, Port ID (outgoing port): GigabitEthernet0/10
switch-1#show run
interface Port-channel3
description switch-2: trunk LAG
switchport
switchport trunk allowed vlan 200-205
switchport mode trunk
!
interface Port-channel11
description switch-3: trunk LAG
switchport
switchport trunk allowed vlan 101
switchport mode trunk
!
interface GigabitEthernet4/6
description switch-2: trunk
switchport
switchport trunk allowed vlan 200-205
switchport mode trunk
channel-group 3 mode active
!
interface GigabitEthernet4/7
description switch-2: trunk
switchport
switchport trunk allowed vlan 200-205
switchport mode trunk
channel-group 3 mode active
!
interface GigabitEthernet1/1
description switch-3:Gi0/1
switchport
switchport trunk allowed vlan 101
switchport mode trunk
mtu 9216
channel-group 11 mode active
!
interface GigabitEthernet1/2
description SW4 Routing Peering
vrf forwarding VRF1
ip address 10.0.0.1 255.255.255.0
!
vlan 200
name ProdVMS
!
vlan 101
name test_vlan
!
switch-1#show interface
GigabitEthernet1/1 is up, line protocol is up (connected)
Hardware is Ten Gigabit Ethernet Port, address is a89d.2163.1111 (bia a89d.2163.1111)
Description: switch-3:Gi0/1
MTU 9216 bytes, BW 10000000 Kbit/sec, DLY 10 usec,
Full-duplex, 10Gb/s, link type is auto, media type is 10GBase-LR
!
GigabitEthernet1/2 is up, line protocol is up (connected)
Hardware is Ten Gigabit Ethernet Port, address is a89d.2163.2222 (bia a89d.2163.2222)
Description: SW4 Routing Peering
MTU 1500 bytes, BW 1000000 Kbit/sec, DLY 10 usec,
Full-duplex, 1000Mb/s, link type is auto, media type is 1000BaseT
!
Port-channel3 is up, line protocol is up (connected)
Hardware is EtherChannel, address is a89d.2163.3333 (bia a89d.2163.333)
Description: switch-2: trunk LAG
MTU 1500 bytes, BW 20000000 Kbit/sec, DLY 10 usec,
Full-duplex, 10Gb/s, media type is N/A
Members in this channel: Ge4/6 Ge4/7
""",
"""
switch-2#show cdp neighbors detail
-------------------------
Device ID: switch-1
Entry address(es):
IP address: 10.1.1.1
Platform: cisco WS-C6509, Capabilities: Router Switch IGMP
Interface: GigabitEthernet1/5, Port ID (outgoing port): GigabitEthernet4/6
-------------------------
Device ID: switch-1
Entry address(es):
IP address: 10.1.1.1
Platform: cisco WS-C6509, Capabilities: Router Switch IGMP
Interface: GigabitEthernet1/6, Port ID (outgoing port): GigabitEthernet4/7
switch-2#show run
interface Port-channel3
description switch-1: trunk LAG
switchport
switchport trunk allowed vlan 200-205
switchport mode trunk
!
interface GigabitEthernet1/5
description switch-1: trunk
switchport
switchport trunk allowed vlan 200-205
switchport mode trunk
channel-group 3 mode active
!
interface GigabitEthernet1/6
description switch-1: trunk
switchport
switchport trunk allowed vlan 200-205
switchport mode trunk
channel-group 3 mode active
!
vlan 200
name ProdVMS
!
vlan 101
name test_vlan
"""
]
}
config = {}
drawing = create_drawio_diagram()
drawer = layer_2_drawer(drawing, config)
drawer.work(data)
drawer.drawing.dump_file(filename="test_cdp_drawing_drawio_data_dict_interfaces_state.drawio", folder="./Output/")
with open ("./Output/test_cdp_drawing_drawio_data_dict_interfaces_state.drawio") as produced:
with open("./Output/should_be_test_cdp_drawing_drawio_data_dict_interfaces_state.drawio") as should_be:
assert produced.read() == should_be.read()
def test_cdp_drawing_drawio_data_dict_add_all_connected():
data = {"Cisco_IOS": ["""
switch-1#show cdp neighbors detail
-------------------------
Device ID: switch-2
Entry address(es):
IP address: 10.2.2.2
Platform: cisco WS-C6509, Capabilities: Router Switch IGMP
Interface: GigabitEthernet4/6, Port ID (outgoing port): GigabitEthernet1/5
-------------------------
Device ID: switch-2
Entry address(es):
IP address: 10.2.2.2
Platform: cisco WS-C6509, Capabilities: Router Switch IGMP
Interface: GigabitEthernet4/7, Port ID (outgoing port): GigabitEthernet1/6
-------------------------
Device ID: switch-3
Entry address(es):
IP address: 10.3.3.3
Platform: cisco WS-C3560-48TS, Capabilities: Switch IGMP
Interface: GigabitEthernet1/1, Port ID (outgoing port): GigabitEthernet0/1
-------------------------
Device ID: switch-4
Entry address(es):
IP address: 10.4.4.4
Platform: cisco WS-C3560-48TS, Capabilities: Switch IGMP
Interface: GigabitEthernet1/2, Port ID (outgoing port): GigabitEthernet0/10
switch-1#show run
interface Port-channel3
description switch-2: trunk LAG
switchport
switchport trunk allowed vlan 200-205
switchport mode trunk
!
interface Port-channel11
description switch-3: trunk LAG
switchport
switchport trunk allowed vlan 101
switchport mode trunk
!
interface GigabitEthernet4/6
description switch-2: trunk
switchport
switchport trunk allowed vlan 200-205
switchport mode trunk
channel-group 3 mode active
!
interface GigabitEthernet4/7
description switch-2: trunk
switchport
switchport trunk allowed vlan 200-205
switchport mode trunk
channel-group 3 mode active
!
interface GigabitEthernet4/8
description switch-22: trunk
switchport
switchport trunk allowed vlan 209
switchport mode trunk
!
interface GigabitEthernet4/9
switchport
switchport trunk allowed vlan 230
switchport mode trunk
!
interface GigabitEthernet1/1
description switch-3:Gi0/1
switchport
switchport trunk allowed vlan 101
switchport mode trunk
mtu 9216
channel-group 11 mode active
!
interface GigabitEthernet1/2
description SW4 Routing Peering
vrf forwarding VRF1
ip address 10.0.0.1 255.255.255.0
!
switch-1#show interface
GigabitEthernet1/1 is up, line protocol is up (connected)
Hardware is Ten Gigabit Ethernet Port, address is a89d.2163.1111 (bia a89d.2163.1111)
Description: switch-3:Gi0/1
MTU 9216 bytes, BW 10000000 Kbit/sec, DLY 10 usec,
Full-duplex, 10Gb/s, link type is auto, media type is 10GBase-LR
!
GigabitEthernet1/2 is up, line protocol is up (connected)
Hardware is Ten Gigabit Ethernet Port, address is a89d.2163.2222 (bia a89d.2163.2222)
Description: SW4 Routing Peering
MTU 1500 bytes, BW 1000000 Kbit/sec, DLY 10 usec,
Full-duplex, 1000Mb/s, link type is auto, media type is 1000BaseT
!
Port-channel3 is up, line protocol is up (connected)
Hardware is EtherChannel, address is a89d.2163.3333 (bia a89d.2163.333)
Description: switch-2: trunk LAG
MTU 1500 bytes, BW 20000000 Kbit/sec, DLY 10 usec,
Full-duplex, 10Gb/s, media type is N/A
Members in this channel: Ge4/6 Ge4/7
!
GigabitEthernet4/8 is up, line protocol is up (connected)
Hardware is Ten Gigabit Ethernet Port, address is a89d.2163.4848 (bia a89d.2163.4848)
Description: switch-22: trunk
MTU 5000 bytes, BW 1000000 Kbit/sec, DLY 10 usec,
Full-duplex, 1000Mb/s, link type is auto, media type is 1000BaseT
!
GigabitEthernet4/9 is up, line protocol is up (connected)
Hardware is Ten Gigabit Ethernet Port, address is a89d.2163.4949 (bia a89d.2163.4949)
MTU 7000 bytes, BW 1000000 Kbit/sec, DLY 10 usec,
Full-duplex, 1000Mb/s, link type is auto, media type is 1000BaseT
""",
"""
switch-2#show cdp neighbors detail
-------------------------
Device ID: switch-1
Entry address(es):
IP address: 10.1.1.1
Platform: cisco WS-C6509, Capabilities: Router Switch IGMP
Interface: GigabitEthernet1/5, Port ID (outgoing port): GigabitEthernet4/6
-------------------------
Device ID: switch-1
Entry address(es):
IP address: 10.1.1.1
Platform: cisco WS-C6509, Capabilities: Router Switch IGMP
Interface: GigabitEthernet1/6, Port ID (outgoing port): GigabitEthernet4/7
switch-2#show run
interface Port-channel3
description switch-1: trunk LAG
switchport
switchport trunk allowed vlan 200-205
switchport mode trunk
!
interface GigabitEthernet1/5
description switch-1: trunk
switchport
switchport trunk allowed vlan 200-205
switchport mode trunk
channel-group 3 mode active
!
interface GigabitEthernet1/6
description switch-1: trunk
switchport
switchport trunk allowed vlan 200-205
switchport mode trunk
channel-group 3 mode active
"""
]
}
config = {"add_all_connected": True}
drawing = create_drawio_diagram()
drawer = layer_2_drawer(drawing, config)
drawer.work(data)
drawer.drawing.dump_file(filename="test_cdp_drawing_drawio_data_dict_add_all_connected.drawio", folder="./Output/")
with open ("./Output/test_cdp_drawing_drawio_data_dict_add_all_connected.drawio") as produced:
with open("./Output/should_be_test_cdp_drawing_drawio_data_dict_add_all_connected.drawio") as should_be:
assert produced.read() == should_be.read()
def test_cdp_drawing_drawio_data_dict_add_all_connected_add_lag():
data = {"Cisco_IOS": ["""
switch-1#show cdp neighbors detail
-------------------------
Device ID: switch-2
Entry address(es):
IP address: 10.2.2.2
Platform: cisco WS-C6509, Capabilities: Router Switch IGMP
Interface: GigabitEthernet4/6, Port ID (outgoing port): GigabitEthernet1/5
-------------------------
Device ID: switch-2
Entry address(es):
IP address: 10.2.2.2
Platform: cisco WS-C6509, Capabilities: Router Switch IGMP
Interface: GigabitEthernet4/7, Port ID (outgoing port): GigabitEthernet1/6
-------------------------
Device ID: switch-3
Entry address(es):
IP address: 10.3.3.3
Platform: cisco WS-C3560-48TS, Capabilities: Switch IGMP
Interface: GigabitEthernet1/1, Port ID (outgoing port): GigabitEthernet0/1
-------------------------
Device ID: switch-4
Entry address(es):
IP address: 10.4.4.4
Platform: cisco WS-C3560-48TS, Capabilities: Switch IGMP
Interface: GigabitEthernet1/2, Port ID (outgoing port): GigabitEthernet0/10
switch-1#show run
interface Port-channel3
description switch-2: trunk LAG
switchport
switchport trunk allowed vlan 200-205
switchport mode trunk
!
interface Port-channel11
description switch-3: trunk LAG
switchport
switchport trunk allowed vlan 101
switchport mode trunk
!
interface Port-channel48
description switch-22:LAG trunk
switchport
switchport trunk allowed vlan 209
switchport mode trunk
!
interface GigabitEthernet4/6
description switch-2: trunk
switchport
switchport trunk allowed vlan 200-205
switchport mode trunk
channel-group 3 mode active
!
interface GigabitEthernet4/7
description switch-2: trunk
switchport
switchport trunk allowed vlan 200-205
switchport mode trunk
channel-group 3 mode active
!
interface GigabitEthernet4/8
description switch-22: trunk
switchport
switchport trunk allowed vlan 209
switchport mode trunk
channel-group 48 mode active
!
interface GigabitEthernet5/1
description switch-22: trunk
switchport
switchport trunk allowed vlan 209
switchport mode trunk
channel-group 48 mode active
!
interface GigabitEthernet4/9
switchport
switchport trunk allowed vlan 230
switchport mode trunk
!
interface GigabitEthernet1/1
description switch-3:Gi0/1
switchport
switchport trunk allowed vlan 101
switchport mode trunk
mtu 9216
channel-group 11 mode active
!
interface GigabitEthernet1/2
description SW4 Routing Peering
vrf forwarding VRF1
ip address 10.0.0.1 255.255.255.0
!
switch-1#show interface
GigabitEthernet1/1 is up, line protocol is up (connected)
Hardware is Ten Gigabit Ethernet Port, address is a89d.2163.1111 (bia a89d.2163.1111)
Description: switch-3:Gi0/1
MTU 9216 bytes, BW 10000000 Kbit/sec, DLY 10 usec,
Full-duplex, 10Gb/s, link type is auto, media type is 10GBase-LR
!
GigabitEthernet1/2 is up, line protocol is up (connected)
Hardware is Ten Gigabit Ethernet Port, address is a89d.2163.2222 (bia a89d.2163.2222)
Description: SW4 Routing Peering
MTU 1500 bytes, BW 1000000 Kbit/sec, DLY 10 usec,
Full-duplex, 1000Mb/s, link type is auto, media type is 1000BaseT
!
Port-channel3 is up, line protocol is up (connected)
Hardware is EtherChannel, address is a89d.2163.3333 (bia a89d.2163.333)
Description: switch-2: trunk LAG
MTU 1500 bytes, BW 20000000 Kbit/sec, DLY 10 usec,
Full-duplex, 10Gb/s, media type is N/A
Members in this channel: Ge4/6 Ge4/7
!
GigabitEthernet4/8 is up, line protocol is up (connected)
Hardware is Ten Gigabit Ethernet Port, address is a89d.2163.4848 (bia a89d.2163.4848)
Description: switch-22: trunk
MTU 5000 bytes, BW 1000000 Kbit/sec, DLY 10 usec,
Full-duplex, 1000Mb/s, link type is auto, media type is 1000BaseT
!
Port-channel48 is up, line protocol is up (connected)
Hardware is EtherChannel, address is a89d.2163.3333 (bia a89d.2163.333)
Description: switch-22:LAG trunk
MTU 1500 bytes, BW 20000000 Kbit/sec, DLY 10 usec,
Full-duplex, 10Gb/s, media type is N/A
Members in this channel: Ge4/8 Gi5/1
!
GigabitEthernet4/9 is up, line protocol is up (connected)
Hardware is Ten Gigabit Ethernet Port, address is a89d.2163.4949 (bia a89d.2163.4949)
MTU 7000 bytes, BW 1000000 Kbit/sec, DLY 10 usec,
Full-duplex, 1000Mb/s, link type is auto, media type is 1000BaseT
!
GigabitEthernet5/1 is up, line protocol is up (connected)
Hardware is Ten Gigabit Ethernet Port, address is a89d.2163.4949 (bia a89d.2163.4949)
MTU 7000 bytes, BW 1000000 Kbit/sec, DLY 10 usec,
Full-duplex, 1000Mb/s, link type is auto, media type is 1000BaseT
""",
"""
switch-2#show cdp neighbors detail
-------------------------
Device ID: switch-1
Entry address(es):
IP address: 10.1.1.1
Platform: cisco WS-C6509, Capabilities: Router Switch IGMP
Interface: GigabitEthernet1/5, Port ID (outgoing port): GigabitEthernet4/6
-------------------------
Device ID: switch-1
Entry address(es):
IP address: 10.1.1.1
Platform: cisco WS-C6509, Capabilities: Router Switch IGMP
Interface: GigabitEthernet1/6, Port ID (outgoing port): GigabitEthernet4/7
switch-2#show run
interface Port-channel3
description switch-1: trunk LAG
switchport
switchport trunk allowed vlan 200-205
switchport mode trunk
!
interface GigabitEthernet1/5
description switch-1: trunk
switchport
switchport trunk allowed vlan 200-205
switchport mode trunk
channel-group 3 mode active
!
interface GigabitEthernet1/6
description switch-1: trunk
switchport
switchport trunk allowed vlan 200-205
switchport mode trunk
channel-group 3 mode active
"""
]
}
config = {"add_all_connected": True, "add_lag": True}
drawing = create_drawio_diagram()
drawer = layer_2_drawer(drawing, config)
drawer.work(data)
drawer.drawing.dump_file(filename="test_cdp_drawing_drawio_data_dict_add_all_connected_add_lag.drawio", folder="./Output/")
with open ("./Output/test_cdp_drawing_drawio_data_dict_add_all_connected_add_lag.drawio") as produced:
with open("./Output/should_be_test_cdp_drawing_drawio_data_dict_add_all_connected_add_lag.drawio") as should_be:
assert produced.read() == should_be.read()
def test_cdp_drawing_yed_data_dict_cisco_nxos_base():
data = { "Cisco_NXOS": [
"""
nxos_switch_1# show cdp nei det
----------------------------------------
Device ID:nxos_switch_2(JPG2212345)
System Name: nxos_switch_2
Interface address(es):
IPv4 Address: 10.2.2.2
Platform: N77-C7711, Capabilities: Router Switch Supports-STP-Dispute
Interface: Ethernet5/1, Port ID (outgoing port): Ethernet2/29
Holdtime: 152 sec
Version:
Cisco Nexus Operating System (NX-OS) Software, Version 18.5(1 )
Advertisement Version: 2
Duplex: full
MTU: 9216
Physical Location: rack, street address
Mgmt address(es):
IPv4 Address: 10.2.2.2
nxos_switch_1# show lldp nei det
Chassis id: 501c.b09b.1111
Port id: Eth2/29
Local Port id: Eth5/15
Port Description: uplink to ISP via nxos_switch_1
System Name: cust_sw_1
System Description: Cisco NX-OS(tm) n7700, Software (n7700-s3-dk9), Version 91.3(1), RELEASE SOFTWARE Copyright (c) 2002-2003 by Cisco Systems, Inc. Compiled 7/30/2003 12:00:00
Time remaining: 95 seconds
System Capabilities: B, R
Enabled Capabilities: B, R
Management Address: 10.151.1.1
Vlan ID: not advertised
nxos_switch_1# show run int
interface Ethernet5/1
description nxos_switch_2:eth2/29 [L3]
mpls ip
mtu 9216
ip address 1.1.1.1/30
vrf member VRF1
ip address 2.2.2.2/32 secondary
!
interface Ethernet5/15
description cust_sw_1 Eth2/29
switchport
switchport mode trunk
switchport trunk allowed vlan 2122
mtu 9216
nxos_switch_1# show interface
Ethernet5/1 is up
admin state is up, Dedicated Interface
Hardware: 1000/10000 Ethernet, address: 8c60.4f53.1234 (bia 00b0.1111.4444)
Description: nxos_switch_2:eth2/29
MTU 9216 bytes, BW 10000000 Kbit, DLY 10 usec
reliability 255/255, txload 1/255, rxload 1/255
Encapsulation ARPA, medium is broadcast
Port mode is routed
full-duplex, 10 Gb/s, media type is 10G
Ethernet5/15 is up
admin state is up, Dedicated Interface
Hardware: 1000/10000 Ethernet, address: 8c60.4f53.1592 (bia 00b0.1111.9999)
Description: cust_sw_1 Eth2/29
MTU 9216 bytes, BW 10000000 Kbit, DLY 10 usec
reliability 255/255, txload 1/255, rxload 1/255
Encapsulation ARPA, medium is broadcast
Port mode is trunk
full-duplex, 10 Gb/s, media type is 10G
""",
"""
nxos_switch_2# show cdp nei det
----------------------------------------
Device ID:nxos_switch_1(JPG2212345)
System Name: nxos_switch_1
Interface address(es):
IPv4 Address: 10.1.1.1
Platform: N77-C7711, Capabilities: Router Switch Supports-STP-Dispute
Interface: Ethernet2/29, Port ID (outgoing port): Ethernet5/1
Holdtime: 152 sec
Version:
Cisco Nexus Operating System (NX-OS) Software, Version 18.5(1 )
Advertisement Version: 2
Duplex: full
MTU: 9216
Physical Location: rack, street address
Mgmt address(es):
IPv4 Address: 10.1.1.1
nxos_switch_1# show lldp nei det
Chassis id: 1409.dcaf.5555
Port id: 10GE1/17/21
Local Port id: Eth5/31
Port Description: cust_sw_3
System Name: cust_sw_3
System Description: Huawei Versatile Routing Platform Software
VRP (R) software, Version 8.120 (OSCA V100R005C60)
Copyright (C) 2012-2016 Huawei Technologies Co., Ltd.
HUAWEI OSCA
Time remaining: 113 seconds
System Capabilities: B, R
Enabled Capabilities: B, R
Management Address: 10.152.3.4
Vlan ID: 1
nxos_switch_1# show interface
Ethernet2/29 is up
admin state is up, Dedicated Interface
Hardware: 1000/10000 Ethernet, address: 8c60.4f53.4321 (bia 00b0.1111.3333)
Description: nxos_switch_1:eth5/1
MTU 9216 bytes, BW 10000000 Kbit, DLY 10 usec
reliability 255/255, txload 1/255, rxload 1/255
Encapsulation ARPA, medium is broadcast
Port mode is routed
full-duplex, 10 Gb/s, media type is 10G
Ethernet5/31 is up
admin state is up, Dedicated Interface
Hardware: 1000/10000 Ethernet, address: 8c60.4f53.3131 (bia 00b0.1111.3131)
Description: cust_sw_3 10GE1/17/21
MTU 9216 bytes, BW 10000000 Kbit, DLY 10 usec
reliability 255/255, txload 1/255, rxload 1/255
Encapsulation ARPA, medium is broadcast
Port mode is routed
full-duplex, 10 Gb/s, media type is 10G
nxos_switch_1# show run int
interface Ethernet2/29
description nxos_switch_1:eth5/1 [L3]
mpls ip
mtu 9216
ip address 1.1.1.2/30
vrf member VRF1
ip address 2.2.2.3/32 secondary
!
interface Ethernet5/31
description cust_sw_3 10GE1/17/21
switchport
switchport mode trunk
switchport trunk native vlan 777
switchport trunk allowed vlan 777,1,2,3,4
"""]
}
config = {
"platforms": ["Cisco_NXOS", "Cisco_IOS"]
}
drawing = create_yed_diagram()
drawer = layer_2_drawer(drawing, config)
drawer.work(data)
drawer.drawing.dump_file(filename="test_cdp_drawing_yed_data_dict_cisco_nxos_base.graphml", folder="./Output/")
with open ("./Output/test_cdp_drawing_yed_data_dict_cisco_nxos_base.graphml") as produced:
with open("./Output/should_be_test_cdp_drawing_yed_data_dict_cisco_nxos_base.graphml") as should_be:
assert produced.read() == should_be.read()
# test_cdp_drawing_yed_data_path_cisco_nxos()
def test_cdp_drawing_yed_data_path_cisco_ios_nxos_all():
data = "./Data/SAMPLE_CDP_LLDP_2/"
config = {
"platforms": ["Cisco_NXOS", "Cisco_IOS"],
"add_all_connected": True,
"add_lag": True,
"group_links": True
}
drawing = create_yed_diagram()
drawer = layer_2_drawer(drawing, config)
drawer.work(data)
drawer.drawing.dump_file(filename="test_cdp_drawing_yed_data_path_cisco_ios_nxos_all.graphml", folder="./Output/")
with open ("./Output/test_cdp_drawing_yed_data_path_cisco_ios_nxos_all.graphml") as produced:
with open("./Output/should_be_test_cdp_drawing_yed_data_path_cisco_ios_nxos_all.graphml") as should_be:
assert produced.read() == should_be.read()
def test_cdp_drawing_yed_data_path_cisco_nxos_base():
data = "./Data/SAMPLE_CDP_LLDP_2/"
config = {
"platforms": ["Cisco_NXOS"]
}
drawing = create_yed_diagram()
drawer = layer_2_drawer(drawing, config)
drawer.work(data)
drawer.drawing.dump_file(filename="test_cdp_drawing_yed_data_path_cisco_nxos_base.graphml", folder="./Output/")
with open ("./Output/test_cdp_drawing_yed_data_path_cisco_nxos_base.graphml") as produced:
with open("./Output/should_be_test_cdp_drawing_yed_data_path_cisco_nxos_base.graphml") as should_be:
assert produced.read() == should_be.read()
def test_cdp_drawing_yed_data_path_cisco_nxos_combine_peers():
data = "./Data/SAMPLE_CDP_LLDP_2/"
config = {
"platforms": ["Cisco_NXOS"],
"combine_peers": True
}
drawing = create_yed_diagram()
drawer = layer_2_drawer(drawing, config)
drawer.work(data)
drawer.drawing.dump_file(filename="test_cdp_drawing_yed_data_path_cisco_nxos_combine_peers.graphml", folder="./Output/")
with open ("./Output/test_cdp_drawing_yed_data_path_cisco_nxos_combine_peers.graphml") as produced:
with open("./Output/should_be_test_cdp_drawing_yed_data_path_cisco_nxos_combine_peers.graphml") as should_be:
assert produced.read() == should_be.read()
def test_cdp_drawing_yed_data_dict_cisco_nxos_combine_peer_has_data():
data = { "Cisco_NXOS": [
"""
nxos_switch_1# show cdp nei det
----------------------------------------
Device ID:nxos_switch_2(JPG2212345)
System Name: nxos_switch_2
Interface address(es):
IPv4 Address: 10.2.2.2
Platform: N77-C7711, Capabilities: Router Switch Supports-STP-Dispute
Interface: Ethernet5/1, Port ID (outgoing port): Ethernet2/29
Holdtime: 152 sec
Version:
Cisco Nexus Operating System (NX-OS) Software, Version 18.5(1 )
Advertisement Version: 2
Duplex: full
MTU: 9216
Physical Location: rack, street address
Mgmt address(es):
IPv4 Address: 10.2.2.2
----------------------------------------
Device ID:nxos_switch_3(JPG2212343)
System Name: nxos_switch_3
Interface address(es):
IPv4 Address: 10.3.3.3
Platform: N77-C7711, Capabilities: Router Switch Supports-STP-Dispute
Interface: Ethernet5/1, Port ID (outgoing port): Ethernet1/33
Holdtime: 152 sec
Version:
Cisco Nexus Operating System (NX-OS) Software, Version 18.5(1 )
Advertisement Version: 2
Duplex: full
MTU: 9216
Physical Location: rack, street address
Mgmt address(es):
IPv4 Address: 10.2.2.2
nxos_switch_1# show run int
interface Ethernet5/1
description nxos_switch_2/3: [L3]
mpls ip
mtu 9216
ip address 1.1.1.1/30
vrf member VRF1
ip address 2.2.2.2/32 secondary
nxos_switch_1# show interface
Ethernet5/1 is up
admin state is up, Dedicated Interface
Hardware: 1000/10000 Ethernet, address: 8c60.4f53.1234 (bia 00b0.1111.4444)
Description: nxos_switch_2/3: [L3]
MTU 9216 bytes, BW 10000000 Kbit, DLY 10 usec
reliability 255/255, txload 1/255, rxload 1/255
Encapsulation ARPA, medium is broadcast
Port mode is routed
full-duplex, 10 Gb/s, media type is 10G
""",
"""
nxos_switch_2# show cdp nei det
----------------------------------------
Device ID:nxos_switch_1(JPG2212345)
System Name: nxos_switch_1
Interface address(es):
IPv4 Address: 10.1.1.1
Platform: N77-C7711, Capabilities: Router Switch Supports-STP-Dispute
Interface: Ethernet2/29, Port ID (outgoing port): Ethernet5/1
Holdtime: 152 sec
Version:
Cisco Nexus Operating System (NX-OS) Software, Version 18.5(1 )
Advertisement Version: 2
Duplex: full
MTU: 9216
Physical Location: rack, street address
Mgmt address(es):
IPv4 Address: 10.1.1.1
Time remaining: 113 seconds
System Capabilities: B, R
Enabled Capabilities: B, R
Management Address: 10.152.3.4
Vlan ID: 1
nxos_switch_1# show interface
Ethernet2/29 is up
admin state is up, Dedicated Interface
Hardware: 1000/10000 Ethernet, address: 8c60.4f53.4321 (bia 00b0.1111.3333)
Description: nxos_switch_1:eth5/1
MTU 9216 bytes, BW 10000000 Kbit, DLY 10 usec
reliability 255/255, txload 1/255, rxload 1/255
Encapsulation ARPA, medium is broadcast
Port mode is routed
full-duplex, 10 Gb/s, media type is 10G
nxos_switch_1# show run int
interface Ethernet2/29
description nxos_switch_1:eth5/1 [L3]
mpls ip
mtu 9216
ip address 1.1.1.2/30
vrf member VRF1
ip address 2.2.2.3/32 secondary
"""]
}
config = {
"combine_peers": True
}
drawing = create_yed_diagram()
drawer = layer_2_drawer(drawing, config)
drawer.work(data)
drawer.drawing.dump_file(filename="test_cdp_drawing_yed_data_dict_cisco_nxos_combine_peer_has_data.graphml", folder="./Output/")
with open ("./Output/test_cdp_drawing_yed_data_dict_cisco_nxos_combine_peer_has_data.graphml") as produced:
with open("./Output/should_be_test_cdp_drawing_yed_data_dict_cisco_nxos_combine_peer_has_data.graphml") as should_be:
assert produced.read() == should_be.read()
def test_cdp_drawing_yed_data_dict_cisco_nxos_combine_peer_two_links():
data = { "Cisco_NXOS": [
"""
nxos_switch_1# show cdp nei det
----------------------------------------
Device ID:nxos_switch_2(JPG2212345)
System Name: nxos_switch_2
Interface address(es):
IPv4 Address: 10.2.2.2
Platform: N77-C7711, Capabilities: Router Switch Supports-STP-Dispute
Interface: Ethernet5/1, Port ID (outgoing port): Ethernet2/29
Holdtime: 152 sec
Version:
Cisco Nexus Operating System (NX-OS) Software, Version 18.5(1 )
Advertisement Version: 2
Duplex: full
MTU: 9216
Physical Location: rack, street address
Mgmt address(es):
IPv4 Address: 10.2.2.2
----------------------------------------
Device ID:nxos_switch_3(JPG2212343)
System Name: nxos_switch_3
Interface address(es):
IPv4 Address: 10.3.3.3
Platform: N77-C7711, Capabilities: Router Switch Supports-STP-Dispute
Interface: Ethernet5/1, Port ID (outgoing port): Ethernet1/33
Holdtime: 152 sec
Version:
Cisco Nexus Operating System (NX-OS) Software, Version 18.5(1 )
Advertisement Version: 2
Duplex: full
MTU: 9216
Physical Location: rack, street address
Mgmt address(es):
IPv4 Address: 10.2.2.2
----------------------------------------
Device ID:nxos_switch_2(JPG2212345)
System Name: nxos_switch_2
Interface address(es):
IPv4 Address: 10.2.2.2
Platform: N77-C7711, Capabilities: Router Switch Supports-STP-Dispute
Interface: Ethernet5/11, Port ID (outgoing port): Ethernet2/30
Holdtime: 152 sec
Version:
Cisco Nexus Operating System (NX-OS) Software, Version 18.5(1 )
Advertisement Version: 2
Duplex: full
MTU: 9216
Physical Location: rack, street address
Mgmt address(es):
IPv4 Address: 10.2.2.2
----------------------------------------
Device ID:nxos_switch_3(JPG2212343)
System Name: nxos_switch_3
Interface address(es):
IPv4 Address: 10.3.3.3
Platform: N77-C7711, Capabilities: Router Switch Supports-STP-Dispute
Interface: Ethernet5/11, Port ID (outgoing port): Ethernet1/34
Holdtime: 152 sec
Version:
Cisco Nexus Operating System (NX-OS) Software, Version 18.5(1 )
Advertisement Version: 2
Duplex: full
MTU: 9216
Physical Location: rack, street address
Mgmt address(es):
IPv4 Address: 10.2.2.2
nxos_switch_1# show run int
interface Ethernet5/1
description nxos_switch_2/3: [L3]
mpls ip
mtu 9216
ip address 1.1.1.1/30
vrf member VRF1
ip address 2.2.2.2/32 secondary
nxos_switch_1# show interface
Ethernet5/1 is up
admin state is up, Dedicated Interface
Hardware: 1000/10000 Ethernet, address: 8c60.4f53.1234 (bia 00b0.1111.4444)
Description: nxos_switch_2/3: [L3]
MTU 9216 bytes, BW 10000000 Kbit, DLY 10 usec
reliability 255/255, txload 1/255, rxload 1/255
Encapsulation ARPA, medium is broadcast
Port mode is routed
full-duplex, 10 Gb/s, media type is 10G
""",
"""
nxos_switch_2# show cdp nei det
----------------------------------------
Device ID:nxos_switch_1(JPG2212345)
System Name: nxos_switch_1
Interface address(es):
IPv4 Address: 10.1.1.1
Platform: N77-C7711, Capabilities: Router Switch Supports-STP-Dispute
Interface: Ethernet2/29, Port ID (outgoing port): Ethernet5/1
Holdtime: 152 sec
Version:
Cisco Nexus Operating System (NX-OS) Software, Version 18.5(1 )
Advertisement Version: 2
Duplex: full
MTU: 9216
Physical Location: rack, street address
Mgmt address(es):
IPv4 Address: 10.1.1.1
Time remaining: 113 seconds
System Capabilities: B, R
Enabled Capabilities: B, R
Management Address: 10.152.3.4
Vlan ID: 1
nxos_switch_1# show interface
Ethernet2/29 is up
admin state is up, Dedicated Interface
Hardware: 1000/10000 Ethernet, address: 8c60.4f53.4321 (bia 00b0.1111.3333)
Description: nxos_switch_1:eth5/1
MTU 9216 bytes, BW 10000000 Kbit, DLY 10 usec
reliability 255/255, txload 1/255, rxload 1/255
Encapsulation ARPA, medium is broadcast
Port mode is routed
full-duplex, 10 Gb/s, media type is 10G
nxos_switch_1# show run int
interface Ethernet2/29
description nxos_switch_1:eth5/1 [L3]
mpls ip
mtu 9216
ip address 1.1.1.2/30
vrf member VRF1
ip address 2.2.2.3/32 secondary
"""]
}
config = {
"combine_peers": True
}
drawing = create_yed_diagram()
drawer = layer_2_drawer(drawing, config)
drawer.work(data)
drawer.drawing.dump_file(filename="test_cdp_drawing_yed_data_dict_cisco_nxos_combine_peer_two_links.graphml", folder="./Output/")
with open ("./Output/test_cdp_drawing_yed_data_dict_cisco_nxos_combine_peer_two_links.graphml") as produced:
with open("./Output/should_be_test_cdp_drawing_yed_data_dict_cisco_nxos_combine_peer_two_links.graphml") as should_be:
assert produced.read() == should_be.read()
def test_cdp_drawing_yed_data_dict_cisco_nxos_combine_peer_behind_lag():
data = { "Cisco_NXOS": [
"""
nxos_switch_1# show cdp nei det
----------------------------------------
Device ID:nxos_switch_2(JPG2212345)
System Name: nxos_switch_2
Interface address(es):
IPv4 Address: 10.2.2.2
Platform: N77-C7711, Capabilities: Router Switch Supports-STP-Dispute
Interface: Ethernet5/1, Port ID (outgoing port): Ethernet2/29
Holdtime: 152 sec
Version:
Cisco Nexus Operating System (NX-OS) Software, Version 18.5(1 )
Advertisement Version: 2
Duplex: full
MTU: 9216
Physical Location: rack, street address
Mgmt address(es):
IPv4 Address: 10.2.2.2
----------------------------------------
Device ID:nxos_switch_3(JPG2212343)
System Name: nxos_switch_3
Interface address(es):
IPv4 Address: 10.3.3.3
Platform: N77-C7711, Capabilities: Router Switch Supports-STP-Dispute
Interface: Ethernet5/1, Port ID (outgoing port): Ethernet1/33
Holdtime: 152 sec
Version:
Cisco Nexus Operating System (NX-OS) Software, Version 18.5(1 )
Advertisement Version: 2
Duplex: full
MTU: 9216
Physical Location: rack, street address
Mgmt address(es):
IPv4 Address: 10.2.2.2
nxos_switch_1# show run int
interface Port-channel51
description nxos_switch_2/3: [L2]
switchport
switchport trunk allowed vlan 209
switchport mode trunk
!
interface Ethernet5/1
description nxos_switch_2/3: [L2, LAG 51]
switchport
switchport trunk allowed vlan 209
switchport mode trunk
channel-group 51 mode active
nxos_switch_1# show interface
Port-channel51 is up
admin state is up, Dedicated Interface
Hardware: 1000/10000 Ethernet, address: 8c60.4f53.1234 (bia 00b0.1111.4444)
Description: nxos_switch_2/3: [L2]
MTU 9216 bytes, BW 10000000 Kbit, DLY 10 usec
reliability 255/255, txload 1/255, rxload 1/255
Encapsulation ARPA, medium is broadcast
Port mode is trunk
full-duplex, 10 Gb/s, media type is 10G
""",
"""
nxos_switch_2# show cdp nei det
----------------------------------------
Device ID:nxos_switch_1(JPG2212345)
System Name: nxos_switch_1
Interface address(es):
IPv4 Address: 10.1.1.1
Platform: N77-C7711, Capabilities: Router Switch Supports-STP-Dispute
Interface: Ethernet2/29, Port ID (outgoing port): Ethernet5/1
Holdtime: 152 sec
Version:
Cisco Nexus Operating System (NX-OS) Software, Version 18.5(1 )
Advertisement Version: 2
Duplex: full
MTU: 9216
Physical Location: rack, street address
Mgmt address(es):
IPv4 Address: 10.1.1.1
Time remaining: 113 seconds
System Capabilities: B, R
Enabled Capabilities: B, R
Management Address: 10.152.3.4
Vlan ID: 1
nxos_switch_1# show interface
Ethernet2/29 is up
admin state is up, Dedicated Interface
Hardware: 1000/10000 Ethernet, address: 8c60.4f53.4321 (bia 00b0.1111.3333)
Description: nxos_switch_1:eth5/1
MTU 9216 bytes, BW 10000000 Kbit, DLY 10 usec
reliability 255/255, txload 1/255, rxload 1/255
Encapsulation ARPA, medium is broadcast
Port mode is routed
full-duplex, 10 Gb/s, media type is 10G
nxos_switch_1# show run int
interface Ethernet2/29
description nxos_switch_1:eth5/1 [L3]
mpls ip
mtu 9216
ip address 1.1.1.2/30
vrf member VRF1
ip address 2.2.2.3/32 secondary
"""]
}
config = {
"combine_peers": True,
"add_lag": True
}
drawing = create_yed_diagram()
drawer = layer_2_drawer(drawing, config)
drawer.work(data)
drawer.drawing.dump_file(filename="test_cdp_drawing_yed_data_dict_cisco_nxos_combine_peer_behind_lag.graphml", folder="./Output/")
with open ("./Output/test_cdp_drawing_yed_data_dict_cisco_nxos_combine_peer_behind_lag.graphml") as produced:
with open("./Output/should_be_test_cdp_drawing_yed_data_dict_cisco_nxos_combine_peer_behind_lag.graphml") as should_be:
assert produced.read() == should_be.read()
def test_cdp_drawing_yed_data_dict_cisco_iosxr():
data = { "Cisco_IOSXR": [
"""
RP/0/RSP0/CPU0:router_XR_01#show run
interface GigabitEthernet0/0/0/19
description To cust_rt_1 Gi1/1
mtu 4484
ipv4 address 10.0.0.1 255.255.255.192
!
interface TenGigE0/0/2/0
description To cust_rt_2 Gi1/2
mtu 4484
ipv4 address 10.1.0.1 255.255.255.192
ipv6 address 2001::4321/64 eui-64
RP/0/RSP0/CPU0:router_XR_01#show lldp neighbors detail
------------------------------------------------
Local Interface: GigabitEthernet0/0/0/8
Chassis id: 0026.9815.c3e6
Port id: Gi0/0/0/8
Port Description: GigabitEthernet0/0/0/8 peer
System Name: asr9k-5
System Description:
Cisco IOS XR Software, Version 4.1.0.32I[Default]
Copyright (c) 2011 by Cisco Systems, Inc.
Time remaining: 102 seconds
Hold Time: 120 seconds
System Capabilities: R
Enabled Capabilities: R
Management Addresses:
IPv4 address: 10.5.173.110
RP/0/RSP0/CPU0:router_XR_01#show cdp neighbors detail
-------------------------
Device ID: cust_rt_1(FGE1234567)
SysName :
Entry address(es):
IPv4 address: 10.0.0.2
Platform: Cisco CISCO3945-CHASSIS, Capabilities: Router Source-Route-Bridge Switch IGMP
Interface: GigabitEthernet0/0/0/19
Port ID (outgoing port): GigabitEthernet1/1
Holdtime : 147 sec
Version :
Cisco IOS Software, C3900 Software (C3900-UNIVERSALK9-M), Version 15.4(3)M1, RELEASE SOFTWARE (fc1)
Technical Support: http://www.cisco.com/techsupport
Copyright (c) 1986-2014 by Cisco Systems, Inc.
Compiled Sat 25-Oct-14 07:15 by prod_rel_team
advertisement version: 2
Duplex: full
-------------------------
Device ID: cust_rt_2
SysName :
Entry address(es):
IPv4 address: 10.1.0.1
IPv6 address: fe80::5e5a:feed:1234:1234
Platform: cisco ASR1001-X, Capabilities: Router IGMP
Interface: TenGigE0/0/2/0
Port ID (outgoing port): GigabitEthernet1/2
Holdtime : 139 sec
Version :
Cisco IOS Software [Fuji], ASR1000 Software (X86_64_LINUX_IOSD-UNIVERSALK9-M), Version 16.9.2, RELEASE SOFTWARE (fc4)
Technical Support: http://www.cisco.com/techsupport
Copyright (c) 1986-2018 by Cisco Systems, Inc.
Compiled Mon 05-Nov-18 19:31 by mcpre
advertisement version: 2
Duplex: full
RP/0/RSP0/CPU0:router_XR_01#show interfaces
GigabitEthernet0/0/0/19 is up, line protocol is up
Interface state transitions: 1
Hardware is GigabitEthernet, address is 4321.f54f.1234 (bia 4321.f54f.1234)
Description: To cust_rt_1 Gi1/1
Internet address is 10.0.0.1/26
MTU 4484 bytes, BW 1000000 Kbit (Max: 1000000 Kbit)
Full-duplex, 1000Mb/s, TFD, link type is force-up
TenGigE0/0/2/0 is up, line protocol is up
Interface state transitions: 1
Hardware is TenGigE, address is 08ec.6789.68e9 (bia 08ec.6789.68e9)
Layer 1 Transport Mode is LAN
Description: To cust_rt_2 Gi1/2
Internet address is 10.1.0.1/26
MTU 4484 bytes, BW 10000000 Kbit (Max: 10000000 Kbit)
Full-duplex, 10000Mb/s, LR, link type is force-up
""",
"""
RP/0/RSP0/CPU0:cust_rt_1#show run
interface GigabitEthernet1/1
description To cust_rt_1 Gi0/0/19
mtu 4484
ipv4 address 10.0.0.2 255.255.255.192
vrf UPSTREAM
RP/0/RSP0/CPU0:cust_rt_1#show cdp neighbors detail
-------------------------
Device ID: router_XR_01
SysName :
Entry address(es):
IPv4 address: 10.0.0.1
Platform: Cisco ASR9001, Capabilities: Router Source-Route-Bridge Switch IGMP
Interface: GigabitEthernet1/1
Port ID (outgoing port): GigabitEthernet0/0/0/19
Holdtime : 147 sec
"""]
}
config = {}
drawing = create_yed_diagram()
drawer = layer_2_drawer(drawing, config)
drawer.work(data)
drawer.drawing.dump_file(filename="test_cdp_drawing_yed_data_dict_cisco_iosxr.graphml", folder="./Output/")
with open ("./Output/test_cdp_drawing_yed_data_dict_cisco_iosxr.graphml") as produced:
with open("./Output/should_be_test_cdp_drawing_yed_data_dict_cisco_iosxr.graphml") as should_be:
assert produced.read() == should_be.read()
def test_cdp_drawing_yed_data_dict_huawei_base():
data = { "Huawei": [
"""
<HUAWEI-CSW-01>disp cur interface
interface 10GE4/0/17
description TO-SLB-FF_21 Eth9
ip binding vpn-instance SLB-F
ip address 10.123.0.5 255.255.255.252
dot1q termination vid 619
#
interface 10GE4/0/9
description LINK:R-RR1-TOR-581:10GE2/0/5
eth-trunk 11
#
interface Eth-Trunk11
description LINK:R-RR1-TOR-581:LAG
port link-type trunk
port trunk allow-pass vlan 200 210 220 230 240 280 290 300 310 320
port trunk allow-pass vlan 330 340 1201 to 1223
mode lacp-static
dfs-group 1 m-lag 11
<HUAWEI-CSW-01>disp lldp neighbor
10GE4/0/17 has 1 neighbor(s):
Port ID :Ethernet9
Port description :UPLINK to CSW 10GE4/0/17
System name :SLB-FF_21
System description :SLB Vendor BLA
System capabilities supported :bridge router
Management address :10.132.77.91
Maximum frame Size :9123
10GE4/0/9 has 1 neighbor(s):
Port ID :10GE2/0/5
Port description :UPLINK to CSW 10GE4/0/9
System name :R-RR1-TOR-581
System description :Huawei Versatile Routing Platform Software
VRP (R) software, Version 8.150 (NE40E V800R009C10SPC200)
Copyright (C) 2012-2017 Huawei Technologies Co., Ltd.
HUAWEI NE40E-X8A
System capabilities supported :bridge router
Management address :10.1.1.2
Maximum frame Size :9216
<HUAWEI-CSW-01>disp cur vlan
vlan 1201
name Servers_vlan
vlan 1223
name SEC_staff
""",
"""
<HUAWEI-CSW-02>disp cur interface
interface 10GE4/0/19
description TO-SLB-FF_22 Eth99
ip binding vpn-instance SLB-F
ip address 10.123.0.3 255.255.255.252
#
interface 10GE4/0/9
description LINK:R-RR1-TOR-581:10GE2/0/6
eth-trunk 11
#
interface Eth-Trunk11
description LINK:R-RR1-TOR-581:LAG
port link-type trunk
port trunk allow-pass vlan 200 210 220 230 240 280 290 300 310 320
port trunk allow-pass vlan 330 340 1201 to 1223
mode lacp-static
dfs-group 1 m-lag 11
<HUAWEI-CSW-01>disp lldp neighbor
10GE4/0/19 has 1 neighbor(s):
Port ID :Ethernet99
Port description :UPLINK to CSW2 10GE4/0/19
System name :SLB-FF_22
System description :SLB Vendor BLA
System capabilities supported :bridge router
Management address :10.132.77.92
Maximum frame Size :9123
10GE4/0/9 has 1 neighbor(s):
Port ID :10GE2/0/6
Port description :UPLINK to CSW2 10GE4/0/9
System name :R-RR1-TOR-581
System description :Huawei Versatile Routing Platform Software
VRP (R) software, Version 8.150 (NE40E V800R009C10SPC200)
Copyright (C) 2012-2017 Huawei Technologies Co., Ltd.
HUAWEI NE40E-X8A
System capabilities supported :bridge router
Management address :10.1.1.2
Maximum frame Size :9216
"""
]
}
config = {
# "add_lag": True
}
drawing = create_yed_diagram()
drawer = layer_2_drawer(drawing, config)
drawer.work(data)
drawer.drawing.dump_file(filename="test_cdp_drawing_yed_data_dict_huawei_base.graphml", folder="./Output/")
def test_cdp_drawing_yed_data_dict_cisco_iosxr_lldp_behind_lag():
data = { "Cisco_IOSXR": [
"""
RP/0/RSP0/CPU0:router_XR_01#show run
interface GigabitEthernet0/0/0/19
description To cust_rt_1 Gi0/0/0/8
bundle id 100 mode active
!
interface Bundle-Ether100
description To router_XR_01 Gi0/0/19 LAG 200
RP/0/RSP0/CPU0:router_XR_01#show lldp neighbors detail
------------------------------------------------
Local Interface: GigabitEthernet0/0/0/19
Chassis id: 0026.9815.c3e6
Port id: GigabitEthernet0/0/0/8
Port Description: To router_XR_01 Gi0/0/19
System Name: cust_rt_1
System Description:
Cisco IOS XR Software, Version 4.1.0.32I[Default]
Copyright (c) 2011 by Cisco Systems, Inc.
Time remaining: 102 seconds
Hold Time: 120 seconds
System Capabilities: R
Enabled Capabilities: R
Management Addresses:
IPv4 address: 10.5.173.110
------------------------------------------------
Local Interface: GigabitEthernet0/0/0/19
Chassis id: 0026.9815.c3e6
Port id: Bundle-Ether200
Port Description: To router_XR_01 Gi0/0/19 LAG 100
System Name: cust_rt_1
System Description:
Cisco IOS XR Software, Version 4.1.0.32I[Default]
Copyright (c) 2011 by Cisco Systems, Inc.
Time remaining: 102 seconds
Hold Time: 120 seconds
System Capabilities: R
Enabled Capabilities: R
Management Addresses:
IPv4 address: 10.5.173.110
""",
"""
RP/0/RSP0/CPU0:cust_rt_1#show run
interface GigabitEthernet0/0/0/8
description To router_XR_01 Gi0/0/19
bundle id 200 mode active
!
interface Bundle-Ether200
description To router_XR_01 Gi0/0/19 LAG 100
RP/0/RSP0/CPU0:cust_rt_1#show lldp neighbors detail
------------------------------------------------
Local Interface: GigabitEthernet0/0/0/8
Chassis id: 0026.9815.c3e6
Port id: GigabitEthernet0/0/0/19
Port Description: To cust_rt_1 Gi0/0/0/8
System Name: router_XR_01
System Description:
Cisco IOS XR Software, Version 4.1.0.32I[Default]
Copyright (c) 2011 by Cisco Systems, Inc.
Time remaining: 102 seconds
Hold Time: 120 seconds
System Capabilities: R
Enabled Capabilities: R
Management Addresses:
IPv4 address: 10.5.173.111
"""]
}
config = {
"add_lag": True
}
drawing = create_yed_diagram()
drawer = layer_2_drawer(drawing, config)
drawer.work(data)
drawer.drawing.dump_file(filename="test_cdp_drawing_yed_data_dict_cisco_iosxr_lldp_behind_lag.graphml", folder="./Output/")
with open ("./Output/test_cdp_drawing_yed_data_dict_cisco_iosxr_lldp_behind_lag.graphml") as produced:
with open("./Output/should_be_test_cdp_drawing_yed_data_dict_cisco_iosxr_lldp_behind_lag.graphml") as should_be:
assert produced.read() == should_be.read()
# test_cdp_drawing_yed_data_dict_cisco_iosxr_lldp_behind_lag() | 31.002227 | 186 | 0.707337 | 15,689 | 111,391 | 4.911467 | 0.031232 | 0.016624 | 0.041113 | 0.048588 | 0.974603 | 0.968322 | 0.962754 | 0.95694 | 0.951555 | 0.946623 | 0 | 0.087061 | 0.176828 | 111,391 | 3,593 | 187 | 31.002227 | 0.753299 | 0.007667 | 0 | 0.818833 | 0 | 0.012283 | 0.757184 | 0.113101 | 0 | 0 | 0 | 0 | 0.016377 | 1 | 0.015865 | false | 0 | 0.002047 | 0 | 0.017912 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
8a463cdc95c06878cecbff2d80903caa271e1c99 | 27,924 | py | Python | foolbox/attacks/imagecorruption.py | ojwenzel/foolbox | 951d0ade820b185cfb665eb3b1d77ed7c4a0c7ef | [
"MIT"
] | null | null | null | foolbox/attacks/imagecorruption.py | ojwenzel/foolbox | 951d0ade820b185cfb665eb3b1d77ed7c4a0c7ef | [
"MIT"
] | null | null | null | foolbox/attacks/imagecorruption.py | ojwenzel/foolbox | 951d0ade820b185cfb665eb3b1d77ed7c4a0c7ef | [
"MIT"
] | null | null | null | from functools import wraps
from typing import List, Union
from warnings import warn
import re
import numpy as np
from .base import Attack
from .base import generator_decorator
from ..criteria import Misclassification
from ..distances import MSE
from .. import nprng
import imagecorruptions
# MODULE INFORMATION
__author__ = "Ole Jonas Wenzel"
__credits__ = ["Ole Jonas Wenzel", ""]
__version__ = "0.0.1"
__maintainer__ = "Ole Jonas Wenzel"
__email__ = "olejonaswenzel@gmail.com"
__status__ = "Dev"
# CONSTANTS
SEVERITIES = [1, 2, 3, 4, 5]
CORRUPTIONS = ['gaussian_noise', 'shot_noise', 'impulse_noise', 'defocus_blur', 'glass_blur', 'motion_blur',
'zoom_blur', 'snow', 'frost', 'fog', 'brightness', 'contrast', 'elastic_transform', 'pixelate',
'jpeg_compression']
# FUNCTIONS
def validate_corruptions(corruptions: List[str]) -> List[str]:
"""
Validates the elements in the input list or conformity with the corruptions provided by the package
imagecorruptions.
:param corruptions: list of corruptions
:return: elements of the input list, that are valid
"""
# PARAMETERS
all_corruptions = CORRUPTIONS
valid_corruptions = [corruption for corruption in corruptions if corruption in all_corruptions]
# check if there is any valid corruption provided
if len(valid_corruptions) == 0:
raise ValueError('\'corruptions\' has to be a list containing at least one valid element'
' from {}'.format(all_corruptions))
# remove all invalid corruptions from
if len(valid_corruptions) != len(corruptions):
warn('\'corruptions\' contains illegal values. Using: {}'.format(valid_corruptions))
return list(valid_corruptions)
def validate_severities(severities: List[int]) -> List[int]:
"""
Validates the elements in the input list or conformity with the severities provided by the package
imagecorruptions.
:param severities: list of severities
:return: elements of the input list, that are valid
"""
# PARSE INPUT
if severities is None:
return SEVERITIES
# PARAMETERS
valid_severities = [severity for severity in severities if severity in SEVERITIES]
# check if there is any valid severity provided
if len(valid_severities) == 0:
raise ValueError('\'severities\' has to be a list containing at least one valid element'
' from {}'.format(SEVERITIES))
# remove all invalid severities from
if len(valid_severities) != len(severities):
warn('\'severities\' contains illegal values. Using: {}'.format(valid_severities))
return list(valid_severities)
def corruption_name_2_cls_name(corruption_name: str) -> str:
# convert corruption_name as in package imagecorruptions into class name as in this module
return ''.join([sub.capitalize() for sub in corruption_name.split('_')])
def cls_name_2_corruption_name(cls_name: str):
"""https://stackoverflow.com/questions/1175208/elegant-python-function-to-convert-camelcase-to-snake-case"""
s1 = re.sub('(.)([A-Z][a-z]+)', r'\1_\2', cls_name)
return re.sub('([a-z0-9])([A-Z])', r'\1_\2', s1).lower()
def get_subset(subset: List[str] = ['common']) -> List[str]:
if 'common' in subset:
return CORRUPTIONS[:15]
elif 'validation' in subset:
return CORRUPTIONS[15:]
elif 'all' in subset or subset is None:
return CORRUPTIONS
else:
return validate_corruptions(subset)
def list_corruption_attack_names(subset: List[str] = ['common']) -> List[str]:
"""
Retrieves class names for specified subset of corruption attacks. Note: identifiers 'common', 'all', 'validation'
take precedence over individual corruption_names
:param subset: specification of subset to retrieve may contain CORRUPTIONS + ['common', 'all', 'validation']
:return:
"""
corruptions = get_subset(subset)
return [corruption_name_2_cls_name(corruption) for corruption in corruptions]
def get_corruption_attacks(subset: List[str] = ['common']):
"""
Retrieves classes for specified subset of corruption attacks. Note: identifiers 'common', 'all', 'validation' take
precedence over individual corruption_names
:param subset: specification of subset to retrieve may contain CORRUPTIONS + ['common', 'all', 'validation']
:return:
"""
corruptions = list_corruption_attack_names(subset)
return [globals()[corruption] for corruption in corruptions]
def image_transpose_decorator(func):
"""
transposes images from format [channels, width , height] to [width, height, channels] before calling function
:param func: image manipulation function
:return: function wrapper that pre- and post-processes func input
"""
@wraps(func)
def with_transpose(image: np.ndarray, *args, **kwargs):
image = image.transpose(1, 2, 0)
image = func(image, *args, **kwargs)
image = image.transpose(2, 0, 1)
return image
return with_transpose
def image_value_range_decorator(func):
"""
converts the range of pixel values of an image from format [channels, width , height]
to [width, height, channels] before calling function
:param func: image manipulation function
:return: func wrapper that pre- and post-processes func input
"""
@wraps(func)
def with_io_transform(image: np.ndarray, *args, **kwargs):
image = to_255_image(image)
image = func(image, *args, **kwargs)
image = to_0_1_image(image)
return image
return with_io_transform
@image_value_range_decorator
@image_transpose_decorator
def corrupt(image: np.ndarray, corruption: str, severity: int):
return imagecorruptions.corrupt(image, corruption_name=corruption, severity=severity)
def to_255_image(image: np.ndarray) -> np.ndarray:
return (image * 255).astype('uint8')
def to_0_1_image(image: np.ndarray) -> np.ndarray:
return (image / 255).astype('float32')
def attack_decorator(corruption: str):
def outer_dec(func):
@wraps(func)
def inner_dec(self, a, severities: List[int] = None, repetitions=10, random: bool = True):
x = a.unperturbed
severities = validate_severities(severities)
seed = x.sum().astype(int)
for _ in range(repetitions):
for s, severity in enumerate(severities):
if not random:
np.random.seed(seed + severity)
perturbed = corrupt(x, corruption=corruption, severity=severity)
if a.normalized_distance(perturbed) >= a.distance:
continue
_, is_adversarial = yield from a.forward_one(perturbed)
if is_adversarial:
break
return inner_dec
return outer_dec
# def attack(a, corruption: str, severities: List[int] = None, repetitions=10, random: bool = True):
#
# x = a.unperturbed
# severities = validate_severities(severities)
#
# seed = x.sum().astype(int)
#
# print('seed value: {}'.format(seed))
#
# for _ in range(repetitions):
# for s, severity in enumerate(severities):
# if not random:
# np.random.seed(seed + severity)
# perturbed = corrupt(x, corruption=corruption, severity=severity)
#
# if a.normalized_distance(perturbed) >= a.distance:
# continue
#
# _, is_adversarial = yield from a.forward_one(perturbed)
# if is_adversarial:
# break
# CLASSES
class GaussianNoise(Attack):
"""Increases the amount of image corruption until the input is misclassified.
"""
CORRUPTION = 'gaussian_noise'
@generator_decorator
@attack_decorator('gaussian_noise')
def as_generator(self, a, severities: List[int] = None, repetitions=10, random: bool = True):
"""Increases the amount of specified image corruption until the input is misclassified.
Parameters
----------
input_or_adv : `numpy.ndarray` or :class:`Adversarial`
The original, unperturbed input as a `numpy.ndarray` or
an :class:`Adversarial` instance.
label : int
The reference label of the original input. Must be passed
if `a` is a `numpy.ndarray`, must not be passed if `a` is
an :class:`Adversarial` instance.
unpack : bool
If true, returns the adversarial input, otherwise returns
the Adversarial object.
severities : List[int]
severity-strengths of attack
repetitions : int
Specifies how often the attack will be repeated.
random : bool
if False an image and corruption severity specific seed is set for reproducibility
"""
def __str__(self):
return 'Corruption attack of type \'{}\''.format(self.CORRUPTION)
class ShotNoise(Attack):
"""Increases the amount of image corruption until the input is misclassified.
"""
CORRUPTION = 'shot_noise'
@generator_decorator
@attack_decorator('gaussian_noise')
def as_generator(self, a, severities: List[int] = None, repetitions=10, random: bool = True):
"""Increases the amount of specified image corruption until the input is misclassified.
Parameters
----------
input_or_adv : `numpy.ndarray` or :class:`Adversarial`
The original, unperturbed input as a `numpy.ndarray` or
an :class:`Adversarial` instance.
label : int
The reference label of the original input. Must be passed
if `a` is a `numpy.ndarray`, must not be passed if `a` is
an :class:`Adversarial` instance.
unpack : bool
If true, returns the adversarial input, otherwise returns
the Adversarial object.
severities : List[int]
severity-strengths of attack
repetitions : int
Specifies how often the attack will be repeated.
random : bool
if False an image and corruption severity specific seed is set for reproducibility
"""
attack(a=a, corruption=self.CORRUPTION, severities=severities, repetitions=repetitions, random=random)
def __str__(self):
return 'Corruption attack of type \'{}\''.format(self.CORRUPTION)
class ImpulseNoise(Attack):
"""Increases the amount of image corruption until the input is misclassified.
"""
CORRUPTION = 'impulse_noise'
@generator_decorator
@attack_decorator('impulse_noise')
def as_generator(self, a, severities: List[int] = None, repetitions=10, random: bool = True):
"""Increases the amount of specified image corruption until the input is misclassified.
Parameters
----------
input_or_adv : `numpy.ndarray` or :class:`Adversarial`
The original, unperturbed input as a `numpy.ndarray` or
an :class:`Adversarial` instance.
label : int
The reference label of the original input. Must be passed
if `a` is a `numpy.ndarray`, must not be passed if `a` is
an :class:`Adversarial` instance.
unpack : bool
If true, returns the adversarial input, otherwise returns
the Adversarial object.
severities : List[int]
severity-strengths of attack
repetitions : int
Specifies how often the attack will be repeated.
random : bool
if False an image and corruption severity specific seed is set for reproducibility
"""
def __str__(self):
return 'Corruption attack of type \'{}\''.format(self.CORRUPTION)
class DefocusBlur(Attack):
"""Increases the amount of image corruption until the input is misclassified.
"""
CORRUPTION = 'defocus_blur'
@generator_decorator
@attack_decorator('defocus_blur')
def as_generator(self, a, severities: List[int] = None, repetitions=10, random: bool = True):
"""Increases the amount of specified image corruption until the input is misclassified.
Parameters
----------
input_or_adv : `numpy.ndarray` or :class:`Adversarial`
The original, unperturbed input as a `numpy.ndarray` or
an :class:`Adversarial` instance.
label : int
The reference label of the original input. Must be passed
if `a` is a `numpy.ndarray`, must not be passed if `a` is
an :class:`Adversarial` instance.
unpack : bool
If true, returns the adversarial input, otherwise returns
the Adversarial object.
severities : List[int]
severity-strengths of attack
repetitions : int
Specifies how often the attack will be repeated.
random : bool
if False an image and corruption severity specific seed is set for reproducibility
"""
def __str__(self):
return 'Corruption attack of type \'{}\''.format(self.CORRUPTION)
class GlassBlur(Attack):
"""Increases the amount of image corruption until the input is misclassified.
"""
CORRUPTION = 'glass_blur'
@generator_decorator
@attack_decorator('glass_blur')
def as_generator(self, a, severities: List[int] = None, repetitions=10, random: bool = True):
"""Increases the amount of specified image corruption until the input is misclassified.
Parameters
----------
input_or_adv : `numpy.ndarray` or :class:`Adversarial`
The original, unperturbed input as a `numpy.ndarray` or
an :class:`Adversarial` instance.
label : int
The reference label of the original input. Must be passed
if `a` is a `numpy.ndarray`, must not be passed if `a` is
an :class:`Adversarial` instance.
unpack : bool
If true, returns the adversarial input, otherwise returns
the Adversarial object.
severities : List[int]
severity-strengths of attack
repetitions : int
Specifies how often the attack will be repeated.
random : bool
if False an image and corruption severity specific seed is set for reproducibility
"""
def __str__(self):
return 'Corruption attack of type \'{}\''.format(self.CORRUPTION)
class MotionBlur(Attack):
"""Increases the amount of image corruption until the input is misclassified.
"""
CORRUPTION = 'motion_blur'
@generator_decorator
@attack_decorator('motion_blur')
def as_generator(self, a, severities: List[int] = None, repetitions=10, random: bool = True):
"""Increases the amount of specified image corruption until the input is misclassified.
Parameters
----------
input_or_adv : `numpy.ndarray` or :class:`Adversarial`
The original, unperturbed input as a `numpy.ndarray` or
an :class:`Adversarial` instance.
label : int
The reference label of the original input. Must be passed
if `a` is a `numpy.ndarray`, must not be passed if `a` is
an :class:`Adversarial` instance.
unpack : bool
If true, returns the adversarial input, otherwise returns
the Adversarial object.
severities : List[int]
severity-strengths of attack
repetitions : int
Specifies how often the attack will be repeated.
random : bool
if False an image and corruption severity specific seed is set for reproducibility
"""
def __str__(self):
return 'Corruption attack of type \'{}\''.format(self.CORRUPTION)
class ZoomBlur(Attack):
"""Increases the amount of image corruption until the input is misclassified.
"""
CORRUPTION = 'zoom_blur'
@generator_decorator
@attack_decorator('zoom_blur')
def as_generator(self, a, severities: List[int] = None, repetitions=10, random: bool = True):
"""Increases the amount of specified image corruption until the input is misclassified.
Parameters
----------
input_or_adv : `numpy.ndarray` or :class:`Adversarial`
The original, unperturbed input as a `numpy.ndarray` or
an :class:`Adversarial` instance.
label : int
The reference label of the original input. Must be passed
if `a` is a `numpy.ndarray`, must not be passed if `a` is
an :class:`Adversarial` instance.
unpack : bool
If true, returns the adversarial input, otherwise returns
the Adversarial object.
severities : List[int]
severity-strengths of attack
repetitions : int
Specifies how often the attack will be repeated.
random : bool
if False an image and corruption severity specific seed is set for reproducibility
"""
def __str__(self):
return 'Corruption attack of type \'{}\''.format(self.CORRUPTION)
class Snow(Attack):
"""Increases the amount of image corruption until the input is misclassified.
"""
CORRUPTION = 'snow'
@generator_decorator
@attack_decorator('snow')
def as_generator(self, a, severities: List[int] = None, repetitions=10, random: bool = True):
"""Increases the amount of specified image corruption until the input is misclassified.
Parameters
----------
input_or_adv : `numpy.ndarray` or :class:`Adversarial`
The original, unperturbed input as a `numpy.ndarray` or
an :class:`Adversarial` instance.
label : int
The reference label of the original input. Must be passed
if `a` is a `numpy.ndarray`, must not be passed if `a` is
an :class:`Adversarial` instance.
unpack : bool
If true, returns the adversarial input, otherwise returns
the Adversarial object.
severities : List[int]
severity-strengths of attack
repetitions : int
Specifies how often the attack will be repeated.
random : bool
if False an image and corruption severity specific seed is set for reproducibility
"""
def __str__(self):
return 'Corruption attack of type \'{}\''.format(self.CORRUPTION)
class Frost(Attack):
"""Increases the amount of image corruption until the input is misclassified.
"""
CORRUPTION = 'frost'
@generator_decorator
@attack_decorator('frost')
def as_generator(self, a, severities: List[int] = None, repetitions=10, random: bool = True):
"""Increases the amount of specified image corruption until the input is misclassified.
Parameters
----------
input_or_adv : `numpy.ndarray` or :class:`Adversarial`
The original, unperturbed input as a `numpy.ndarray` or
an :class:`Adversarial` instance.
label : int
The reference label of the original input. Must be passed
if `a` is a `numpy.ndarray`, must not be passed if `a` is
an :class:`Adversarial` instance.
unpack : bool
If true, returns the adversarial input, otherwise returns
the Adversarial object.
severities : List[int]
severity-strengths of attack
repetitions : int
Specifies how often the attack will be repeated.
random : bool
if False an image and corruption severity specific seed is set for reproducibility
"""
def __str__(self):
return 'Corruption attack of type \'{}\''.format(self.CORRUPTION)
class Fog(Attack):
"""Increases the amount of image corruption until the input is misclassified.
"""
CORRUPTION = 'fog'
@generator_decorator
@attack_decorator('fog')
def as_generator(self, a, severities: List[int] = None, repetitions=10, random: bool = True):
"""Increases the amount of specified image corruption until the input is misclassified.
Parameters
----------
input_or_adv : `numpy.ndarray` or :class:`Adversarial`
The original, unperturbed input as a `numpy.ndarray` or
an :class:`Adversarial` instance.
label : int
The reference label of the original input. Must be passed
if `a` is a `numpy.ndarray`, must not be passed if `a` is
an :class:`Adversarial` instance.
unpack : bool
If true, returns the adversarial input, otherwise returns
the Adversarial object.
severities : List[int]
severity-strengths of attack
repetitions : int
Specifies how often the attack will be repeated.
random : bool
if False an image and corruption severity specific seed is set for reproducibility
"""
def __str__(self):
return 'Corruption attack of type \'{}\''.format(self.CORRUPTION)
class Brightness(Attack):
"""Increases the amount of image corruption until the input is misclassified.
"""
CORRUPTION = 'brightness'
@generator_decorator
@attack_decorator('brightness')
def as_generator(self, a, severities: List[int] = None, repetitions=10, random: bool = True):
"""Increases the amount of specified image corruption until the input is misclassified.
Parameters
----------
input_or_adv : `numpy.ndarray` or :class:`Adversarial`
The original, unperturbed input as a `numpy.ndarray` or
an :class:`Adversarial` instance.
label : int
The reference label of the original input. Must be passed
if `a` is a `numpy.ndarray`, must not be passed if `a` is
an :class:`Adversarial` instance.
unpack : bool
If true, returns the adversarial input, otherwise returns
the Adversarial object.
severities : List[int]
severity-strengths of attack
repetitions : int
Specifies how often the attack will be repeated.
random : bool
if False an image and corruption severity specific seed is set for reproducibility
"""
def __str__(self):
return 'Corruption attack of type \'{}\''.format(self.CORRUPTION)
class Contrast(Attack):
"""Increases the amount of image corruption until the input is misclassified.
"""
CORRUPTION = 'contrast'
@generator_decorator
@attack_decorator('contrast')
def as_generator(self, a, severities: List[int] = None, repetitions=10, random: bool = True):
"""Increases the amount of specified image corruption until the input is misclassified.
Parameters
----------
input_or_adv : `numpy.ndarray` or :class:`Adversarial`
The original, unperturbed input as a `numpy.ndarray` or
an :class:`Adversarial` instance.
label : int
The reference label of the original input. Must be passed
if `a` is a `numpy.ndarray`, must not be passed if `a` is
an :class:`Adversarial` instance.
unpack : bool
If true, returns the adversarial input, otherwise returns
the Adversarial object.
severities : List[int]
severity-strengths of attack
repetitions : int
Specifies how often the attack will be repeated.
random : bool
if False an image and corruption severity specific seed is set for reproducibility
"""
def __str__(self):
return 'Corruption attack of type \'{}\''.format(self.CORRUPTION)
class ElasticTransform(Attack):
"""Increases the amount of image corruption until the input is misclassified.
"""
CORRUPTION = 'elastic_transform'
@generator_decorator
@attack_decorator('elastic_transform')
def as_generator(self, a, severities: List[int] = None, repetitions=10, random: bool = True):
"""Increases the amount of specified image corruption until the input is misclassified.
Parameters
----------
input_or_adv : `numpy.ndarray` or :class:`Adversarial`
The original, unperturbed input as a `numpy.ndarray` or
an :class:`Adversarial` instance.
label : int
The reference label of the original input. Must be passed
if `a` is a `numpy.ndarray`, must not be passed if `a` is
an :class:`Adversarial` instance.
unpack : bool
If true, returns the adversarial input, otherwise returns
the Adversarial object.
severities : List[int]
severity-strengths of attack
repetitions : int
Specifies how often the attack will be repeated.
random : bool
if False an image and corruption severity specific seed is set for reproducibility
"""
def __str__(self):
return 'Corruption attack of type \'{}\''.format(self.CORRUPTION)
class Pixelate(Attack):
"""Increases the amount of image corruption until the input is misclassified.
"""
CORRUPTION = 'pixelate'
@generator_decorator
@attack_decorator('pixelate')
def as_generator(self, a, severities: List[int] = None, repetitions=10, random: bool = True):
"""Increases the amount of specified image corruption until the input is misclassified.
Parameters
----------
input_or_adv : `numpy.ndarray` or :class:`Adversarial`
The original, unperturbed input as a `numpy.ndarray` or
an :class:`Adversarial` instance.
label : int
The reference label of the original input. Must be passed
if `a` is a `numpy.ndarray`, must not be passed if `a` is
an :class:`Adversarial` instance.
unpack : bool
If true, returns the adversarial input, otherwise returns
the Adversarial object.
severities : List[int]
severity-strengths of attack
repetitions : int
Specifies how often the attack will be repeated.
random : bool
if False an image and corruption severity specific seed is set for reproducibility
"""
def __str__(self):
return 'Corruption attack of type \'{}\''.format(self.CORRUPTION)
class JpegCompression(Attack):
"""Increases the amount of image corruption until the input is misclassified.
"""
CORRUPTION = 'jpeg_compression'
@generator_decorator
@attack_decorator('jpeg_compression')
def as_generator(self, a, severities: List[int] = None, repetitions=10, random: bool = True):
"""Increases the amount of specified image corruption until the input is misclassified.
Parameters
----------
input_or_adv : `numpy.ndarray` or :class:`Adversarial`
The original, unperturbed input as a `numpy.ndarray` or
an :class:`Adversarial` instance.
label : int
The reference label of the original input. Must be passed
if `a` is a `numpy.ndarray`, must not be passed if `a` is
an :class:`Adversarial` instance.
unpack : bool
If true, returns the adversarial input, otherwise returns
the Adversarial object.
severities : List[int]
severity-strengths of attack
repetitions : int
Specifies how often the attack will be repeated.
random : bool
if False an image and corruption severity specific seed is set for reproducibility
"""
def __str__(self):
return 'Corruption attack of type \'{}\''.format(self.CORRUPTION)
| 34.60223 | 118 | 0.640381 | 3,255 | 27,924 | 5.407373 | 0.079263 | 0.03068 | 0.031873 | 0.034089 | 0.824442 | 0.805466 | 0.773536 | 0.773536 | 0.773536 | 0.76899 | 0 | 0.004482 | 0.272884 | 27,924 | 806 | 119 | 34.645161 | 0.862392 | 0.547379 | 0 | 0.330144 | 0 | 0 | 0.123472 | 0.002405 | 0 | 0 | 0 | 0 | 0 | 1 | 0.22488 | false | 0 | 0.052632 | 0.090909 | 0.588517 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 8 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.